The power (P) required to run a motor is equal to the voltage (E) applied to that motor times the current (I) supplied to the motor. If the motor data says the motor uses 180 watts of power and the voltage applied to the motor is 120 volts, how much current will the motor require?

Assume DC current where voltage is constant.

P=EI
P=180 w
E=120 v
Solve for I (answer in ampères)

To calculate the current (I) required by the motor, we can use the formula:

P = E x I

Given that the power (P) is 180 watts and the voltage (E) is 120 volts, we can rearrange the formula to solve for the current (I):

I = P / E

Substituting the given values into the formula:

I = 180 watts / 120 volts

Simplifying the calculation:

I = 1.5 amperes

Therefore, the motor will require 1.5 amperes of current.

To find the current required by the motor, we need to rearrange the formula for power (P) to solve for the current (I). The formula is:

P = E * I

where P is power, E is voltage, and I is current.

Given that the power required by the motor is 180 watts and the voltage applied to the motor is 120 volts, we can substitute these values into the formula and solve for the current:

180 = 120 * I

To isolate I, we divide both sides of the equation by 120:

180 / 120 = I

Simplifying the equation, we get:

1.5 = I

Therefore, the motor will require a current of 1.5 amperes (A).