Post a New Question

algebra

posted by .

The power (P) required to run a motor is equal to the voltage (E) applied to that motor times the current (I) supplied to the motor. If the motor data says the motor uses 180 watts of power and the voltage applied to the motor is 120 volts, how much current will the motor require?

  • algebra -

    Assume DC current where voltage is constant.

    P=EI
    P=180 w
    E=120 v
    Solve for I (answer in ampères)

Respond to this Question

First Name
School Subject
Your Answer

Similar Questions

More Related Questions

Post a New Question