A device having a constant resistance operates at a voltage of 100 V. If the voltage supplied to the device increases by 10 percent, how does this affect the power used by the device relative to the power consumption at 100 V?

Respuesta :

P=V^2/R

In the first case voltage is 100V, in the second case it is 110V

P1=10000/R

P2=12100/R

R=P1/10000

R=P2/12100

P1/10000=P2/12100

12100 . P1=P2 . 10000

Divide both by 100

P1 . 121=P2 . 100

Therefore Power in the second scenario is 21% less than the first one