we know i=p/v so if v is increased , i reduces for constant power...but as soon as supply voltage is increased ,the current and voltage across the distant load is going to change , so how can the power be constant?.
Also
From Kirchhoffs voltage law or directly applying ohms law ,the current is going to increase if source voltage is increased.
Pl. someone explain this to me.