How far can electricity be transferred over wires from a power station before the loss factor is too great? — JD, New York NY
That depends on the electricity’s voltage. The transmission lines carrying the electricity are important parts of the overall electric circuit. They waste electric power as they carry current and the amount of power they waste is proportional to the square of the current they carry. The purpose of high voltage transmission lines is to send as small a current as possible across the countryside so that the wires waste as little power as possible. This reduction in current is possible if each electric charge moving in that current carries a large amount of energy—the current must be one that consists of high voltage charges. In short, higher voltage transmission lines employ smaller currents and waste less power than lower voltage transmission lines.
When Thomas Edison set out to electrify New York City, he used direct current of the highest practical household voltage. Nonetheless, his relatively low voltage power transmission lines wasted so much power that he had to scatter generating plants throughout the city so that no home was far from a power plant. But when George Westinghouse and Nicola Tesla realized that using alternating current and transformers to temporarily convert the household power to high voltages and small currents, they were able to send power long distances without wasting electricity. That realization eventually destroyed Edison’s direct current electric system and gave us the modern alternating current system. It’s now common to send electric power several hundred miles through high voltage transmission lines. At those distances, perhaps half the power is lost en route. I doubt that transmission of power more than 1,000 miles is practical.