Why does a simple number not say 100 and 200 V? Does it have something to do with historical reasons? Or is an average value of AC supply related to peak value?
I think the reason for this voltage level is historic. The original "killer app" for power distribution to homes was electric lighting. And Thomas Edison, who devised the first practical light bulb, determined that his bulbs would be powered by about 100 to 110V, as an optimal choice for his filaments. To compensate for IR losses, 120V distribution, which Edison thought should be DC. You can search online and find early Edison advertizements, where voltage level required for his bulbs was listed as 100 to 130V.
When AC distribution won out, the final drop to individual homes in the US became 240V center-tapped, where wiring in the house was divided evenly into two sets of 120-to-neutral lines, plus several 240V outlets with safety ground, for the heavy loads. In the US, tri-phase 208V is sometimes used too, to homes, where each phase to neutral provides 120V. But it's not that common, and the disadvantage is you don't have that 240V single-phase available for heavy appliances (although some 208V appliances are available, or you can make do with 208V instead of 240V).
Until the 1960s, some European countries, certainly Italy and France, also had domestic 120 and 240 (roughly, some labeled it 110 and 220, 115 and 230, or what have you). Just like in the US, European homes had 110-120 in their ubiquitous outlets, and 220-240 for water heaters, electric stoves, and the like.
In short, I think the reason for the 100+ selection was Edison's filament light bulbs. That's what they used! The choice of 120V was to compensate for IR drop. The reason for 220-240 was that with AC distribution, this became a convenient voltage level to deliver to homes. And the reason why Europe dropped the 120V domestic distribution was simply economics. You can save on copper.
The way power is distributed through neighborhoods differs, though. In the US, it's 12,500V tri-phase, then usually down to 240V center-tapped single-phase to each home. In parts of Europe, looks like 415V tri-phase to homes, which is then distributed as 220V single phase to outlets.
Historically, the type of electricity delivered to homes and businesses was first DC but then changed to AC electricity. The standard voltage level started at 110V, went to 240V, back to 110V, and then to 220V. The frequency started at 60Hz and then went to 50Hz in most areas.
Early in the history or electricity, Thomas Edison's General Electric Company was distributing DC electricity at 110 volts in the United States. Then Nikola Tesla devised a system of three-phase AC electricity at 240 volts. He had calculated that 60Hz was the most effective frequency. Tesla later compromised to reduce the voltage to 120 volts for safety reasons.
Tesla's AC system became the standard in the United States. Meanwhile, the German company AEG started generating electricity and became a virtual monopoly in Europe. They decided to use 50Hz instead of 60Hz to better fit their metric standards, but they stayed with 120V.
Unfortunately, at 120V 50Hz AC has greater losses and is not as efficient as 60HZ. Due to the slower speed 50Hz electrical generators are 20% less effective than 60Hz generators. Electrical transmission at 50Hz is about 10-15% less efficient. 50Hz transformers require larger windings and 50Hz electric motors are less efficient than those meant to run at 60Hz. They are more costly to make to handle the electrical losses and the extra heat generated at the lower frequency.
Europe stayed at 120V AC until the 1950s, just after World War II. They then switched over to 230V for better efficiency in electrical transmission. Great Britain not only switched to 230V, but they also changed from 60Hz to 50Hz to follow the European lead. Since many people did not yet have electrical appliances in Europe after the war, the change-over was not that expensive for them.
The United States also considered converting to 220V for home use but felt it would be too costly, due to all the 120V electrical appliances people had. A compromise was made in the U.S. in that 240V would come into the house where it would be split to 120V to power most appliances. Certain household appliances such as the electric stove and electric clothes dryer would be powered at 240V.
The voltage and frequency of AC electricity varies from country to country throughout the world. Most use 230V and 50Hz. About 20% of the countries use 110V and/or 60Hz to power their homes. 240V and 60Hz are the most efficient values, but only a few countries use that combination.
The values have come from the early use of batteries to provide DC supply. Primary zinc-carbon cell provided 1.5volts per cell while secondary lead-acid cells provided 2volts per cell. Thus the common voltage factor for power supply was 6volts for a battery which could be created from either a primary or a secondary cell. Thus DC voltages of multiple of 6 became a standard, like 6, 12, 24, 48, etc. For higher voltages, a simple series connection of round figure 20 numbers of 6volts batteries gave 120volts while 40 such batteries gave 240volts. Many countries around the world follow 120volts or 240volts standards even today for AC supply. The same was standardized to 110volts, 220volts, etc later for all high voltages, including 11kvolts, 33kvolts, just for the fancy of identical two first digits.
My question remains unanswered - why this odd number of multiple of 11? Agreed that ausual dry cell may be taken as an early reference of 1.5 V and any multiple of that would result in 150 V which could have also been taken. OK, if you take a peak value as 150 even then its rms would be around 106V. I still fail to think of any reason of adopting these odd numbers like 11. Of course if it happens to be based on conversion of earlier units of Volts, like statvolts I do not know.
I dont think that it should be treated as a multiple of 11 since all the possible 'multiples' are not used. My opinion is that just the first two digits were taken to be identical. So we had 110v for domestic supply at some places in the world, 220v similarly, 440v for some industrial loads, 550v DC for Calcutta Tramways, etc. Now-a-days, for new supply systems, such a convention is not followed. In India, for example, the 3-phase line voltage is 400v that makes the line to neutral voltage (for 1-phase) as 230v. The Railways use 25kv. HV transmission lines use 132kV.
Look I am not asking the question you are trying to answer. My question is simply why choose norms which are not say 100V or 200V and choose say 110 or 220. I know multiples are because of transformer up down ratios for AC. My question is simply why these odd values as standard. If you choose 200 V you get practically similar wattage and reasonable wire thickness to carry this wattage as for 220V.
Yes, but again, originally it was Tesla and Eddison's decision based on Efficiency "at that time" (and it was 120 and 240 V). But once you have a Setting (no matter 110, 120, 220, 230, 240) well spread and the appliances adapted to it, then it is a big step to jump out of the box and install something just "more Logical"
So Samuel it was historically fixed by Eddison for whatever reasons. Thanks for that info. However a bit strange that it was set at 110 or so without much reason. and most Countries adopted that despite acceptance of frequency either 50 or 60 Hz. I mean appliances are available for trains, for cars etc. based on 12 V DC supply too. Strangely this selection of norms went unchallenged.
but I think, there were only practical reasons, like multiples of battery voltage, efficiency thoughts (50-60) Hz. And if somebody starts, the rest follows.
Now I am again confused. There is seemingly no practical reason in fixing 220 instead of 200V. I agree somebody fixed it and others followed. When Volts is defined already as a Unit, to fix 110 or 220 instead of round figure does not go down well so easily.
I think No any practical reason for Fix 110 or 220.
but this is only due to stander, every manufacturing company follow this.
and for this no any practical logic, i hope in history first manufactured product rating may be 110 and 220 , then after that every manufacturing company follow that. otherwise requires a volt. regulator , so better follow that stander.
@Hanno Kriieger - had it been multiple of battery voltages, it was fine. In that case 240 V or 120 V is very legitimate. But fixing 110V and multiples lacks logic. Agreed we adopt certain things when you like it was historically fixed by Eddison in this case. But it would have helped in quick calculations had we followed a decimal system in this area also - like 100V or 200V.
I think the reason for this voltage level is historic. The original "killer app" for power distribution to homes was electric lighting. And Thomas Edison, who devised the first practical light bulb, determined that his bulbs would be powered by about 100 to 110V, as an optimal choice for his filaments. To compensate for IR losses, 120V distribution, which Edison thought should be DC. You can search online and find early Edison advertizements, where voltage level required for his bulbs was listed as 100 to 130V.
When AC distribution won out, the final drop to individual homes in the US became 240V center-tapped, where wiring in the house was divided evenly into two sets of 120-to-neutral lines, plus several 240V outlets with safety ground, for the heavy loads. In the US, tri-phase 208V is sometimes used too, to homes, where each phase to neutral provides 120V. But it's not that common, and the disadvantage is you don't have that 240V single-phase available for heavy appliances (although some 208V appliances are available, or you can make do with 208V instead of 240V).
Until the 1960s, some European countries, certainly Italy and France, also had domestic 120 and 240 (roughly, some labeled it 110 and 220, 115 and 230, or what have you). Just like in the US, European homes had 110-120 in their ubiquitous outlets, and 220-240 for water heaters, electric stoves, and the like.
In short, I think the reason for the 100+ selection was Edison's filament light bulbs. That's what they used! The choice of 120V was to compensate for IR drop. The reason for 220-240 was that with AC distribution, this became a convenient voltage level to deliver to homes. And the reason why Europe dropped the 120V domestic distribution was simply economics. You can save on copper.
The way power is distributed through neighborhoods differs, though. In the US, it's 12,500V tri-phase, then usually down to 240V center-tapped single-phase to each home. In parts of Europe, looks like 415V tri-phase to homes, which is then distributed as 220V single phase to outlets.
It depends on the distribution step-down transformer voltage level. For instance 11/0.415 kV, the secondary three-phase supply is rated at 415 V. Thus voltage for single phase load is 415/1.732 = 240 V. At 240 V ac, the load current is small, thus saving in cu. At 110 V ac, the current is high.
The standard distribution voltage level currently is 11 kV. This voltage level will be reduce to domestic supply at 415/230 Volts via a distribution transformer rated at 11/.415 kV. The 415 Volts is a three-phase supply for 3 phase electrical equipment such as motors and 230 V for consumer appliance such as TV, computers etc.