Railroad Forums 

  • Corridor Electric Power Generation and Distribution

  • Discussion related to Amtrak also known as the National Railroad Passenger Corp.
Discussion related to Amtrak also known as the National Railroad Passenger Corp.

Moderators: GirlOnTheTrain, mtuandrew, Tadman

 #745927  by electricron
 
farecard wrote:The above is talking a) about 25Hz sections. b) about the primary side, not the secondary.
My interest is in the 2x25Kv sections that replace 25 Hz with 60 Hz at higher voltage.
I've already describe why 60 Hz is just as good today as 25 Hz. It's the traction motors....
I've already described why the higher traction voltage is not only better, but why too....

So, that only leaves answering why have two vs one 25 kV traction power line. The answer is the same reason why PRR has two 69kV transmission lines, or if you like a single 138kV center tap line. The answer is reliability, in case one locomotive shorts out a line, or another failure, the second line is available for power.

Image

Center-tap transformers aren't required for the much lower 25V lines. The only reason they were used on the 138kV transmission lines was to reduce this higher voltage down to 69kV to ground. That way they could use far fewer insulators on the lines. You don't necessarily see transmission power lines above the traction power lines on the newer 25kV 60 Hz systems......

Here's a general rule of thumb for transmission lines, although it's not exact......
A generator, transformer, or other electric source can push electricity down the lines at 1,000 volts per mile. A 138kV power source can push the electricity 138 miles, a 25kV source 25 miles, a 12kV source 12 miles. Since the newer 25kV traction systems use 60 Hz power, normal power from the grid can be used to feed each 25kV rail power transformer. That wasn't true with the old 25 Hz systems, motor generators were required to convert 60 Hz to 25 Hz, accomplished by having a different number of poles in the motor and the generator. That's why 138kV was needed at 25 Hz, to reduce the number of motor generators along the railroad corridor, because that was probably the only 25 Hz line in the area, if not the entire USA.
 #745961  by Nasadowsk
 
electricron wrote:That's why 138kV was needed at 25 Hz, to reduce the number of motor generators along the railroad corridor, because that was probably the only 25 Hz line in the area, if not the entire USA.
Circa 1930, 25hz power was still quite common in the US. AFAIK, today, Amtrak is the sole user of 25hz left in North America, if not the world.
 #745966  by farecard
 
Nasadowsk wrote: Circa 1930, 25hz power was still quite common in the US. AFAIK, today, Amtrak is the sole user of 25hz left in North America, if not the world.
Indeed, ISTM many of the original Niagara Falls generators are 25 Hz.
 #746005  by electricron
 
The original Niagra Falls generators were designed for DC operations, not AC. That's why they were initially 25 Hz initially as AC generators.....for the exact same reason I gave before.....

Read http://en.wikipedia.org/wiki/Utility_frequency for more information.

History
Many different power frequencies were used in the 19th century. Very early isolated AC generating schemes used arbitrary frequencies based on convenience for steam engine, water turbine and electrical generator design. Frequencies between 16⅔ Hz and 133⅓ Hz were used on different systems. The proliferation of frequencies grew out of the rapid development of electrical machines in the period 1880 through 1900. In the early incandescent lighting period, single-phase AC was common and typical generators were 8-pole machines operated at 2000 RPM, giving a frequency of 133 cycles per second.
The oldest continuously-operating commercial hydroelectric power plant in the United States, at Mechanicville, New York, still produces electric power at 40 Hz and supplies power to the local 60 Hz transmission system through frequency changers.

Why 50 Hz?
The German company AEG (descended from a company founded by Edison in Germany) built the first German generating facility to run at 50 Hz, allegedly because 60 was not a preferred number. AEG's choice of 50 Hz is thought by some to relate to a more "metric-friendly" number than 60. At the time, AEG had a virtual monopoly and their standard spread to the rest of Europe. After observing flicker of lamps operated by the 40 Hz power transmitted by the Lauffen-Frankfurt link in 1891, AEG raised their standard frequency to 50 Hz in 1891.

Why 60 Hz?
Westinghouse Electric decided to standardize on a lower frequency to permit operation of both electric lighting and induction motors on the same generating system. Although 50 Hz was suitable for both, in 1890 Westinghouse considered that existing arc-lighting equipment operated slightly better on 60 Hz, and so that frequency was chosen. Frequencies much below 50 Hz gave noticeable flicker of arc or incandescent lighting. The operation of Tesla's induction motor required a lower frequency than the 133 Hz common for lighting systems in 1890. In 1893 General Electric Corporation, which was affiliated with AEG in Germany, built a generating project at Mill Creek, California using 50 Hz, but changed to 60 Hz a year later to maintain market share with the Westinghouse standard.

25 Hz origins
The first generators at the Niagara Falls project, built by Westinghouse in 1895, were 25 Hz because the turbine speed had already been set before alternating current power transmission had been definitively selected. Westinghouse would have selected a low frequency of 30 Hz to drive motor loads, but the turbines for the project had already been specified at 250 RPM. The machines could have been made to deliver 16⅔ Hz power suitable for heavy commutator-type motors but the Westinghouse company objected that this would be undesirable for lighting, and suggested 33⅓ Hz. Eventually a compromise of 25 Hz, with 12 pole 250 RPM generators, was chosen. Because the Niagara project was so influential on electric power systems design, 25 Hz prevailed as the North American standard for low-frequency AC.

Standarization
As the 20th century continued, more power was produced at 60 Hz (North America) or 50 Hz (Europe and most of Asia). Standardization allowed international trade in electrical equipment. Much later, the use of standard frequencies allowed interconection of power grids. It wasn't until after World War II with the advent of affordable electrical consumer goods that more uniform standards were enacted.

Stablility
Regulation of power system frequency for timekeeping accuracy was not commonplace until after 1926 and the invention of the electric clock driven by a synchronous motor. Network operators will regulate the daily average frequency so that clocks stay within a few seconds of correct time. In practice the nominal frequency is raised or lowered by a specific percentage to maintain synchronization.

To add, 25 Hz in the NE USA was the standard low Hz power grid for high horsepower motors (Industral loads). That's just one reason why PRR chose 25 Hz to power the NE corridor 20 to 25 years after Niagara Falls generators existed. 60 Hz was the standard for lighting loads. Few Americans had heavy motors in their homes, therefore most Americans had 60 Hz power to their homes.
 #746032  by MudLake
 
OK, I'm confused. Where does it say or imply that the Niagara power stations were designed for DC power generation? What I picked up was they were designed before a standard frequency was established in the USA.
 #746041  by farecard
 
More to the point, we are oscillating further and further away from my query, which is expressly about the 2x25 system with autoformers.

I still can't see the I^2R loss advantage of the 2x system, as compared to a single-ended system with a parallel feeder periodically bonded to it.

Seehttp://www.irfca.org/docs/traction-schematics.html and note that each conductor has half the current, and thus 25% of the I^2R loss...feeder side. {The return losses are another issue...}

But if you draw out a simple cat+parallel feeder of the same size as above; guess what....each cable gets half the current, and thus its I^2R losses are....
 #746054  by David Benton
 
because if you have a parrallel feed system both feeds are at 25kv , whereas with the 50 kv system , a good portion of the feed is at 50 kv , hence 1/2 the current , and 1/4 of the loss .
 #746132  by electricron
 
MudLake wrote:OK, I'm confused. Where does it say or imply that the Niagara power stations were designed for DC power generation? What I picked up was they were designed before a standard frequency was established in the USA.
I thought I bolded the text so anyone could see it...

"before alternating current power transmission had been definitively selected".

That means before AC power transmission had been definitively selected.
Which means they initially designed Niagara Falls power stations for DC.

If the text read before AC frequency had been definitively selected, you'd be correct.

I can't believe intelligent people can't comprehend what they read now-a-days....
 #746143  by electricron
 
farecard wrote:More to the point, we are oscillating further and further away from my query, which is expressly about the 2x25 system with autoformers.

I still can't see the I^2R loss advantage of the 2x system, as compared to a single-ended system with a parallel feeder periodically bonded to it.

Seehttp://www.irfca.org/docs/traction-schematics.html and note that each conductor has half the current, and thus 25% of the I^2R loss...feeder side. {The return losses are another issue...}

But if you draw out a simple cat+parallel feeder of the same size as above; guess what....each cable gets half the current, and thus its I^2R losses are....
There's three drawings on the page you link. If I on the cateneary is 1/2, then the I^2 losses are 1/4.
Kirchoff's Law comes into play with circuits, as the arrows in the drawings try to show.......

Everyone eventually gets Ohm's Law and its variants (E=IR), but not everyone sees Kirchoff's Laws.
http://en.wikipedia.org/wiki/Kirchhoff%27s_circuit_laws
Kirchhoff's current law (KCL)
The current entering any junction is equal to the current leaving that junction.
This law is also called Kirchhoff's point rule, Kirchhoff's junction rule (or nodal rule), and Kirchhoff's first rule.
The principle of conservation of electric charge implies that:
At any node (junction) in an electrical circuit, the sum of currents flowing into that node is equal to the sum of currents flowing out of that node.

The principle of conservation of electric charge stands hand in hand with the principle of the conservation of matter and energy. And that's not always an easy idea to see.

Basically, by having two 25 kV lines to the motor of the locomotive, each catenary line carries half the current, therefore each line has 1/4 of the I^2 losses. But since there's two paths, the equivalent is just 1/2 of the I^2 losses. That's still a significant savings in I^2 losses.

And while it is true the extra length of wire used for the second furthest loop increases I^2 losses some, it's not twice as much, therefore overall there are I^2 loss savings.

Additionally, you got to remember this is an AC circuit. The current is flowing one way on one alternation, and the opposite way on the other. In the real world, 120 times a second, there's zero current flowing.
 #746150  by Ken W2KB
 
electricron wrote: As the 20th century continued, more power was produced at 60 Hz (North America) or 50 Hz (Europe and most of Asia). Standardization allowed international trade in electrical equipment. Much later, the use of standard frequencies allowed interconection of power grids. It wasn't until after World War II with the advent of affordable electrical consumer goods that more uniform standards were enacted.
The first interconnection was established in 1927 by three different utility companies, at 60Hz, and is still in existence. The Pennsyslvania-New Jersey-Maryland Interconnection ("PJM"). It encompasses a much larger area today.
 #746182  by timz
 
electricron wrote:
MudLake wrote:OK, I'm confused. Where does it say or imply that the Niagara power stations were designed for DC power generation? What I picked up was they were designed before a standard frequency was established in the USA.
...I can't believe intelligent people can't comprehend what they read now-a-days...
All us unintelligent people are still confused. Maybe when you said Niagara was designed for DC, you didn't mean to imply it was built DC? It actually got redesigned, then built as AC? Or is that not what this means

"The first generators at the Niagara Falls project, built by Westinghouse in 1895, were 25 Hz because ..."
 #746227  by MudLake
 
electricron wrote:
MudLake wrote:OK, I'm confused. Where does it say or imply that the Niagara power stations were designed for DC power generation? What I picked up was they were designed before a standard frequency was established in the USA.
I thought I bolded the text so anyone could see it...

"before alternating current power transmission had been definitively selected".

That means before AC power transmission had been definitively selected.
Which means they initially designed Niagara Falls power stations for DC.

If the text read before AC frequency had been definitively selected, you'd be correct.

I can't believe intelligent people can't comprehend what they read now-a-days....
I had already read the Wikipedia article on my own. What makes no sense is the generating capacity being designed at the time in Niagara Falls far outstripped the capability for it to be consumed within a mile or two radius as DC power. That's why I'm skeptical of the timeline and associated explanation.
 #746360  by PRRTechFan
 
A PRR/NEC 2 wire, single phase 138kv transmission circuit is fed from a center-tapped 138kV transformer, but the center-tap is grounded through a resistor. If the transformer was not grounded at all, the voltage from the transmission lines to ground could fluctuate all over the place, and under certain circumstances the potential to ground could even exceed 138kv, exceeding the rating of the insulators and causing a flashover to ground.

If the transformer center-tap was solidly grounded, and a transmission conductor shorted to ground because of a broken insulator or ground wire coming in contact with the transmission wire, excessive current would flow to ground and the line would be tripped out.

But resistance grounding provides a ground reference to the circuit, keeping both wires in the circuit at about 69kv above ground; if one wire should short to ground, the resistor limits the current to ground to about 200 amps. The abnormal "ground current" is detected by protective relays and trips an alarm, but the circuit stays energized.

One of the previous posts had a photograph showing two, two wire 138kv transmission circuits; but four, two wire 138kv circuits are much more prevalent along most of the original PRR NEC. Two circuits were run on each side of the right-of-way. This gave the original system tremendous redundancy. It was unlikely that any kind of accident or damage would take out the lines on both sides of the tracks simultaneously. It allowed for a circuit or two to be out for repair or maintenance without causing a shutdown or restriction of railroad operations. I'll have to research some materials I have stored away, but my recollection is that most substations had multiple step down transformers (138kv to 12kv) and between the multiple transformers and switching capabilities at each substation on the 138kv side, transformers and transmission circuits could be switched to accomodate almost any contingency.

As for the 25kv-0-25kv question, there are two separate things to consider. One is indeed higher transmission voltages and reduced I^^R losses. This is because higher voltage allows you to transmit energy with less loss. This is why the PRR chose to use 138kV for transmission, as this was the highest voltage that equipment could reliably operate and be insulated for at the time. Today, it would probably be 345 or 500kv.

The second consideration has to do with the cost and configuration of transformers. A single-winding autotransfomer is less expensive than a standard two-winding transformer, and an autotransformer does not have to have as large an electrical rating as a two winding transformer for the same load. An autotransformer is a possible economical choice when electrical isolation of the primary and secondary circuits is not required and when the change in voltage is less than 2:1. In the case of a 50kv/25kv system, 2:1 autotransformers are a possible choice.

However, as the ratio of primary/secondary voltage rises beyond 2:1, the advantages of the autotransformer starts to fade. If electrical isolation of circuits is required, an autotransformer cannot be used and a two winding transformer is required.

I believe that the New Haven-Boston NEC 25kv/60Hz electrification is a 50kv/25kv system, although I have yet to find the "two line" (...single phase, remember) diagram.
 #746381  by farecard
 
PRRTechFan wrote:

As for the 25kv-0-25kv question, there are two separate things to consider. One is indeed higher transmission voltages and reduced I^^R losses. This is because higher voltage allows you to transmit energy with less loss. This is why the PRR chose to use 138kV for transmission, as this was the highest voltage that equipment could reliably operate and be insulated for at the time. Today, it would probably be 345 or 500kv.
True, but today it might make sense to buy more local power; i.e. stop transporting power from Timbuktu to Kalamazoo. Or they might well buy power yearly in say 3 of six possible 6 places, (IOW, short-range backhauling) and next year, use only 1 of the 3 and two different ones. That would let them actually get bids from the possible vendors. (That's only possible, of course, once they dump the 25Hz.)

In any case, the primary aspects are separate from my focus on the autoformer secondary.

The second consideration has to do with the cost and configuration of transformers. A single-winding autotransfomer is less expensive than a standard two-winding transformer, and an autotransformer does not have to have as large an electrical rating as a two winding transformer for the same load. An autotransformer is a possible economical choice when electrical isolation of the primary and secondary circuits is not required and when the change in voltage is less than 2:1. In the case of a 50kv/25kv system, 2:1 autotransformers are a possible choice.
I really want to straighten up the thinking here. The trainsets run on 25KV. They are fed 25 KV, from a 25KV power line. The I^2R losses are at 25KV. Talking about them being fed with quote "50 KV" just obscures the facts here. You are the first respondent to correctly grasp/state the autoformers are NOT there to do 2-1 voltage stepdown as much as offer balance.
However, as the ratio of primary/secondary voltage rises beyond 2:1, the advantages of the autotransformer starts to fade.
I'll have to think about that for a while. I'm trying to recall the issues in autoformer design vs conventional ones.

My theory thus far is that the autoformer scheme must be as much for the EMI cancellation as for stemming droop. I still can't see how it helps much on {longer distance} transmission loss. It does seem to be useful for the literal last-mile droop in the cat itself, between transformers.
I believe that the New Haven-Boston NEC 25kv/60Hz electrification is a 50kv/25kv system, although I have yet to find the "two line" (...single phase, remember) diagram.
I'd be very interested. I'm also curious as to the relative gauges of the cat and feeders on any autoformer section.
Last edited by farecard on Tue Dec 08, 2009 11:35 pm, edited 1 time in total.
 #746386  by farecard
 
PRRTechFan wrote:
But resistance grounding provides a ground reference to the circuit, keeping both wires in the circuit at about 69kv above ground; if one wire should short to ground, the resistor limits the current to ground to about 200 amps. The abnormal "ground current" is detected by protective relays and trips an alarm, but the circuit stays energized.
That's an old trick widely used by the US Navy. Both sides floated on shipboard power, with a resistor to limit floating. [It does not need much current to keep the float down.] That way, if the ship took battle damage that shorted one leg to ground, nothing stopped working. Only if the other leg was also shorted did fuses blow.

In THAT case, they had "battle shorts" -- the ultimate penny in the fusebox. They'd short the fuses and fight fires rather than lose say, the main turret power. More recently, NASA Houston did the same thing in the critical minutes before Apollo 11 touchdown.

BTW, do not EVER let a Navy electrician near your house wiring.. First off, to him, black is ground....and then it gets worse.
  • 1
  • 4
  • 5
  • 6
  • 7
  • 8