Ever since I can remember, a rough but useful rule of thumb to describe the sources of U.S. electricity generation is that 50% comes from coal-fired plants, 20% comes from nuclear, and the rest comes from natural gas, hydro, and various other renewable sources.
That steady relationship has been changing over the past few years and markedly so since the middle of 2011. So much so that this past Friday, the U.S. Energy Information Administration (EIA) announced that in April 2012, for the first time since EIA began collecting monthly data almost 40 years ago, generation from natural gas-fired plants was virtually equal to generation from coal-fired plants, with each fuel providing 32% of total megawatt-hours of power.
There are a number of explanations for the trend, but an important one is natural gas prices near historic lows. The chart below shows monthly Henry Hub spot prices for natural gas (the red bars) plotted against the share of coal-fired (the blue line) and natural gas-fired (the green line) generation out of total megawatt-hours of generation in the United States.
As natural gas prices have fallen, cheaper natural gas-fired generation is increasing dispatched at the expense of pricier coal-fired generation. The result: As coal-fired generation is displaced by natural gas-fired generation, natural gas becomes an increasing share of total generation.
In fact, it would not be surprising if the share from natural gas pulls ahead of that for coal over the next couple months. As you can see in the two charts above, generation from natural gas-fired plants as a share of total generation typically goes up in the summer. Why? Because most peaking units that turn on to meet demand for air conditioning on hot summer days are gas-fired.