.

David Haessig: Shining a Light on Blackout Solutions

Local reader responds to Ray Lutz's blog.

Editor's Note: This email is in response to the blog by Ray Lutz.

Ray's theory is that the Arizona line was up, but not stable. And that instability worked its way into the rest of the system. While Ray’s theory may be fact, I did not see any power fluctuations here, even on a recording instrument. The power just abruptly dropped off line.

Another scenario is that the load could not be supported without the Arizona line, and when it went down, San Onofre and other remaining power sources could not support the large load, so these remaining sources shut down.

If the remaining power sources were able to supply the load, the outage would not have occurred. Maybe we should not allow the connected load to exceed the connected capacity minus the power provided by the largest connected power source. (ML-1)

There are some solutions to consider:

1. As the load approaches ML-1, start shedding loads. If this is not sufficient, start rolling blackouts.

2. Permanently add enough capacity so the load cannot reach ML-1.

3. Permanently reduce the load so ML-1 cannot be reached.

Solution 1 is not popular, but is less objectionable than a massive 12-hour blackout. The utility has peaking stations and backup power plants that can be started and connected when the load approaches M-1, and that would have helped. 

Solar is a good way to permanently add capacity to the grid. These peak power demands always occur when it is hot and sunny, and sunny means solar panels are putting power into the grid. As long as the grid is up! And large solar arrays, like those planned for the Sunrise Power Link, may be designed to lead a load, acting as an independent power source. In my scenario, the Sunrise Power Link may have been large enough to prevent the blackout.

Additional energy conservation measures will reduce load. Many already have energy conserving features in their home and workplace. And technology keeps finding more ways to save.

Power utilities business is profitably selling power. Regulation and image help them focus on the cost and quality of their service. Before this happened, all lines were loaded, the high cost to operate peaking plants were off, and there was no reserve. This is an ideal situation for selling power at the best margin. Maybe regulation and image could help them focus more on the quality of service.

Finally, we have to look at growth.  It seems we allow growth before supporting infrastructure is in place.

—David Haessig

Jason September 22, 2011 at 09:20 PM
It's still too early to be crowing about what should and what wasn't done last week. The ISO and the utilities are still investigating the events of last Thursday, so we're still waiting for many critical details about what failed and why. But we know this: The entire sequence from the initial failure in Arizona to the complete countywide blackout took twelve minutes. Is twelve minutes long enough to bring additional power sources online? Would they have provided sufficient capacity? Is twelve minutes long enough to commence rolling blackouts? Could anyone responsible for those decisions have determined that those were the correct responses within twelve minutes? If those were wrong, what are the consequences? Costs of damage to overloaded equipment at San Onofre or a distribution substation surely start in the millions, and would take days or weeks to effect. How would that downtime affect our businesses, careers, and livelihoods? If a transmission line overloads, overheats and breaks, might it start another large fire? 2007 isn't that distant a memory. People on the East Coast and in the Midwest lost power for over a week due to storms this summer and last winter. We were back up in twelve hours. Whatever the cause, it feels like we got off easy.
PeterD September 23, 2011 at 11:41 PM
I think what Dave was suggesting that as the load approached its maximum (ML-1) those peaking stations would be spinning up and would be able to compensate for a sudden loss. You might have still seen this type of problem if the load at the time of the Arizona incident was below the threshold to spin up the peaking generators and the incident itself caused more of a supply decrease than the load safety margin. Another alternative is that the monitoring instruments at Dave's house were isolated from the systems that were being disrupted. (ie the voltage fluctuations would not be felt system wide until the sources shut it down to 0) If that were the case, then you could get Ray's runaway resonance on a main line and not see anything past a substation. I don't know if that's possible though. Be nice to actually know what happened, but I suspect that because backup systems are overhead and dip into profits, you will only see them through regulation of the regional monopoly.

Boards

More »
Got a question? Something on your mind? Talk to your community, directly.
Note Article
Just a short thought to get the word out quickly about anything in your neighborhood.
Share something with your neighbors.What's on your mind?What's on your mind?Make an announcement, speak your mind, or sell somethingPost something
See more »