7 Disastrous Concentration in the National Power Grid

On august 14, 2003, large portions of the Midwest and Northeast in the United States and Ontario in Canada experienced an electric power blackout. The outage affected an area with an estimated 50 million people and 61,800 megawatts of electric load in the states of Ohio, Michigan, Pennsylvania, New York, Vermont, Massachusetts, Connecticut, and New Jersey and the Canadian province of Ontario. The blackout began a few minutes after 4:00 p.m., eastern daylight time, and power was not restored for four days in some parts of the United States. Parts of Ontario suffered rolling blackouts for more than a week before full power was restored. Estimates of total costs in the United States range between $4 billion and $10 billion. (Agency 2003) The failure stranded millions of office workers and commuters from Pennsylvania to Boston and Toronto. It left Cleveland without water, shut down twenty-two nuclear plants (always a risky procedure), caused sixty fires, and in New York City alone, required eight hundred elevator rescues.

The details of the causes of the outage show the familiar string of interacting small errors. It was a hot summer day and demand was high, but that was not unusual. The device for determining the real-time state of the power system in the Midwest Independent Service Operator (MISO) had to be disabled because a mismatch occurred. The device (a state estimator) was corrected, but the engineer failed to re-engage it on going to lunch. Normally, this would not be a problem for the duration of a lunch break, but it just so happened (the refrain of normal-accident theory) that forty-five minutes later a software program that sounds an alarm to indicate an untoward event at FirstEnergy began to malfunction. (Yes, this is the company that had the hole in the head of its Davis-Besse nuclear power plant.) It was supposed to give grid operators information on what was happening on FirstEnergy’s part of the grid. Independently, perhaps, the server running the software program failed. Not to worry, there is a backup server. But the failure was of the sort that when the backup server came on, the program was in a restart mode, and under these conditions the software program failed. Moreover, it failed to indicate that it had failed, so the operators were unaware that it was not functioning properly.

Meanwhile, in the MISO office, the state estimator was restarted, after lunch, but again indicated a mismatch; no one knows why. (“Normal accidents” frequently have such mysteries.) At this point there was no untoward event that the program would warn about— but that unfortunately changed. Independently (normal accidents are all about independent failures whose consequences interact), faulty tree-trimming practices and the hot weather caused one of the FirstEnergy’s lines to go down; as it had cut maintenance at its Davis-Besse plant, it also cut maintenance on its transmission lines. For complicated reasons only partly related to the failures, neither the MISO nor the company became aware of this line failure for some time. Finally, the MISO noticed and took the tripped line out of service, but FirstEnergy’s failed program did not allow FirstEnergy to know of either the trip or that the line was taken out of service. Three more lines shorted out on trees because of FirstEnergy’s cutbacks in maintenance, but the utility’s computers showed no problems. FirstEnergy only became aware when its own power went out, a signal hard to miss, and had to switch to emergency power. By then, in just seven minutes of cascading failures, eight states and parts of Canada blacked out. (Funk, Murray and Diemer 2004; Outage 2003, 136–37) As normal-accident theory puts it, the system was tightly coupled, as well as interactively complex, hence the cascade of failures in eight states and Ontario province.

INTRODUCTION

Our national power grid, the high-voltage system that links power generation and distribution, is the single most vulnerable system in our critical infrastructure, and this is the first reason we shall examine it. It has been attacked by natural forces, disabled by industrial accidents, attacked by domestic terrorists, and threatened by foreign ones. It need not be as vulnerable as it is at present. The second reason for examining it is that until recently, it was an example of a form of network structure that other systems in our critical infrastructure might learn from. The third reason is that it shows how market forces, inappropriately released and with faulty regulation, can reduce its reliability.

Electric power is obviously essential for our society; virtually everything important depends on it. Large-scale outages cost billions, and inevitably some people die. But power outages from bad weather, overheated transmission lines, equipment failures, or errors by the utilities are generally of short duration—a few days at most—and many essential services have backup generators. So we are not as threatened as we would be by the radiation of a nuclear plant or the poisons of a chemical plant. A two- to four-day shutdown might only take one to two dozen lives. But the economic consequences can be very large. Even without a large blackout, the costs of routine failures are substantial. One study puts the cost of interruptions and quality problems at more than $100 billion per year in the United States. (Priman n.d.) Most important, here is where terrorists could create havoc on a much larger scale than our big blackouts have caused. They could disable equipment that could take several months to replace and would have to be custom built in a place that had power, and a really large attack could leave a substantial part of the country without power—and without the power to replace the power. In such cases the death rate would soar. I think such an attack is very unlikely, but so was 9/11.

Despite the disruptions and costs of the 2003 blackout, it is worth taking a quick look at another blackout to illustrate the resiliency of society to blackouts, or at least Canadian society. A large area of Canada, including the capital, Ottawa, was without power for weeks after three severe ice storms in short order in January 1998. In the bitter cold weather, sixty-six municipalities were without electric power, some for three weeks. A study of the response to that disaster found no panic and a smoothly working emergency-response effort. (It helped that there was ample firewood.) All the official response groups performed as trained; they had valuable experience from previous outages. Though this outage was extreme, they innovated and exchanged roles with other teams, and temporary, emergent organizations materialized to handle the inevitably unanticipated emergencies. Army crews cut firewood; private station wagons hauled in bulk food; the police, with time on their hands since there was no looting or crime, checked homes to find old people in danger of freezing; generators were flown in from distant provinces and trucks were used to power gasoline and diesel fuel pumps. There was endless innovation, and almost no deaths in three January weeks of extreme cold with no power. (Scanlon 2001) (The author of that study, Joseph Scanlon, also documents another successful Canadian disaster response when the Gander, Newfoundland, airport suddenly received thirty-eight diverted flights with 6,600 passengers at the time of the September 11, 2001, World Trade Center and Pentagon attack. The town of 10,387 grew overnight by 63 percent. Careful emergency planning and experience with diverted flights were the key to the amazingly flexible and successful response. [Scanlon 2001; Scanlon 2002])

Americans would not be as resilient if there were a terrorist attack on multiple sites of the U.S. grid. If one generating station were to fail because of an industrial accident, a storm, or even a terrorist attack, the consequences would not be great. The grid would find other sources to make up for the lost power. But if there are multiple failures (which is what we had in the Northeast 2003 blackout) the fragile web not only breaks in multiple places but brings down other stations in a cascade of failure. If the attacks not only disrupted power supply but also ruined generating equipment, much of which is custom made, it would be more serious.

I think the terrorist threat to the grid is very low, but if terrorists were able to electronically break into the control system of more than one large generating facility, their cyber attack could shut down a large portion of the national grid. Al Qaeda documents from 2002 suggest cyber attacks on the electrical grid were considered. We know there have been attempts by hackers, but probably not terrorists, to break into what are called the SCADA control systems (supervisory control and data acquisition) that run large plants. One utility reported hundreds of attacks a day, and a computer security expert said that there were a “few cases” where they have had an impact. Security consultants say that they were “able to penetrate real, running, live systems” and could disable equipment that would take months to replace. Terrorists could do this from another country without leaving a trace. (Blum 2005; GAO 2004b) They could also disrupt the national grid with a few well-placed small bombs at transmission towers. (A major concentrated line linking California and the Northwest is at the California-Oregon border.) Environmental extremists have blown up transmission towers in California. A large, well-organized terrorist group could attack all three of the major grids at the same time. Such an attack, or one on the operating systems, would do far more damage in 2006 then it would have in 1990, before the grid was deregulated and long-distance transfer of energy became a lucrative business proposition. The grid now is a larger, more attractive target. Its recent concentration can have disastrous consequences for our society.

THE GRID

Our national power grid, made up of three independent grids that are loosely tied together, is an example of a highly decentralized system. It has grown steadily and rapidly as the population, urban concentrations, and reliance on electricity have grown. Its structure is flat and slim, with a voluntary coordinating group formed in 1968, the National Electric Reliability Council at the head, presiding over ten councils. Below that, the structure and the nomenclature is changing yearly. Some say there are 150 electric control operators, others speak of 120 “cohesive electrical zones.” Either way most of these are run by the dominant organizational form, investor-owned utilities—that is, for-profit utilities. In the works are regional transmission operators and independent service operators (ISOs)—there are five ISOs now—encouraged by federal legislation. These will be inserted between the ten councils and the 150 electric control operators to facilitate transmission and coordination.) At the bottom are more than three thousand local utilities and in addition, constituting 12 percent of the output in 1998, many other diverse organizations generating and sometimes selling electricity, mostly industrial plants. This network represents an enormous investment, including more than 15,000 generators in 10,000 power plants, and hundreds of thousands of miles of transmission lines and distribution networks, whose estimated worth is more than $800 billion. In 2000, transmission and distribution alone were valued at $358 billion. Considering the size of the grid, its structure is very flat.

The three grids are the Eastern Interconnected System, covering the eastern two-thirds of the United States and adjacent Canadian provinces; the Western Interconnected System, consisting primarily of the Southwest, areas west of the Rocky Mountains, and adjacent Canadian provinces; and the Texas Interconnected System, covering Texas and parts of Mexico.

It is hard to imagine what has been called the world’s largest machine (a name also given to the Internet) as having only five hierarchical levels (resembling the Internet), with the top being a voluntary council and the bottom having more than three thousand units. But while there are central control rooms for each the three national grids, managed by the North American Electric Reliability Council, and control rooms for the 150 ISO coordinators, they would better be labeled coordinating rooms, since they act more as traffic police and adjusters, responding to electronic signals that convey the state of the system. They rely on increasingly sophisticated automatic devices.

Massoud Amin, an authority who has written extensively on the prospects of self-organization through electronic “intelligent agents” on the grid, says it cannot have a centralized authority because of its size and complexity: “No single centralized entity can evaluate, monitor, and manage all the interactions in real-time.” (Amin 2001) Demand decisions are, of course, made by the final consumers, as in any economic enterprise, but the decisions about balancing the system—how much power and from what generating stations will be put on which of several possible transmission lines to be sent to the distributor—are made by either the 150 ISOs, where they are in operation, or even much smaller facilities in the areas that are still being regulated. (Surprisingly, most of the system has not been fully deregulated. Many states in the Southeast have resisted, for example.) These “decisions,” moreover, are increasingly not made by humans, though humans of course have designed the “agents” and the decision rules they will follow. Decisions are made by electromagnetic, and increasingly, electronic controls. A government report puts it as follows: “[T]hrough proper communications (metering and telemetry), the control center is constantly informed of generating plant output, transmission lines and ties to neighboring systems, and system conditions. A control center uses this information to ensure reliability by following reliability criteria and to maintain its interchange schedule with other control centers.” (Administration 2000b) (That was in 2000; since then, information is being shared less because of increased competition.)

According to Amin, only some coordination (as of 2000) occurs without human intervention; much is still based on telephone calls between the utility control centers. (Amin 2001, 24) Indeed, when our transmission system as a whole is considered, we have what has been called a first-world country with a third-world grid, with much of the equipment for transmission dating from the 1950s. But automated coordination is a realistic goal, according to Amin. (In an unpublished editorial submission, he deplores the response to the 2003 blackout, citing short-term fixes of problems that will only expand the scope of future disasters, and the failure to fund research for more sophisticated devices. [Personal communication, 2003]) In an enthusiastic 2000 article, he wrote:

 

[I]ntelligent agents will represent all the individual components of the grid. Advanced sensors, actuators, and microprocessors, associated with the generators, transformers, buses, etc., will convert those grid components into intelligent robots, fixed in location, that both cooperate to ensure successful overall operation and act independently to ensure adequate individual performance. These agents will evolve, gradually adapting to their changing environment and improving their performance even as conditions change. For instance, a single bus will strive to stay within its voltage and power flow limits while still operating in the context of the voltages and flows imposed on it by the overall goals of the power system management and by the actions of other agents representing generators, loads, transformers, etc. (Amin 2000)

 

The automatic devices planned and being installed may be close to what has been called cognitive agents, or intelligent agents; that is, the agent learns from repeated experience. Just as voice-recognition programs grow in complexity and improve greatly as they encounter slightly different contexts, so do these agents.

Though I have not found an explicit discussion of this, it would follow that as new areas are added to the grid, they would continually organize themselves (the loose parallel to self-organizing systems), and also prompt the reconfiguration of the section of the grid they are attached to. While adjustments are made by operators, the relays, breakers, and multitude of other devices are not individually adjusted by hand, but through automatic devices. This kind of expansion, or growth dynamic, is not possible with a chemical plant, a hospital, or the CIA, no matter how decentralized they might try to be in the conventional sense. It does not mean that the utilities—say, Consolidated Edison or Pacific Gas and Electric—are organized this way; these remain centralized bureaucracies. But the grids the utilities attach to appear to be these radically decentralized, self-activating, almost self-organizing systems, run by professionals who are reasonably independent of the utilities, though the utilities fund them.

It takes a very high level of technology to design and build the billions of parts and thousands of intelligent agents that go into the system, and to design coordinating rules. The system has grown in an evolutionary way; no central authority planned it and laid it down at one time. Many mistakes have been made along the way and corrected, and “congestion”—where a transmission line threatens to become overloaded—is a persistent problem with rapid growth of the system. The system still has many reliability flaws, but most result only in minor disruptions, generally associated with peak demands during hot, humid weather.

It is true that there have been very serious blackouts in the United States, and normal-accident theory would say serious failures are to be expected because of interactive complexity and tight coupling. Equipment failures in the transmission lines and unanticipated shutdowns of generating plants are to be expected, but they may interact in unanticipated ways with each other and with peak load demands, or with the phenomenon of reverse flow of current because of “loops.”1 “Normal accidents”—simultaneous, interacting, multiple failures that are unpredictable and even hard to understand—are quite rare, as normal-accident theory would expect, and remarkably so, given the excessive demands on the system and its size and complexity. Much more likely to cause failures are single-point failures, such as overload because of weather and demand. (I will deal later with deliberate destabilization to manipulate prices, as in the Enron case in California.) In the 1990s, demand in the United States increased 35 percent but capacity increased only 18%. (Amin 2001) One does not need a fancy theory to explain large failures under those conditions. (Indeed, one does need a fancy theory, such as network theory, to explain why there were so few failures under these conditions.)

Failures can be large: one failure in 1996 affected eleven U.S. states and two Canadian provinces, with an estimated cost of $1.5–$2 billion; the 2003 failure in the Northeast was even larger, at $4 –$10 billion. But most are local. The appearance of deregulation in the second half of the 1990s took its toll. In the first half of that decade there were forty-one major outages affecting an average of 355,000 customers. That is about eight medium-sized cities each year, but in terms of the nation, very small. In the second half, as deregulation began to take hold, there were fifty-eight major outages, affecting 410,000 customers, a 41 percent increases in outages and 15 percent increase in the average number of customers affected by each one. The 2003 Northeast grid failure was our most severe, and we came within seconds of an equally large one in 2005. So the grid is becoming less secure.

But I am more struck by the high technical reliability of the power grids before deregulation than by the few serious cascading failures they have had over the decades. Without centralized control of the three vast grids, despite the production pressures of mounting demand and despite increased density and scope, they muddle through remarkably well when not being manipulated by energy wholesalers and starved of investment by the concentration of energy producers. What can we learn from this that might be of use in redesigning other parts of our critical infrastructure?

LESSONS FROM THE GRID?

These principles seem to have been at work:

(1) It has had bottom-up growth. The network evolved gradually as needed for reliability. It was not imposed from the top, either by a few dominant for-profit firms or by a strong central government. (State and federal agencies played monitoring and regulating roles, entering the scene only to deal with evident problems.) Replicating this feature in other existing systems in our critical infrastructure will be difficult. Something other than evolution is needed to reduce the concentrations of hazardous materials or populations in risky areas.

(2) The network formed on the basis of independent units voluntarily coming together for reasons other than horizontal market control (though that has always been a danger and is a large one at present). Horizontal market control is when a few large entities in the same business coordinate practices (collude) to achieve above-market returns, or excessive profits. (Vertical market control is where one organization integrates key functions and sets excessive prices, without fear of competition.) The local utilities cooperated with each other in order to increase reliability. Since they were largely monopolies, they could cooperate without fear of price competition; local governmental bodies were only to ensure that the rate of return was fair. (Admittedly, a weakness of the system was that the utilities had undue influence over some local bodies, receiving above-normal rates of return, sometimes buried in deceptive accounting schemes.)

(3) Shared facilities (transmission lines in this case) are routinely accessed and priced at cost, or close to it. (With deregulation, the transmission firms must be competitively priced, turning a common good into a commodity.) This exists in much of our transportation system now and in the Internet; it could be explored for the gas and oil pipelines critical for our infrastructure, with the federal government setting fair rates of return and incentives for modernization. Much of our critical infrastructure could be defined in terms of a common good, subject to regulation, and not as vulnerable to price competition or unregulated monopoly. Supreme Court rulings in recent decades have greatly narrowed the scope of common goods.

(4) Members support independent research and development facilities for technical improvements that are shared by all. R&D investments have declined since deregulation because the market does not value them. If firms were required to contribute to an R&D fund, this amount would be removed from stock price evaluations. Something similar could be done where there is private underinvestment in security and reliability in industries with concentrations of hazardous materials and in other parts of the critical infrastructure, including telecommunications.

(5) Oversight bodies (generally voluntary) establish broad regulations and accounting practices (labeled commonality interdependency in chapter 9). This enables firms to establish interoperability where facilities are shared. It is a curse of decentralized systems that interoperability is difficult to achieve. Strong oversight bodies are required to ensure independent invention and innovation that still provides interoperability. Still, the electric power industry has done reasonably well here, and the Internet has done very well, so it is possible in these two very critical parts of the critical infrastructure. Less oversight is needed in other parts, since interoperability is less essential.

While I would argue that these characteristics of a sound network fostered the reliability and increased efficiency needed for the key element of our critical infrastructure, it appears that the electric power deregulation that started in earnest with the 1992 legislation and was extended in 1999 legislation will change its character. While officially designed to increase competition, and thus lower prices, there are signs that it increases consolidation and thus market control by the largest firms, and this could lead to excess profits and less reliability. The price of electricity, controlling for fuel costs, was falling before 1992 but has risen steadily since then, though deregulation was to have lowered it. It is worth reviewing recent developments to see where the dangers might lie.

DEREGULATORY DANGERS

Electric power in the past fifty years is a good example of the possibilities of networking, but it also illustrates the perils. Networks are less subject to predatory behavior than other economic structures such as hierarchies or multidivisional firms because units cannot accumulate commanding power, reciprocity is more evident, and there is an awareness of a common fate. Nevertheless, they are not immune to predators. In the 1960s through 1992, the utilities were quite independent, loosely regulated, and gradually linking up with each other to ease demand fluctuations and handle more temporary outages. The connections were through the transmission lines, of course, and as these connections grew in number and significance, technology evolved to routinize and stabilize the intermittent exchange of power and to meter and price the exchanges.

Think of a group of manufacturers, making, say, complicated machine tools. They manufacture many parts for the tools. One or the other might run short on a part, call up a nearby manufacturer, and purchase some if the latter had the capacity to increase production. Next week the favor might be returned, at a price of course. If such exchanges became frequent, regularized systems of inventory and exchange might be installed, and a network established. If the volume exchanges became significant, one firm might specialize in handling these exchanges, perhaps forming a separate organization to do so. This is what happened to the power grid, as exchanges needed to ensure reliability increased. Common standards evolved, regulated by councils and the government.

The local utilities were monopolies, justified as “natural monopolies” because it would be wasteful to have competing plants and transmission lines in the same locale, and they were considered a common good in that everyone used them. They were regulated primarily by their local authorities, with regulators setting maximum returns for the utilities, and the utilities kept excess generating capacity in reserve to meet surges in demands and temporary failures in production or transmission. (Today these local utilities are still monopolies but are less often independently owned. They are being consolidated into holding companies and large ownership groups and thus are not subject to local community control.)

At this point, before deregulation, we had a large number of independent units linked together to share power when needed, in a quite decentralized web, or network. It allowed new units to be established to accommodate growth in the market with only minor adjustments at the edges of the system. Technological advances made many of these exchanges fairly automatic, such that few supervisory levels needed to be added to the system; it remained quite “flat,” and very reliable.

If we had remained that way, it would have been a model for a number of industries and services that form our critical infrastructure. For example, the power grids and transportation systems have a common problem, congestion, which can disrupt the grids and disrupt the possibilities for rapid evacuation and rapid movement of emergency equipment in transportation systems. Adding transmission lines are difficult in urban areas (because of high land values and safety considerations), and adding more highways or airports is similarly difficult, even though in both cases it is the crowded urban areas that most need the increases. Both systems attempt to increase the capacity of their existing transmission facilities. What the power system had was separate, independent, local generation and transmission facilities called independent power producers (IPPs), along with (1) research and development agencies they supported with fees (principally the Electric Power Research Institute, with a substantial budget, responsive to the locals) and (2) coordinating agencies that set standards and policed its members, principally the national Electric Reliability Councils. In keeping with the best web designs, the organizations above the IPPs coordinate, police, and manage the system, rather than direct it.

The transportation system, in contrast, consists of multiple organizational interests with considerable clout: vehicle manufacturers, highway construction companies, insurance companies, governments at the national through local level, and the police. Some were concerned with only their part of the system. In the case of truck and auto transport, the automotive companies compete with one another and share few common interests with traffic management systems within and between cities. Though the Department of Transportation attempts to coordinate and standardize devices that would reduce congestion, neither the cities nor the auto manufacturers are responsive. Manufacturing is highly centralized in a few auto companies, giving them the power to resist decongestion devices in the vehicles and in road design, and neither cities nor users are able to insist on necessary changes in vehicle designs. Cities themselves are not coordinated by any regional or national agency, nor are there any major R&D facilities that serve the industry, only scattered programs, mostly in universities.

This shows the difference between a decentralized system, such as the power grid, especially before deregulation, and a fragmented system. Decentralized systems have coordinating bodies at the top and incentives to cooperate to maintain the system. Fragmented systems lack these.

In Japan and Europe, where the national governments are much stronger, intelligent transportation systems (ITS) are much more advanced; even Europe, with its many sovereign nations, manages to cooperate and coordinate better than the states within the United States. One of the oldest and simplest ITS devices, the “easy pass,” allowing one to drive through a tollgate without stopping, and receive a bill later, is not standardized in the United States. Truck driver must have a dozen or more stickers or placards as they traverse the nation, and information on the vehicle and its contents, vital for safety, is generally not registered by the toll station. A much larger percentage of European and Japanese vehicles are equipped with driver assistance devices warning of dangers and showing alternative routes, often responding to sensors in the roadway (these are lacking in the United States), thus reducing congestion and accidents. The transportation systems of Europe, despite the handicaps of dealing with multiple nations and cultures and density, with ancient cities and no room to expand, appear to have managed to combine centralized planning and coordination with local authority on route planning and local safety concerns. So has Japan.

THE LINKS IN THE NETWORK

The electric power grid and the Internet are similar in that they are both one big system. But transferring electricity is quite different from transferring electronic signals. Distances on the Internet are largely irrelevant. Signals from England to Spain will generally pass through North America. The electric power grid in North America is also one in big machine in that everything is or can be connected, but the distance between the place where the power is generated and where it is consumed makes a great deal of difference. There are problems with congestion on the Internet, especially with the advent of large video and music files, but the Internet’s electronic impulses are, in a sense, simple. They have addresses that determine where they will go, and the means to select the most efficient routing. In contrast, when electricity is generated, it must go into a transmission line (it cannot be stored, or held back, as electronic impulses can) and it will flow through the most open line, making the routing a complex matter all along the way. Sending electricity is more complex, and the complexity increases with the length of the transmission and the complexity of the interconnections, neither of which is true of the Internet.

The burgeoning field of network analysis pays almost no attention to the links between nodes, but the difference between the Internet and other network forms is important. Packets of information on the Internet can travel over diverse paths and be recombined (because each has all the addresses), and the paths show incredible diversity because of the connection of the nodes. Overloading on the Internet just slows things down a bit; on the power grid it heats things up and interferes with voltages and such things as reactive power. On the grid, electricity on the transmission lines will seek the shortest path that is left open by relays even if this results in overloads, improper voltages or phase transitions, and other causes of failure. The links are crucial. (They are even more crucial in two other network forms we will discuss: networks of small firms and terrorist networks.) We should think of the grid as a large number of clusters of cells where the links between the clusters are best kept short to prevent instabilities in the current that is transmitted.

Changes in the generation and transmission at any point in the system will change loads on generators and transmission lines at every other point—often in ways not anticipated or easily controlled. Frequencies and phases of all power-generation units must remain synchronous within narrow limits, but the limits are even narrower on the grid, where a small shift of power flows can trip circuit breakers and trigger chain reactions. (Lerner 2003) (A relay with a faulty setting brought about the 1965 blackout.)

These conditions have led some engineers to say that the grid is vastly overextended because of longer transmission distances, and regulation should restrict the amount of “wheeling,” which means transferring power from one point to another via a third party (a transmission company). Most engineers disagree, at least judging from the published literature (the published literature, of course, reflects the industry’s point of view). Their argument is that with the proper incentives—the chance to increase profits—the industry will invest in more lines and better facilities, which will allow safe long-distance transmission, and they say the power loss from long-distance transmission is not large. In effect they say that if all the “intelligent agents” that Amin speaks of are inserted, the system will behave more like the Internet and be as risk free. Shortly we will find that other experts disagree.

INDUSTRY CONSOLIDATION

Starting in the 1970s, deregulation of common carriers, such as air and truck transport, began, and in the 1992 Energy Policy Act this strategy was applied to electric power, on the grounds that there was insufficient incentive to improve efficiency, that excess generating capacity was wasteful, and the cost of electricity varied widely between areas (generally states and regions). (The act also allowed the industry to make campaign contributions for the first time since the 1930s scandals. The interests of both Congress and the industry were thus “aligned.”)

(I am drawing freely from a report by Eric Lerner [2003] of the American Institute of Physics, a critic of deregulation. See also the work of former utility executive John Casazza, who predicted the increased risk of blackouts because of deregulation and has become an increasingly virulent critic of the industry he left. [Casazza 1998; Casazza and Delea 2003] He is routinely, but not always convincingly, contradicted by industry spokesmen. My other main source is the government’s Energy Information Administration, which publishes annual reviews of the industry and, of course, favors deregulation.)

One technical argument for deregulation was that peaks and failures could be accommodated by more extensive wheeling. The economic argument for transmitting power over longer distances was that the Northwest and the Southeast had cheap power and it should be sent to areas where power was expensive. Wheeling was already extensive and grew at an annual average rate of 8.3 percent between 1986 (well before the post-1992 reforms) and 1998. (Administration 2000a, 24) Further deregulation occurred in 1999, and wheeling was to take a steep rise after 2000.

Deregulation also allowed utilities to buy up one another, since control by local governmental was reduced, and deregulation required the unbundling of generation, transmission and distribution. Unbundling generation and transmission would allow more flexibility, though most utilities still both generate and transmit. The Energy Information Administration (EIA), a part of the U.S. Department of Energy gives the details: expensive production facilities were closed down, and technological advances meant that it was no longer necessary to build a 1,000-megawatt generating plant to exploit economies of scale; combined-cycle gas turbines are maximally efficient at 400 megawatts, and aero-derivative gas turbines can be efficient at 10 megawatts. (Administration 2000a, ix)

Predictably, and intentionally, mergers took place. From 1992 to 1999, the investor-owned utilities (IOUs) have been involved in thirty-five mergers, and an additional twelve were pending approval. The size of the IOUs correspondingly increased. The ten largest in 1992 held 36 percent of IOU-held generating capacity; this went to 54 percent by the end of 2000. In 1992 the twenty largest already owned 58 percent; this rose to 72 percent in 2000. (Administration 2000a, x) While there are more than three thousand utilities, most are small and inconsequential; 239 IOUs own more than three-quarters of the capacity. This is a substantial consolidation of the industry; consolidation would somehow increase competition, it was curiously believed. (It should have lowered prices, but it has not. Historically prices had been in a slow but steady decline for fifty years, but between 1993 and 2004 the retail price for all sectors rose by 9 percent, and for residential customers, 8 percent. Deregulation was supposed to reduce the differences between localities, such as states. It has not. Idaho’s rates are still half those of Connecticut, and similar though somewhat smaller differences obtain between other adjacent states.) (Administration 2005)

THE VIRTUES OF INDUSTRY CONSOLIDATION?

Rather than see the system up until the 1990s as resilient, reliable, and expandable, the EIA describes it as “balkanized.” “To better support a competitive industry, the power transmission system is being reorganized from a balkanized system with many transmission system operators, to one where only a few organizations operate the system.” (Administration 2000a, ix) (Some believe that the prediction of “only a few organizations operate” is ominous indeed, and it suggests the immense consolidation the industry leaders may be seeking.) The EIA report clearly believes that regulation has hampered efficiency. It does not deny that regulation was necessary in the past, and cites the 1935 Public Utility Holding Company Act that put the Securities and Exchange Commission in charge of “policing the widespread abuses of the large holding companies” and “aimed at breaking up the unconstrained and excessively large trusts that then controlled the Nation’s electric and gas distribution networks.” (29)

But the EIA believes that act has outlived its usefulness because the economic institutions that protect customers and investors are all working. As evidence, it cites the changes in the last decades that have precluded the predatory behavior of the trusts of the past. It is a remarkable list, since every one of the “changes” was proved to be illusionary by the scandals of 2001 and 2002, and was foreseen even as the agency was writing its report :

 

• The development of an extensive disclosure system for all publicly held companies

• The increased competence and independence of accounting firms

• The development of accounting principles and auditing standards and the means to enforce them

• The increased sophistication and integrity of securities markets and securities professionals

• The increased power and ability of State regulators. (49)

 

Not a single one of these promises has proved viable.

This is not to say that the decentralized monopolies of the past fifty or so years did not need reform legislation and invigorated regulation. The rates varied greatly, with some states charging more than twice as much as others, and often without the justifications of federal hydroelectric power keeping the rates in the Northwest low and other contextual variations. Innovation in transmission was probably not aggressively pursued by comfortable monopolies that saw little need for it. But there was no great need for long-distance transmission and thus no need to unbundle long-distance transmission from local generation by allowing firms to emerge that specialized in transmission. Two things made the changes dangerous.

First, the physics of the grid make long-distance transmission wasteful because current is lost in the form of heat (there is disagreement on the significance of the loss), and more important, it raises the risk of uncontrolled interactions. The basic problem is a “collision between the physics of the system and the economic rules that now regulate it,” says Lerner. (Lerner 2003) The vast system of electricity generation, transmission, and distribution that covers the United States and Canada is essentially a single machine in one respect. But everything is not effectively connected to everything else. Instead, clumps or cells of local activity are connected to the geographically adjacent cells—or at least that was the case before deregulation. The interconnections were there to insure against a sudden loss of power, so they were not in steady use.

After deregulation, the interconnections were heavily used. Engineers warned that increased long-distance trading of electric power would create dangerous levels of congestion on transmission lines where controllers did not expect them and could not deal with them. The problem would be compounded as independent power producers added new generating units at essentially random locations determined by low labor costs, lax local regulations, or tax incentives. If generators were added far from the main consuming areas, the total quantity of power flows would rapidly increase, overloading transmission lines. “The system was never designed to handle long-distance wheeling,” noted Loren Toole, a transmission-system analyst at Los Alamos National Laboratory. (Lerner 2003)

A further problem was that the separation resulted in an inadequate amount of something called reactive power, which is current 90 degrees out of phase with the voltage. It is needed to maintain voltage, and longer-distance transmission increases the need for it. However, generating companies are the main producers of reactive power, and with the new rules, they do not benefit from it. Its production reduces the amount of salable power produced. So transmission companies, under the new rules, cannot require generating companies to produce enough reactive power to stabilize voltages and increase system stability. Lerner notes: “The net result of the new rules was to more tightly couple the system physically and stress it closer to capacity, and at the same time, make control more diffuse and less coordinated—a prescription, engineers warned, for blackouts.” (Lerner 2003)

THE PROBLEMS OF CONSOLIDATION

These were some of the technical arguments for opposing the kind of deregulation Congress and the Clinton administration set in motion. But there were political ones as well. There was the risk that energy wholesalers, made possible by unbundling transmission and generation, would become speculators—gaming the system, as Enron, Dynergy, El Paso, and other firms did. Unbundling would require aggressive regulation to prevent predatory pricing, but regulation was not what business or the administration wanted. The separation of generation, transmission, and distribution allowed a fourth party to play a role: wholesalers such as Enron who reconciled the difference between the generator with the cheapest electricity and the distributor who would pay the highest cost. Looking for the best bargain, wholesalers were supposed to reduce price differentials, that is, perform an arbitrage role, which can be the role of the speculator. (There were no provisions for regulating these wholesalers. Unregulated arbitrage invites costly and inefficient manipulation, as the California episode made clear.)

The transmission part of the system became something of an orphan. Those producers that owned transmission lines were required to carry the power of competitors over their lines—the lines were defined as a common good, like telephone lines and cables in the Internet. But common goods that can be manipulated require regulation, and consumer groups and publicly owned utilities protested that without careful regulation and expansion, there could be technical problems and abuses. After four years of litigation, the Supreme Court upheld the new regulations on transmission and the separation of production and distribution in 2000. Producers and distributors used the lines, but they did not want to build new ones as the population grew and demand even faster, and communities resisted adding lines. With more energy being wheeled, the lines became overloaded.

In March 2000, the warnings began to come true. Within a month of the Supreme Court decision requiring transmission lines to be open to all, thus suddenly increasing the economic value of long-distance wheeling on a grid not designed for it, electricity trading skyrocketed, as did stresses on the grid. The number of events where line loads were relieved by shifting power to other lines increased sixfold in just two months. The average hourly frequency deviations from sixty hertz (a measure of safety) went from 1.3 million in May 1999 to 4.9 in May 2000 and to an alarming 7.6 by January 2001. As predicted, the new trading had the effect of overstressing and destabilizing the grid.

Not only did energy companies run things to the limit of capacity, but they gamed the system, causing widespread blackouts and brownouts immediately after the March 2000 Supreme Court ruling. Federal investigations showed that employees of Enron, Dynergy, and other energy traders “knowingly and intentionally” filed transmission schedules designed to block competitors’ access to the grid and to drive up prices by creating artificial shortages, all the while chortling over the phone about defrauding poor widows. In California, this behavior resulted in widespread blackouts, the doubling and tripling of retail rates, and eventual costs to ratepayers and taxpayers of more than $30 billion. (It also led to the recall of the governor, who was blamed instead of the energy companies.) In the more tightly regulated Eastern Interconnection, retail prices rose less dramatically. Nationally, the cost of electricity, excluding fuel costs, had increased by about 10 percent in just two years after 2000. (Lerner 2003) The vaunted savings from deregulation have never been realized.

After a pause following Enron’s collapse in 2001 and a fall in electricity demand (partly due to recession and partly to weather), energy trading resumed its frenzy in 2002 and 2003. Although power generation in 2003 had increased only 3 percent above that in 2000, generation by independent power producers, a rough measure of wholesale trading, doubled by 2004. System stress has soared, and with it, warnings by the Federal Energy Regulatory Commission and other groups. The stress was revealed in the August 14, 2003, blackout of the northeastern United States, to which we will turn shortly.

That is the minority view, critical of deregulation. The majority view emphasizes the role of the free market. First, if the transmission companies have enough of a profit incentive, they will invest in the technologies that will eliminate the kind of problems the critics refer to. One should be allowed to be skeptical about this. Shareholders in these companies are unlikely to take a long-run point of view, seeing their returns drop in order to improve reliability. It is in the interests of the companies to keep operating, of course, but a massive blackout only interrupts their income for a short time. A massive blackout will cause billions in losses, but not to the shareholders or the company. Only strict federal regulation is likely to force the companies to spend the kinds of money needed to make the vast grid highly reliable, and the 2005 energy act is quite toothless in this regard. Even a staunch advocate of deregulation notes this in his excellent analysis of the 2005 energy bill. (Eagle forthcoming)

ENRON

The Enron story does not illustrate any key vulnerability to natural, industrial, or terrorist disasters, but it suggests we have another case of executive failure, widespread malfeasance at the top that brought prison sentences for Kenneth Lay, Enron’s head (not served because of his untimely death), and Jeffrey Skilling, Andy Fastow, and other top Enron officers. It also illustrates two institutional conditions that increase the vulnerabilities of our electrical power system: the unfortunate consequences of ill-advised deregulation and lack of government concern with economic concentration in our critical infrastructure. A key advantage to deregulation, according to its supporters, is that it smoothes out market fluctuations, thus reducing disparities and the inefficiencies of a market that does not properly “clear.” Furthermore, “Transmission congestion . . . increases consumer costs by frequently denying low-cost transactions in favor of high-cost transactions,” according to a government study. (Energy 2002) However, transmission congestion may be preferred by producers who can sell at higher costs, and this became apparent in the California scandal. (This section draw principally on the first popular book to examine Enron after the scandal broke, Power Failure: The Inside Story of the Collapse of Enron, by Mimi Swartz with Sherron Watkins [2003].)

For Enron, this started with natural gas. Instead of simply trading gas—that is, buying it with a short- or long-term contract from a producer and selling it to a distributor, it started making financial trades with suppliers and distributors based on the movement of the market for gas in general, not particular amounts or sources. It began dealing in energy derivatives, that is, swaps, options, futures, arbitrages (buying an item in a place where it is cheap and selling it to a place which will pay more), and so on. Government oversight regulated such transactions in the gas market through the Commodity Futures Trading Commission, since derivatives presented many opportunities for abuse. But in January 1993, just before the Clinton administration came into office and replaced the first Bush administration, the chairperson of the commission secured a vote that exempted energy derivatives and related swaps from government oversight. The exemption was to have historic consequences. The chairperson securing it was Wendy Gramm, wife of senator Phil Gramm (R-TX), who was a close friend of Kenneth Lay, the CEO of Enron, and a proponent of free markets. When the Clinton administration took over a few days after the vote, Mrs. Gramm left the Commission and joined Enron’s board, earning a $50,000-a-year salary, stock options, and other cash benefits. (68) Freed of oversight and regulations, Enron began trading heavily in energy markets, exhibiting the dynamism that led to a glowing Harvard Business School case, and to Fortune and Business Week heaping praise on the “CEO of the year” or the “company of the year” in the late 1990s.

Enron’s creative interpretation of a new accounting rule (Financial Accounting Standards no. 125) issued in June 1996, allowed it to effectively book all the profit streams expected from a power plant purchase over the next several years in just one year. By buying up plants each quarter and declaring on its balance sheet the profits that it anticipated over the next several years, it could show quarterly profits, even if the plant failed to produce the profits in succeeding years or even failed entirely. (136) It moved into the electric generation business, showing astounding growth and (questionable) profits. (It failed in its attempt at energy delivery and related services.) But the main profits came from trading, free of oversight as a result of the 1993 ruling. Enron traders inserted themselves between enough buyers and sellers of power to make electricity-trading revenues soar, even if gross margins were still negligible. The revenues looked good—it did a lot of business— though a careful look at the actual profit on the business would show it was minuscule. (140)

But something more attractive than revenues was possible. With a lot of power under contract, and an understanding of the key nodes on the grid that the power had to pass through if there were disturbances in the grid, they could sell their power at crisis prices, reaping huge profits. Initially, in June 1998, Mother Nature provided the disturbance. A severe heat wave hit the Midwest a month early, unexpectedly catching several power plants off-line, since they were getting serviced in preparation for the coming hot weather. Then storms took out some of the backup plants. Some small utilities were caught short and defaulted on supply agreements, and “power that normally sold for $20 to $40 per mega-watt-hour under long-term contracts suddenly hit $500 per megawatt hour.” Worse yet, the new free-market rules had attracted many small power marketers, with contracts to sell small amounts from small utilities to the big utilities. They began defaulting as well; they didn’t have access to the power they were contracted to sell. The big utilities had to turn to the spot market, and now Enron, which owned many of the producers under the consolidation that deregulation had promoted, had much of the spot market. They could wait until prices spiked and sell the power on the spot market. “Prices jumped as high as $7,500 per megawatt hour in a matter of minutes.” One Enron trader made $60 million for the company in one day. Enron was awash in “a sea of cash” even as other energy traders, some quite large but perhaps not as aggressive, failed and left the business. (141) The year was 1998.

Early heat waves and storms were not new, but this crisis was new. Before deregulation, “utilities bought and sold electricity to one another at reasonable prices when such regional shortages appeared.” (141) Now there were middlemen between producers and retailers, and the powerful ones were in the same organization that controlled many producers.

The next California crisis was not caused by weather disturbances but by traders, and it was more severe. In 2000 and 2001, in addition to huge price jumps, there were four days of rolling blackouts, and the state’s largest utility, Pacific Gas and Electric, was forced into bankruptcy. (It is estimated that California’s experiment with deregulation has cost the state more than $70 billion.) Enron, Senator Gramm, White House officials, respected academics such as economist Paul Joskow (he heads a powerful energy research center at MIT, at the time funded in part by Enron), and government bureaucrats all gave reasons for the crisis that were demonstrably not true. There was no extravagant usage by the state’s citizens as they claimed—California ranked as the second-most efficient energy consumers in the nation. In one crisis month, July 2000, they were using less energy than in July 1999, and there was no blackout then. Demand for energy never exceeded the state’s capacity. Power usage on blackout days was lower than in previous years. The state had added 170 new generation and cogeneration facilities in the 1990s, giving it enough power to meet its growing needs. It was said that there were inadequate transmission facilities, which is probably true, but that also had been true in the years with no blackouts and high amounts of wheeling.

But there was something new about the transmission lines: traders sat on them and they could be squeezed. Deregulation had created middlemen who profited by buying cheap and selling dear, but some of the middlemen were from companies like Enron and El Paso that also controlled the production of power. Swartz and Watkins tell a part of the story so well I will quote them. They mention Timothy Belden, an Enron star trader (who later pleaded guilty to some minor charges in return for cooperating with federal investigators). On May 24, 1999,

 

Belden tried to send an enormous amount of power over some aged transmission lines—2,900 megawatts of power over a 15-megawatt path. Such a plan was destined to cause congestion on the line. California had an automated response to overloaded lines. Immediate electronic requests went to all the state’s suppliers—Do you have power coming across this line? Can you remove it? We will pay you to take it away! But on this day, there also happened to be a human watching the wires for California’s Independent System Operator, who couldn’t imagine why someone would send so much power over such a small line. “That’s what you wanted to do?” the dubious operator from the ISO asked when she called to check to be sure that the transaction had not been requested in error.

When Belden replied in the affirmative—“Yeah. That’s what we did.”—she became even more incredulous. “Can I ask why?” (Swartz and Watkins 2003, 240)

 

Belden could not give an explanation, but agreed “it makes the eyes pop, doesn’t it?” She said she would have to report the transaction to the power grid regulators because it seemed pointless, and Belden concurred. But from Enron’s perspective it was not pointless; it drove the price of electricity up 70 percent that afternoon. Enron was fined $25,000 a year later after an investigation, but Enron had cleared $10 million that day, and California customers overpaid by around $5.5 million. “California’s human-free automated system was completely dependent on the honesty of the power suppliers.” (240)

Enron and others were happy to pay the trivial fines when they were occasionally leveled. Promised lower rates and better service from deregulation, California’s wholesale electricity rates jumped 300 percent, and Enron, Reliant, El Paso, and Dynergy reaped huge profits. (241) Amin’s “intelligent agents” can be programmed to commit fraud. Something other than a technical solution to the transmission problem is needed.

The famous cap that California’s regulatory plan had put on prices the power suppliers could charge customers, as inadvisable as that may have been, was not the problem, because it did not interfere with gaming of the system once they had engineered volatility. When the state got the Federal Energy Regulatory Commission (FERC) to install price caps on the suppliers, the out-of-state suppliers such as Enron immediately vacated the scene. By December 2000, the lights were going out, and California asked FERC to lift the caps so that power could come in again, which it did. The power came back at even higher prices than before, even though there was actually no shortage. One Enron executive became worried about what they were doing and wrote an eight-page memo detailing the tactics being used: false congestion on power lines, such as Belden engineered, transferring power in and out of the state to avoid price caps, and charging for services the company never actually provided. This was about the same time that Texas senator Phil Gramm said in an interview with the Los Angeles Times that Californians suffering “the consequences of their own feckless policies,” and vowed, “I intend to do everything in my power to require those who valued environmental extremism and interstate protectionism more than common sense and market freedom to solve their electricity crisis without short circuiting taxpayers in other states.” (243) And renowned professor Paul Joskow, in a New York Times Op-Ed piece chided California for not building enough power plants and praised deregulation. (243) Deregulation had certainly helped one of the contributors to his MIT energy center, Enron.

Disclosures as early as the spring of 2002 made it clear that the professionals running the West Coast grid were ordered to curtail production, to make inefficient and unnecessary circular trades, and to misrepresent carrying capacity, using the wholesalers’ market control to increase profits. See the several early news accounts of deliberate destabilization by Enron in the California case. (Kahn 2002; Oppel 2002; Van Atta 2002: Berenson 2002)

THE LITTLE ENGINE THAT COULD

The unraveling of Enron, Reliant, El Paso, and other energy companies might not have occurred were it not for a small public utility district thirty miles north of Seattle. The Snohomish district had signed a nine-year contract for power at four times the usual cost in January 2001, when, like areas in California, it was experiencing blackouts. When Enron collapsed and filed for bankruptcy protection Enron sued the district for $122,000 for canceling what it considered an illegitimate contract. Rather than collect $400 per customer to pay the fine to Enron, the Snohomish district searched for evidence of the illegal activity at Enron. Snohomish lawyers tracked down recordings seized by the FBI from Enron’s western trading hub in Portland, Oregon. After a short legal tussle with the Justice Department, the utility was granted access in return for sharing the transcripts with law enforcement, since Justice had no intention of examining the evidence. The district hired a consulting firm, which recruited three people to pore over 2,800 hours of tapes. Their quotes made the national news when the district went to court with them. For example, one trader asks: “Do you know when you started overscheduling and making buckets of money on that?” Traders were shouting “Burn, baby, burn” when a forest fire threatened energy supplies. Traders joked about stealing money from California grandmothers and about the possibility of going to jail for their actions. One transcript does us the service of explaining “arbitrage.” It goes:

 

“He just f–––s California,” says one Enron employee. “He steals money from California to the tune of about a million.”

“Will you rephrase that?” asks a second employee.

“OK, he, um, he arbitrages the California market to the tune of a million bucks or two a day,” replies the first.

 

(Conversations were recorded because they have evidence of verbal contracts which the firms want to keep.)

It is interesting that the little utility district had scant cooperation from federal authorities. The utility went to search the Enron’s warehouses in Dallas, and turned up evidence the government first claimed it never had and then said it did not have the resources to transcribe it. But the “little engine that could” did the whole job for about $130,000. (Peterson 2005; Egan 2005; Gonzales 2004)

FERC in particular did little to speed up the investigation, and indeed might be charged with suppressing it. The failure of many of our regulatory agencies has increased since the deregulation movement in the latter part of the twentieth century. By 2001, or perhaps earlier, FERC had transcripts of conversations between another power wholesaler, Williams Companies, and a power plant operator in California, showing the two conspiring to shut down a power plant for two weeks to boost electricity prices and Williams’s profits. FERC kept the evidence under wraps for a year and cut a secret deal with Williams to refund California $8 million it obtained through the scam without admitting any guilt. (Leopold 2005) It was a nice deal for Williams; it precluded any criminal charges, or any publicity, and the company probably made much more than $8 million in the two weeks. However, Houston-based Reliant Energy was not so lucky; it faces millions in fines and prison time for four of its executives, as of December 2005.

It was not until March 2005 that the commission determined that Enron was engaging in illegal activity at the time it entered into exorbitant contracts with the west coast states. It was the first time the commission has acknowledged that the contracts were signed under fraudulent pretenses, though they had evidence that at least the Williams company had done so since 2001. It blamed the Federal Power Act, and rightly so; the deregulation legislation overlooked the obvious possibility of gaming the system. In a December 2005 report to Congress, the Commission noted that the legislation governing energy markets “did not address market manipulation and there was little in the way of penalty authority. If the express prohibition of market manipulation had been in place then, it is very possible that it would have deterred market participants from manipulating the market because they would have known the serious consequences of their actions.” (Commission 2005) The commission was required by Congress to investigate market manipulation and it concluded sixty-eight investigations. But the only civil remedies that could apply involved refunds and “disgorgement of unjust profits.” By December 2005, it had facilitated settlements of more than $6.3 billion, others had not been settled and lawsuits are still pending. Despite the flaws of the 2005 energy bill, it does contain a provision that is supposed to forbid market manipulation. One wonders why such a provision was not in the law that established deregulation of electric power.

I should stress that this is not a “bad apple” case. The attack on the regulation of electric energy goes back at least to the time when Senator Lyndon Johnson accepted large campaign contributions from Texas energy companies in return for deposing a highly effective official that headed the principal federal energy agency of the time. (The dramatic story is well told in Robert Caro’s Master of the Senate [2002].) Nor was Enron alone; there were other energy companies such as Dynergy, El Paso, Williams, and Reliant rigging the market. Nor will these be the last, judging from the weak regulatory provisions of the 2005 energy bill. Free markets only work when entry is easy, guaranteeing there will be many sellers to give consumers a variety of choices, and when collusion is difficult. This does not obtain in the energy market.

According to an article by Jason Leopold, the 2005 energy bill was shaped by Reliant Energy, even while it faced millions in fines for rigging the California market in 2001–2002. “House Energy Committee Chairman Joe Barton has for years pushed for legislation to create more competitive electricity markets. Reliant, whose executives enjoy close a relationship with Barton, played a role in shaping the energy bill, and worked side by side with Barton for years in shaping a portion of the energy bill that deals with competitive power markets.” The cozy relationship between Congress and corporations is illustrated by the fact that Barton’s staff includes two former Reliant executives, and until recently, Reliant’s lobbying team included two former Barton aides. Joe Allbaugh, until recently the director of FEMA, was active in energy policy in 2000 when California was suffering, receiving updates on the crisis from President Bush’s economic adviser, Lawrence Lindsey, himself a former member of Enron’s advisory board. Reliant Energy hired Allbaugh’s wife as a lobbyist, and she received $20,000 for consulting work during the last three months of 2000 from Reliant, along with equal amounts from two other involved energy companies, TXU and Entergy (which we met in the previous chapter). She was consulting with them while her husband was getting updates from Lindsey, Bush’s adviser. Reliant need not worry about the expensive lobbying fees. Its stock has increased nearly tenfold since California’s crisis. Reliant employees showed their loyalty to House Energy Committee chairman Barton by contributing more than $35,000 to his political causes between 2001 and 2004. In addition, Barton holds about $15,000 in Reliant stock. (Leopold 2005)

From such networks of power, our energy policies will be made. Without campaign financing and lobbying reform, our electricity supply is not likely to be more secure.

DEREGULATION AND THE 2003 BLACKOUT

After the 2003 blackout, extensive investigations by a joint U.S.Canadian task force criticized the Ohio utility and virtually everything else about the system. It called for Congress to create tough mandatory reliability standards for electric utilities—with penalties for companies that violate them—to prevent massive power failures. The North American Electric Reliability Council (NERC) supervises this part of the grid, but its standards are voluntary, and it has been accused of being too cozy with its member utilities. However, the joint task force declined to consider whether deregulation played a role in the massive outage (though it called for tough new regulation, an implicit acknowledgment of deregulation’s failure), postponing this inquiry for a future time. As of 2006, the inquiry into the role of deregulation had not taken place, though there have been some stormy conferences devoted to this question. As NERC is financed by the utilities who have done very well under deregulation, it is not a topic it is likely to explore on its own.

Predictably, industry participants said that the blackout resembled those of previous decades and was unrelated to deregulation, while critics of the industry argued that competition resulted in large cuts in professional staff, cuts in maintenance expenditures, and more extensive wheeling of electricity over inadequate and poorly maintained transmission lines. (Wald 2005a) Predictably, New York Senator Charles Schumer, a Democrat, said that NERC’s recommendations were only “baby steps.” The blackout should have been “a big wake-up call, but the commission only hit the snooze button,” he said. (Funk, Murray, and Diemer 2004)

It is expected that NERC, made up of the utilities, will be given the job of establishing mandatory regulations regarding the maintenance and operation of the grid. It is very unlikely that it will restrict the long-distance transmission of electricity, which provides such handsome revenues for the wholesalers. Until the physics of the grid can be made compatible with this usage, such restriction is what is needed.

We can expect more serious outages. In fact, we came close to one in 2005. What Canadian engineers called a “rare glitch” occurred in May. Two apparently independent protection systems failed, and the cascading failures across Ontario came very close to extending into the United States as extensively as the August 2003 blackout. Predictably, the utility said that the short blackout proved that this system was safe. The blip was contained and handled very well, it said. Engineers at the utility (who incidentally happened to be on strike at the time), said that rather than being an encouraging sign, it was just the opposite, a warning of how vulnerable the system was. They pointed out that the system was only about eighteen seconds from causing a massive two-country blackout, and their analysis was confirmed by outside experts. There were four major swings before the system stabilized, according to reports filed with the Northeast Power Coordinating Council. Had it been a hot day, an engineer noted, cascading failures would have been very likely. (Struck 2005)

Errors are inevitable in complex systems, and we have safety devices to prevent their spread. But this close call was closer than most. One of the two independent failures was a switching mistake by an operator after routine maintenance that caused a massive short-circuit in a line carrying 500,000 volts. There is a protection for such an expected error, but the circuit breaker took three times as long as it should have to operate—we are talking about milliseconds here—putting a massive strain on the system. The system survived, said a utility manager, but just by good luck. (Struck 2005) We can’t count on luck forever.

CONCLUSIONS

Electric power is vital to our nation. Increasingly, everything in our critical infrastructure depends on it. It is second only to concentrations of hazardous chemicals in its vulnerability to a terrorist attack. A sophisticated terrorist attack on our power system could include consequences such as the release of hazardous chemicals and much, much more. Industrial accidents and extreme weather can shut down the electric power grid for two, three, or four days; but terrorists could damage machinery that would take months to repair because much of the equipment is customized. The grid is vulnerable to heat waves, ice storms, floods and hurricanes, and to the inevitable equipment failures and employee mistakes that plague all complex, tightly coupled systems.

Paradoxically, the power grid is also one of four examples of a system design that offers promise for other systems in our critical infrastructure. Deregulation since the 1990s has altered the design considerably, but we can learn from many features still present in the power grid. The network evolved gradually in order to provide reliability. Independent generating stations voluntarily linked together and provided transmission facilities. The result was a decentralized system of transmission with independent control agents and increasingly sophisticated automatic control devices, such as the relays. Price competition did not exist, and rates and profits were controlled by local governments. Prices fell every decade. Transmission of power from one utility to the other was limited to balancing local demand; a power shortage at one utility could be made up by another utility at a reasonable price, since the favor might flow the other way at another time. Research and development was reasonably adequate, and utilities supported nonprofit technical institutions.

Deregulation and competitive pressures were introduced in order to increase efficiency and lower prices. Neither of these occurred; the mainstays of efficiency—maintenance and a professional work-force—have both declined, and after a long-term secular trend of lower prices, the cost of electric power has risen since deregulation. Reliability has arguably declined. While massive outages are fairly rare, so that we will need two or three decades under deregulation to estimate its reliability effects, our biggest one occurred in 2003 and one of a similar size almost occurred in 2005. In general, as a result of deregulation, more and more power is being pushed farther and farther over transmission lines that can barely handle it. The design of the deregulated system did not provide incentives for maintaining, improving, or adding transmission lines. With intense competition, the lines are no longer seen as a common resource that all must invest in and not misuse; investment comes out of profits. Line misuse has been widespread, as evidenced by the California crisis, and it may have contributed to the 2003 blackout.

Deregulation in the electric power industry was prompted by the same shift in ideology that led to deregulation of the airlines, trucking, and some other industries. But political interests and industry lobbying dominated the deregulatory effort. Unsavory practices by Texas Republicans can be related directly to the Enron scandal. With close ties to a House committee chairman, Reliant Energy participated in the drafting of the 2005 energy bill, even while under indictment for its illegal actions in California in 2000–2001.

Is it possible to have a grid that realizes the advantages of longdistance transmission and still be reliable and secure? Assuming that the power loss of long-distance transmission can be reduced, we would need two things. First, there would have to be more investment in electronic devices, such as Amin and others are projecting, in order to remove the grid from its “third-world” status. The investment should come from the electric power industry, which has been quite reluctant to make it. Second, much more regulation is required to prevent gaming by power wholesalers. This is not a market with easy entry and little chance of collusion. These changes would improve reliability.

However, the changes would do nothing to decrease the size of the targets available to terrorists. Some of electric power generation is being deconcentrated as industries with excess power can sell it to power companies. But there are only small numbers of these, and the generating stations are miniscule. Concentration is increasing as more massive power stations are being built by ever larger comglomerations of power generators. Though I doubt that these targets are as attractive for terrorists as chemical plants and nuclear power plants, they are even less protected and their disruption could wreck large swaths of our economy for many months. What is unreliable is also insecure, and what is insecure is also unreliable. We are utterly dependent on electric power. Whether the threat is from nature, bad management, or dedicated terrorists, we should not tolerate the concentration of this most vital element of our critical infrastructure.

1Here is how a government report explains it: “The interconnection of the transmission grid makes management a difficult and challenging task. One of the biggest problems is managing parallel path flow (also called loop flow). Parallel path flow refers to the fact that electricity flows across an electrical path between source and destination according to the laws of physics, meaning that some power may flow over the lines of adjoining transmission systems inadvertently affecting the ability of the other region to move power. This cross-over can create compensation disputes among the affected transmission owners. It also impacts system reliability if a parallel path flow overloads a transmission line and decisions must be made to reduce (curtail) output from a particular generator or in a particular area. An RTO [Regional Transmission Organization] with access to re-gion-wide information on transmission network conditions, with region-wide power scheduling authority, and with more efficient pricing of congestion can better manage parallel path flows and reduce the incidence of power curtailment.” (Administration 2000a)

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.144.194