The Computer and the Dynamo
By Brian Hayes
How much power do the world's computers consume?
How much power do the world's computers consume?
DOI: 10.1511/2001.34.390
Blackouts were rolling across California last winter when I first began to hear stories about the gluttonous energy appetite of computers, and how Silicon Valley might be partly to blame for the power crisis. Computers and the infrastructure of the Internet, the reports said, were consuming 8 percent of the nation's electricity supply. Or maybe the figure was 13 percent. In any case, by 2010, fully half of all electricity generated in the U.S. would go to keep computer hardware humming.
I first heard these numbers mentioned—without explanation or attribution—in a television newscast. They have turned up in many other places as well, from USA Today and the Wall Street Journal to Computer (the magazine of the IEEE Computer Society). They have been cited in testimony before various Congressional committees. And during the 2000 presidential campaign, George W. Bush quoted the 8 percent figure in a speech on energy policy. His remarks were promptly echoed in a Doonesbury cartoon.
The estimates of computer and network power consumption struck me as quite remarkable. If they were correct, we were approaching a notable inflection point in human affairs, where we expend as much effort in moving information as we do in moving matter. But I had my doubts about those numbers. Bits are so much lighter than atoms. Perhaps a decimal point had slipped out of place. Could it really be true that roughly a tenth of the output of all those gargantuan power plants was being squeezed through the finespun filigree of conductors on silicon chips? It seemed preposterous—but, then again, something like a tenth of all electricity squeezes through the finespun filaments of lightbulbs. The question was not to be answered by mere hand-waving.
It wasn't hard to trace the story back to its source. Typing the words "8 percent computer electricity consumption" into a search engine produced lots of leads. (The search engine told me how many seconds it spent on the query, but not how many kilowatt-hours.) All trails led to Peter Huber and Mark P. Mills, writers and consultants who publish a newsletter called Digital Power Report. In particular, I was directed to an article of theirs titled "Dig More Coal—The PCs Are Coming," which appeared in Forbes in 1999.
The Forbes article includes all the essential elements of the story: "At least 100 million nodes on the Internet, drawing from hundreds to thousands of kilowatt-hours per year, add up to 290 billion kWh of demand. That's about 8% of total U.S. demand. Add in the electric power used to build and operate stand-alone (unnetworked) chips and computers, and the total jumps to about 13%. It's now reasonable to project that half of the electric grid will be powering the digital-Internet economy within the next decade."
Other statements of Huber and Mills are no less electrifying. Utilities have to burn a pound of coal, they calculate, for every two megabytes of data moving across the Internet. A "server farm" housing computers that serve Web pages has the power needs of a small steel mill. And then there is their most provocative claim: A Palm Pilot connected to the Internet consumes as much energy as a household refrigerator. (Of course that power doesn't come out of AAA batteries; it's the handheld unit's share of the power used by Internet routers and servers.)
Unfortunately, Huber and Mills don't always make it easy to trace the line of reasoning that led them to their conclusions. As far as I can tell, the Palm Pilot–refrigerator equation is not explained anywhere. The extrapolation from 8 or 13 percent today to 50 percent a decade hence is also presented without any supporting documentation. (Moreover, another version of this prediction says 30 to 50 percent in two decades.)
The longest and most explicit presentation of these ideas appears in a report titled The Internet Begins with Coal, authored by Mills alone. Even there, however, certain blanks remain unfilled. A crucial starting point for the numerical estimates is an assumption that "your typical PC and its peripherals require about 1,000 watts of power," but the documentation for this number is vague and confusing. A footnote mentions as one source of information an "online configuration tool" provided by a manufacturer of uninterruptible power supplies, but that tool's rating for the configuration that Mills discusses is not 1,000 watts but only 205 watts. Mills then remarks: "The 1,000 W figure for the PC nominally accounts for the power needs of otherwise unaccounted microprocessor devices on the network." Those devices remain unaccounted, so that four-fifths of the power drain attributed to PCs comes from unidentified "behind-the-wall components."
Brian Hayes
This shadow world of unseen power loads is also mentioned in the Forbes article. "For every piece of wired hardware on your desk," Huber and Mills write, "two or three pieces of equipment lurk in the network beyond—office hubs and servers, routers, repeaters, amplifiers, remote servers and so forth." Taken literally, this statement implies that there are 200 or 300 million hubs, routers, etc.—two or three for each of the 100 million desktop computers that Huber and Mills count as being connected to the Internet. The preponderance of hidden devices is hard to fathom, since the Internet has a treelike structure, in which the leaf nodes—desktop PCs—ought to be more numerous than machines along the trunk and branches. Elsewhere, Huber and Mills themselves assume there are only 7 million routers and Web servers.
Such gaps in documentation are no definitive refutation of the Huber and Mills thesis, but they don't inspire great confidence. Neither does the provenance of the report. The Internet Begins with Coal was published by an organization called the Greening Earth Society, where Mills serves as science advisor. The name of this group might evoke images of the Green Party and Greenpeace, but the agenda is rather different. The name reflects a conviction that higher concentrations of atmospheric CO2 are "beneficial to humankind and all of nature," because they promote plant growth. The society was created by the Western Fuels Association, a consortium of electric utilities and coal-mining companies, whose main business, of course, is digging up C and combining it with O2. Both the society and the association argue that coal-fired power plants will remain essential to continued prosperity; in particular, they disparage the notion that the Internet will usher in a new economy without smokestacks, where demand for electricity would remain static or decline.
The fact that an argument serves the publisher's economic interest certainly does not invalidate the argument. But when interests and arguments are so closely aligned, readers can be expected to give the supporting evidence rather careful scrutiny.
The mirror image of the Greening Earth Society is the Center for Energy and Climate Solutions, a division of the Global Environment and Technology Foundation, which describes itself as "a nonprofit dedicated to building the infrastructure for sustainable development." The center's director, Joseph Romm, presents a view diametrically opposite to that of Huber and Mills. The Internet has not inflated energy use, he says, but instead produces a net savings of energy, mainly through "dematerialization." For example, software delivered online saves energy that would have been expended on manufacturing and shipping. Other efficiencies of electronic commerce allow companies to reduce inventory levels, saving the energy needed to build and operate warehouses. Telecommuting saves gasoline.
The trouble is, measuring these diffuse effects of computer and communications technology is probably even harder than documenting the direct electricity demand of the Internet. Furthermore, the two trends are not mutually exclusive; in principle, Romm and Mills could both be right. The Internet economy could be saving energy overall but still consuming more electric power—in effect siphoning energy out of automobile gas tanks and dumping it onto the coal heaps of electric generating stations.
On the specific issue of electricity consumption, Romm points out that growth in demand actually slackened at just about the time the Internet boom began. Before 1996, according to Romm's figures, electricity output was growing at 2.9 percent per year, but since then the growth rate has been only 2.2 percent. Therefore, if computers and the Net have suddenly introduced enormous new loads, other uses of electricity must have held steady or declined.
The governmental body charged with gathering statistics about the production and consumption of electricity is the Energy Information Administration, or EIA, within the Department of Energy. Their figures on consumption come from surveys of three sectors—residential, commercial and industrial—which each consume about a third of the nation's kilowatt-hours.
The EIA data on computer power demand were summarized in February 2000 by Jay E. Hakes, who was then the Administrator, in Congressional testimony. In the residential sector, Hakes said, PCs account for about 2 percent of electric power consumption, and in the commercial sector about 3 percent. Because computers are not a significant factor in the industrial energy budget, the computer's share of total electricity use works out to 1.6 percent.
One might suppose that the government statistics would carry enough weight to put an end to the argument, but there are enough complications and inconsistencies to leave room for doubt. In the residential survey, a PC was defined as a CPU and a monitor, but printers and other accessories were relegated to a different category, "electronics," which also included some audio and video equipment. The electronics category accounted for 10 percent of residential electricity use—five times the PC segment. How much of that 10 percent should be allocated to computer peripherals? In the commercial survey, laser printers were included in the computer category, but "Internet-related infrastructure equipment" was counted under another heading. Although it seems implausible that any rearrangement of the data could make up the difference between 1.6 percent and 8 or 13 percent, the differences in classification make comparisons awkward.
Another group of energy analysts has undertaken a direct rebuttal of the Huber-Mills thesis. At the Lawrence Berkeley National Laboratory, Jonathan G. Koomey heads the Energy Analysis Department of the Environmental Energy Technologies Division, which has carried out numerous studies of energy consumption, mostly funded by the Department of Energy and the Environmental Protection Agency. When the Forbes article appeared, Koomey immediately disputed its conclusions, citing data from his own group's survey of energy use by computers and other office equipment.
Koomey and his colleagues question nearly all the assumptions that underlie the Huber-Mills energy estimates, starting with the power demands of individual machines. A desktop PC is not a 1,000-watt device, Koomey says, even if the nameplate attached to the chassis gives a rated power in this range. For a 500-megahertz Pentium III computer and a 17-inch monitor, Koomey's measurements indicate that power demand is no greater than 150 or 200 watts, even including a share of the electricity consumed by a workgroup laser printer. For the somewhat larger computers used as servers, Huber and Mills specify 1,500 watts, and Koomey reduces it to 300. In the case of mainframe computers the disparity is even greater. For these machines Huber and Mills adopt a figure of 250 kilowatts (half for the computer itself and half for air conditioning). Koomey finds that only exotic supercomputers with hundreds of processors approach this level of power use, and that a more realistic estimate for a typical mainframe is 10 or 20 kilowatts.
There are also disagreements about counting. Relying on a compilation of computer sales statistics, Mills asserts that the inventory of computers in use is growing by 40 million a year. Koomey points out that some fraction of the new computers are not additions to the stock but replacements for retired equipment. (The EIA says that computers in offices are nearing "saturation," with four computers for every five employees.)
In August 2000 the LBNL group released a new report on power consumption by computers and network equipment, with further supporting data published in February 2001. The survey includes energy used by all kinds of office machinery, including not only computers and their peripherals but also unrelated devices such as copiers and fax machines. The conclusion: The entire spectrum of equipment dissipates 74 terawatt-hours per year, which is about 2 percent of U.S. electricity consumption. Adding in an allowance for a few other items that Huber and Mills count (such as the energy needed to manufacture computers, and a share of the energy consumed by telephone switchgear) brings the total to 3.2 percent—still only a fourth of the 13 percent level claimed by Huber and Mills.
The debate between Huber and Mills and their critics has been conducted via letters to the editor, press releases and public e-mails. The tone has not always been collegial. In February 2000, when Mills testified before the House Subcommittee on National Economic Growth, Natural Resources, and Regulatory Affairs, he seized the opportunity to defend his ideas against the attacks by Koomey and others. Koomey responded with a memo offering further rebuttal. He wrote: "In the past year and a half, I have been witness to an extraordinary event: an analysis based on demonstrably incorrect data and flawed logic has achieved the status of conventional wisdom, in spite of my and my colleagues' best efforts to refute its assertions. The results continue to be cited by an unsuspecting press, and even by people who ought to know better."
A further year and a half later, the "conventional wisdom" is still very much in circulation. A few weeks ago, Roger N. Anderson of the Lamont-Doherty Earth Observatory published an op-ed essay in the New York Times alluding to the Huber-Mills conclusions, with no hint that they might be controversial.
After reading so many disparate claims about the wattage of PCs, I became curious about my own computers' contribution to the energy budget. If national statistics are hard to pin down, maybe I could at least figure out what proportion of my own household electric bill feeds my digital habits. To make a first crude estimate, I went around the house with a flashlight, crawling under desks to read the power ratings on nameplates. I knew that the result of this exercise would be an overestimate; in fact it proved useless even as an upper bound.
Brian Hayes
Adding up all the nameplate data suggested that the computer equipment in my home could draw as much as 2,400 watts. This is a fair amount of power; it would fully load a 20-ampere circuit. That the actual consumption can't be anywhere close to this level became apparent when I dug out the old utility bills. For the past two years the average electricity demand for the whole house was 868 watts—roughly a third of the nameplate rating for the computer equipment alone.
Obviously, I needed a better measurement technology. To this end I was aided by Ethan Brand, of Brand Electronics in Whitefield, Maine, who lent me a digital power meter that measures both demand in watts and cumulative energy consumption in watt-hours. After crawling back under the desk to plug in this instrument, I soon had a clearer picture. A computer rated at 400 watts actually draws about 50 watts in active use; in its standby or "sleep" mode the power consumption falls to 3 watts. The color monitor attached to this computer uses more power than the CPU—97 watts in active mode, 6 watts when sleeping.
Adding up figures for all of the monitored equipment, I found that the most I could manage to consume was about 700 watts—and I could get near that level only by having three computers simultaneously spit out pages from three printers, while at the same time I scanned a photograph and burned a CD-ROM. The typical wattage reading, at times when I was working, ranged from 150 to 170 watts. Note that this is right on target according to Koomey.
But that's not the end of the story. I left the monitor attached to the big bundle of power cords under the desk, measuring total energy consumption over 10 days. In 240 hours, some 18,540 watt-hours of electricity flowed through the meter. That works out to an average demand of 77 watts, which implies that computer equipment is responsible for some 9 percent of my electricity consumption.
Of course it would be foolish to extrapolate from my home office to the entire national economy. I have no reason to believe my experience is representative; on the contrary, as someone who writes about computers, I surely spend more time at the keyboard than most people do. But, the fact remains, the proportion of my electric bill that goes to bit-shuffling is far greater than I ever would have guessed. I had assumed that my power consumption would be so dominated by air conditioning, refrigeration, water heating and lighting that the computing load would barely be detectable. Now I know otherwise.
For Huber and Mills, the power demands of computers and the Internet signal inevitable future growth in overall electricity consumption. But you can equally well look at the situation as an opportunity for conservation. If computers really were responsible for some large fraction of the nation's energy consumption, then measures to make the machines more efficient would have a major impact. And even if computers do not make up such a fat slice of the energy pie, they may nonetheless be an attractive target for conservation measures, because there's so much room for improvement. Cutting another 10 percent off the energy consumption of a refrigerator or a water heater is an engineering challenge, but the energy per operation and per device in computers has been falling for decades, and that trend will surely continue. Indeed, there is no intrinsic limit to it; in principle, computing can be done without energy loss.
In the 1940s, the ENIAC had 18,000 vacuum tubes and consumed 174 kilowatts—roughly 10 watts per tube. If modern silicon chips required 10 watts per transistor, a Pentium would suck 100 megawatts out of the power grid, and the computer on your desk would easily swallow the entire output of a nuclear plant. Instead, a few tens of watts are enough to power a chip with transistors numbering in the tens of millions.
For the most part, these gains in transistor efficiency have been exploited not to minimize the power consumption of each chip but to maximize the number of transistors per chip, while keeping the power density just below the melting point. There is little incentive to do otherwise. This is the muscle-car era of computer design; what sells hardware is performance, not fuel economy. Until the electric bill for running a machine begins to approach the purchase price, no one is going to care much about energy consumption.
But energy use in computers may well decline anyway, even without economic impetus. Laptop computers offer a proof by example that electric power can be reduced by an order of magnitude without greatly impairing computer power. Some of the technologies and components of laptops will find their way into desktop machines, again not because lower power consumption is a selling point but because smaller fans, heat sinks and power supplies can save the manufacturer a few dollars. Flat-panel displays are already migrating from laptops to desktops.
Power-management systems that put idle machines and monitors to sleep have also had an effect. In the 2000 office-equipment survey, Koomey and his colleagues find that such reduced-power modes save 23 terawatt-hours per year; if everyone used the sleep modes, they would save another 17 terawatt-hours.
At the end of the day, I am left with the sense that the issue of computer power consumption is not going to be the determining factor in national energy policy. Even if the Huber-Mills analysis were correct, most of our kilowatts would still be flowing into the real world, not into cyberspace. Fluctuations in prices and the weather have a bigger impact on power demand than any conceivable events on the World Wide Web. Computers will not save the coal mines, nor will they save the planet.
But efficiency is more than a matter of economics and industrial policy; it has an aesthetic aspect, and even an ethical one. As Vice President Cheney recently observed, energy conservation is a sign of personal virtue. There is satisfaction in accomplishing more with less, in wringing the most results out of the least resources. For a long time this was a prominent strand in the mental habits of computer enthusiasts. To waste a CPU cycle or a byte of memory was an embarrassing lapse. To clobber a small problem with a big computer was considered tasteless and unsporting, like trout fishing with dynamite. Not even rolling blackouts will roll us back to that quaint age of frugal computing, but there is much to admire in its ethos.
© Brian Hayes
Click "American Scientist" to access home page
American Scientist Comments and Discussion
To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.
If we re-share your post, we will moderate comments/discussion following our comments policy.