Ecocentric

The Surprisingly Large Energy Footprint of the Digital Economy [UPDATE]

Our computers and smartphones might seem clean, but the digital economy uses a tenth of the world's electricity — and that share will only increase, with serious consequences for the economy and the environment

  • Share
  • Read Later
Y.C via Getty Images

A server room at a data center. One data center can use enough electricity to power 180,000 homes

Which uses more electricity: the iPhone in your pocket, or the refrigerator humming in your kitchen? Hard as it might be to believe, the answer is probably the iPhone. As you can read in a post on a new report by Mark Mills — the CEO of the Digital Power Group, a tech- and investment-advisory firm — a medium-size refrigerator that qualifies for the Environmental Protection Agency’s Energy Star rating will use about 322 kW-h a year. The average iPhone, according to Mills’ calculations, uses about 361 kW-h a year once the wireless connections, data usage and battery charging are tallied up. And the iPhone — even the latest iteration — doesn’t even keep your beer cold. (Hat tip to the Breakthrough Institute for noting the report first.)

[UPDATE: You can see the calculations behind the specific iPhone comparison, which was done by Max Luke of the Breakthrough Institute, at the bottom of the post. It’s important to note that the amount of energy used by any smartphone will vary widely depending on how much wireless data the device is using, as well as the amount of power consumed in making those wireless connections—estimates for which vary. The above examples assumes a relatively heavy use of 1.58 GB a month—a figure taken from a survey of Verizon iPhone users last year. (Details at bottom of post.) That accounts for the high-end estimate of the total power the phone would be consuming over the course of a year. NPD Connected Intelligence, by contrast, estimates that the average smartphone is using about 1 GB of cellular data a month, and in the same survey that reported high data use from Verizon iPhone users, T-Mobile iPhone users reported just 0.19 GB of data use a month—though that’s much lower than any other service. Beyond the amount of wireless data being streamed, total energy consumption also depends on estimates of how much energy is consumed per GB of data. The top example assumes that every GB burns through 19 kW of electricity. That would be close  to a worst-case model. The Centre for Energy-Efficient Communications (CEET) in Melbourne assumes a much lower estimate of 2 kWh per GB of wireless data, which would lead to a much lower electricity consumption estimate as well—as little as 4.6 kWh a year with the low T-Mobile data use. In the original version of the post, I should have noted that there is a significant range in estimates of power use by wireless networks, and that this study goes with the very high end.]       

The iPhone is just one reason why the information-communications-technologies (ICT) ecosystem, otherwise known as the digital economy, demands such a large and growing amount of energy. The global ICT system includes everything from smartphones to laptops to digital TVs to — especially — the vast and electron-thirsty computer-server farms that make up the backbone of what we call “the cloud.” In his report, Mills estimates that the ICT system now uses 1,500 terawatt-hours of power per year. That’s about 10% of the world’s total electricity generation or roughly the combined power production of Germany and Japan. It’s the same amount of electricity that was used to light the entire planet in 1985. We already use 50% more energy to move bytes than we do to move planes in global aviation. No wonder your smartphone’s battery juice constantly seems on the verge of running out.

As our lives migrate to the digital cloud — and as more and more wireless devices of all sorts become part of our lives — the electrons will follow. And that shift underscores how challenging it will be to reduce electricity use and carbon emissions even as we become more efficient.

Here’s an example: the New Republic recently ran a story arguing that the greenest building in New York City — the Bank of America Tower, which earned the Leadership in Energy and Environmental Design’s (LEED) highest Platinum rating — was actually one of the city’s biggest energy hogs. Author Sam Roudman argued that all the skyscraper’s environmentally friendly add-ons — the waterless urinals, the daylight dimming controls, the rainwater harvesting — were outweighed by the fact that the building used “more energy per square foot than any comparably sized office building in Manhattan,” consuming more than twice as much energy per square foot as the 80-year-old (though recently renovated) Empire State Building.

Why did an ultra-green tower need so much electricity? The major culprit was the building’s trading floors, full of fields of energy-thirsty workstations with five computers to a desk:

Assuming no one turns these computers off, in a year one of these desks uses roughly the energy it takes a 25-mile-per-gallon car engine to travel more than 4,500 miles. The servers supporting all those desks also require enormous energy, as do the systems that heat, cool and light the massive trading floors beyond normal business hours. These spaces take up nearly a third of the Bank of America Tower’s 2.2 million total square feet, yet the building’s developer and architect had no control over how much energy would be required to keep them operational.

I think — and others agree — that the TNR article was unfair. There’s lots of silliness in the LEED ratings system — see this Treehugger post for evidence — but it’s not the Bank of America building itself that’s responsible for that massive carbon footprint. It’s what’s being done inside the building, as those hardworking computers suck electricity 24 hours a day, seven days a week. The fact that a skyscraper with so many cutting-edge, energy-efficient features can still use so much energy because it needs to play a full-time role in the cloud underscores just how electricity-intensive the digital economy can be.

That’s because the cloud uses energy differently than other sectors of the economy. Lighting, heating, cooling, transportation — these are all power uses that have rough limits. As your air conditioner or lightbulb becomes more efficient, you might decide to then use them more often — in energy efficiency, that is what’s known as the rebound effect. But you can only heat your home so much, or drive so far before you reach a period of clearly diminishing returns. Just because my Chevy Volt can get 100 miles per gallon doesn’t mean I’m going to drive back and forth to Washington each day. So it stands to reason that as these appliances become more efficient, we can potentially limit and even reduce energy consumption without losing value — which is indeed what’s happened in recent years in the U.S. and other developed nations.

But the ICT system derives its value from the fact that it’s on all the time. From computer trading floors or massive data centers to your own iPhone, there is no break time, no off period. (I can’t be the only person who keeps his iPhone on at night for emergency calls because I no longer have a home phone.) That means a constant demand for reliable electricity. According to Mills, efficiency improvements in the global ICT system began to slow around 2005, even as global data traffic began to spike thanks to the emergence of wireless broadband for smartphones and tablets. As anyone who has ever tried to husband the battery of a dying smartphone knows, transmitting wireless data — whether via 3G or wi-fi — adds significantly to power use. As the cloud grows bigger and bigger, and we put more and more of our devices on wireless networks, we’ll need more and more electricity. How much? Mills calculates that it takes more electricity to stream a high-definition movie over a wireless network than it would have taken to manufacture and ship a DVD of that same movie.

Look at our smartphones: as they become more powerful, they also use more power. Slate’s Farhad Manjoo called this the “smartphone conundrum” in a piece earlier this year:

Over the next few years, at least until someone develops better battery technology, we’re going to have to choose between smartphone performance and battery life. Don’t worry — phones will keep getting faster. Chip designers will still manage to increase the speed of their chips while conserving a device’s power. The annual doubling in phone performance we’ve seen recently isn’t sustainable, though. Our phones are either going to drain their batteries at ever increasing rates while continuing to get faster — or they’re going to maintain their current, not-great-but-acceptable battery life while sacrificing huge increases in speed. It won’t be possible to do both.

And that’s just our phones. What’s unique about the ICT system is that companies keep introducing entirely new product lines. In 1995, you might have had a desktop computer and perhaps a game system. In 2000, maybe you had a laptop and a basic cell phone. By 2009, you had a laptop and a wireless-connected smartphone. Today you may well have a laptop, a smartphone, a tablet and a streaming device for your digital TV. The even more connected might be wearing a Fitbit tracker, writing notes with a wi-fi-enabled Livescribe pen and tracking their runs with a GPS watch. And there will certainly be more to come, as the best minds of our generation design new devices for us to buy. In a piece yesterday, Manjoo reviewed the Pebble, the first — but almost certainly not the last — major “smartwatch.” At a moment when young people are buying fewer cars and living in smaller spaces — reducing energy needs for transportation and heating/cooling — they’re buying more and more connected devices. Of course the electricity bill is going to go up.

None of this is to argue that energy efficiency isn’t important in the ICT sector. Just as the Bank of America Tower’s green features keep its gigantic electricity demand from ballooning even more, efficient smartphones and laptops can slow the growth of the cloud’s carbon footprint. But grow it will. Energy efficiency has never been a big part of the sales strategy for digital devices, probably because electricity is still cheap in the U.S. and it’s something we pay for in bulk at the end of the month. Compare the feeling of paying your utility bill to the irritation of forking out $3.50 a gallon to fill up your car. The costs of electricity are hidden in our society.

That includes the environmental costs. The full title of Mills’ report is The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure and Big Power, and it’s sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity. Unsurprisingly, the report argues that coal — still the single biggest source of electricity in the U.S. — essentially powers our wonderful cloud. (And it is wonderful! The cloud generates a lot of value for all the electricity it uses.) Coal is hardly the only source of electricity that can keep the ICT system going — cleaner natural gas is already gaining, nuclear provides carbon-free base-load power, and renewables are growing fast. Certain aspects of the ICT system will also help reduce energy use, as smart grids and smart meters promote conservation. But users of the wireless cloud are likely to grow from 42.8 million people in 2008 to nearly 1 billion in 2014 — and that’s just the beginning, as smartphones spread from the developed to the developing world. We already have a gigantic digital cloud, and it’s only going to get bigger. What we need is a cleaner one.

[Update: Along those lines, digital companies have been taking steps to clean the cloud by procuring more of their energy from low-carbon sources. Apple’s data centers, for instance, are 100% powered by renewable energy, and is working to increase renewable energy use overall. Google gets 34% of its energy for operations from renewable sources. Smart companies are looking to cite power-hungry data centers near reliable sources of renewable energy: large hydro plants, like the ones near the new data center Facebook recently opened in Sweden, or utility-scale wind farms. Ultimately, though, it’s less the responsibility of the companies themselves then the economy as a whole to make the shift to cleaner energy. As more and more people buy more and more cloud-connected devices—and as electric cars and other forms of electrified transport replace petroleum-powered vehicles—the demand for electricity will grow. It’s up to us to push to make it cleaner.]

*A note on the calculations on smartphone energy use. This comes from an email by Max Luke, a policy associate at the Breakthrough Institute, which posted on Mills’ study:

Last year the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by a ATKearney for mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kw X 19 GB) 361 kwh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.

Breakthrough ran the numbers on the iPhone specifically—the Mills’ endnotes (see page 44 in the report) refer to smartphones and tablets more generally—but Luke notes that Mills confirmed the calculations.

As I noted in the update at the top of the post, these estimates are at the very high end—other researchers have argue that power use by smartphones is much lower. And the Mills study itself has come in for strong criticism from other experts, as this MSN post notes:

Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments that Mills’ work was flawed.

Writing to MSN News, Heiser said Mills’ work “seems blatantly wrong.” He said Mills overestimates the amount of power used by a modern smartphone, in this case a Galaxy S III, by more than four times.

“I’d have to have a quick look to see how they arrive at this figure, but it certainly looks like baloney to me,” Heiser said.

Gang Zhou, an associate professor of computer science at the College of Williams and Mary, was less direct in attacking Mills’ claims, but nonetheless said his measurements for the power consumption of smartphones was at least “one or two magnitude” higher than they should be. Nonetheless, Zhou said the subject of data center electricity usage is an important issue and it “should raise concern.”

Still, I think the takeaway from this isn’t about the energy use of individual brands or even whole classes of devices. The point is that as our always-on digital economy grows more extensive—and it will—we need to be more aware of the energy demands that will follow. The study from CEET in Melbourne that I noted in the update at the top of the post assumes much lower power consumption by individual devices than Mills’ work, but it still raises the alarm about the growing energy demand from cloud services.

As I write above, the nature of a smartphone or a tablet makes it hard to realize how much energy it may be using—especially given the fact that the electricity is often produced at plants far away from our outlets. At a gas station, for instance, the immediate cost and the smell of petrol is a potent reminder that we’re consuming energy. The digital economy is built on the sensation of seamlessness—but it still comes with a utility bill.