The Surprisingly Large Energy Footprint of the Digital Economy [UPDATE]

Our computers and smartphones might seem clean, but the digital economy uses a tenth of the world's electricity — and that share will only increase, with serious consequences for the economy and the environment

  • Share
  • Read Later
Y.C via Getty Images

A server room at a data center. One data center can use enough electricity to power 180,000 homes

Which uses more electricity: the iPhone in your pocket, or the refrigerator humming in your kitchen? Hard as it might be to believe, the answer is probably the iPhone. As you can read in a post on a new report by Mark Mills — the CEO of the Digital Power Group, a tech- and investment-advisory firm — a medium-size refrigerator that qualifies for the Environmental Protection Agency’s Energy Star rating will use about 322 kW-h a year. The average iPhone, according to Mills’ calculations, uses about 361 kW-h a year once the wireless connections, data usage and battery charging are tallied up. And the iPhone — even the latest iteration — doesn’t even keep your beer cold. (Hat tip to the Breakthrough Institute for noting the report first.)

[UPDATE: You can see the calculations behind the specific iPhone comparison, which was done by Max Luke of the Breakthrough Institute, at the bottom of the post. It’s important to note that the amount of energy used by any smartphone will vary widely depending on how much wireless data the device is using, as well as the amount of power consumed in making those wireless connections—estimates for which vary. The above examples assumes a relatively heavy use of 1.58 GB a month—a figure taken from a survey of Verizon iPhone users last year. (Details at bottom of post.) That accounts for the high-end estimate of the total power the phone would be consuming over the course of a year. NPD Connected Intelligence, by contrast, estimates that the average smartphone is using about 1 GB of cellular data a month, and in the same survey that reported high data use from Verizon iPhone users, T-Mobile iPhone users reported just 0.19 GB of data use a month—though that’s much lower than any other service. Beyond the amount of wireless data being streamed, total energy consumption also depends on estimates of how much energy is consumed per GB of data. The top example assumes that every GB burns through 19 kW of electricity. That would be close  to a worst-case model. The Centre for Energy-Efficient Communications (CEET) in Melbourne assumes a much lower estimate of 2 kWh per GB of wireless data, which would lead to a much lower electricity consumption estimate as well—as little as 4.6 kWh a year with the low T-Mobile data use. In the original version of the post, I should have noted that there is a significant range in estimates of power use by wireless networks, and that this study goes with the very high end.]       

The iPhone is just one reason why the information-communications-technologies (ICT) ecosystem, otherwise known as the digital economy, demands such a large and growing amount of energy. The global ICT system includes everything from smartphones to laptops to digital TVs to — especially — the vast and electron-thirsty computer-server farms that make up the backbone of what we call “the cloud.” In his report, Mills estimates that the ICT system now uses 1,500 terawatt-hours of power per year. That’s about 10% of the world’s total electricity generation or roughly the combined power production of Germany and Japan. It’s the same amount of electricity that was used to light the entire planet in 1985. We already use 50% more energy to move bytes than we do to move planes in global aviation. No wonder your smartphone’s battery juice constantly seems on the verge of running out.

As our lives migrate to the digital cloud — and as more and more wireless devices of all sorts become part of our lives — the electrons will follow. And that shift underscores how challenging it will be to reduce electricity use and carbon emissions even as we become more efficient.

Here’s an example: the New Republic recently ran a story arguing that the greenest building in New York City — the Bank of America Tower, which earned the Leadership in Energy and Environmental Design’s (LEED) highest Platinum rating — was actually one of the city’s biggest energy hogs. Author Sam Roudman argued that all the skyscraper’s environmentally friendly add-ons — the waterless urinals, the daylight dimming controls, the rainwater harvesting — were outweighed by the fact that the building used “more energy per square foot than any comparably sized office building in Manhattan,” consuming more than twice as much energy per square foot as the 80-year-old (though recently renovated) Empire State Building.

Why did an ultra-green tower need so much electricity? The major culprit was the building’s trading floors, full of fields of energy-thirsty workstations with five computers to a desk:

Assuming no one turns these computers off, in a year one of these desks uses roughly the energy it takes a 25-mile-per-gallon car engine to travel more than 4,500 miles. The servers supporting all those desks also require enormous energy, as do the systems that heat, cool and light the massive trading floors beyond normal business hours. These spaces take up nearly a third of the Bank of America Tower’s 2.2 million total square feet, yet the building’s developer and architect had no control over how much energy would be required to keep them operational.

I think — and others agree — that the TNR article was unfair. There’s lots of silliness in the LEED ratings system — see this Treehugger post for evidence — but it’s not the Bank of America building itself that’s responsible for that massive carbon footprint. It’s what’s being done inside the building, as those hardworking computers suck electricity 24 hours a day, seven days a week. The fact that a skyscraper with so many cutting-edge, energy-efficient features can still use so much energy because it needs to play a full-time role in the cloud underscores just how electricity-intensive the digital economy can be.

That’s because the cloud uses energy differently than other sectors of the economy. Lighting, heating, cooling, transportation — these are all power uses that have rough limits. As your air conditioner or lightbulb becomes more efficient, you might decide to then use them more often — in energy efficiency, that is what’s known as the rebound effect. But you can only heat your home so much, or drive so far before you reach a period of clearly diminishing returns. Just because my Chevy Volt can get 100 miles per gallon doesn’t mean I’m going to drive back and forth to Washington each day. So it stands to reason that as these appliances become more efficient, we can potentially limit and even reduce energy consumption without losing value — which is indeed what’s happened in recent years in the U.S. and other developed nations.

But the ICT system derives its value from the fact that it’s on all the time. From computer trading floors or massive data centers to your own iPhone, there is no break time, no off period. (I can’t be the only person who keeps his iPhone on at night for emergency calls because I no longer have a home phone.) That means a constant demand for reliable electricity. According to Mills, efficiency improvements in the global ICT system began to slow around 2005, even as global data traffic began to spike thanks to the emergence of wireless broadband for smartphones and tablets. As anyone who has ever tried to husband the battery of a dying smartphone knows, transmitting wireless data — whether via 3G or wi-fi — adds significantly to power use. As the cloud grows bigger and bigger, and we put more and more of our devices on wireless networks, we’ll need more and more electricity. How much? Mills calculates that it takes more electricity to stream a high-definition movie over a wireless network than it would have taken to manufacture and ship a DVD of that same movie.

Look at our smartphones: as they become more powerful, they also use more power. Slate’s Farhad Manjoo called this the “smartphone conundrum” in a piece earlier this year:

Over the next few years, at least until someone develops better battery technology, we’re going to have to choose between smartphone performance and battery life. Don’t worry — phones will keep getting faster. Chip designers will still manage to increase the speed of their chips while conserving a device’s power. The annual doubling in phone performance we’ve seen recently isn’t sustainable, though. Our phones are either going to drain their batteries at ever increasing rates while continuing to get faster — or they’re going to maintain their current, not-great-but-acceptable battery life while sacrificing huge increases in speed. It won’t be possible to do both.

And that’s just our phones. What’s unique about the ICT system is that companies keep introducing entirely new product lines. In 1995, you might have had a desktop computer and perhaps a game system. In 2000, maybe you had a laptop and a basic cell phone. By 2009, you had a laptop and a wireless-connected smartphone. Today you may well have a laptop, a smartphone, a tablet and a streaming device for your digital TV. The even more connected might be wearing a Fitbit tracker, writing notes with a wi-fi-enabled Livescribe pen and tracking their runs with a GPS watch. And there will certainly be more to come, as the best minds of our generation design new devices for us to buy. In a piece yesterday, Manjoo reviewed the Pebble, the first — but almost certainly not the last — major “smartwatch.” At a moment when young people are buying fewer cars and living in smaller spaces — reducing energy needs for transportation and heating/cooling — they’re buying more and more connected devices. Of course the electricity bill is going to go up.

None of this is to argue that energy efficiency isn’t important in the ICT sector. Just as the Bank of America Tower’s green features keep its gigantic electricity demand from ballooning even more, efficient smartphones and laptops can slow the growth of the cloud’s carbon footprint. But grow it will. Energy efficiency has never been a big part of the sales strategy for digital devices, probably because electricity is still cheap in the U.S. and it’s something we pay for in bulk at the end of the month. Compare the feeling of paying your utility bill to the irritation of forking out $3.50 a gallon to fill up your car. The costs of electricity are hidden in our society.

That includes the environmental costs. The full title of Mills’ report is The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure and Big Power, and it’s sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity. Unsurprisingly, the report argues that coal — still the single biggest source of electricity in the U.S. — essentially powers our wonderful cloud. (And it is wonderful! The cloud generates a lot of value for all the electricity it uses.) Coal is hardly the only source of electricity that can keep the ICT system going — cleaner natural gas is already gaining, nuclear provides carbon-free base-load power, and renewables are growing fast. Certain aspects of the ICT system will also help reduce energy use, as smart grids and smart meters promote conservation. But users of the wireless cloud are likely to grow from 42.8 million people in 2008 to nearly 1 billion in 2014 — and that’s just the beginning, as smartphones spread from the developed to the developing world. We already have a gigantic digital cloud, and it’s only going to get bigger. What we need is a cleaner one.

[Update: Along those lines, digital companies have been taking steps to clean the cloud by procuring more of their energy from low-carbon sources. Apple’s data centers, for instance, are 100% powered by renewable energy, and is working to increase renewable energy use overall. Google gets 34% of its energy for operations from renewable sources. Smart companies are looking to cite power-hungry data centers near reliable sources of renewable energy: large hydro plants, like the ones near the new data center Facebook recently opened in Sweden, or utility-scale wind farms. Ultimately, though, it’s less the responsibility of the companies themselves then the economy as a whole to make the shift to cleaner energy. As more and more people buy more and more cloud-connected devices—and as electric cars and other forms of electrified transport replace petroleum-powered vehicles—the demand for electricity will grow. It’s up to us to push to make it cleaner.]

*A note on the calculations on smartphone energy use. This comes from an email by Max Luke, a policy associate at the Breakthrough Institute, which posted on Mills’ study:

Last year the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by a ATKearney for mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kw X 19 GB) 361 kwh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.

Breakthrough ran the numbers on the iPhone specifically—the Mills’ endnotes (see page 44 in the report) refer to smartphones and tablets more generally—but Luke notes that Mills confirmed the calculations.

As I noted in the update at the top of the post, these estimates are at the very high end—other researchers have argue that power use by smartphones is much lower. And the Mills study itself has come in for strong criticism from other experts, as this MSN post notes:

Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments that Mills’ work was flawed.

Writing to MSN News, Heiser said Mills’ work “seems blatantly wrong.” He said Mills overestimates the amount of power used by a modern smartphone, in this case a Galaxy S III, by more than four times.

“I’d have to have a quick look to see how they arrive at this figure, but it certainly looks like baloney to me,” Heiser said.

Gang Zhou, an associate professor of computer science at the College of Williams and Mary, was less direct in attacking Mills’ claims, but nonetheless said his measurements for the power consumption of smartphones was at least “one or two magnitude” higher than they should be. Nonetheless, Zhou said the subject of data center electricity usage is an important issue and it “should raise concern.”

Still, I think the takeaway from this isn’t about the energy use of individual brands or even whole classes of devices. The point is that as our always-on digital economy grows more extensive—and it will—we need to be more aware of the energy demands that will follow. The study from CEET in Melbourne that I noted in the update at the top of the post assumes much lower power consumption by individual devices than Mills’ work, but it still raises the alarm about the growing energy demand from cloud services.

As I write above, the nature of a smartphone or a tablet makes it hard to realize how much energy it may be using—especially given the fact that the electricity is often produced at plants far away from our outlets. At a gas station, for instance, the immediate cost and the smell of petrol is a potent reminder that we’re consuming energy. The digital economy is built on the sensation of seamlessness—but it still comes with a utility bill.


Of all the places where wind and solar power are useless, and the load is clearly base load (constant!) it would seem that a data center is the most obvious place for a nuclear power plant, and there are now three designs for breeder nuclear fission reactors, small in size and immune to meltdown, besides producing less than a ton of easily controllable waste products per gigawatt year.


After checking out a number of the blog posts on your web site, I honestly appreciate your way of blogging. I saved it to my bookmark webpage list and will be checking back in the near future. Please visit my web site as well and tell me how you feel.


Interesting claim but one of the largest companies in cloud and vod services is releasing totally different figures and I'd assume that they actually know how much energy it cost to do whatever they do.


I don't know if the data are right, but if true, all the more reason to get our electrical grid updated everywhere and protected from attacks.  I do believe we are totally reliable on digital devices.  Everything transmitted electronically and so much information that is only accessible from the internet connection.  

Hope somebody is working on it.  Do you think the Republicans will appropriate money for that?


<b>Hard as it might be to believe, the answer is probably the iPhone.</b>

Hard to believe because it simply isn't true. By the ridiculous metrics used, every time I listen to a song on the radio I'm using the energy equivalent of a stick of dynamite to account for the studio recordings, remixing, advertising costs and everything else associated all along the chain to get that song to my ears, including the electricity used to manufacture the radio.

Or as FBN.WEB notes...

<b><i>shouldn't you also factor in refrigerators and freezes at the grocery store for my refrigerator energy consumption?  I mean, that gallon of milk sitting in my fridge came from somewhere just like the data on my phone, right?</i></b>

And since Apple has data centers powered entirely by renewable energy, it pretty much invalidates the claim completely, given that so many of the calculated guesses are extraordinarily high.


"And since Apple has data centers powered entirely by renewable energy, it pretty much invalidates the claim completely"
No - the claim was the amount of kWH used by the phone including the servers to supply its data - the article specified YouTube as an example, which is not owned by Apple. But in any case, the source of the energy is irrelevant to the quantum of the energy specified in kWH, regardless of whether that energy came from coal or renewables.


Mills was spectacularly wrong in 1999 (e.g. and he is wrong again. He barely even bothered to change the title of his "report" ("The Internet begins with Coal" in 1999 to "The Cloud begins with Coal" in 2013). Bryan, if you write in the "Science" section of Time, it is your responsibility to check if your source is credible.


If you're going to factor in the cell towers in my iPhone energy consumption, then shouldn't you also factor in refrigerators and freezes at the grocery store for my refrigerator energy consumption?  I mean, that gallon of milk sitting in my fridge came from somewhere just like the data on my phone, right?


Veganism Is Direct Action!  

The following points and facts are excerpted from Please Don't Eat the Animals (2007) by the mother-daughter writing team of Jennifer Horsman and Jaime Flowers:

The Worldwatch Institute estimates one pound of steak from a steer raised in a feedlot costs:  five pounds of grain, a whopping 2,500 gallons of water, the energy equivalent of a gallon of gasoline, and about 34 pounds of topsoil.

Thirty-three percent of our nation's raw materials and fossil fuels go into livestock destined for slaughter.  In a vegan economy, only two percent of our resources will go to the production of food.

"It seems disingenuous for the intellectual elite of the first world to dwell on the subject of too many babies being born in the second- and third-world nations while virtually ignoring the overpopulation of cattle and the realities of a food chain that robs the poor of sustenance to feed the rich a steady diet of grain-fed meat."

--Jeremy Rifkin, pro-life AND pro-animal author, Beyond Beef: The Rise and Fall of the Cattle Culture, and president of the Greenhouse Crisis Foundation
"Carl Pope could probably affect the world more by being a vegetarian than through his job as president of the Sierra Club," quipped Jennifer Horsman.


"Nearly 75% of the grain grown and 50% of the water consumed in the U.S. are used by the meat industry. (Audubon Society)

It takes nearly one gallon of fossil fuel and 5,200 gallons of water to produce just one pound of conventionally fed beef. (Mother Jones)

According to the editors of World Watch, July/August 2004: "The human appetite for animal flesh is a driving force behind virtually every major category of environmental damage now threatening the human future -- deforestization, topsoil erosion, fresh water scarcity, air and water pollution, climate change, biodiversity loss, social injustice, the destabilization of communities and the spread of disease."

The number of animals killed for food in the United States is nearly 75 times larger than the number of animals killed in laboratories, 30 times larger than the number killed by hunters and trappers, and 500 times larger than the number of animals killed in animal pounds.

People for the Ethical Treatment of Animals (PETA) is challenging those who think they can still be "meat-eating environmentalists" to go vegan, if they really care about the planet.

peta2 is now the largest youth movement of any social change organization in the world.

peta2 has 267,000 friends on MySpace and 91,000 Facebook fans.

A few years ago, PETA was the top-ranked charity when a poll asked teenagers what nonprofit group they would most want to work for. PETA won by more than a two to one margin over the second place finisher, The American Red Cross, with more votes than the Red Cross and Habitat for Humanity combined.


"Mills calculates that it takes more electricity to stream a high-definition movie over a wireless network than it would have taken to manufacture and ship a DVD of that same movie."

Interesting claim but one of the largest companies in cloud and vod services is releasing totally different figures and I'd assume that they actually know how much energy it cost to do whatever they do. The claim that for making and delivering 1 DVD you can watch 3 days of youtube:


Case of iPhone

Charger: 5v *1A = 5W

Accu: 1.5 Ah

Efficiency: 50-90%

Charging time: 1.5 Ah / ( 50% * 1A) = 3h

In case of every night charging: 365 * 3h * 5W = 5475 Wh = 5.475 kWh

Every other activity uses the same energy, they must not be summerized.


Gilder’s Law: Winner’s Waste. “The best business models, he said, waste (Leverage) the era’s cheapest resources in order to conserve the era’s most expensive resources”.  

This is really not surprising.


sure we use the cloud - but I went to a city talk recently about energy efficiency where a council guy showed a graphic of the largest CBD electricity users - one was off the chart - at least double number 2's usage- I asked him later - looks like near where I live - he didn't want to say - I said 'maybe Global Switch' (datacentre) - he finally acknowledged 'yes it's Global Switch' - so in my CBD the largest electricity user - by far - is a computer data centre - which is also expanding by about 40% - so make that way largest+40% - that's about 500m from my home

nearer to me - about 150m from my home - is a smaller data centre - still a large multi-storey commercial building - that in winter I see discharging steam from 4 large roof vents - so I'd say a lot of heat, a lot of cooling, and a lot of water used in cooling towers as I almost daily see the street gutters flushing with water as they refresh their cooling system.

So yeah we love the cloud - but don't think it ain't using a heap of electricity - that's why they're moving to hydroelectricity/geothermal regions - either the computer power or the cooling requirement or both - a LOT of electricity !


So, we have a lot more devices presently that use electricity than we used to so now we use a lot more electricity with those devices than we used to. Common sense, but I understand the purpose of the article.

As far as conserving electricity as some on here have mentioned, it shouldn't matter how much electricity we use as long as we pump it out using clean methods. Electricity is "created". It is not a resource we dig out of the ground. There are many ways to harvest electricity without burning coal like we used to. That includes nuclear power, wind turbines, solar panels, and what we should focus more on is hydroelectric power...particular in coastal areas. I think nuclear power is too big a part of our energy grid and we should focus more on the natural sources. I lived in Germany for a couple years while in the Army and the landscape had plenty of wind turbines and solar panels. Nuclear power is even being phased out there as it isn't as necessary and it poses potential disasters although they are rare. The US should take a page out of Germany's playbook in the energy sector and we will be just fine.


@michaelbrown You are right Michael - the way of generating electricity should be optimized to use renewable energy source instead of radiation spewing nuclear reactors which pollutes air, water, soil and genes. Germany and Switzerland are making tremendrous progress in phasing out energy generation from nuclear and fossil fuel sources and moving ahead with renewable energy sources.


@michaelbrown It's a good idea in principle, but (with a few exceptions) none of those renewable energy sources or fossil reserves are enough, individually or together, to do more than make a dent in our consumption.  Not even nuclear fission, whose energy reserves are actually substantially less than petroleum (though more if you allow wide access breeder reactors, accepting the proliferation risk).

Now, the aforementioned exceptions:

* Nuclear fusion, of course, if it ever gets off the ground.  Not technically renewable but the reserves vastly exceed any projected consumption for centuries.

* Industrial-scale solar collectors covering a non-negligible fraction of the Earth's surface, with attendant environmental consequences.

* Orbiting solar collectors: as above but without the environmental side effects, assuming you can find an efficient way to get them up there.


Energy storage and smart grid electronics can help integrate wind and solar into the grid.  Due to drought much of hydro-power is reduced to energy storage rather than being primary energy.   A wind turbine stacked on top of a municipal water tank over a large cistern with a pump in the up pipe and a water turbine in the down pipe could  be wind with a really cute energy storage system right under it. 


We can minimize the energy impact by charging our smartphones using a car charger when driving or better use a solar charger - these can even be used to charge laptops. Do your best to save / conserve electricity in turn saving our only planet we call HOME.


@mahendrantgt That will not help. Charging your phone in the car will increase the load on your car's electrical system, increasing the load on the alternator and then the engine, which will burn more gas to compensate. Because cars are not thermally efficient compared to a power plant, this actually results in more fuel consumed to generate the power your phone uses, compared with using grid power. 

Not to mention doing this guarantees your phone will be charged by burning fossil fuels, but depending on where you live, using the wall outlet will be at least partly generated by clean/renewable sources.