It all seems virtual, but the Internet really does have a physical home in the thousands and thousands of computer data centers operated by IT companies around the world. And those data centers require electricity—potentially, lots of it, with a thirst that will likely only grow as our lives become more digital. In 2007, a report by the Environmental Protection Agency (EPA) estimated that power consumed by data centers would double between 2005 and 2010—to roughly 100 billion kilowatt-hours of energy—raising concerns that the digital economy might be just as unsustainable as the physical one.
It turns out, however, that the EPA forecast was a bit off. A new report by Jonathan Koomey—a consulting professor in the civil and environmental engineering department at Stanford University—has found that the growth in the number of data centers, as well as their power usage, was less than originally forecast. Thanks in part to the recession—which curbed demand for computing—as well as the rise of more efficient computer chips and server technology, electricity used by data centers worldwide only increased by 56% between 2005 and 2010, and just 36% in the U.S.
More from TIME: Google’s Floating Data Center
Mostly because of the recession, but also because of a few changes in the way these facilities are designed and operated, data center electricity consumption is clearly much lower than what was expected, and that’s really the big story.
Koomey can’t tell how big an impact the recession had on curbing demand versus improvements in energy efficiency, but his numbers are surprising because the Internet has become so much more central to our lives and the economy than it was just five years ago. More and more of our data is stored in the “cloud”—meaning in a powered data center somewhere—ready to be accessed on demand. Companies like Google and Facebook—whose businesses depend on huge banks of computer servers—have grown vastly over the past five years. Yet Koomey estimates that data centers use perhaps 1.1% to 1.5% of global electricity supplies, which makes the IT sector a relatively small part of the global climate problem.
Video from TIME: Google’s Energy Director Talks Clean Power
But even if data centers are using much less power than we thought, that demand is still growing much faster than nearly every other part of the economy. And that rate could continue to increase as the computing cloud expands, with services like Dropbox and Evernote already allowing computer users to store data online cheaply. Just today Apple began rolling out the details of its new iCloud service, which will allow users to store much of their data on the company’s servers. Anyone who signs up for iCloud will get up to 5 GB of storage free—and that doesn’t even count storage for music, apps or books, which if you’re like me will take up far more space. (I think I own about 600 MB worth of Arcade Fire songs alone.) Worse, as chips get faster and faster they’ll eat up more power, which could limit how much more energy-efficient they could become in the future.
One way to get around this problem is to use renewable electricity to power data centers whenever possible—something Google, one of the biggest IT consumers of electricity, has actually pioneered. Unlike many other major power users—like factories or defense bases—data centers don’t have to be located near population centers, which means they can be built in areas that can more easily take advantage of renewable resources, like deserts or windy plains. Of course, data centers must have steady sources of power—and renewables are too often intermittent. An on demand digital life requires on demand power, and lots of it—so expect the IT sector’s power bill to keep growing and growing.
More from Ecocentric: Stop Global Warming with a New Computer Game