“GREEN” COMPUTING CAN’T POWER THE CLOUD

May 22, 2014
Economics21.org

Facebook’s initial public offering was all about superlatives. The May 2012 event was the largest-ever IPO for a US technology company and the third-largest in US history. It marked, or so the hype claimed, the coming of age for social media companies. But amid the hype over the company’s stock price, revenues, and growth potential, the media paid almost no attention to the vast quantities of electricity that Facebook and other tech companies need to operate their business.

In April 2012, Greenpeace spotlighted the issue of power demand in data centers in the report called “How Clean is Your Cloud?” The environmental group graded a series of technology companies, including Facebook, Apple, Amazon, and others on the percentage of what it called “dirty energy” being used by their data centers. Greenpeace—which, of course, has a Facebook page—gave the social media company a “D” for what it calls “energy transparency.” And the group went onto claim that it had convinced Facebook to “unfriend” coal-fired electricity.

Never mind that about 40 percent of all global electricity production comes from coal. Let’s consider what the “clean energy” footprint of one of these big data centers might look like.

In 2012, James Hamilton, a vice president and engineer at Amazon Web Services, wrote about Apple’s new iCloud data center in Maiden, North Carolina. Hamilton was responding to Apple’s claim that it was going to use solar energy to help run the site. In a blog posting called “I love solar but…” Hamilton calculated that each square foot of data center space would require about 362 square feet of solar panels. In all, Hamilton estimated that powering Apple’s 500,000 square-foot data center would require about 6.5 square miles (16.8 square kilometers) of solar panels. Hamilton said that setting aside that much space, particularly in the densely populated regions where many data centers are built, is “ridiculous on its own” and would be particularly difficult because that land couldn’t have any trees or structures that could cast shadows on the panels.

Utilizing wind energy to fuel data centers would be equally problematic. To demonstrate that, consider the Facebook data center in Prineville, Oregon, which needs 28 megawatts of power. The areal power density of wind energy—and it doesn’t matter where you put your wind turbines—is one watt per square meter. Therefore, just to fuel the Facebook data center with wind will require about 28 million square meters of land. That’s 28 square kilometers or nearly 11 square miles—about half the size of Manhattan Island, or about eight times the size of New York City’s Central Park.

The mismatch between the power demands of Big Data and the renewable-energy darlings of the moment are obvious. US data centers are now consuming about two percent of domestic electricity. That amounts to about 86 terawatt-hours of electricity per year, or about as much as is consumed by the Czech Republic, a country with 10 million residents. Put another way, U.S. data centers are consuming about 47 times as much electricity as what was produced by all the solar-energy projects in America in 2011.

The hard reality is that our iPhone, Droids, laptops, and other digital devices require huge amounts of electricity. According to Jonathan Koomey, a research fellow at Stanford University who has worked on the issue of power consumption in the information technology sector for many years, data centers consume about 1.3 percent of all global electricity. That quantity of electricity, about 277 terawatt-hours per year, is nearly the same as what is consumed by Mexico. While that’s a lot of energy, Koomey’s estimate of 277 terawatt-hours doesn’t account for the energy used by home computers, TVs, iPads, iPods, video monitors, routers, DVRs, and mobile phones.

In 2013, Mark Mills, a colleague at the Manhattan Institute, wrote a report called “The Cloud Begins with Coal,” which put the total even higher. Mills estimated then when all the energy used for telephony, Internet, data storage, and the manufacturing of information-technology hardware is included, about 7 percent of all global electricity is being used in our effort to stay connected. That now amounts to about 1,500 terawatt-hours per year, or nearly as much electricity as is used annually by Japan and Germany combined.

Regardless of the precise amount of energy being used to run our digital communications network, it’s readily apparent that communications-related electrical demand is growing rapidly. Between 2005 and 2010, global use of electricity in data centers grew by about 56 percent. That’s more than three times as fast as the growth in global electricity consumption over that same time frame. It’s apparent that the demand for electricity to power data centers will continue to grow as more people, and more things, get connected to the Internet. A plethora of digital devices – ranging from smartphones to GPS-enabled locators on shipping containers – is connecting to the network. In 2012, Intel estimated that there were about 2.5 billion devices connected to the Web. By 2015, it expects there will be 15 billion Net-connected devices. Ericsson predicts 50 billion by 2030.

Again, the exact numbers are not as important as the trend. The push for Smaller Faster digital devices requires moving ever-more information. The more computing power we use, the more electricity we consume. Big Data has always demanded Big Electron. And as we’ve managed to move more and more bits, we’ve seen a corresponding increase in the demand for electricity.

Original story may be found here.

JUICE: HOW ELECTRICITY EXPLAINS THE WORLD

Contact Robert

For information on speaking engagements or other interviews.