In 2008, Satoshi Nakamoto (a pseudonym) announced plans to build a new electronic currency—totally peer-to-peer and requiring no third party intermediaries—called Bitcoins. In order to get new Bitcoins, users would install programs on their computers called “Bitcoin miners” that would solve complex mathematical puzzles. By making the puzzles difficult and only solvable after some heavy computing, coins would be slowly introduced into the system over time and the coins randomly distributed to users. These mining programs would search for a sequence of data that produces a particular pattern which, when found, gives the miner a small amount of Bitcoins. Simply put, users could make Bitcoins by using their computers’ processing power to solve these puzzles and generate new coins. As of 2009, the number of new Bitcoins has been designed to halve every 4 years until 2140, after which the number of Bitcoins will have reached a maximum of 21 million coins, and no more Bitcoins will be added into circulation.
This system worked well for the first few years, but since Bitcoin mining became widely practiced in 2009, the easy puzzles have been solved, and more processing power has been needed to solve the increasingly harder puzzles. Though there are other ways to obtain Bitcoins (like buying them with other currencies, or trading them for products and services, or through processing fees), mining the coins is still the only way to introduce more coins into the system. As Bitcoin mining requires increasing computing power for diminishing returns, the low-powered computers found in homes and offices are no longer up to the task of virtual mining.
In April 2013, Mark Gimein at Bloomberg published an article calling Bitcoin mining an “environmental disaster” that consumes 982 megawatt hours a day, or enough power to run 31,000 US homes. Additionally, the value of Bitcoins is subject to massive fluctuations in the currency trading markets, threats by various governments to shut down the experiment, and hacker attacks on the Bitcoin system. Just three days before the Gimein published his article, Bitcoin values plummeted by 77% after hackers and new users put pressure on the system. A month later, US authorities seized the world’s largest Bitcoin exchange, and earlier this week the IRS declared Bitcoins a taxable income. While Bitcoin has made a few people wealthy, Bitcoin miners are quite literally converting thousands of megawatt hours into virtual currency, the future of which is extremely uncertain. Just like mining for gold in the real-world, mining for virtual coins presents serious political, economic, and environmental issues.
While Bitcoins may be the one of the most obvious challenges to the virtual-material divide, it may not be the most significant. In September 2012, the New York Times estimated that digital data centers worldwide use about 30 billion watts of electricity (or about the same as the output of 30 nuclear power plants), with the US responsible for about one-third of that usage. According to Google, a single search uses about 0.0003 kWh (1080 joules) of energy, which is roughly the same as turning on a 60W light bulb for 17 seconds. Another estimate found that a 140 character Tweet consumes about 90 joules, which is roughly enough energy to power that same light bulb for 1.4 seconds.
But what about when no one is actively using these services? A McKinsey & Company report estimated that an average data center only uses 6 to 12% of its electricity for computation, while nearly 90% of energy use goes into keeping servers idling in case of a surge in activity that could crash operations. Companies keep their facilities running around the clock at maximum capacity, regardless of demand, because they fear what might happen if their services are interrupted.
Earlier this month, Google hosted a summit at the Googleplex to consider “How green is the Internet?” In his keynote address, energy researcher Jon Koomey estimated that the Internet is probably responsible for about 10% of the world’s total electricity consumption. Koomey, who has been studying the material impact of the Internet since 2000, noted that the numbers are difficult to track, but suggested that companies that have made their names collecting data could do a better job tracking electrical use. Eric Masanet from Northwestern University found this lack of data troubling enough that he launched a publicly available model for assessing the energy effects of cloud computing called CLEER.
Koomey also noted that moving to digital communications and networks has reduced overall electricity use. For example, Koomey argues, businesses and organizations reduce their use of electricity by allowing companies like Google to host their email servers rather than run their own. The subtext of many of the “How green is the Internet?” keynotes was fairly obvious; if you care about the environment, move your data and processing to the cloud. Google made this connection clear when the company posted on its blog about the summit and cited a study (sponsored by Google) from the Lawrence Berkeley National Laboratory that found that migrating all US office workers to the cloud could save up to 87% of IT energy use (or enough to power the city of Los Angeles for a year).
From an environmental perspective, Google has done its best to make migrating to the cloud attractive. Google is one of the largest investors in renewable energy, has commissioned several wind farms, and uses more efficient cooling towers for its servers than most Internet companies (though currently only 33% of its energy use is renewable). Other companies are investing in clean and green technologies too. Last year Facebook opened a data center in a building designed to make its servers 40% more energy efficient, and this year it opened a data center in Sweden that completely runs on hydropower. Apple states that its data centers are completely powered by solar, wind, hydro and geothermal energy. Microsoft has pledged to become carbon neutral in 2013 and earned its place on the Environmental Protection Agency’s 2013 list of the top 10 renewable energy-using organizations in the US along with Intel, Starbucks, Wal-Mart, and Lockheed Martin.
Recently, several cloud computing companies like Cloud Hashing have begun offering services that allow users to use outsource their Bitcoin mining processes to their cloud servers. Bitcoin mining isn’t the only service being migrated to the cloud. Last year, Adobe announced its decision to begin offering its Creative Suite of products, like the popular Photoshop and Illustrator, on its cloud service exclusively. Adobe reported this week that 700,000 users have begun using their “Creative Cloud” service and hopes to have 4 million users by 2015. Adobe’s decision to offer its services via the cloud was primarily motivated by its need to combat piracy of its software and also to roll out updates quicker, not necessarily because the company wanted to decrease end user energy consumption. Other data gathering companies have a stronger interest in collecting, storing, and mining user data. Last week, Google caused some controversy and user confusion after completing an update to its mobile Gmail app, making archive (rather than delete) the default setting for mobile users. Google didn’t remove the delete option—it’s still available through menu actions—but the company is clearly nudging users away from deleting emails. Much like mining Bitcoins, mining user data or letting users search archived messages requires sifting through massive amounts of data looking for particular patterns or text.
While Adobe’s new cloud services might use less energy than what individual computers running the software requires, and companies like Apple and Google are moving to renewable energies, the lack of energy usage transparency prevents users from knowing the actual costs of cloud computing. Many people go out of their way to turn off the lights when they leave a room or recycle soda cans, but become angry when a site loads slowly or they can’t instantly find an email archived four years ago in Gmail. The data centers that store and process old emails and tweets already use more than 2% of the US electricity supply (more than the notoriously energy demanding paper industry). When one considers how much energy is involved in Internet use, “the cloud” rapidly comes down to Earth.