Data centers use at least $7.2 billion in electricity globally
Data centers use at least $7.2 billion in electricity globally
mongabay.com
February 15, 2007
U.S. data centers consume 45 billion kilowatts of energy per year, according to a new study, commissioned by computer chip maker AMD.
Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratories and a consulting professor at Stanford University, calculated that in 2005 total data center electricity consumption in the U.S., including servers, cooling and auxiliary equipment, was approximately 45 billion kWh, resulting in total utility bills amounting to $2.7 billion. Globally, data centers used $7.2 billion in electricity.
Koomey’s report also estimates that server energy use has doubled in the past five years and without energy efficiency improvements will continue to climb.
“Though we have long known that data centers worldwide consume a significant amount of energy, AMD believes Dr. Koomey’s findings are a wake-up call not just for the IT industry, but also for global business, government and policy leaders,” said Randy Allen, AMD’s corporate vice president of the Server and Workstation Division, in a keynote address at the LinuxWorld OpenSolutions Summit in New York. “This study demonstrates that unchecked demand for data center energy use can constrain growth and present real business challenges. New generations of energy-efficient servers are now able to help provide IT departments with a path to reduce their energy consumption while still achieving the performance they require.”
Despite the large numbers, Koomey says his study likely underestimates actual power consumption since it doesn’t include unpublicized server installations such as Google’s estimated 450,000 servers.
Growth in Internet video and bandwidth-intensive media are helping to fuel rising power demand by data centers.