In July, Koomey released a report that showed, among other findings, that the electricity used in data centers worldwide increased by about 56 percent from 2005 to 2010—a much lower rate than the doubling that was observed from 2000 to 2005.
While better energy efficiency played a part in this change, the total electricity used in data centers was less than the forecast for 2010 in part because fewer new servers were installed than expected due to technologies such as virtualization, which allowed existing systems to run more programs simultaneously. Koomey notes that data center computers rarely run at peak power. Most computers are, in fact, "terribly underutilized," he says.
To Koomey, the most interesting aspect of the trend is thinking about the possibilities for computing. The theoretical limits are still so far away, he says. In 1985, the physicist Richard Feynman analyzed the electricity needs for computers and estimated that efficiency could theoretically improve by a factor of 100 billion before it hit a limit, excluding new technologies such as quantum computing. Since then, efficiency improvements have been about 40,000. "There's so far to go," says Koomey. "It's only limited by our cleverness, not the physics.
If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks