Internet Energy Analysis Pitfalls
When it comes to understanding and predicting trends in energy use, the internet is a tough nut to crack. So say energy researchers Eric Masanet, of UC Santa Barbara, and Jonathan Koomey, of Koomey Analytics. The two just published a peer-reviewed commentary in the journal Joule discussing the pitfalls that plague estimates of the internet’s energy and carbon impacts.
The paper describes how these errors can lead well-intentioned studies to predict massive energy growth in the information technology (IT) sector, which often doesn’t materialize. “We’re not saying the energy use of the internet isn’t a problem, or that we shouldn’t worry about it,” Masanet explained. “Rather, our main message is that we all need to get better at analyzing internet energy use and avoiding these pitfalls moving forward.”
Masanet, the Mellichamp Chair in Sustainability Science for Emerging Technologies at UCSB’s Bren School of Environmental Science & Management, has researched energy analysis of IT systems for more than 15 years. Koomey, who has studied the subject for over three decades, was for many years a staff scientist and group leader at Lawrence Berkeley National Lab, and has served as a visiting professor at Stanford University, Yale University and UC Berkeley. The article, which has no external funding source, arose out of their combined experiences and observations and was motivated by the rising public interest in internet energy use. Although the piece contains no new data or conclusions about the current energy use or environmental impacts of different technologies and sectors, it raises some important technical issues the field currently faces.
Masanet and Koomey’s work involves gathering data and building models of energy use to understand trends and make predictions. Unfortunately, IT systems are complicated and data is scarce. “The internet is a really complex system of technologies and it changes fast,” Masanet said. What’s more, in the competitive tech industry, companies often guard energy and performance data as proprietary trade secrets. “There’s a lot of engineering that goes into their operations,” he added, “and they often don’t want to give that up.”
Four fallacies
This feeds directly into the first of four major pitfalls the two researchers identified: oversimplification. Every model is a simplification of a real-world system. It has to be. But simplification becomes a pitfall when analysts overlook important aspects of the system. For example, models that underestimate improvements to data center efficiency often overestimate growth in their energy use.
Some simplification is understandable, said Koomey, since often researchers simply don’t have enough data. But too much simplification runs the risk of producing inaccurate results, he stressed.
The second pitfall is essentially the conflation of internet usage with energy demand: data traffic and energy use are not equivalent. “It seems rational to say that a 20% increase in data traffic would lead to a 20% increase in the energy use of the internet,” Masanet said, “but that’s not the way the system works.” Networks have high fixed energy use, so energy demand doesn’t change much when data traffic changes.
Imagine data throughput on the internet as passengers on a train. Most of the energy goes into moving the train. Doubling the number of people on the train won’t double the amount of energy the train requires. “So there’s this smaller, marginal effect that’s well known to network engineers but is not always known among energy analysts,” Masanet said.
The pace and nature of changes in internet technologies and data demand bring about the third pitfall: Projecting too far into the future. In a retrospective study published in 2020 Masanet, Koomey and their colleagues found that earlier projections overestimated data center energy growth. They didn’t foresee large increases in IT virtualization or shifts of workloads to the cloud.
Not only do we develop new and improved technologies, but industry structures and consumer demands often change as well. For instance, few people could have predicted the massive amounts of processing power now devoted to bitcoin mining just 5 years ago. That said, the researchers caution against extrapolating such early growth trends too far into the future. “When the internet was growing rapidly in the late 1990s, some analysts projected that IT would account for half of U.S. electricity use within a decade,” Koomey said.
Given all this uncertainty, it’s no wonder that analysts can miss the mark in their predictions. IT changes so rapidly that projections simply won’t be accurate beyond a few years, Masanet said. In contrast, projecting decades out is common in other domains of energy analysis. It can be crucial for planning power grid capacity or transportation infrastructure, to name a few. This can lead to unrealistic expectations when it comes to forecasting IT energy use, which is much more rapid and unpredictable.
The final pitfall the duo identified stemmed from a lack of proper scope: overgeneralization. When data is scarce, it’s tempting to apply growth rates from one part of a system to the system as a whole. Masanet offered the rise of cloud computing as one example. Although the energy use of many cloud companies grew rapidly over the last decade, this wasn’t the whole picture for data centers. The energy use of traditional data centers fell concurrently as that part of the sector shrank, keeping the overall energy use of data centers in check during that same time period.
Similarly, while the rise in streaming video may drive up energy use for data centers, it could reduce home energy use by decreasing the number of TV set-top boxes, Koomey explained.
“You’ve got to look at the whole system and avoid extrapolating from just one part,” Masanet said.
Going forward
In addition to dealing with a dearth of data and a complex system, tech companies and analysts don’t have any standards for reporting internet energy use. Automobiles have miles per gallon — the agreed upon efficiency metric in the U.S.— but there’s no analogue for data centers yet. One reason is that every data center is different: It’s difficult to compare a center primarily engaged in scientific computing with another that mostly handles web hosting, Masanet pointed out.
Congress recently passed the Energy Act of 2020, which has provisions for data centers. “It’s a positive sign that we’re moving toward having those benchmarks that could enable more reporting from companies, at least in the U.S.,” Masanet said.
“One thing the research community can do is help develop these metrics so that if companies do want to report and still stay confidential, they can have standard, agreed-upon, scientific metrics to use,” he added.
“The world needs better IT energy predictions, and the analysis community needs to get a lot better at producing these, ourselves included,” Masanet continued. “We’ve encountered these pitfalls in our own work.
“Now we need to recognize them and figure out how to avoid them in the future so that the we all can provide more rigorous outputs, because those outputs are becoming more and more important.”
Koomey emphasized the importance of exercising restraint when confronting complex systems with persistent data gaps. While it can be appealing to make assumptions when data doesn’t exist, that’s not the best approach, he said. It’s better to collect more data, acknowledge caveats and remain modest when making claims.
“Our goal is to promote accurate analysis of information technology, so that policymakers can make judgments based on reality rather than misconceptions,” he said. “Data on IT electricity use will always lag behind reality because much relevant data are closely-guarded secrets, and these systems change so quickly. Analysts need to accept these inherent limitations and not make strong claims based on speculation or too many assumptions.”