Data centers are already consuming about 5% of global electrical energy and if current trends hold, we may see the data center’s share of global energy consumption increase substantially in coming years. Artificial Intelligence / Machine Learning (AI/ML) workloads comprise an ever-growing component of this growing data center energy and computational footprint. The researchers at the Institute for Energy Efficiency are attacking fundamental issues in reducing the energy footprints of data center computing, of moving data within and between data centers, and of training and executing AI/ML models.
Grand Challenge
Multiple orders of magnitude improvement in Data Center energy efficiency.
Data centers are projected to see a 1000x increase in the volume of data and resulting computing and communications loads by the middle of this decade. To handle this dramatic increase in throughput while reducing data center energy consumption at all levels is a significant challenge that will require a multi-faceted approach. Institute researchers are working to improve the fundamental efficiency of algorithms, computing and systems architecture, communications and interconnect to achieve these aggressive goals.
Institute researchers are developing new computing and systems architecture solutions that reduce the energy use associated with cooling loads, inefficient server use, and wasteful computer processes. Our researchers are also developing new computing hardware and architectures that address AI processing problems in a dramatically more efficient way.
IEE’s world class photonics faculty have pioneered optical on-chip interconnects that are widely licensed and used in data centers around the world. These systems have dramatically improved the efficiency of data center interconnects and communications worldwide and our researchers new developments and emerging research promise even more progress in coming years.
Grand Challenge
1000X more efficient AI/machine learning training and processing.
Training AI/machine learning models, an activity that involves processing vast amounts of data, is an energy-intensive process. One recent estimate in the literature suggested that training a single typical AI/ML model creates a carbon dioxide (CO2) footprint of 626,000 pounds, or five times the lifetime emissions of the average American car. That same paper asserted that using a state-of-the-art language model for natural language processing equals the CO2 emissions of one human for 30 years. Both findings provide a jarring quantification of AI’s environmental impact.
IEE Machine Learning researchers have demonstrated significant early results in improving the energy efficiency of training AIs via an algorithmic and systems approach.
IEE’s researchers are also designing and fabricating hardware similar to the human brain, for solving some of the hardest AI and optimization problems. This effort employs mixed-signal neuromorphic circuits with integrated metal-oxide memristors. Such circuits enable very dense, fast, and energy-efficient implementation of the most common operations in bio-inspired optimization algorithms. The preliminary results from this group show that the proposed hardware implementation is estimated to be 70 times faster and 20,000 times more energy efficient compared to the most efficient conventional approach.
Grand Challenge
Develop the computational and communication energy efficiencies necessary to achieve a fully instrumented society using the Internet of Things (IoT).
The Internet of Things (IoT) embeds ordinary physical objects in our environment with sensing, control, communications, and computing capabilities. By making it easy and economical to collect, mine, and analyze information (extracting inferences and predictions) from any physical object and location, IoT has the potential for unparalleled societal impact beyond even that wrought by Internet search and e-commerce. IoT will facilitate data-driven automation and real time sensor analysis and tracking to enhance situational awareness and effective decision making by literally extending human perception and control of the physical world through digital infrastructure.
However, the power and energy requirements that must be met to instrument every “thing” and then to connect it to The Internet are staggering. For example, many accounts in the popular press predict 1 Trillion connected devices to be on the Internet within the next decade. If each device is 1 watt (the typical power consumption for a cell phone) that will require an additional 8700 TWh/year along with the concomitant ambient heat generation and carbon footprint. Further, the power infrastructure necessary to provide electrical power to remote areas is substantially more expensive than the instrumentation and actuation devices it will power.
Institute Professors Krintz and Wolski are developing the power optimized systems necessary to make the Internet of Things, and the societal benefits it will bring, a feasible reality. They have developed CSPOT – a portable, multi-scale programming infrastructure for cloud based IoT applications that is between 1 and 3 orders of magnitude more power efficient than current commercial IoT cloud systems. The CSPOT software corpus is also available as freely available open source, stimulating a community of IoT researchers and developers who are actively pursuing the ubiquity of IoT.As a proving ground for this research, Krintz and Wolski have developed the UCSB SmartFarm project that is developing the power-efficient IoT systems and analytics necessary to implement precision agriculture in rural locations where power and computational infrastructure are not available. SmartFarm provides Institute researchers with a rich set of test applications that require highly power-optimized and durable systems to enable new sustainable farming techniques.
Research Highlights
Energy Efficient Computation and Classification Professors Dmitri Strukov and Tim Sherwood - When extremely low-energy processing is required, the choice of data representation makes a tremendous difference and we have much to learn from nature in this regard. For example the brain seems to use some form of “time-based” representations — encodings where the temporal relationship between spike arrivals carries useful information. See for example, 2019 ASPLOS Best Paper Boosted Race Trees for Low Energy Classification.
UCSB Center for Responsible Machine Learning (CRML) Director William Wang - The Center for Responsible Machine Learning ties cutting-edge research in AI with important societal impacts. In their IEE related research, Professor Wang and colleagues in CRML are applying AI and Machine Learning to helping solve societal energy efficiency problems and are using an algorithmic and systems approach to making AI/ML training and inference dramatically more energy efficient.
Analytical Architectural Modelling Including Energy vs. Performance Tradeoffs Professor Tim Sherwood - CHARM (a language for Closed Form High-Level Architectural Modelling) and applying CHARM to design more energy efficient computing architectures.
Energy Efficient Programming for the Cloud Professors Chandra Krintz and Rich Wolski - Developed CSPOT – a portable, multi-scale programming infrastructure for cloud based IoT applications that is between 1 and 3 orders of magnitude more power efficient than current commercial IoT cloud systems. The CSPOT software corpus is also available as freely available open source, stimulating a community of IoT researchers and developers who are actively pursuing the ubiquity of IoT. Professor Wolski is also part of a team focused on The Zero-Carbon Cloud with colleagues at University of Chicago.