To develop a new Moore's Law for computing energy efficiency.

Data centers consume 1.5% of global energy and, if current trends hold, we’ll see a 1000x increase in the volume of data by 2025.  Meanwhile, computing efficiency is expected to increase only 25x over this period, leaving a huge efficiency gap. 

Institute researchers are working to close this looming gap by pursuing solutions that reduce the energy use associated with cooling loads, inefficient server use, and wasteful computer processes.


Energy-proportional Computation Developing software to make server energy use proportional to load; developing memristers for memory, storage and computation; and optimizing components such as hard drives, RAM and microchips.
Cloud Computing Developing open-source cloud infrastructures and application platforms for researching and improving the capabilities of cloud-based networks.
Wireless Networks Developing low-cost and energy-efficient wireless networks that automatically adjust to service disruptions, and using wireless links in datacenters to adapt to bursty traffic.
Cooling Technologies Developing thermal management devices to spot cool computer chips and reduce the demand for server room cooling.

Research Projects