top of page

Trends in Energy Estimates for Computing in AI/Machine Learning Accelerators, Supercomputers, and Compute-Intensive Applications

Reference Type: 

Conference Paper

Shankar, Sadasivan, and Albert Reuther. 2022. “Trends in Energy Estimates for Computing in AI/Machine Learning Accelerators, Supercomputers, and Compute-Intensive Applications.” In 2022 IEEE High Performance Extreme Computing Conference (HPEC), 1–8. https://doi.org/10.1109/HPEC55821.2022.9926296.

We examine the computational energy requirements of different systems driven by the geometrical scaling law (known as Moore's law or Dennard Scaling for geometry) and increasing use of Artificial Intelligence/ Machine Learning (AI/ML) over the last decade. With more scientific and technology applications based on data-driven discovery, machine learning methods, especially deep neural networks, have become widely used. In order to enable such applications, both hardware accelerators and advanced AI/ML methods have led to the introduction of new architectures, system designs, algorithms, and software. Our analysis of energy trends indicates three important observations: 1) Energy efficiency due to geometrical scaling is slowing down; 2) The energy efficiency at the bit-level does not translate into efficiency at the instruction-level, or at the system-level for a variety of systems, especially for large-scale AI/ML accelerators or supercomputers; 3) At the application level, general-purpose AI/ML methods can be computationally energy intensive, off-setting the gains in energy from geometrical scaling and special purpose accelerators. Further, our analysis provides specific pointers for integrating energy efficiency with performance analysis for enabling high-performance and sustainable computing in the future.

Download Reference:

Search for the Publication In:

Formatted Reference:

bottom of page