20241118 Power Surge

Power Surge

Much has been said about the promise and peril of artificial intelligence (AI). Regardless of the debate’s outcome, one thing is not in question: we need more power.

The proliferation of AI technology has triggered unprecedented demand for data processing, computation, and storage. Data center growth has been robust, at 6% per year. Energy consumption at such facilities was already significant. With the shift to AI-focused data center buildout, the energy demand from data centers will increase much faster. This is because AI computing is four times as energy-intensive as non-AI computing. As models increase in size and sophistication, their incremental power requirements grow, too.

Training AI models, particularly deep learning models, involves processing massive datasets across thousands of interconnected hardware units. Training just one large language model can consume hundreds of megawatt-hours of electricity—equivalent to the annual consumption of several households. As these models grow in scale and complexity, their energy requirements expand exponentially.

Generative applications like ChatGPT and DALL-E are particularly energy-demanding. They require constant high-speed processing to deliver real-time responses and high-quality outputs. Compounding this is the trend toward “hyperscale” data centers—facilities with large-scale computing resources capable of handling vast AI workloads. Tech giants like Google, Amazon, and Microsoft rely heavily on these centers, each consuming as much power as a small city.

A study by the International Energy Agency (IEA) projects that data centers’ share of global electricity demand could nearly double over the next decade, with AI-intensive operations the leading contributor.  Some estimate that within a few years, the most advanced AI models will require the energy equivalent to what is needed to power Manhattan.

To solve this energy need, data center providers are getting creative. Just last week, it was reported that Microsoft entered into a 20-year agreement to purchase electricity from the nuclear power plant at Three Mile Island. This is a 180-degree turn for the facility, which closed five years ago and was in the process of being decommissioned. Now, $1.6 billion will be invested to get back online to feed the needs of AI. Restarts of mothballed nuclear power plants are being considered elsewhere, such as in Iowa and Michigan. 

Other possible solutions include small modular nuclear reactors (SMRs), a promising new class of flexible nuclear power systems. They represent a significant innovation in nuclear technology, with potential benefits in cost, safety, and adaptability. Last month, Google announced it would back seven of these SMRs as part of its effort to power its data centers.

There is an interesting parallel here. The power needed to run AI is far greater than that for ‘regular’ computing needs. Similarly, the human brain, which comprises just 2% of body mass, consumes 20% of our energy. Apparently, the ability to think, whether naturally or artificially, is energy-intensive. It’s a good bet that natural human brain power will find ways to solve the energy needs of artificial ‘brains.’  Let’s hope we don’t regret it!

Jeff Buck