The United States is in an artificial intelligence race with China and that has big implications for the US energy grid.
In the coming years, the public and private sectors plan to ramp up production of AI data centers on scales that seem unfathomable today. ChatGPT maker OpenAI has told government officials it will eventually require data centers that consume an astonishing five gigawatts of energy at any one time.
OpenAI is just one company in the race. Amazon, in its most recent earnings release said its capital expenditures will reach $75 billion in 2024, much of it going toward building AI-enabled data centers. Alphabet, Meta and others are following a similar path.
The AI revolution, which is both a private sector boom and a national security imperative will require a reimagining of the US energy grid, fed by diverse energy sources, from nuclear power to solar, wind and geothermal.
The good news for the renewables industry is that all of the US companies planning to massively ramp up AI data centers have made commitments to reduce their carbon footprints.
The role of government in AI is also likely to grow. Amid a transfer of power in Washington, we will be monitoring this area to determine how our clients can benefit.
The incoming administration views winning the AI race with China as a national priority and has already floated the idea of “Manhattan Projects” for AI. While Republicans have tended to avoid subsidies for renewable energy, there is no way to achieve the necessary data center capacity without wind, solar and other forms of sustainable production.
It’s important to note that we don’t know what specific policies will change at this point. Much will depend on who is appointed to cabinet level positions in the coming weeks. But in informal policy signals in the leadup to the election, Trump officials floated the idea of rolling back red tape that could slow the development of AI.
Sunridge will be watching this space closely. Changes in policy combined with the new data center economics could change the profitability of projects, as well as new storage and production technologies.
Large geothermal projects, for instance, could become more attractive. Energy storage could also change. Solar power could be one of the fastest and cheapest ways to increase energy capacity. It might make sense to spend more money on storage, rather than wait for nuclear plants to come online.
These are questions that could be answered in the first half of 2025 and we will be ready to help clients navigate this changing landscape.
Like all new technology, there is a possibility that this new AI wave is more hype than reality. But even if the development of AI models reaches a plateau, we expect massive investment in data centers to continue for some time.
One reason is that big tech companies cannot afford to miss what is potentially the most important technological development since the industrial revolution. But even current AI models have yet to reach their full potential, and won’t get there until there is more compute capacity.
To help understand this dynamic, here is some background on how we got here.
Traditional computers run on central processing units, or CPUs. Those chips, made by companies like Intel and AMD, can do many different types of calculations, allowing you to run everything from Microsoft Word to Zoom.
AI chips like the graphics processors (GPUs) made by companies like Nvidia, only need to do one kind of calculation over and over again, allowing them to crunch massive amounts of numbers at break-neck speed. (They are called graphics processors because they were originally designed to render 3D computer graphics).
AI researchers have been using GPUs for years, repurposing the technology to do the lengthy math equations necessary in the field of AI.
In 2017, researchers at Google published a paper called “Attention is All you Need.” It showed how large language models could get “smarter” by teaching computers to recognize patterns in the way fragments of words relate to one another – or “pay attention” to one another.
This technique allowed something remarkable to take place. Unlike AI models of the past, these so-called “transformer” models that used the attention mechanism could keep getting better with more and more data and processing power.
OpenAI and Microsoft decided to take this concept and run with it. They built one of the world’s largest supercomputers with 10,000 GPUs and spent months making these models bigger and bigger.
OpenAI’s research sent shockwaves through Silicon Valley, but it wasn’t until the company turned its most massive large language model into a consumer product called ChatGPT that the world took notice.
ChatGPT became the fastest growing internet product of all time. But more importantly, it kicked off a race between the biggest companies to make better and better models.
Almost every major company in the world is now experimenting with how to use this kind of AI model in new and creative ways to automate tasks.
The most impactful use case has been in software development. Google recently disclosed that 25% of its code is now AI-generated. That figure alone should tell us a lot about whether AI is hype or reality.
In the coming AI race, data centers will be used for two main purposes: Training and inference. Training is the initial creation of an AI model like GPT-4o, which now powers ChatGPT. Inference is what the model does when you type a prompt into ChatGPT.
Training is a lot trickier than inference. Most of the advanced AI models today require all of the GPUs (or other AI chips) to be clustered inside a single building. The biggest ones today have about 100,000 GPUs connected together. The goal for AI companies is to have all of those chips running at maximum power for the entirety of a training run, which could last months.
These chips take up so much power that the heat generation alone is an engineering challenge. To keep these $40,000 devices from melting, companies use water-cooling systems that themselves consume massive amounts of energy from the grid.
AI data centers could eventually get so large, consuming so much energy, that they would require government intervention to make it possible. Hence, the “Manhattan Project” moniker given by the incoming administration.
Building data centers large enough for the biggest training runs creates an energy puzzle. How do you get enough power generation concentrated in one place to handle an unprecedented level of consumption?
While AI inference requires less power concentrated in place, it is also shaping up to be a significant energy drain.
While the AI companies have figured out ways to make inference much more efficient and less expensive, the appetite for more and more of it is insatiable.
At OpenAI, prices have come down about 99%. Normally, it’s a bad thing when a company makes a product that costs $100 one year and then $1 the following year. Not in AI, apparently.
The reason is that AI companies have found ways to give the same models more “reasoning” ability by using more compute during the inference stage. This is sometimes referred to as “chain of thought” reasoning. These methods take prompts and divide them into steps and then use the model to continuously check its own work, improving the quality, accuracy and reliability of the responses.
As more compute power becomes available, AI companies will likely find ways to use more and more compute during the inference stage to improve their products.
To meet this demand, companies like Amazon, Google, Microsoft, Oracle and others will want to dot the country with AI data centers, all of which will use more energy than the data centers built over the past decade.
This will require holistic upgrades of the energy grid, upgrading capacity, transmission lines and other infrastructure.
Solar power could be key to this strategy. Companies may incentivize customers to use their products during the day, when solar is available. For instance, customers might be offered lower prices if they are willing to wait for the completion of an intensive AI task, allowing companies to prioritize compute power when more energy is available.
© Copyright 2024 Sunridge Legal, LLP. All rights reserved.