Source URL: https://simonwillison.net/2025/Jan/12/generative-ai-the-power-and-the-glory/#atom-everything
Source: Simon Willison’s Weblog
Title: Generative AI – The Power and the Glory
Feedly Summary: Generative AI – The Power and the Glory
Michael Liebreich’s epic report for BloombergNEF on the current state of play with regards to generative AI, energy usage and data center growth.
I learned so much from reading this. If you’re at all interested in the energy impact of the latest wave of AI tools I recommend spending some time with this article.
Just a few of the points that stood out to me:
This isn’t the first time a leap in data center power use has been predicted. In 2007 the EPA predicted data center energy usage would double: it didn’t, thanks to efficiency gains from better servers and the shift from in-house to cloud hosting. In 2017 the WEF predicted cryptocurrency could consume al the world’s electric power by 2020, which was cut short by the first crypto bubble burst. Is this time different? Maybe.
Michael re-iterates (Sequoia) David Cahn’s $600B question, pointing out that if the anticipated infrastructure spend on AI requires $600bn in annual revenue that means 1 billion people will need to spend $600/year or 100 million intensive users will need to spend $6,000/year.
Existing data centers often have a power capacity of less than 10MW, but new AI-training focused data centers tend to be in the 75-150MW range, due to the need to colocate vast numbers of GPUs for efficient communication between them – these can at least be located anywhere in the world. Inference is a lot less demanding as the GPUs don’t need to collaborate in the same way, but it needs to be close to human population centers to provide low latency responses.
NVIDIA are claiming huge efficiency gains. “Nvidia claims to have delivered a 45,000 improvement in energy efficiency per token (a unit of data processed by AI models) over the past eight years" – and that "training a 1.8 trillion-parameter model using Blackwell GPUs, which only required 4MW, versus 15MW using the previous Hopper architecture".
Michael’s own global estimate is "45GW of additional demand by 2030", which he points out is "equivalent to one third of the power demand from the world’s aluminum smelters". But much of this demand needs to be local, which makes things a lot more challenging, especially given the need to integrate with the existing grid.
Google, Microsoft, Meta and Amazon all have net-zero emission targets which they take very seriously, making them "some of the most significant corporate purchasers of renewable energy in the world". This helps explain why they’re taking very real interest in nuclear power.
Elon’s 100,000-GPU data center in Memphis currently runs on gas:
When Elon Musk rushed to get x.AI’s Memphis Supercluster up and running in record time, he brought in 14 mobile natural gas-powered generators, each of them generating 2.5MW. It seems they do not require an air quality permit, as long as they do not remain in the same location for more than 364 days.
Here’s a reassuring statistic: "91% of all new power capacity added worldwide in 2023 was wind and solar".
There’s so much more in there, I feel like I’m doing the article a disservice by attempting to extract just the points above.
Michael’s conclusion is somewhat optimistic:
In the end, the tech titans will find out that the best way to power AI data centers is in the traditional way, by building the same generating technologies as are proving most cost effective for other users, connecting them to a robust and resilient grid, and working with local communities. […]
When it comes to new technologies – be it SMRs, fusion, novel renewables or superconducting transmission lines – it is a blessing to have some cash-rich, technologically advanced, risk-tolerant players creating demand, which has for decades been missing in low-growth developed world power markets.
(BloombergNEF is an energy research group acquired by Bloomberg in 2009, originally founded by Michael as New Energy Finance in 2004.)
Via Jamie Matthews
Tags: ai, ethics, generative-ai, energy
AI Summary and Description: Yes
**Summary:** The text discusses Michael Liebreich’s report on the energy impact of generative AI, highlighting significant predictions regarding data center energy usage and efficiency improvements in AI training infrastructure. It outlines historical predictions about data center power consumption, the financial implications of AI infrastructure, and the challenges that come with increased energy demands, particularly emphasizing the interest of major tech companies in sustainability practices.
**Detailed Description:**
The text provides a thorough analysis of generative AI and its associated energy requirements, making it highly relevant for professionals involved in cloud computing, data security, and information governance. Here are the critical points and implications:
– **Historical Context of Energy Predictions:**
– Previous predictions from agencies like the EPA and WEF have often turned out to be overly dramatic. This indicates that while AI is projected to escalate energy usage, realistic adaptations have historically mitigated these predictions.
– **Financial Implications of AI Infrastructure:**
– The anticipated $600 billion infrastructure spending on AI implies a massive scalability requirement, suggesting financial models where either billions of users contribute small amounts or a smaller number of intensive users contribute significantly more.
– **Data Center Requirements:**
– The text contrasts the typical power capacity of existing data centers (under 10MW) with newer AI-training focused centers that require 75-150MW, spotlighting the need for infrastructure that can support high-performance GPUs which operate best in close proximity.
– **Efficiency Improvements:**
– NVIDIA’s claims of improving energy efficiency by 45,000 times over eight years signify critical advancements in AI technology and infrastructure that could alleviate some energy concerns. This demonstrates a direct link between tech advancements and sustainability.
– **Projected Energy Demand:**
– Liebreich’s projection of an additional 45GW demand by 2030 aligns with significant pieces of modern energy infrastructure challenges, indicating an urgent need for advancements in energy generation and distribution systems.
– **Corporate Sustainability Efforts:**
– Major corporations (Google, Amazon, Microsoft, and Meta) are recognized for their commitments to net-zero emissions, which positions them as leaders in seeking renewable energy sources, such as nuclear. This represents a broader trend in tech companies addressing sustainability seriously.
– **Operational Choices and Regulations:**
– The mention of Elon Musk’s GPU data center running on natural gas generators speaks to regulatory complexities and operational choices in energy usage for high-performance computing.
– **Renewable Energy Trends:**
– The statistic that 91% of new power capacity in 2023 came from wind and solar reflects a significant positive trend in renewable energy adoption, which is essential for the long-term sustainability of AI infrastructure.
– **Outlook and Conclusion:**
– The report concludes optimistically about the tech industry’s ability to innovate and collaborate on energy solutions. It emphasizes the need for a resilient grid and local partnerships as crucial steps for sustainable growth in AI-powered services.
Overall, the insights from this text are invaluable for security and compliance professionals who are concerned about the implications of rapidly expanding AI capabilities on energy use, infrastructure security, and sustainability initiatives in technology ecosystems.