Published Date : 03/11/2025
Microsoft CEO Satya Nadella recently revealed a significant challenge facing the AI industry: the lack of power to accommodate all the AI GPUs. During an interview with OpenAI CEO Sam Altman on the Bg2 Pod, Nadella emphasized that the company is currently facing a problem of not having enough power to plug in some of the AI GPUs they have in inventory.
Nadella explained that the issue is not a glut of compute resources but a shortage of power. “The biggest issue we are now having is not a compute glut, but it’s power — it’s the ability to get the builds done fast enough close to power,” he said. “So, if you can’t do that, you may actually have a bunch of chips sitting in inventory that I can’t plug in. In fact, that is my problem today. It’s not a supply issue of chips; it’s actually the fact that I don’t have warm shells to plug into.”
This problem has already caused consumer energy bills to skyrocket, showing how the AI infrastructure being built out is negatively affecting the average American. OpenAI has even called on the federal government to build 100 gigawatts of power generation annually, stating that it’s a strategic asset in the U.S.’s push for supremacy in its AI race with China.
This comes after some experts said Beijing is miles ahead in electricity supply due to its massive investments in hydropower and nuclear power. The lack of power is just one of the many challenges the AI industry faces. Alongside this, there is the possibility of more advanced consumer hardware hitting the market.
Someday, we will make an incredible consumer device that can run a GPT-5 or GPT-6-capable model completely locally at a low power draw — and this is like so hard to wrap my head around,” Altman said. Gerstner then commented, “That will be incredible, and that’s the type of thing that scares some of the people who are building, obviously, these large, centralized compute stacks.”
This highlights another risk that companies must bear as they bet billions of dollars on massive AI data centers. While you would still need the infrastructure to train new models, the data center demand that many estimate will come from the widespread use of AI might not materialize if semiconductor advancements enable us to run them locally.
This could hasten the popping of the AI bubble, which some experts like Pat Gelsinger say is still several years away. The industry is displacing all of the internet and the service provider industry as we think about it today, and the challenges are significant but not insurmountable.
Q: What is the main challenge Microsoft is facing with AI GPUs?
A: The main challenge is the lack of power to plug in all the AI GPUs they have in inventory.
Q: How is the AI industry affecting consumer energy bills?
A: The AI infrastructure being built out is causing consumer energy bills to skyrocket due to the high power consumption of AI data centers.
Q: What has OpenAI called for to address the power issue?
A: OpenAI has called on the federal government to build 100 gigawatts of power generation annually to support the U.S. in the AI race with China.
Q: What is the potential impact of advanced consumer hardware on AI data centers?
A: Advanced consumer hardware that can run AI models locally at a low power draw could reduce the demand for large, centralized data centers.
Q: What is the current state of the AI bubble according to Pat Gelsinger?
A: Pat Gelsinger, former Intel CEO, says the AI bubble is still several years away from popping, but the industry is displacing the internet and service provider industry as we know it.