According to the Economist, amidst the whirlwind of the artificial intelligence (AI) revolution, Nvidia has emerged as an irreplaceable technology icon. On 27/8, the company announced its second-quarter financial report with revenue of $46.7 billion, exceeding Wall Street's forecast of $46 billion, thanks to the strong demand for Blackwell chips and the GB200 superchip - the "heart" of advanced AI models like ChatGPT.
With over 600,000 Blackwell chips and a similar number of GB200 chips sold in the quarter, Nvidia captured nearly 60% of the revenue from AI products, solidifying its leading position. Wall Street optimistically predicts the American chipmaker could reach a market capitalization of $5 trillion, after reaching $4 trillion in July.
But the US power grid, which is aging and overloaded, could derail Nvidia's dream.
![]() |
Nvidia's H20 chip. Photo: Reuters |
Nvidia's H20 chip. Photo: Reuters
Experts suggest that AI is disrupting how the technology industry uses electricity. Previously, despite the surge in internet traffic, the energy consumption of data centers remained stable thanks to increasingly energy-efficient chips. But AI has broken this rule. A typical computer rack requires 12 kW, while an AI rack training a large language model consumes up to 80 kW and 40 kW when processing user requests.
Nvidia's Blackwell chip, with superior performance, consumes 1 kW per chip - three times more than the previous generation. The GB200 superchip, integrating 72 Blackwell chips and 36 Grace CPUs, pushes the consumption to 132 kW per rack, not including the cooling system which requires an additional 160 kW. Nvidia is projected to sell 6 million Blackwell chips and 5.5 million GB200 chips from 2/2024 to 2/2026. If half of these chips are deployed in the US, they will increase electricity demand by an additional 25 GW, nearly double the capacity added to the country's power system in 2022.
The US power grid, once the foundation of industrial progress, is now showing cracks. For decades, electricity capacity has grown sluggishly in the single digits, unable to keep up with AI's energy thirst.
According to a survey by Schneider Electric, data center managers are more concerned about available power than access to GPUs. Bernstein predicts the US could face a shortage of 17-62 GW by 2030, depending on chip efficiency advancements, while Morgan Stanley estimates the figure to be 45 GW by 2028.
If this issue isn't addressed, Nvidia could face two scenarios: either chip sales stagnate because data centers cannot expand, or chips are sold but left unused due to lack of power. Both scenarios threaten the profits of Nvidia and its major customers like Microsoft and Amazon, who are pouring billions of dollars into GPUs.
The US power industry is working to turn the tide. Since ChatGPT started the AI craze in 2022, capital spending by the 50 largest listed power providers has increased by 30%, reaching $188 billion by the end of June. Ambitious plans promise to add 123 GW of new capacity, in addition to the existing 565 GW.
However, only 21 GW of new capacity has been built. Global electrical equipment manufacturers have exacerbated the situation by cutting investment by 3% annually since 2022, pushing up equipment prices, compounded by tariff pressures.
Power companies face legal hurdles when they want to raise electricity prices to offset investment costs. Meanwhile, consumers, sensitive to inflation, may react strongly, creating additional political pressure.
Recognizing this risk, tech giants are seeking their own solutions. Alphabet is deploying solar panels and energy storage at its data centers, while Meta in Louisiana is using locally sourced natural gas. But these solutions are only local and cannot replace the role of power companies, the main suppliers to the US grid.
Nvidia is also taking action, spending $30.8 billion to secure component supply and production capacity for the Blackwell B300 Ultra chip. The company has developed Microservice software to optimize energy, but this is only a short-term solution. Electricity demand is still growing faster than supply, and power companies remain key.
In addition, the company is partnering with the Electric Power Research Institute (EPRI) to develop AI-based solutions to manage the increasing electricity consumption from data centers and computing. The initiative, called the "Open Power AI Consortium", brings together major utility companies and technology corporations, including PG&E, Con Edison, Constellation Energy, Duke Energy, Tennessee Valley Authority, ENOWA (NEOM's energy and water company), Microsoft and Oracle.
The consortium plans to leverage specialized AI models to improve grid efficiency, optimize energy distribution, and prevent supply shortages. These AI models will be open-source, allowing researchers across industries and academia to contribute to grid improvements.
The impact of the energy crisis is not limited to the US. According to S&P Global, global data centers will consume over 1,580 TWh by 2034, equivalent to the entire Indian economy.
Nvidia is seeking to reduce risk by expanding into new markets, taking advantage of stable and low-cost power sources. However, even in these locations, the company faces challenges in infrastructure and local competition. These efforts show that Nvidia is trying to diversify to avoid over-reliance on the US power grid.
Nvidia CEO Jensen Huang emphasized in a Bipartisan Policy Center seminar that "AI can help solve energy problems by optimizing efficiency, but it needs to be deployed in areas with excess energy." However, such areas are increasingly scarce in the US.
The American chipmaker is at a crossroads. With the Blackwell and GB200 chips, the company has shaped the future of AI, but that future depends on a seemingly simple factor: electricity. If the US power grid cannot keep up with this trend, the company's dream of raising its market capitalization to $5 trillion could be delayed, dragging down the entire AI industry.
Technology expert Chirag Dekate from Gartner believes Nvidia leads thanks to its chip and software ecosystem, but energy is a factor beyond its control. "If the power grid doesn't keep pace, the entire AI industry could stall," he warns.
The battle now is not just about technology, but also about energy - a battle Nvidia cannot win alone.
Phong Lam (According to Economist, Apnews, Invezz)