The power hunger of AI servers is becoming the toughest ceiling for industry expansion.
A certain tech giant recently admitted publicly that the real bottleneck in scaling AI is not chips, but **power supply**. As soon as the statement was made, action was taken—ordering an additional 5 units of 380 MW gas turbines from South Korea's Doosan Energy to specifically power the new AI server clusters exceeding 600,000 units.
How crazy is this number? A quick calculation makes it clear. Each high-end server cabinet at the GB200 level consumes over 100kW, and 600,000 units stacked together approach a total power of **60 gigawatts**—equivalent to the annual electricity consumption of an entire country like Switzerland. Relying solely on wind and solar power is obviously insufficient; these emerging energy sources cannot meet the high-density, high-stability demands of AI computing clusters. Therefore, natural gas power generation has become a pragmatic choice: rapid deployment, stable efficiency, and immediate results.
This is not a compromise in energy policy but a **harsh reality of computing power competition**. The larger and more powerful AI models become, the exponentially higher their power consumption. Essentially, the current AI arms race has evolved into an energy war—who can guarantee power supply the fastest and most reliably holds the key to leading in computing power.
Behind this trend lies opportunity: **AI-specific energy infrastructure** is becoming a trillion-yuan new track, transforming everything from power generation equipment to energy storage systems, from grid upgrades to microgrid solutions, reshaping the entire industry chain.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
14 Likes
Reward
14
3
Repost
Share
Comment
0/400
GateUser-c802f0e8
· 13h ago
60 gigawatts... That's really outrageous, equivalent to the entire electricity consumption of Switzerland. AI is really getting more and more hungry.
View OriginalReply0
ForkTrooper
· 13h ago
60 gigawatts? Damn, that's just burning money. Energy is the real moat.
View OriginalReply0
SnapshotBot
· 13h ago
60 gigawatts? This damn thing is the real arms race, way more intense than chip bottlenecks.
The power hunger of AI servers is becoming the toughest ceiling for industry expansion.
A certain tech giant recently admitted publicly that the real bottleneck in scaling AI is not chips, but **power supply**. As soon as the statement was made, action was taken—ordering an additional 5 units of 380 MW gas turbines from South Korea's Doosan Energy to specifically power the new AI server clusters exceeding 600,000 units.
How crazy is this number? A quick calculation makes it clear. Each high-end server cabinet at the GB200 level consumes over 100kW, and 600,000 units stacked together approach a total power of **60 gigawatts**—equivalent to the annual electricity consumption of an entire country like Switzerland. Relying solely on wind and solar power is obviously insufficient; these emerging energy sources cannot meet the high-density, high-stability demands of AI computing clusters. Therefore, natural gas power generation has become a pragmatic choice: rapid deployment, stable efficiency, and immediate results.
This is not a compromise in energy policy but a **harsh reality of computing power competition**. The larger and more powerful AI models become, the exponentially higher their power consumption. Essentially, the current AI arms race has evolved into an energy war—who can guarantee power supply the fastest and most reliably holds the key to leading in computing power.
Behind this trend lies opportunity: **AI-specific energy infrastructure** is becoming a trillion-yuan new track, transforming everything from power generation equipment to energy storage systems, from grid upgrades to microgrid solutions, reshaping the entire industry chain.