CONNECT WITH US

Tesla CEO reveals China's AI edge as xAI sparks US data center power race

Amanda Liang, Taipei; Charlene Chen, DIGITIMES Asia 0

Credit: AFP

According to Business Insider, Elon Musk, CEO of Tesla and founder of xAI, recently highlighted China's dominant position in AI computing power during a podcast, warning that the country will far surpass others globally in supporting AI operations.

China's power generation advantage

Musk identified China's key advantage in the AI race not just in chips or models but in its massive expansion of electricity supply capabilities. He projected that in 2026, China's power generation could reach nearly three times that of the US, enabling the rapid growth of energy-intensive AI data centers.

As computing power becomes a critical infrastructure, electricity shifts from a public utility to a core capability that AI companies must directly manage. Industry insiders say the real competition hinges on who can connect power to compute fastest.

With AI scaling massively, power constraints have evolved from cost concerns into rigid limits determining whether compute comes online on schedule. Musk noted rising chip performance drives up power consumption, making electricity delivery the main bottleneck for expanding AI systems—a challenge long underestimated.

Power delivery lags behind data center growth

Semiconductor research firm SemiAnalysis recently reported that AI data centers face a mismatch between power delivery speed and compute expansion pace. While data center build cycles now compress to 12-24 months, grid upgrades, transmission construction, and interconnection approvals often take three to five years.

When compute demand hits gigawatt levels, waiting for power becomes an untenable risk. As tech giants race to build AI data centers, single campuses consume electricity comparable to small cities, with energy supply increasingly overtaking chips and algorithms as the chief constraint on AI growth.

US firms embrace onsite power generation

To save time, more US AI data centers are opting to generate their own power onsite rather than wait for grid connections. They rapidly deploy gas turbines, engines, or fuel cells within campuses to bring compute online faster. This approach, once reserved for extreme cases, has become a practical solution known as Bring Your Own Generation (BYOG).

The industry stresses BYOG aims not to permanently disconnect from grids but to gain time—initially operating off-grid, then gradually integrating with utilities, turning onsite plants into backups. In the AI era, urgency to launch trumps traditional infrastructure timelines.

SemiAnalysis' report highlights xAI's strategy: bypassing public grids entirely using quickly deployable gas turbines and engines, achieving over 500MW onsite generation. Equipment leasing instead of purchasing further shortens setup time.

By the end of 2025, self-built power plants among US AI firms have become systemic rather than exceptional. For example, OpenAI and Oracle's 2.3GW onsite gas plant in Texas; Meta, Amazon AWS, and Google employing bridging power solutions across multiple campuses, with some AI supercomputing clusters running before formal grid connection.

Export controls won't stop China's chip progress

As power emerges as a critical limit on compute, Musk also warned against relying solely on export controls to restrict China's advanced semiconductor progress. He believes such measures will grow less effective over time and bluntly asserted that China will "figure out the chips."

Article edited by Jerry Chen