Nvidia and Emerald AI said on Tuesday that they are joining forces with a group of major US power producers — including AES Corporation, Constellation Energy, Invenergy, NextEra Energy, Nscale Energy & Power, and Vistra — to develop a new generation of "AI factories" designed to come online faster and operate as active participants in the power grid.
The effort reflects a broader push to align the rapid expansion of artificial intelligence infrastructure with the realities of an increasingly strained US electricity system. By combining computing, energy generation, and grid management, the companies aim to build data centers that not only consume vast amounts of power but can also respond dynamically to grid conditions.
At the center of the initiative is Nvidia's Vera Rubin DSX reference architecture, paired with its DSX Flex software, which allows large-scale AI facilities to interact with grid services. The design enables so-called hybrid AI factories to initially rely on co-located power generation and battery storage — often necessary given long grid interconnection timelines — before transitioning to a more flexible role in supplying electricity back to the grid when needed.
Such facilities could help accelerate the deployment of AI infrastructure while easing pressure on existing power networks. The companies said that even AI factories without dedicated on-site generation could benefit from the architecture by achieving faster and larger grid connections.
Emerald AI's Conductor platform is designed to coordinate computing workloads with on-site energy resources, including batteries and behind-the-meter systems, allowing operators to adjust power consumption in real time without compromising performance. That flexibility, the companies said, could shorten reliance on temporary "bridge power," reduce infrastructure costs, and improve overall grid stability.
"AI factories are the engines of the intelligence era," said Jensen Huang, adding that future systems must integrate computing, energy, networking, and cooling into a single architecture. Varun Sivaram said such facilities should not be treated as passive electricity loads, but as assets capable of generating both economic value and grid support.
The initiative comes as power demand from AI data centers surges, exposing structural inefficiencies in the US grid, which is built to meet peak demand but remains underutilized much of the time. The companies estimate that more flexible AI infrastructure could unlock as much as 100 gigawatts of additional capacity by better using existing resources and selectively adding new generation.
Major energy partners involved in the effort said they see flexible, grid-responsive AI facilities as a way to meet accelerating demand without overburdening the system. By pairing large computing loads with adaptive power usage and new generation capacity, they aim to speed up project timelines while maintaining reliability.
The model also addresses a growing tension in the industry. Many large-scale AI projects have turned to on-site power generation to bypass slow grid connections, but fully isolating those resources can lead to inefficiencies and higher long-term costs. A hybrid approach, the companies argue, allows those assets to serve both AI workloads and the broader grid.
Over the past year, Nvidia and Emerald AI have tested these concepts at several data centers globally. They expect to deploy the DSX Flex system at commercial scale later this year at an AI research facility in Virginia, which is planned as one of the first fully power-flexible AI factories built on the Vera Rubin platform.
The companies said they plan to expand the model across future projects, to accelerate AI infrastructure deployment, improve grid resilience, and extend the economic benefits of both energy and computing investments to local communities.
Article edited by Jack Wu


