Monday, November 18, 2024
AI data centers and supercomputers with hundreds or thousands of graphics cards use a lot of energy, but by 2025, 40% of all AI data centers may not have enough power to function fully.
As more AI data centers connect to the grid, like Elon Musk's xAI supercomputer in Tennessee, their collective power demand increases and could hit 500 Terawatt-hours by 2027. That's over double the current needs, according to a new report from Gartner.
Meta is also a big player in the data center space, with numerous data centers under construction. Microsoft is looking to add more data centers to its portfolio and is even planning to resurrect Three Mile Island to power its AI vision. These tech giants, as well as Google and Amazon, are increasingly looking to nuclear power to fulfill energy needs.
"New larger data centers are being planned to handle the huge amounts of data needed to train and implement the rapidly expanding large language models (LLMs) that underpin GenAI applications," says Gartner VP Analyst Bob Johnson. "However, short-term power shortages are likely to continue for years as new power transmission, distribution, and generation capacity could take years to come online and won’t alleviate current problems."
Gartner predicts this continued spike in electricity demand will result in higher power prices than usual over time. It'll also make it harder for utility providers to reduce their carbon emissions, meaning all this AI model training and operations may exacerbate climate change.
While some academics believe AI could help humans solve the climate crisis, the tech is causing energy and carbon emission problems in the meantime. Developing and using efficient computer hardware and renewable energy sources could help, though, as well as scheduling data centers to only operate at off-peak hours and building them in colder regions to reduce cooling costs.
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|