
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
Meta is building a huge new data center called Hyperion, which will supply massive computing power, five gigawatts (GW), for its growing AI needs.
That’s enough power to run millions of homes.
According to CEO Mark Zuckerberg, this facility will be big enough to cover most of Manhattan and is likely being built in Richland Parish, Louisiana, where Meta already has a $10 billion data center project underway. (Via: TechCrunch)
This move is part of Meta’s push to stay ahead in the race to develop AI, competing with companies like OpenAI and Google DeepMind.
After hiring top AI experts, Meta is now focusing on the computing power needed to train the most advanced AI models.
By 2030, the company plans to have two GW of power online and scale up to five GW in the following years.
Zuckerberg also announced another major project called Prometheus, a supercomputer cluster being built in New Albany, Ohio, which will go online in 2026 and provide 1 GW of computing power.
This would make Meta one of the few tech companies to operate such a massive AI data center.
However, building and running these centers takes a toll on local communities.
For example, a Meta project in Georgia reportedly caused water shortages for nearby residents.
Other data centers, like those from AI company CoreWeave in Texas, could double the electricity demands of entire cities.
Despite these issues, the tech industry is racing to build more of these powerful facilities.
Other major projects include OpenAI’s Stargate (with Oracle and SoftBank) and Elon Musk’s xAI Colossus supercomputer.
The US government, including President Donald Trump and current officials like Energy Secretary Chris Wright, is backing this trend.
Wright argues that AI is worth the high energy cost because it turns electricity into “intelligence.”
Some experts warn, though, that without increasing energy production, data centers might soon consume up to 20% of US electricity, up from just 2.5% in 2022, creating more strain on local infrastructure.
Do you think the massive energy demands of AI data centers are justified by the potential benefits? Or should there be limits on how much power these facilities can consume? Tell us below in the comments, or reach us via our Twitter or Facebook.
Follow us on Flipboard, Google News, or Apple News
