The engines of modern artificial intelligence are firing on all cylinders and running in data centers. These sprawling, energy-hungry digital-age factories are devouring staggering amounts of electricity and water. Electricity keeps them alive; the water keeps them cool.
Now, the appetite for AI is growing so fast that data centers they are starting to compete with local communities for these resources, especially water. China, however, is betting on a bold solution: moving its data centers to the wettest place on Earth: the ocean.
China already has two large underwater data centers. The first, off the coast of Hainan, was launched in 2022 and is now in full commercial use. A second $226 million project recently came online off Shanghai and is powered in part by offshore wind. Both rely on naturally cold ocean water for cooling.
The cloud under the ocean
Data centers are dedicated buildings, packed with thousands of powerful computers called servers. Servers have become critical infrastructure, storing, processing, and distributing data for everything from your emails to streaming to AI training. They were already booming due to explosive data growth, and AI kicked things into overdrive. The enormous computational demands of AI have created a voracious and relentless appetite for more processing power, forcing companies to build these data factories at an unprecedented pace.
These powerful computers generate a large amount of waste heat that needs cooling. This process represents a huge 40% of your total electricity consumptionwhich is already very high.
Data centers consume between 2% and 3% of the world’s electricity, according to the International Energy Agency. AI is expected to increase this consumption by 165% by 2030. Simply put, they consume a lot of energy and will consume even more.
Most of that energy is spent cooling water, which is then evaporated or sprayed into the air. The idea of the underwater center is to use the cold, stable temperature of the ocean as a massive, free “heat sink.”
“We placed the entire data cabin in the deep sea because seawater can help cool the temperature,” Pu Ding, project director of Shenzhen HiCloud Data Center Technology, told Chinese media outlet Financial News. “Compared to terrestrial data centers, underwater data centers can reduce the energy consumption required for cooling, which helps reduce operating costs.”
The engineering part is, in principle, quite simple. The system pumps cold seawater through a radiator at the back of the server racks, absorbing heat and carrying it away. To obtain power, engineers can connect the system to a nearby offshore wind farm, such as the one in Shanghai. The result, on paper, is a stand-alone data center with a near-zero footprint that requires no fresh water or grid connection. Of course, this is easier said than done, and there is another major problem.
What happens if something breaks?
That’s the billion dollar question. Obviously, you can’t just send a guy with a screwdriver to fix things. In fact, nothing can be repaired on site. The entire system is based on two basic principles: extreme reliability and modularity.
The ideal scenario is that, of course, nothing breaks. It’s not as crazy as it seems. In large, well-designed data centers, most server failures are caused by humanshumidity or dust. These capsules are sealed and surrounded by inert nitrogen, not reactive oxygen. This stops corrosion in its tracks. A 2020 Microsoft study found that data centers dropped into the ocean are eight times more reliable than their land-based counterparts.
But if something goes wrong, China is opting for a “trade, not fix” approach.
The data center is not a giant structure; It is a series of enormous 1,400-ton “cabin capsules” chained together at the bottom of the sea. Each cabinet contains 24 server racks capable of holding up to 500 servers. So if something breaks, the system can be transported back to shore, where the faulty module is swapped out for a working one and then sunk again. The idea is that the huge energy savings will offset this risk and potential logistics cost.
China runs (or bets?) ahead
This idea is not new. Microsoft pioneered this technology more than a decade ago with its Project Natick. microsoft launched Project Natick in 2014, submerging an experiment data center off the coast of Scotland in 2018. However, as of 2024, Microsoft said it is no longer pursuing this technology. The company never really explained why it stopped the project. But it was probably due to the high logistical cost of large-scale ocean deployment and repair.
China thinks it has that covered. Or rather, he is willing to take a risk. Earlier this year, the country opened another data center in Shanghai. The Hainan project aims for a network of 100 cabins. The $226 million Shanghai facility is a 24-megawatt prototype for planned 500-megawatt versions.
China is making a huge bet. The bet is that the complex engineering challenge of servicing these deep-sea capsules is a much easier problem to solve than the global energy and water shortage crisis that AI is creating. While the West, after crunching the numbers, took a step back, China goes on the attack. The rest of the world is now watching to see whether this gamble will cool a dangerously hot planet or become a very expensive high-tech madness at the bottom of the sea.
#China #building #massive #underwater #data #centers #fuel #boom