Are you ready for the cloud beneath the sea? Microsoft has built and deployed a submarine data center, running servers on the ocean floor for three months in a submersible container. The research prototype, dubbed Project Natick, is part of Microsoft’s ongoing quest to find affordable ways to deploy sustainable cloud data centers.
Project Natick represents a radical new approach to deploying data center capacity, which could enable Microsoft to shift its factory-built modular designs from earth to sea. In an era of exciting advances in data center design, Microsoft’s experiment seeks to extend the frontiers of edge computing, bringing cloud capacity closer to population centers concentrated along coastlines around the world.
“Moving data centers to the ocean made a great amount of sense to be able to make the cable to our customers as short as possible,” said Microsoft Research Engineer Jeff Kramer. “Natick could have a lot of impact, both currently and into the future.”
Microsoft’s experiment continues the data center industry’s decade-long effort to harness the power of the sea to create sustainable data centers, tapping the waves and water to power and cool armadas of cloud servers. It ties together three data center trends we’ve been tracking here at Data Center Frontier – ocean-based facilities, the emergence of edge computing and unmanned data centers.
Significantly, Microsoft kept Project Natick under wraps until it had been successfully deployed, placing it in a different league from previous visions of seagoing data centers from Google and others. This shifts the project beyond the hypothetical, allowing the industry
“The overall goal here is to deploy data centers at scale, anywhere in the world, from decision to power-on within 90 days,” said Ben Cutler, Microsoft’s Project Manager with the Natick team.
An Undersea Moonshot
Project Natick’s origin story dates to 2013 and a research paper from Microsoft’s Sean James proposing an underwater data center powered by renewable ocean energy. In 2014 Microsoft Research created a team to explore the feasibility of the concept, which led to the creation of a submersible data container containing a single rack, dubbed the Leona Philpot (named for a character the popular Xbox game Halo, who broke her neck diving into a pool but became homecoming queen).
The vessel was deployed last August 15 in about 30 feet of water off the coast of California. The Microsoft team tested and monitored the servers as the data center operated for 105 days on the ocean floor, until it was retrieved in November and trucked back to Microsoft headquarters in Redmond, Washingon for further analysis. Microsoft describes the initial voyage as “very successful,” with no leaks or hardware failures, allowing the researchers to extend the project and even run live workloads from its Azure cloud.
The Project Natick web site didn’t include any details on the power or cooling setup, but the New York Times noted that the Leone Philpot featured “a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips.”
Here’s a video from Microsoft that provides more information:
The Power of the Ocean
Microsoft’s effort is a new take on an old idea: using the sea to power and cool a data center, transforming both the economics and sustainability of cloud computing platforms. The concept dates to 2007, when Google gained a patent for a water-based data center, stirring visions of a fleet of futuristic offshore data havens powered and cooled by the waves. The company has never built the sea-going “Google Navy” described in its patents, but other companies have pursued the idea.
The most recent is Nautilus Data Centers, which has created a floating colocation facility on a barge moored on the San Francisco Bay. Nautilus says it has successfully tested a design for a floating data center that can dramatically slash the cost of running IT operations.
The primary advantage of a maritime data center is the ability to slash costs by using water to power or cool the data center, and avoiding the expense of real estate and property taxes. These ideas build on previous seagoing IT operations – both the U.S. Navy and major cruise lines have maintained sophisticated telecom and IT infrastructure for decades – and add power and cooling technologies that can slash costs.
Harnessing Wind and Waves
Microsoft’s Project Natick appears to have used grid power for its prototype, but the company is said to be working on next-generation designs featuring larger containers that can be powered with turbines powered by the waves or tides.
The company also sees the submarine data center project as an opportunity to rethink many of the form factors that have traditionally been used for servers and storage, which must account for the need for humans to access the equipment and replace components or refresh servers. Operating in full unmanned “lights out” mode allows new approaches that need account only for heat removal, rather than access.
Microsoft: The goal is to deploy data centers at scale, anywhere in the world, within 90 days.Click To TweetMicrosoft says the Natick containers are designed to operate as unmanned units submerged for up to five years at a time. An interesting wrinkle is that the company believes it may be able to go five years in production without refreshing its servers. Most hyperscale providers refresh their servers and processors every three years (as noted recently by research from Coolan).
“With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly,” the Project Natick team said in its FAQ. “We see this as an opportunity to field long-lived, resilient datacenters that operate “lights out” – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as 10 years.”
The Road Ahead
It’s easy to be skeptical about the potential for underwater data centers. But Microsoft now has data and a working prototype, as well as a history of converting these “moonshot” ideas into megascale data center operations. One of Sean James’ previous brainstorms plays a central role in this history.
In 2008, James and Microsoft colleague Christian Belady ran a single rack of servers in a tent in the fuel yard for one of the company’s data centers for seven months, with no equipment failures. The experiment proved that servers were hardier than believed, clearing the way for Microsoft to envision running containerized servers in the outdoors.
Microsoft has since deployed tens of thousands of servers – and perhaps hundreds of thousands – in modular known as IT PACs that sit outdoors on cement pads on Microsoft data center campuses in Washington state, Iowa and southern Virginia.
Belady now heads Microsoft’s data center operations, and has advanced a number of innovative projects combining portable computing and renewable energy, like data plants and landfill-powered containers.
Microsoft says its still “early days” in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.
“It’s not a moonshot in the sense that it’s just this outlandish thing,” said Spencer Fowers, a researcher on the Natick team. “It’s actually a viable product that we could make.”

The Microsoft Project Natick team with the Leona Philpot underwater data center prototype. (Photo: Microsoft)
Is this genius or madness? What’s your opinion? Share your thoughts in the comment section.