Skip navigation
Microsoft’s Project Natick team gathers on a barge tied up to a dock in Scotland’s Orkney Islands in preparation to deploy the Northern Isles data center on the seafloor. Pictured from left to right are Mike Shepperd, senior R&D engineer, Sam Ogden, senior software engineer, Spencer Fowers, senior member of technical staff, Eric Peterson, researcher, and Ben Cutler, project manager. Microsoft/Scott Eklund/Red Box Pictures
Microsoft’s Project Natick team gathers on a barge tied up to a dock in Scotland’s Orkney Islands in preparation to deploy the Northern Isles data center on the seafloor. Pictured from left to right are Mike Shepperd, senior R&D engineer, Sam Ogden, senior software engineer, Spencer Fowers, senior member of technical staff, Eric Peterson, researcher, and Ben Cutler, project manager.

Microsoft Has Sunk Another Data Center: Here’s What You Need to Know

DCK explains the latest milestone in Project Natick, Microsoft’s crazy effort to test underwater data centers.

Project Natick, Microsoft’s crazy-sounding research effort to see whether it makes sense to deploy data centers underwater, submerged onto the ocean floor, has reached a big milestone.

The team of researchers recently powered on and sunk a 40-foot prototype submarine vessel full of servers 117 feet below ocean surface at the European Marine Energy Centre, not far from Scotland’s Orkney Islands. This is a much larger vessel than the first Project Natick deployment in 2015 off the coast of California.

Microsoft detailed the latest deployment a blog post that went up this week. Here’s what you need to know to understand the announcement:

Why sink a data center in the ocean and not just build another one on dry land?

Microsoft is building data centers on dry land like crazy, investing billions in expanding the global infrastructure underneath its hyperscale cloud platform.

Project Natick is one of its “relevant moonshots,” or projects that pursue ideas that have “the potential to transform the core of Microsoft’s business and the computer technology industry.” It may or may not work – the point is to test this idea out – but if it does work, it will be a massive paradigm shift in the way companies think about large-scale computing infrastructure.

“Everything learned from the deployment – and operations over the next year and eventual recovery – will allow the researchers to measure their expectations against the reality of operating underwater data centers in the real world.”

Is this a “science project” or real computers supporting real applications?

Project Natick is “an applied research project,” but it may eventually run some real-life cloud computing workloads.

“Like any new car, we will kick the tires and run the engine in different speeds to make sure everything works well,” Spencer Fowers, one of the Microsoft researchers, said. “Then, once we are completely ready to go, we will grab one or two of our clients and hand them over the keys and let them start deploying jobs onto our system.”

How is this phase of Project Natick different from the last one?

The first phase was a proof of concept, meant to show whether the idea is even feasible. The second phase is meant to find out whether it’s practical in terms of economics, logistics, and environmental sustainability.

The second vessel is much larger than the first one, which was a vertical cylinder that contained only one IT rack. There are 864 servers in 12 racks inside the new 40-foot submarine data center, including all the necessary cooling infrastructure.

The first vessel operated underwater only for 105 days. The second one is designed to stay on the ocean floor for years. It is a full-scale module that’s closer to what Microsoft’s underwater data centers would look like if the experiment was successful and the company started deploying them as part of its global-scale cloud platform.

How is this thing powered, and how does it send data to land?

The data center is connected to a power cable running from the Orkney Island grid, which gets all its energy from renewable sources (wind and solar). The same cable includes fiber-optic wiring that connects the capsule to Microsoft’s network.

Is it cooled by seawater?

Yes! Naval Group, the French maker of military-grade ships and submarines Microsoft hired to design its data center vessel, put in the same cooling technology used in its submarines. It takes in seawater, pipes it directly through radiators on the back of each server rack, and spits it back out into the ocean.

What if something breaks inside?

Well, the hope is that nothing will – at least for a while. The latest Natick module is designed to hum along without maintenance for “up to five years,” although the company plans to monitor and collect data from this particular deployment over the next 12 months. Maintenance is one of the hardest parts of this whole concept, introducing all kinds of logistical complications. If something does go wrong sooner than planned, chances are the vessel will be pulled out prematurely to either fix the problem and dunk it back in or to work with the data collected up until that point.

Isn’t it cheaper and less risky to store expensive computers in a building on dry land?

Sure, but there’s a lot to be gained from submerging a data center. According to Microsoft, it can be easier to secure permits to deploy a data center underwater than to build a massive facility in a dense population center.

There’s also the reduction in energy costs. Since a submerged data center can be cooled by seawater, it doesn’t need mechanical chillers, which are some of the biggest consumers of energy in a traditional data center.

Microsoft is also looking at potentially linking up with companies that produce energy using off-shore wind farms and tidal turbines. A submarine data center can be an anchor customer for a bank of tidal turbines, for example, providing the assurance necessary to get the off-shore energy project financed and built, “allowing the two industries to evolve in lockstep.”

How much does deploying underwater data centers complicate the logistics?

A lot, actually, and one of the biggest goals of Project Natick is to figure out whether it’s worth the trouble. Microsoft wants to maximize the use of its existing data center supply chain for its submarine data center project.

The first complication is getting one built. Microsoft can hire essentially any major construction firm to build a brick-and-mortar data center shell anywhere around the world. There aren’t nearly as many companies that specialize in submarine building.

Once the vessel is designed, built, and stuffed with servers and networking gear, it needs to be transported to the port of departure. The latest Natick vessel was shipped from France to Scotland on an 18-wheel flatbed truck. It was intentionally designed to have dimensions similar to those of a standard cargo container.

Once it arrived, it was attached to a ballast-filled base and “towed out to sea partially submerged and cradled by winches and cranes between the pontoons of an industrial catamaran-like gantry barge. At the deployment site, a remotely operated vehicle retrieved a cable containing the fiber optic and power wiring from the seafloor and brought it to the surface where it was checked and attached to the data center, and the data center powered on.”

The hardest part was lowering it, foot-by-foot, onto the seafloor, 117 feet under the surface. It took 10 winches, a crane, a gantry barge, and a remotely operated vehicle that accompanied the data center underwater.

Isn’t it bad for marine life?

The biggest environmental concern with a system like this is the heat it produces as the byproduct of computing. Microsoft researchers claim that the heat it puts out doesn’t make enough of a difference to be concerned. Results from the first phase of Project Natick showed that the heat from the pod quickly mixed with cold water and dissipated due to currents.

“The water just meters downstream of a Natick vessel would get a few thousandths of a degree warmer at most,” the researchers wrote in an article for IEEE Spectrum. “So the environmental impact would be very modest.”

In fact, Microsoft has filed a patent for designing submarine data centers in a way that encourages marine life to occupy the enclosures, turning them into artificial reefs. Even without specific design measures, crab and fish started gathering around the first Natick vessel within 24 hours of its deployment off the US West Coast. In a process called biofouling, single-cell organisms colonize submerged surfaces, in turn attracting larger creatures that eat them, and so on.

Biofouling creates a bit of a design problem for a submerged data center, however, since it can obstruct water coming in and out of the cooling system. As of last February, when the IEEE article came out, project researchers were still working through this issue.

What’s the big-picture vision here?

The fact is that most of us live near water. In Microsoft’s estimate, “more than half of the world’s population lives within about 120 miles of the coast.”

Another fact is that we haven’t figured out a way to make data (or anything else for that matter) travel faster than the speed of light. That means that even if we ever do manage to bring the time it takes to process data, encrypt it, and send it to the right address down to zero, chances are it will always take some time for data to travel from point A to point B. That leaves physical distance as the only remaining knob we can tweak to shrink latency.

Network latency matters a lot for next-generation applications. The ability to make split-second decisions by autonomous vehicles, for example, can be a matter of life and death. Less dramatically, the ability of a VR headset to mimic the way the world presents itself to the human brain depends to a large extent on lightning-fast image rendering. Put simply, if your virtual reality is slower than actual reality, you will feel nauseous.

In Microsoft’s words, “By putting data centers in bodies of water near coastal cities, data would have a short distance to travel to reach coastal communities, leading to fast and smooth web surfing, video streaming, and game playing, as well as authentic experiences for AI-driven technologies.”

Has anyone else deployed data centers in the ocean before?

Besides Project Natick Phase I, there’s been no other publicly reported attempt to submerge a data center. A floating data center, however, has been built. A California company called Nautilus Data Technologies has built a data center on a floating barge, where it’s selling colocation space.

While it’s unlikely to actually have one running, Alphabet’s Google has a patent for a floating data center powered by tidal energy.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish