Saturday, June 9, 2018

Why Microsoft wants to put data centers at the bottom of the ocean

Earlier this week, Microsoft announced the second phase of Project Natick, a research experiment that aims to understand the benefits and challenges of deploying large-scale data centers under water. In this second phase, the team sank a tank the size of a shipping container with numerous server racks off the coast of the Orkney islands and plans to keep it there for a few years to see if this is a viable way of deploying data centers in the future.

Computers and water famously don’t mix, as anyone who has ever spilled a cup of water over a laptop, so putting server racks under water sure seems like an odd idea. But as Microsoft Research’s Ben Cutler told me, there are good reasons for why the bottom of the ocean may be a good place for setting up servers.

The vast majority of people live within 200 kilometers of the ocean, Cutler noted, and Microsoft’s cloud strategy has long been about putting its data centers close to major population centers. So with large offshore wind farms potentially providing renewable power and the obvious cooling benefits of being under water (and cooling is a major cost factor for data centers), trying an experiment like this makes sense.

“Within Microsoft, we’ve spent an enormous amount of energy and time on cloud — and obviously money,” Cutler explained when I asked him about the genesis of this project. “So we’re always looking for new ways that we can innovate. And this idea sort of gelled originally with one of our employees who worked on a U.S. Navy submarine and knew something about this technology, and that this could maybe be applied to data centers.”

So back in 2013, the team launched phase one and dropped a small pressure vessel with a few servers into the waters of the Pacific Ocean. That experiment worked out pretty well. Even the local sea life seemed to appreciate it. The team found that the vessel didn’t heat up the water close to it by more than a few thousandths of a degree Celsius warmer than a few feet further away from it. The noise, too, was pretty much negligible. “We found that once we were a few meters away from the vessel, we were drowned out by background noise, which is things like snapping shrimp, which is actually the predominant sound of the ocean,” Cutler told me, and stressed that the team’s job is to measure all of this as the ocean is obviously a very sensitive environment. “What we found was that we’re very well received by wildlife and we’re very quickly colonized by crabs and octopus and other things that were in the area.”

For this second phase, the team decided on the location off the coast of Scotland because it’s also home to the European Marine Energy Center, so the infrastructure for powering the vessel from renewable energy from on- and off-shore sources was already in place.

Once the vessel is in the ocean, maintenance is pretty much impossible. The idea here is to accept that things will fail and can’t be replaced. Then, after a few years, the plan is to retrieve the vessel, refurbish it with new machines and deploy it again.

But as part of this experiment, the team also thought about how to best make these servers last as long as possible — and because nobody has to go replace a broken hard drive inside the vessel, the team decided to fill the atmosphere with nitrogen to prevent corrosion, for example. To measure the impact of that experiment, Microsoft also maintains a similar vessel on land so it can compare how well that system fares over time.

Cutler stressed that nothing here is cutting-edge technology. There are no exotic servers here and both underwater cabling and building vessels like this are well understood at this point.

Over time, Cutler envisions a factory that can prefabricate these vessels and ship them to where they are needed. That’s why the vessel is about the size of a shipping container and the team actually had it fabricated in France, loaded it on a truck and shipped it to England to test this logistics chain.

Whether that comes to pass remains to be seen, of course. The team is studying the economics of Natick for the time being, and then it’s up to Microsoft’s Azure team to take this out of the research labs and put it into more widespread production. “Our goal here is to drive this to a point where we understand that the economics make sense and that it has the characteristics that we wanted it to, and then it becomes a tool for that product group to decide whether and where to use it,” said Cutler.



from TechCrunch https://ift.tt/2MbTseR

No comments:

Post a Comment