Azure Stack Brings Microsoft's Cloud Outside Microsoft Data Centers update from December 2017

If cloud is a model and not a place, you should be able to have it anywhere.

7 Min Read

If cloud is a model and not a place, then the promise of Microsoft’s Azure Stack is that you should be able to have that model anywhere, be it a Microsoft data center or your own facility. And it doesn’t necessarily have to be a data center.

Applications for the on-premises version of the world’s second most popular enterprise cloud (Amazon Web Services is the market leader) are expanding well beyond the initial goal of simply providing on premises the same applications and services that are available in Azure. Vendors like the Swiss industrial automation giant ABB, the French oilfield services multinational Schlumberger, and the San Francisco-based enterprise software company Pivotal are bringing platform solutions to Azure Stack. They run as a mix of on-prem and public cloud services, and in some verticals the on-prem portion is playing a bigger and bigger role.

This is especially true in “industrial Internet of Things” applications, where the likes of ABB and GE collect data from sensors on their equipment and analyze it to help with things like predictive maintenance and capacity planning. Often, for performance and/or security reasons, the compute gear that does the analytics is placed directly on oil rigs, power plants, factory floors, in mines, and so on. Sometimes it’s connected to the cloud, and sometimes it isn’t.

“The message from our customers was, ‘We will only start this journey with you if you can put all the infrastructure we need on our premises, isolated from the internet, so we can be assured of security, of governance, of adequate latency for decision making,’” Ciaran Flanagan, group VP and head of ABB's Global Datacenter business, said. He predicted that for industrial IoT, between 60 percent and 70 percent of processing, transactions, and data management is going to happen at the edge.

Industry verticals Azure Stack appeals to include oil and gas, manufacturing, retail, healthcare, and government, Natalia Mackevicius, director of program management for Azure Stack, told us. “We’re looking at military and defense, where they often need to be fully air-gapped, or disconnected, but they want to use same type of app development as in the cloud.”

Some customers want a hybrid cloud strategy for unified application development, with the same code running in their data center as in the public cloud. Others have data sovereignty or regulatory requirements that make it hard to use public cloud. Yet others want it for locations where public cloud is inaccessible, or where unreliable connectivity or network latency makes cloud unfeasible.

Especially in financial services, some organizations want to modernize legacy apps and apps inherited through acquisitions without using public cloud, she noted. “If they have a mainframe Oracle or SAP environment, that’s the system of record for the organization that they can't easily migrate. They want to bring the cloud application model to that location and start modernizing the application that way.”

That kind of software refresh is why Pivotal is bringing its Cloud Foundry application platform to Azure Stack, Richard Seroter, Pivotal’s director of product, told us. “Companies need to add more agility. If you can make your on-premises team operate in an agile, on-demand manner, location doesn’t matter as much. Azure Stack makes a lot of sense, because you get the actual same user experience and paradigms on premises and in the cloud; that’s not something AWS or Google or VMware can do.”

Coming from someone working for a company closely associated with VMware, that statement is telling. Pivotal was spun out of VMware and its parent company EMC in 2013 and was for several years run by VMware’s former CEO Paul Maritz. The company that’s now called Dell Technologies still lists both VMware and Pivotal as its subsidiaries following the merger between Dell and EMC. VMware and Pivotal also have a partnership with Google Cloud Platform, providing an enterprise-friendly version of the open source container orchestration platform Kubernetes as a service.

Through partnerships or otherwise, every major cloud provider now has an on-premises story to tell, each in its own stage of maturity, Azure Stack being the more mature of the offerings. AWS and VMware are offering customers the ability to extend their on-prem VMware environments to the AWS cloud; Google has partnerships with Cisco, Nutanix, and Scale Computing to bridge the gap between its cloud and enterprise customers’ own data centers. Oracle and IBM also have on-prem cloud offerings that gel with their public clouds.

Hardening the Edge

Some customers are using Azure Stack without any cloud connectivity at all; others have occasionally connected architectures where Azure Stack takes care of the local compute and real-time processing, but they might also use public Azure services for aggregate data analysis, machine learning and AI. “They’re getting some of the data to Azure periodically to make use of services like Azure Functions and Azure Batch and Cosmos DB,” Mackevicius said.

Hybrid scenarios -- where Azure Stack handles local near-real-time processing, while public cloud is used for aggregate data analysis -- are interesting to industrial customers, which is why ABB picked Azure Stack as the basis for a hardened edge data center. It developed the solution for oil rigs, mines, power plants, factory floors, and other environments that are different from data centers and don’t have data center managers manning them.

To deliver that, ABB’s Ability software runs on an Azure Stack system from HPE in an IP65-rated cabinet from Rittal with its own UPS and cooling as a secure edge data center. Flanagan called it “a fully autonomous single-rack data center to host wherever you need, in whatever context.”

Ability pools ABB’s IoT-enabled devices into a single managed platform, he said. “Our customers are very interested in what they can do in terms of business benefit for efficiency for maintenance, for capacity planning, for scalability, but they also have concerns around the convergence of IT and operational technology. Industrial customers are very expert at OT, but they may not be so expert at IT, so we have to make sure IT is not an inhibitor to the rollout of industrial IoT.”

Industrial processes often require quick and reliable decisions based on the analysis of vast amounts of sensor data, Volkhard Bregulla, HPE’s VP, Global Manufacturing Industries, and ABB’s hardware partner, told us. “Think of a machine which has to be stopped within fractions of a second to avoid damage. There’s no option to transfer the raw data to a remote data center or cloud for analysis, because this would cost too much time, it would exceed the capacity of the network, and it would not be reliable enough.”

But if you’re correlating data from several plants or coordinating processes across sites or even between different companies, public cloud is a better way to do it. Oil rig equipment maintenance is a huge challenge for oil and gas companies, Bregulla said, because the equipment is exposed to extremely rough environments, and maintenance costs can be very high. Running applications that monitor the condition of the equipment on the oil rig itself means they can analyze more and richer datasets and get better predictions than they would if they had to trim the data down to suit the low-bandwidth network links that connect the rigs. Sending a subset of data to public Azure means they can optimize maintenance scheduling across several rigs or even oilfields, and it can be the same application in Azure and Azure Stack handling that.

By enabling customers to run some of Azure's PaaS services in their own data centers, Azure Stack simplifies this new class of hybrid cloud design patterns, where part of the app runs in Azure Stack and other components in Azure. For many customers, the appeal lies in the fact that it is less a stepping stone to cloud migration and more a bridge to bring cloud to where they need it.

About the Author(s)

Data Center Knowledge

Data Center Knowledge, a sister site to ITPro Today, is a leading online source of daily news and analysis about the data center industry. Areas of coverage include power and cooling technology, processor and server architecture, networks, storage, the colocation industry, data center company stocks, cloud, the modern hyper-scale data center space, edge computing, infrastructure for machine learning, and virtual and augmented reality. Each month, hundreds of thousands of data center professionals (C-level, business, IT and facilities decision-makers) turn to DCK to help them develop data center strategies and/or design, build and manage world-class data centers. These buyers and decision-makers rely on DCK as a trusted source of breaking news and expertise on these specialized facilities.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like