Edge computing concept art Getty Images

Edge Computing Benefits for AI Crystallizing

The prospect of performing AI at the edge is still mostly theoretical, but organizations are exploring potential edge computing benefits.

Interest in edge computing continues to build, as does confusion surrounding the architecture. The situation is similar when it comes to artificial intelligence. The prospect of moving AI to the edge might sound like a recipe for even more confusion. 

Performing artificial intelligence at the edge is often “just theory quoted in articles,” said Martin Davis, managing partner at DUNELM Associates.  

Still, the concept of edge AI is increasingly hard for industrial and enterprise organizations to ignore. Resource-intensive operations such as deep learning and computer vision have traditionally taken place in centralized computing environments. But the growing availability of high-performance networking and computing hardware opens up the possibility to shift that activity a “centralized cloud architecture to the [edge],” as consultant Chaitan Sharma wrote. “It won’t happen overnight, but it is inevitable.” Gartner predicts that three-fourths of enterprise data will be processed at the edge by 2025, while Grand View Research predicts the edge computing market to expand at an annual rate of 54% to 2025

At the Edge of Industry

The question of where exactly edge computing takes place is not always clear. The Open Glossary of Edge Computing defines the architecture as the “delivery of computing capabilities to the logical extremes of a network.” Located outside of traditional data centers and the cloud, the edge is concentrated at the “last mile” of the network  and it is as close as possible to the things and people producing data or information. 

[IoT World is North America’s largest IoT event where strategists, technologists and implementers connect, putting IoT, AI, 5G and edge into action across industry verticals. Book your ticket now.]

Given the difficulty of using cloud computing in environments such as factories or mines, the industrial sector is a good candidate for edge computing architecture. A factory, for instance, might require high network reliability, at 99.9999% uptime, and low-millisecond latency, and it might place constraints on sending data off-premises. Given such limitations, most factories have traditionally deployed physical cabling and proprietary wired protocols from industrial vendors. The result is a “fragmented technology environment,” which technologies like edge computing could help unify, according to the Ovum Market Radar: CSPs’ Industrial IoT Strategies and Propositions.

An edge computing architecture that operates without the cloud is not to be confused with local compute scenarios in which all data is processed on individual devices. While such on-board computing can support critical decision making in real time, the device hardware is costly, according to Harald Remmert, senior director, research and innovation at Digi International. Additionally, the ability of such local compute configurations to support operations such as machine learning is often limited.   

Conversely, an AI-enabled edge computing system in a factory could contextualize data from multiple machines to detect and ultimately predict problems that cause downtime. “Performing machine learning inference at the edge is an enabler for the scale of applications, even when low latencies are not required,” concluded Gal Ben-Haim, head of architecture of Augury, a company creating machine learning technology for the process industry. 

That doesn’t mean deploying machine learning at the edge is necessarily easy, however. It “requires more mature machine learning models and new ways to manage their deployments,” Ben-Haim said. 

From the Cloud to Edge and Back 

While some edge computing scenarios may not use centralized computing models at all, many analysts see edge computing enabling a continuum of computing that has distributed and centralized aspects. Instead of representing a pendulum swing away from the centralized data centers, edge computing offers a “truce,” said Gartner analyst Bob Gill in a 2018 webinar

“Some models of edge computing assert it will replace cloud; I do not believe that will happen,” said Bill Malik, vice president of infrastructure strategies at Trend Micro. 

“There are few use cases where edge being self-contained makes sense,” agreed Daniel Newman, principal analyst at Futurum Research. 

Most of the time, the flow of data will be bidirectional between the edge and the cloud. While the cloud can foster tracking of broad trends and second-order effects such as changes in energy consumption or air quality, “edge computing gives local answers to local questions,” Malik said. 

Accenture sees edge computing as a cloud extension. “Edge is used by many of our clients in tandem with cloud analytics and machine learning technology to make new and valuable business services possible,” said Charles Nebolsky, managing director and network practice lead for Accenture Technology. One example is Accenture’s Connected Mine initiative to streamline how mining companies manage their in-pit operations. “We’ve extended the Connected Mine solution with edge computing at an industrial mining client where they use high-resolution video off of drilling equipment to determine rock density,” Nebolsky added. That capability allows the drill to adjust angle and speed in real-time, and also support predictive maintenance of equipment. “The bandwidth of the high-density video streams required can’t be transported back to the cloud with the required frames per second in a cost-effective manner for direct cloud processing,” Nebolsky said.

Another example of this circular data flow comes via Volvo Trucks, which deploys telematics and remote diagnostic systems in recent vehicles. The system works, in part, by using the on-board computer that detects abnormal parameters and triggers trouble codes. From there, its telematics system streams troubling operational data to Volvo’s Uptime Center, which can coordinate responses with relevant parties such as repair shops, dealers and customer service agents. While on-board computing on trucks helps diagnose problems, the centralized aspect of the deployment enables repair shops and dealers to prepare for trucks arriving for maintenance.  

“Volvo is progressing down what is quickly becoming a common maturity model related to edge analytics and artificial intelligence and machine learning,” said Bill Roberts, IoT director at SAS. A reasonable next step would be to enable the edge computing capability on trucks determine which fault data is actionable. Such a shift would free “bandwidth to collect additional telematic data leads to more analytic insights developed in the cloud,” Roberts said. “Those insights can be operationalized anywhere from the edge or the cloud depending on what the use case dictates.”  

The Distributed Energy Resources Integration test bed provides another example of combined distributed and cloud computing. The project provides an alternative to traditional centralized alternating-current power grids, which struggle to efficiently use power from distributed direct-current sources such as solar panels or wind turbines. The test bed leverages real-time edge-based analytics deployed on hardware interspersed throughout the grid to bridge heterogeneous legacy equipment and centralized control with full capabilities of real-time response and autonomous operation, according to Erik Felt, market development director of future grid at RTI, and Neil Puthuff, software integration engineer at RTI. The platform is  equipped for autonomous operation and edge-based analytics, while providing data and control to one or more control centers.

5G connectivity has also sparked interest in edge architecture to enable computing outside of traditional data centers. While there are few examples of organizations with 5G-enabled edge computing projects , that could change as the 5G network matures. The benefits of this approach are similar to those of the cloud, albeit with lower latency, Remmert noted. “This architecture is very popular for machine learning applications,” he concluded.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish