Akamai to Deploy Advanced AI Across Its Global Edge Network

With partner Neural Magic's software, potential user benefits may include lower latency, higher service levels, and faster response times.

2 Min Read
edge computing
Alamy

Cloud and content delivery network Akamai Technologies last week teamed with Neural Magic to deploy advanced artificial intelligence (AI) software across its global edge server network.

The duo's efforts could provide businesses with lower latency, higher service levels, and faster response times. Adding the software could enable use cases such as AI inference, immersive retail, and spatial computing.

Long ago, Akamai built a distributed global network comprised of edge servers containing cached content located close to users to cut the time and boost the performance of delivering rich media such as streaming video. Now, the provider is using the same network to provide Neural Magic's AI much closer to the sources of user data.

The company said it intends to "supercharge" its deep learning capabilities by leveraging Neural Magic's software, which enables AI workloads to be run more efficiently on traditional central processing unit-based servers, as opposed to more advanced hardware powered by graphics processing units (GPUs).

Potential Business Benefits of AI at the Edge

One expert sees several potential benefits to using Akamai's content delivery network (CDN) business customers with Neural Magic's AI acceleration software.

"This could potentially lower the cost of service and still meet the requirements for the AI workloads," said Baron Fung, Senior Research Director at Dell'Oro Group, a global telecom market research and consulting firm. "Lower cost can be achieved because the service provider (Akamai) can use general-purpose servers that uses traditional infrastructure, rather than expensive dedicated AI/GPU servers and infrastructure."

Related:Why ‘Edge Computing vs. Cloud Computing’ Misses the Point

Potential applications benefits are possible "because these nodes are situated at the network edge, close to where the user or machines are located, faster response time of applications for customers could be realized, especially for workloads that are AI related."

Higher service levels could be attained. "Because of the scalable nature of the solution, new CDN nodes suitable for AI workloads could be scaled quickly in high-demand regions."

Read the rest of this article on Network Computing.

Read more about:

Network Computing

About the Authors

Bob Wallace

Contributor, Network Computing

A veteran business and technology journalist, Bob Wallace has covered networking, telecom, and video strategies for global media outlets such as IDG and UBM. He has specialized in identifying and analyzing trends in enterprise and service provider use of enabling technologies. Most recently, Bob has focused on developments at the intersection of technology and sports. A native of Massachusetts, he lives in Ashland. 

Network Computing

Network Computing, a sister site to ITPro Today, provides community members with in-depth analysis on new and emerging infrastructure technologies, real-world advice on implementation and operations, and practical strategies for improving their skills and advancing their careers. Its community is a trusted resource for IT architects and engineers who must understand business requirements as well as build and manage the infrastructures to meet those needs.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like