Skip navigation
AI NetApp

Is Artificial Intelligence Storage Better Than Traditional Cloud Storage?

NetApp's Octavian Tanase discusses the ways in which artificial intelligence storage will increase companies' ability to exploit data.

NetApp, along with Nvidia, is at the forefront of a burgeoning new field of artificial intelligence storage--or, more specifically, artificial intelligence-infused storage. Together, they have developed ONTAP AI, designed to help organizations achieve “edge to core to cloud” control over their data. It’s powered by Nvidia’s DGX supercomputers and NetApp’s AFF A800 cloud-connected flash storage. Octavian Tanase, senior vice president of NetApp’s ONTAP Software & System, explains why he believes artificial intelligence and machine learning will soon be the new mission-critical applications, and why the data associated with them will be invaluable.

ITPro Today: Why do you believe the time has come for true AI-infused storage?

Tanase: It’s all about the tremendous growth and types of data. People are starting to think of data as the oil of the 21st century—something that powers all organizations. A lot of data today is being created outside of the traditional data center at the edge or in the cloud, and we saw the need for a data pipeline from the edge to core to cloud that would enable data scientists to analyze it, apply new algorithms, and draw more value from it.

How does your approach to artificial intelligence storage achieve these goals? 

We’ve combined the best technology from NVIDIA and NetApp to enable tremendous performance and scalability for organizations looking to apply AI and deep learning algorithms to their data sets. NVIDIA contributes the capabilities of its DGX-1 AI supercomputer, which packs multiple CPUs called GPUs in a very small footprint. It provides the necessary density and has algorithmic capabilities that enable complex computations. NetApp’s Data Fabric allows organizations to build data lakes and make the protected data highly available. The solution is called ONTAP AI.

How is this solution different from others available today? 

One way is performance. We believe that our solution performs as much as four times faster than our closest competitor. The second is that we enable the data pipeline from the edge to the core to the cloud. We believe we’re the only vendor that has consistent data management on edge devices in both the traditional data center and the cloud. I also think we are the only such solution available where data is actually created.

What does the artificial intelligence storage component provide? 

A lot of work and time goes into preparing the data so you can apply algorithms. Then you have to train your AI module so it can learn from the different data sets you’re throwing at it. You also need to be able to correct that algorithm over time so it becomes intelligent and faster than the human brain in performing certain tasks. This solution integrates all of those processes.

Can you provide an example of how it might work?

Take a high-tech company that has assets in the field deployed at customer sites that do some kind of task and send data back. With this solution, they could collect all of those data points and process them. Through analysis, they could derive information about the health and meantime to failure for certain components. That way, they could proactively replace a component without any human intervention. They could also gain important information about usage patterns. By analyzing that data, you could derive best practices for using that device.

Is this solution best-suited to the high-tech industry, then?

No, it’s applicable to any industry in any sector. We validated this solution before launch with more than a dozen customers. In one case, a healthcare company is using it to collect data points from Internet-connected inhalers, with the goal of being able to adjust and customize doses for specific patients. And a manufacturer is using it to manage data in autonomous vehicles.

How would companies try to achieve these tasks before?

They could still do the work, but calculations that now take a week might have taken a year, and calculations that now take hours might have taken a week. 

What other efficiencies does it provide?

We’re all about making it as simple as possible to enable the data lifecycle from the edge to the core to the cloud. So we have tools that make it easier to manage the data and operationalize the infrastructure. For example, we use a technique called thin provisioning to enable users to clone a lightweight version of the data and analyze it without having to make a full copy.

How might AI change the nature of data storage in general over the next five years or so?

This is probably one of the more transformational trends that have affected our industry. You will see the emergence of data lakes that transcend the traditional data center, so there will be a need for data lakes that live partly in the cloud and partly in the traditional data center. I also believe that customers will start looking at their data differently. They will look at data as their main asset rather than collateral that they employ only when they need to do some sort of analysis. In the next three to five years, I believe that artificial intelligence, machine learning and data lakes will be the new mission-critical applications, and the data associated with them will be critical to enterprises regardless of industry sector.

TAGS: Storage
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish