Software intelligence specialist Dynatrace has released of Grail, a core platform technology designed to help unify data while retaining its context and drive AI-powered automation.
The end goal is to improve observability data as well as security and business data from cloud-native and multicloud environments.
Dynatrace's Grail provides a purpose-built data lakehouse for observability and security with a massively parallel processing (MPP) analytics engine to provide speed, scale, and instant analytics, removing the need for rehydration and index generation.
"Organizations today demand unlimited analytics — speed, scale, and low cost. Grail can tap into all data with lightning-fast queries, while maintaining the context of the data and doing this cost-effectively," said Guido Deinhammer, chief product officer at Dynatrace. "Grail makes the AIOps and analytics capabilities of the Dynatrace platform even more impactful."
The company's unified software intelligence platform for AI-powered answers and automation, including core technologies, works in concert with each other: OneAgent, Smartscape, Purepath, Davis AI, and now Grail.
Dynatrace's Grail Provides Precise, Real-Time Answers
Grail can power log analytics and management, giving teams a single source for all their data to deliver precise answers and intelligent automation from data in real time.
While the data explosion is becoming impossible for humans to manage solely, data is required to power the digital experiences in our day-to-day lives, from telehealth services and mobile banking to communicating with our friends, family, and colleagues, Deinhammer said.
Traditional approaches to managing and analyzing this data rely on fractured toolsets that lack data context, require manual analytics, and are too slow and costly for rapidly changing cloud environments or security threats.
"Adding this context requires continuous manual input, which overburdened teams don't have time to provide," he said. "Moreover, organizations become more selective about what data they analyze to control costs."
A recent Dynatrace survey found on average, organizations use just 10% of available observability and security data for analytics.
In addition, different teams store distinct data types (app traces, infrastructure metrics, security events, etc.) in specific repositories to facilitate their individual analytics needs.
"These actions create silos that make it hard to create a unified data management strategy or for teams to collaborate to align on a single, data-backed version of the truth," Deinhammer pointed out.
He added that cloud environments are getting more complex as the data explosion continues.
"While organizations make the shift to hybrid and multicloud environments, they're finding that too much of their data is siloed, locked in cold storage, discarded, or can't be managed and analyzed manually," he said.
Observability and smarter data management are critical to ensure all data and context are accessible so all infrastructure teams can get precise, actionable insights at an instant, rather than losing time with manual analytics and infrastructure maintenance.
MPP Delivers Instant Analytics
Meanwhile, massively parallel processing allows for speed, scale, and instant analytics, which addresses the need for rapid ingestion of large amounts of unstructured data.
"While traditional approaches require expensive infrastructure and storage solutions that call for manual upkeep, using an MPP approach allows us to deliver a pulse of processing power within milliseconds for large-scale data queries when an organization needs it," Deinhammer said.
As AI and ML innovations evolve, the way organizations analyze and manage their data will need to as well.
Deinhammer said these technologies play a vital role in continued digital transformation as organizations are faced with managing the ongoing data boom.
"Being able to extract value from the data generated is getting increasingly difficult as we approach a flattening of the data value curve, which could lead to a major disruption in the digital services we rely on across banking, retail, government services, and beyond," he said.
From his perspective, that's where AI and ML, in combination with domain-specific knowledge about technologies and business processes, will create actionable insight from this information and drive autonomous optimization and real-time self-healing in complex multicloud environments.
About the authorNathan Eddy is a freelance writer for ITPro Today. He has written for Popular Mechanics, Sales & Marketing Management Magazine, FierceMarkets, and CRN, among others. In 2012 he made his first documentary film, The Absent Column. He currently lives in Berlin.