In June, a new winner was crowned in the competition to be the world's fastest supercomputer, with the US taking the crown back from China.
Oak Ridge National Lab's Summit supercomputer can process more than 122 petaflops – that's 122 thousand trillion floating point operations per second. China's Sunway TaihuLight, which held the top spot for the past five years, can do 93.
Despite being faster, Summit has one-fifth as many cores as the Chinese system and uses half the power – a sign of how fast supercomputers are evolving these days. The supercomputer used on the Human Genome Project in the 1990s was less powerful than one of today's smartphones.
But it takes money to stay ahead of the curve. Summit cost ORNL $200 million.
Because of the expense, supercomputers are typically used for the most intensive calculations, like predicting climate change, or modeling airflow around new aircraft.
Bruce Beam had personal experience with supercomputers, when he was an IT director with the US Navy.
"We modeled things that you couldn't practically do, like modeling the effects of nuclear weapons," Beam, who is now director of security and infrastructure at the International Information System Security Certification Consortium, said. "It's very impractical to do that in the real world."
Supercomputers can have application in cybersecurity as well, but, according to experts, the days when that’s a reality are far ahead.
For example, IBM is using a supercomputer to analyze threat data, Bean said, but the project is still in its initial stages.
"I don't think any of it is operational yet," he said.
Commodity servers can be grouped together to do similar tasks but a lot cheaper and more efficiently, he said. "You're not quite a supercomputer, but it's enough right now for most threat-hunting and cybersecurity uses."
And, as commodity servers get faster, they will be able to take on bigger and bigger challenges.
"I don't know if we would ever get to the point where we're using a supercomputer for cybersecurity," he said. "When I was in the Navy, we had some of the fastest supercomputers in the world, and I had never once seen any of them used for any cybersecurity application."
It makes more sense to use lower-cost commodity servers in distributed deployments instead of a single centralized supercomputer, Jeff Williams, CTO and cofounder at Contrast Security, agreed. "I'd much rather see sensors that do local analysis and report up in a hierarchical fashion."
It is possible to collect all data from all the new sensors in operating systems, containers, cloud environments, and in the application layer, so that you'd need a supercomputer to look at it all, he said. "But it's a fool's errand – you just wind up analyzing a ton of garbage."
However, an event in one location might be innocent by itself, but when combined with other events in other locations and other systems, it might add up to something meaningful.
One advantage that supercomputers offer over traditional approaches is that a supercomputer can look at a large volume of data all at once.
"It can find those nuanced relationships across systems, across users, across geolocations, that could indicate early warning of a potential breach," said Anthony Di Bello, senior director of security, discovery, and analytics at OpenText.
A global enterprise with 200,000 machines could be processing petabytes of data every day, he said. "I'm looking for a needle in a haystack of needs. I need faster computing."
But we're not quite there yet, he added.
"That technology has yet to be implemented in such a way as to benefit from supercomputing," he said. "But it's heading in that direction."
It could be at least two or three years before we start seeing real-world uses of supercomputers for cybersecurity, he said. "The big tech giants are more focused on other use cases at this point."