15 Technology Predictions for 2020
Dealing with data will challenge IT professionals on several fronts, from storage to security to compliance. And expect big battles – tech companies versus governments, competing AI ventures and rival collaboration tools – to shape the daily IT landscape. Here are our top technology predictions for 2020.
January 2, 2020
![2020 technology 2020 technology](https://eu-images.contentstack.com/v3/assets/blt07f68461ccd75245/blteac09563529b994d/6616abada6015763ec0f0ed3/GettyImages-1154710777.jpg?width=700&auto=webp&quality=80&disable=upscale)
Getty Images
Says who? "A trend is the competition for cloud native development platforms, with HPE entering the ring and Mirantis absorbing Docker Enterprise to contend with key players Red Hat OpenShift, Pivotal Platform and Cloud Foundry." – Michael Azoff, Distinguished Analyst, Ovum
How likely is this? Likely.
Why? An increasing number of vendors are vying to launch platforms that make it easy to build and deploy applications that feature all the hallmarks of the "cloud native" trend – including microservices architectures, containers and CI/CD pipelines. Pivotal, Cloud Foundry and Red Hat had an early lead in this niche, but growing demand has motivated other vendors to offer competing solutions. HPE announced a new Kubernetes-based deployment tool in November 2019, doubling down on its hybrid-cloud-friendly container strategy. And the acquisition by Mirantis of Docker Enterprise in the same month positioned Mirantis to add Docker container development and deployment tooling to its Kubernetes support services. – Christopher Tozzi
Says who? "Organizations are creating application clusters dependent on internal storage in each node to lower latency, but they sacrifice efficiency. NVMe Over Fabrics (NVMe-oF) enables [those organizations] to gain similar low latency but with the efficiency of shared storage." -- George Crump, president of Storage Switzerland, an IT analyst firm focused on storage and virtualization
How likely is this? Very likely. Interface cards and network switching are now NVMe-ready, and storage systems that both use NVMe internally and have native NVMe connections externally are now coming to the market. Storage software supporting NVMe-oF is a little less mature, but it's gaining steam.
Why? Artificial intelligence, machine learning and large containerized environments all need the performance and low latency that NVMe-oF can provide. – Karen Schwartz
Says who? "As efforts to derive value from data is stymied by data siloed in fragmented application deployments (core, cloud, edge), data-driven organizations will look for means to gather and federate data in so it can be correlated meaningfully for decision-makers. This will include methodologies such as DataOps to speed access to curated data using technologies such as metadata management, data lakes, data capture/movement/ingest and policy management." – Phil Goodwin, Research Director, Cloud Data Management for Protection, IDC
How likely is this? Somewhat likely. 2020 will be the year that more organizations become aware of the benefits of DataOps. As they become familiar and understand the value, they will begin looking at adding DataOps practices to their organization.
Why? DataOps is a methodology, like DevOps, that fosters collaboration, communication, integration and automation of data flows. It automates the design, deployment and management of data delivery, using metadata to improve the usability and value of data in a dynamic environment, according to Gartner. The results include reduced frustration, better data utilization, improved governance and faster time to market. – Karen Schwartz
Says who? Microsoft announced its intention to comply with California’s coming data privacy law across the U.S., laying down the gauntlet for the industry. "We’ll find out if the rest of the tech industry decides to extend CCPA to all Americans, and not just California residents," said Peter Reinhardt, CEO and co-founder of customer data and analytics firm Segment.
How likely is this? Very likely. California’s law faced opposition from the tech industry but will go into effect on January 1. Microsoft made their announcement in November.
Why? Following GDPR in Europe, California’s own data privacy legislation is likely the first of several state-level regulatory frameworks — or the precursor for a national one in the United States. Businesses aren’t ready but public and industry appetite for federal laws is there, which won’t go unnoticed in an election year. – Terri Coles
Says who? "This is an area where application development and data protection must be tightly integrated. There will have to be some awareness or synchronization here around which containers need backup and which ones do not. And application consistency is critical as container usage expands. The recipe will likely include a healthy dose of Kubernetes." -- Christophe Bertrand, senior analyst, Enterprise Strategy Group
How likely is this? Very likely. Twenty-four percent of respondents in a recent ESG survey say that data protection for containers will become as important as VM protection in the next 13 to 24 months, while another 24 percent say it will become so in more than 24 months.
Why? The adoption of containers is accelerating, and containers require different backup and recovery techniques than traditional environments. – Karen Schwartz
Says who? "There is too much going on and not enough people to handle it if we continue to do things the way they have been done. Automation can help in many ways, not as a replacement for individuals but as a force multiplier in many different areas. – Fernando Montenegro, senior analyst for information security, 451 Research
How likely is this? Very likely. According to a recent survey, more than 57 percent of cybersecurity professionals anticipate changes to the focus of their use of automation in the next 12 months
Why: The workload demands on security teams can’t be met by just increasing staff and budget. Automation via scripting can help IT professionals perform tasks faster; automation via workflows can expedite coordination; and automation within business processes can preempt issues from happening. – Karen Schwartz
Says who? This year, Nvidia, which makes the GPUs powering most HPC deployments, teamed with ScaleMatrix, which operates high-density colocation data centers, and Microway, a provider of computer clusters, servers and workstations for HPC and AI, to produce "AI Anywhere," an out-of-the-box solution that includes Nvidia's AI-focused DGX software stack and its DGX hardware. AI Anywhere delivers a payload of up to 13 petaFLOPS, more than enough for demanding AI or ML workloads. Closed loop water assisted cooling is built into the unit, making it completely plug-and-play – even in remote locations with absolutely no cooling infrastructure.
"Any enterprise that wants to be a supercomputing enterprise could have never imagined deploying the scale of infrastructure that they can support now with this solution," Tony Paikeday, director of product marketing of AI and deep learning at Nvidia, told us a few days before the product's launch. "Prior to this, they would have needed an AI-ready data center – the kind of facility that is optimized for the power and cooling demand of these accelerated computing systems. Now you can literally drop a supercomputing facility in places that would have been unimaginable before."
How likely is this? Very likey. Given the competitive nature of the tech industry, and the fact that AI is on many enterprises' Christmas wish list, it's a given that similar ready-to-digest supercomputers from the likes of HPE, Lenovo and Dell will be following. When that happens and AI becomes just another workload, expect software vendors to then find even newer ways to take advantage of any excess petaFLOP capabilities that might be taking up data center floor space. That's why it's one of our technology trends of 2020.
Why? Artificial intelligence and machine learning are making this necessary, and the industry has been busy removing the remaining roadblock to adoption. – Christine Hall
Says who? "I see DevOps embracing more interoperability than it have in the past given the proliferation of tools. 2020 will be the year of Tekton – it will grow widely and be supported by more tools. The Tekton project and ecosystem will encourage a new era of DevOps tooling interoperability than was possible in the past. " – Chris Aniszczyk, vice president, Developer Relations at the Linux Foundation.
How likely is this? Somewhat likely. Certainly, more interoperability across DevOps tools is possible, though it's not entirely clear if any one project will be the big winner or not.
Why? Every developer has their own set of favorite tools, and having those different tools interoperate has the potential to make it easier for management and for the elimination of silos. – Sean Michael Kerner
Says who? "The new challenge for the DevOps community in 2020 will be the emergence of edge computing and the demands that it will place on infrastructure teams. Edge computing requires awareness of location, greater hardware heterogeneity and unique security challenges. Developers and IT Operations personnel will need to collaborate on a new model for writing, deploying, running and managing software at the edge — ‘EdgeOps.’” — Mike Milinkovich, Executive Director, Eclipse Foundation
How likely is this? Very likely. Edge computing will likely dominate many IT discussions in 2020 as 5G become real and is more widely deployed around the world.
Why? For much of the last decade, developers have been told they need to move to the cloud. The cloud does not answer all problems, though, and the need for impactful, local, low-latency experiences is driving demand to the edge. It's a trend that developers would be wary to ignore. – Sean Michael Kerner
Says who? "The environmental credentials of the cloud providers will become more of a focus as more attempt to be carbon neutral and more will increase the use of renewable energy." - Roy Illsley, Distinguished Analyst, Ovum.
How likely is this? Somewhat likely.
Why? The climate change discussion continues to grow in intensity and more folks are realizing that IT infrastructure is a big (if somewhat invisible) part of society's collective carbon footprint. Promising more eco-friendly cloud infrastructure is a way for cloud vendors not only to bolster their public image, but also to reinforce the message that cloud architectures can save money by, among other things, consuming less energy overall than on-premise data centers. Microsoft has taken the lead in promoting its renewable-energy strategy for its data centers while also pushing for a carbon tax. Other cloud providers may follow suit in the coming year.
The caveat, of course, is that clean energy tends to be more expensive energy. Finding alternative, renewable energy sources for cloud data centers that were built before eco-friendliness was a primary consideration may be logistically and financially difficult. – Christopher Tozzi
Says who? Several leaders in business, government and academics have expressed concern not only about how artificial intelligence is being used in China but also about America’s ability to keep up. For enterprise, the competition between the two companies brings up questions of ethics, product development and deployment, and staffing.
How likely is this? Very likely. The U.S. is at the forefront for AI, but China is pulling up. Ongoing visa and immigration concerns might benefit China by keeping talent out of the United States.
Why? Right now, the United States is leading globally for AI investment and deployment, but it shouldn’t be assumed that will hold. China’s massive user base and increasing investment in this space will continue to pay off, and there are implications not only for enterprises to consider but also questions of ethics, data security and global politics. – Terri Coles
Says who? Anyone with an avid interest in chat-based workspaces, but especially Slack's investors, many of whom want to see the company – valued at $23 billion on the first day it went public – provide a solid return on investment.
How likely is this? Very likely. Microsoft mulled a Slack acquisition, then decided against it to focus on Skype in 2016. However, Slack continued to entrench itself as the de facto standard for a new collaborative workspace. Thanks to the one-two-three punch of web-based app/desktop app/mobile app, Slack users could maintain continuous communications and file sharing no matter what device they were on. Thus a collaboration tool became what the computer desktop used to be: The starting point for all work activity – with the bonus of not being tied to a specific device. Meanwhile, Skype continued to skid and Microsoft folded Skype for Business into Teams and is planning to phase it out for enterprises.
The competition has been good for enterprise customers. This year, both products incorporated a wide array of features that facilitate multiple types of communication among end users – like email prompts alerting one to a workspace-based conversation they should know about – and streamline workflow processes. People can now coordinate their calendars via Slack or Teams, for example. And both products are currently looping other workflow apps into their collaboration ecosystem, like Dropbox or Trello.
While Slack's been careful to emphasize its continued paying-customer growth and Microsoft's talked about Teams as being more big-customer focused than Slack is, the two companies are effectively making the same product for the same audience – office workers. And Microsoft has started pointing out that Slack's integration of Office products is not something that happened because of company collaboration – Slack used available APIs to build the products on its own.
Why? As of right now, Slack has the cultural mindshare and 105,000 paid customers as of December 2019. But it's up against Microsoft and Microsoft's hefty customer base – all of whom will have access to Teams as it's rolled into the Microsoft services they're already using. As of March 2019, Team's second birthday, Microsoft said more than 500,000 organizations were using Teams – that includes 91 of the Fortune 100 companies. It also recently said it's got 20 million active daily users. IT departments will have to answer the question, "Why should we pay extra for Slack when we already have Teams?" As Hunter Willis, product marketing manager at TK company Avepoint, said, "Microsoft has also been clear that Teams is its number one product focus within Office 365. Twenty million active daily users is impressive, but it represents only 10 percent of the more than 200 million Office 365 commercial users. From this perspective, it's hard to see how any enterprise productivity software has stronger growth potential in 2020." – Lisa Schmeiser
Background information: Microsoft has been sending out consistent messaging about the need to refine workflow so it necessitates fewer interruptions or mode changes due to users moving from application to application as they manage communications or act on requests that have been communicated to them. A lot of the collaboration product news dropped at Microsoft Ignite focused on increased integration of different software functions within a collaborative space so people wouldn't have to alt-tab between different apps. (For example, a new Tasks experience in Teams scheduled for release in Q1 2020 provides users with a unified view of tasks created in three separate apps – Teams, Planner or To Do – all within the Teams app.)
How likely is this? Very likely. The demand is there – a January 2018 survey of 2,000 knowledge workers across all industries in the US, UK and Australia, conducted by CITE Research on behalf of RingCentral, found that 68 percent of workers toggle between apps up to 10 times an hour, and 68 percent of workers estimate they're losing up to an hour a day to application switching. Two-thirds of them want a unified comms platform. And according to Workfront's 2018-2019 "The State of Work" report, 55% of U.S. employees are losing work time to email overload. Reducing superfluous communications and mode-switching are going to be big priorities for any company that wants to stay competitive in the collaboration space.
Why? Because improving productivity metrics is perceived as a net win for a business's bottom line. In the productivity space, "notification overload" has been cited as a major enemy of focus and flow. As Microsoft's Kamal Janardhan, Partner Director of Program Management at Microsoft, said, "We are fighting the war for attention," and that war cost an estimated $650 billion a year in lost productivity. Expect the big players in the collaboration space to unroll tools to help users manage their interruptions via notifications – without, presumably, sacrificing the ability to be reached when it's really important – and expect smaller companies to offer plug-ins and apps to define and address the problem. – Lisa Schmeiser
Background information: As every company is in the midst of a digital transformation – like Microsoft, Google, Amazon and other major players – has been saying over the last year, then there is a lot of digital data out there that needs protecting. One key area for protecting this data from unauthorized access will be the expanding use of multi-factor authentication, also referred to as two-factor authentication.
How likely is this? Likely. If enterprise leaders and managers truly want to protect their business-related data and services, they must begin adopting MFA in earnest. To the benefit of these companies looking to start or grow their use of MFA, the tools are available either for free or as part of most subscription services. According to a survey by Duo Labs in November 2017, the biggest obstacle will be the perception of the difficulties around MFA. While most users began incorporating MFA into their daily security routines voluntarily (54%), many users found it stressful, frustrating and with a steep learning curve to establish their own usage of MFA technologies.
Why? Data security, or the lack thereof, will be expensive for businesses in this age of the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), which goes into effect on January 1, 2020. Paying fines and then spending money to fix the business process that resulted in the loss of customer data will cost a company much more than simply implementing MFA. – Richard Hay
Background information: In October 2019, Microsoft surprised everyone when they announced two upcoming dual-screen mobile devices that will be part of the Surface line of hardware. The Surface Neo will run a variant of Windows 10 called Windows 10X while the Surface Duo, a more compact device, will run the Android operating system. This mobile device – which is really a smartphone despite Microsoft’s unwillingness to call it that – could be the new norm for an active Android update approach by an OEM.
How likely is this? Somewhat likely. Right now, the biggest challenge faced by enterprise customers who manage Android-based devices for their users is the slow pace around the release of monthly security updates. First-party hardware like Google’s Pixel devices consistently get these monthly updates in a very timely manner. For other OEMS, these sometimes run a couple of months behind and even longer in some cases.
Why? Many complain about Microsoft’s Patch Tuesday approach to updating their operating system and hardware. However, over the more than 10 years since implementing this approach, they have developed the processes and systems that push these updates to millions of machines every month. That regular cadence keeps Windows more secure. Microsoft could adopt this same approach with their Surface Duo device and become a lead OEM in partnership with Google to get Android security updates out at the same pace Google ships them to their Pixel hardware. – Richard Hay
Background information: In October 2019, Microsoft surprised everyone when they announced two upcoming dual-screen mobile devices that will be part of the Surface line of hardware. The Surface Neo will run a variant of Windows 10 called Windows 10X while the Surface Duo, a more compact device, will run the Android operating system. This mobile device – which is really a smartphone despite Microsoft’s unwillingness to call it that – could be the new norm for an active Android update approach by an OEM.
How likely is this? Somewhat likely. Right now, the biggest challenge faced by enterprise customers who manage Android-based devices for their users is the slow pace around the release of monthly security updates. First-party hardware like Google’s Pixel devices consistently get these monthly updates in a very timely manner. For other OEMS, these sometimes run a couple of months behind and even longer in some cases.
Why? Many complain about Microsoft’s Patch Tuesday approach to updating their operating system and hardware. However, over the more than 10 years since implementing this approach, they have developed the processes and systems that push these updates to millions of machines every month. That regular cadence keeps Windows more secure. Microsoft could adopt this same approach with their Surface Duo device and become a lead OEM in partnership with Google to get Android security updates out at the same pace Google ships them to their Pixel hardware. – Richard Hay
If data is the new oil, IT professionals have become the new prospectors, pipeline builders and refiners. Dealing with data in 2020 will challenge IT professionals on several fronts, from storage to security to compliance – but just one of the many challenges IT professionals will face, according to our technology predictions for 2020.
The year ahead will also be marked with some shifts toward a new way of doing DevOps, plus new developments in ongoing conflicts between software giants, showdowns between tech giants and governments and a battle for who gets to shape the direction of AI development and deployment.
Our team at ITPro Today talked to industry experts about their technology predictions for 2020 on a range of enterprise IT topics, including DevOps, storage, cybersecurity, AI, mobility, containers, collaboration, edge computing, compliance and more. Click through the slideshow to see 15 technology predictions for 2020 tech predictions – and how they could affect your IT strategies.
About the Author(s)
You May Also Like