Skip navigation
2018: What Mattered in IT?

What we Got Right (and Wrong) in Our 2018 Predictions

The ITPro Today team looks back to the claims we made on the cusp of 2018 to see how they panned out this year.

In some ways it seems like it couldn’t possibly have been 12 months ago that the ITPro Today team was sitting down to make our 2018 predictions, yet here we are. To keep ourselves accountable, we are looking back to the claims we made on the cusp of 2018 to see how they panned out this year.


"If you're looking for a trend to watch in 2018, you could do worse than keeping your eye on the blockchain phenomenon." -- Christine Hall

I wrote those words on New Year's Day in 2018. Not only that, I went on to use my crystal ball to predict that within a year we'll find the technology beginning to have an impact on everyday online life.

For most of us, that day has yet to arrive, which I'd come to accept by late summer when I opined that "the blockchain technology hasn't taken off as quickly as, say, containers, which went from unknown to everywhere at something like the speed of light."

So what went wrong? Nothing really. Blockchain is still on track to be a technology that will help define the connected world going forward, but maybe (OK, definitely) not as fast as I first thought. Part of the reason it's taking blockchain time to get up to speed as a disruptor has to do with issues with the technology that developers are still working to address. Perhaps a more important reason was expressed by Chris Ferris, CTO of open technology at IBM and the chairperson of the Hyperledger Project's technical steering committee, when he told me at OSCON, "I think people are, in just about every industry, still looking for use cases that are valid."

There's also a bit of herding cats when it comes to blockchain, since most implementations require that several companies within an industry get on the same page.

None of this means that we won't be interacting with blockchain on a daily basis soon -- just not as soon as I initially thought.


“In 2018, expect to see a lot more machine learning capabilities built into public cloud as hyperscale cloud vendors embrace artificial intelligence trends, and help users see how the tools can add business value.” - Nicole Henderson

I’ll admit that it was far from a bold claim to predict that hyperscale cloud vendors would move to provide more support around artificial intelligence (AI) and machine learning in 2018, since they were already on that trajectory this time last year. But 2018 brought cloud providers to the center of the conversation and solidified their role in providing education around the potential machine learning brings to enterprise IT.

For example, in November at AWS re:Invent, Amazon Web Services (AWS) opened up its machine learning training to its end-users, including developers, data scientists, data platform engineers and business professionals. The company said it is the same training its own employees use. And it is free.

Cloud vendors understand that this education is not only in demand, but also can be very expensive, and training end-users on their own flavor of machine learning will only mean more customers in the long-run.

“I think that machine learning is probably top of the hype cycle where virtually every company I talked to either is doing machine learning or more common, are planning on doing some machine learning, or have been told they need to do some machine learning, and it is far from easy, far from well-understood,” Christian Kleinerman, VP of product at Snowflake, said in an earlier interview with ITPro Today.

If the thought of a self-paced online curriculum makes your eyes glaze over (hey, online training is not for everyone), AWS has thought of you, too: also, in November it launched DeepRacer, an autonomous automobile to help developers train on machine learning. The micro-racer is a way for end-users to practice reinforcement learning (RL), an advanced type of ML. As we move into 2019, I expect the number of training opportunities to explode, especially as earlier fears that artificial intelligence would replace all the jobs are beginning to seem like a bit of hyperbole at this point. 


"In 2018, expect to see at least one story about the data being collected on users -- and most likely we'll get that story in one of two ways: When the app maker brags about how it used that data we didn't know it was collecting, or when a cracker brags about the ransom it gets when it invariably breaches the app maker's servers." -- Lisa Schmeiser

Okay, so I was half-right. This was the year we discovered -- repeatedly -- that everyone had been collecting data. We learned when businesses had to admit to security breaches. We learned it when the European Union's General Data Protection Regulation hit and suddenly businesses were vulnerable to new compliance rules. And we learned that sometimes, companies had been persistently dishonest about what data they were collecting and how they were using it.

For example, we learned that Facebook had been collecting user data and passing it along in all manner of undisclosed ways -- repeatedly. See also: "Facebook CEO Vows to Alert All Whose Data May've Been Exposed" (March 2018), "Facebook Scandal a `Game Changer' in Data Privacy Regulation" (April 2018), "Facebook Updates Data Policies After User Privacy Outcry" (April 2018), 
"Facebook Suspends 200 Apps in Investigation Over Data Abuse" (May 2018), "Facebook Disputes Report It Grants Phone Makers Deep Data Access" (June 2018), "If You’re A Facebook User, You’re Also a Research Subject" (June 2018), "Facebook Says Security Breach Affected About 50 Million Accounts" (September 2018), "Facebook Wielded User Data to Reward, Punish Rivals, Emails Show" (December 2018), "Facebook Bug Gave Developers Broader Access to User Photos" (December 2018) -- to name only a few of the stories we ran this year on one company's ever-expanding list of ways in which it played fast and loose with its user data. 


"As our workforce grows increasingly reliant on a liquid computing environment where workers move between desktop, tablet and phone, we'll also be increasingly dependent upon app makers who are under no obligation to tell us what data they're collecting about us while we work." -- Lisa Schmeiser

We're seeing this in both the mobile app and desktop app spaces -- and machine learning is powering a lot of that data collection. Both Google and Microsoft now rely on AI to streamline email replies and nudge conversations along. Actually, Microsoft incorporated its data collection about its users into its collaboration and productivity features across the board: In June, they incorporated "zero-query search," which anticipates your search queries "recommendations powered by AI and the Microsoft Graph," into Outlook, so that's based on data the company collects about users; in September during the annual Microsoft Ignite event, the company talked about how its customers can now use data collection on its workers at either an organizational level or an individual level; by November, we learned the company is expanding its data-fueled AI to shape how people work in Word too.

Some of the metrics Microsoft shares with users seem transparent -- my weekly email shows who I interact with most, how scheduled meetings ebb and flow in my calendar, and how much time I spent on answering email. But there's nothing anywhere in that experience that tells me whether Microsoft's tracking other data I'm generating. And right now, there doesn't have to be.


"While [Microsoft] continued work in many areas, it is these kinds of advanced technologies [around quantum computing] that become reality, and they will be part of powering their products and services in 2018 and beyond." -- Richard Hay

In 2018, Microsoft continued its momentum in making quantum computing accessible to developers at all levels. During 2018, the company posted 22 entries on its Microsoft Quantum blog to provide deeper insight and new information about its quantum computing work.

Among the areas of focus was the Quantum Development Kit, the scalability and practicality of quantum computing, and understanding how to solve problems using the technology.

Microsoft is not the only big cloud provider working with quantum computing. Google currently focuses the technology in its Artificial Intelligence efforts and Amazon, after first discussing the Quantum Compute cloud which they called QC2 at the time on April Fools’ Day 2010, have been pretty quiet on this front except for a recent blog post expressing support for the National Quantum Initiative Act on Congress. 

In an area that has largely been inhabited by scientist and research organizations, quantum computing has the potential to help us solve complex problems in a fraction of the time it would take normal computers. While the full reality of quantum computers being routine is some time off in the future, the groundwork is being laid now for the technology. If you work in an industry that needs to deal with complex data and calculations, then this is an area you should continue to watch.


"Microsoft should ... shift to one feature update each calendar year. This will still get new capabilities out into the operating system much faster than the three-year cycle of service packs that existed in previous versions of Windows and yet give organizations a more manageable update cycle." -- Richard Hay

In 2018 things got a lot more stressful for Windows 10. Both the April 2018 Update and the October 2018 Update faced challenges as they arrived. The issues were so bad with the October 2018 Update that it had to be pulled after being publicly released due to a serious data loss issue related to new functionality in OneDrive and how some key folders were handled. It took more than a month to fix this and get the update re-released to the public. As of today, this update is still only available to seekers who go into Windows Update and manually check for the update.

While the April 2018 Update release was delayed by 20 days, irritating enthusiasts and Windows Insiders who follow these things regularly, the fact that the October 2018 Update had to be pulled due to data loss was a significant setback for the Windows team. 

Following a month of testing, the October 2018 Update was re-released to users who manually checked for the update via Windows Update.

A few days after that re-release was in progress, the new Corporate Vice President of Windows, Michael Fortin, penned the blog post "Windows 10 Quality approach for a complex ecosystem" on the official Windows Blog. In this article, he went into detail about the history of Windows 10 since its release in July 2015 and the metrics the company used to measure its reliability and quality. Fortin promised more transparency and responsiveness in the future as the team moved forward from this challenging release.

Since then there have been blog posts detailing the process behind the monthly security and quality updates for Windows and hardware drivers in the Windows ecosystem. They are also now updating the Windows 10 Update History page with a list of systems that are blocked from the feature update due to ongoing issues. This is a good start to the transparency promises, but we might have to wait until the next hitch in an update to see how responsive the company is going to be communicating about the issues in real time.

If the current development cycle of the next feature update, Windows 10 19H1, is any indication, Microsoft may already be working on just fit and finish tasks. That low-key approach was part of the April 2018 Update and it might be the new focus now for future updates.

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.