In November of this year, Microsoft launched a new feature in its Office 365 suite of products: Productivity Score. The tool allows employers to track how employees use Microsoft’s software across more than 70 measures and provides a productivity score based on factors such as content collaboration and network connectivity.
Microsoft bills Productivity Score as support for “the journey to digital transformation” and as a way to improve productivity and satisfaction via suggesting more effective ways for employees to collaborate or ensuring software is working effectively. These workplace monitoring tools have new value now that employees who usually work on site are more likely to be working remotely, and as some companies plan to continue with remote or hybrid working environments long term.
Another recent innovation in employee monitoring also resulted in a swift backlash. In June, PwC announced the development of a tool that uses facial recognition via webcams to log when remote workers are away from their screens. PwC said the technology was developed specifically for financial institutions for reasons of compliance, but the potential privacy implications for the technology are wide ranging.
To some critics, tools such as Productivity Score represent corporate overreach into employee activities — for many employees, activities currently happening in their homes — and a hyper-focus on a definition of “productivity” that may not relate well to a person’s actual job responsibilities.
The founder of Basecamp, a project management tool designed for remote work, criticized the Microsoft offering on Twitter. “Just as the reputation of a new and better company was being built, they detonate it with the most invasive work-place surveillance scheme yet to hit mainstream,” David Heinemeier Hansson wrote.
Despite the criticism, workplace monitoring tools are increasingly popular — and increasingly sophisticated, thanks to integrating machine learning (ML), artificial intelligence (AI) and data analysis.
In September, University of Edinburgh digital health and society researcher Claudia Pagliari told The Guardian that employer surveillance was increasing, from timecard-style punching in and out to sophisticated tracking software installed on work-issued devices. And research firm Gartner found that 16% of employers are now using workplace monitoring tools more often to track measures like internal communications, computer use and engagement. Even in 2019, half of employers were using some kind of nontraditional monitoring method like analyzing email content — up from 30% in 2015.
“With increasing numbers of staff now working on critical content and processes from home, chasing efficiency and applying new systems can be difficult,” said Anthony Macciola, chief innovation officer at ABBYY, a digital intelligence company. “Essentially, productivity will always take center stage.”
Several companies are making investments in this area. Besides Microsoft and PwC, Amazon is launching industrial monitoring tools that use AI; startup Headroom tracks how many words participants say in a meeting and measures attentiveness.
And one technology in this space isn’t always enough — companies already using robotic process automation are realizing that they need new AI technologies such as task mining and process intelligence for optimal efficiency and return on investment, Macciola said.
But the decisions about whether to implement these technologies, and how they should be used, involve more than an examination of return on investment. “This is ethics — the requirement for human intervention in a business or operational process to ensure that decisions are made appropriate and in a nondiscriminatory manner,” said Avani Desai, president of Schellman & Company, a global independent security and privacy compliance assessor.
“Adopter management constantly needs to evaluate the output of such tools, consider things such as reliability of data, potential bias at the input and output layers, and make course adjustments accordingly,” Desai said.
Ultimately, though data monitoring and process analysis are critical, companies also need to understand how their employees are working, Macciola said.
Task mining has one role to play here, he said.
“Task mining captures and analyzes how people interact with systems and each other through recordings and snapshots, helping companies identify and have a deeper understanding of what employees actually do when they perform a particular task and identify the common actions,” Macciola said.
The resulting data can guide automation, improve processes and provide a view of the workflow, he said. It can also be valuable in ensuring staff focus on higher-value tasks instead of more repetitive ones — and on a more granular level, as a personal productivity tool.
“For example, if an employee doesn’t see the benefit from a program in place, they can show their employer how many hours they lose with the current program in place or how it’s negatively impacting their work productivity,” Macciola said.
Privacy and Ethics Concerns
However, there are privacy and ethics concerns to consider.
For AI technologies, ethical considerations are right up there with employee privacy, Desai said. “Ethics is at the cornerstone of artificial intelligence,” she said. “Why that is important has to do with specific use cases.” When an organization adopts an AI-based technology, it’s important to consider its sources and uses just as you would with any other new enterprise tech.
Also, some employees may see activity tracking and task-level monitoring as a Big Brother-style encroachment into their workdays. In cases where a company has not provided a work-issued device or work-issued software, it can be difficult to ensure that only data related to an employee’s job role is tracked. (This, along with other security concerns, is an argument for providing work-issued technology for remote workers.)
“Organizations must look to implement advanced task mining tools which have protocols and settings in place to safeguard users’ personal data since they log user interactions in real time, helping avoid infringing on users’ privacy,” Macciola said. For example, Microsoft announced changes to its privacy measures within Productivity Score in a blog post published about a week after the tool was announced, including the removal of user names from the product to prevent individual employee tracking. However, different software has different privacy measures in place, and it is important to understand which are built into the tools you implement.
It’s also vital to consider the value of the data your company will generate via these applications, Desai said.
“Users of ML technology need to understand the quality, quantity and especially the limitations of source data,” she said. “As such, the old saying of ‘garbage in, garbage out’ applies, especially when business decisions are made based on the outputs of the ML technology.”
It’s important to understand what the sources of this data are, and how those sources can or cannot be effectively brought together. Enterprises also need to know what data is NOT there — for example, relevant customer data — and how those omissions can affect decision-making processes, Desai said.
Additionally, it is unwise to introduce task mining or other methods of digital tracking just because you can. A clear understanding of why the software and resulting data are necessary is vital — for employers and employees.
“In a nutshell, it is the alignment of ‘people, processes and data’ that is critical to the success of any digital transformation effort,” Macciola said. Understanding how your employees are doing their jobs depends on more than just what you can learn through even the most innovative software.