Before artificial intelligence takes your job, it might become your boss. While lawmakers, unions and experts have raised concerns about AI displacing workers, a new report from Cracked Labs offers a window into how intrusive AI-powered technologies can be for employees who still have their jobs — and offers some thoughts about where it all might be heading. Companies have used automated tools to track and manage workers for years — a 2019 survey found that 45 percent of companies are monitoring their employees’ keystrokes and mouse clicks. The rise of AI has accelerated concerns about how that data is used, and what the impact on workers will be — enough that the Biden White House put out a call this summer for examples of the technology, while noting cases of it already being used on nurses, warehouse workers, drivers and call center employees. Cracked Labs, an Austrian nonprofit focused on consumer organizations and unions, took a different approach: It went deep on a single employee-tracking company, virtually unknown to the public, and documented exactly how it was promising to track and manage people in the workplace. The new report, published by researcher Wolfie Christl on Wednesday, focuses on Celonis, a German software company that describes itself as “the global leader in process mining.” The company is listed among the top five “AI private investment activities” in Stanford University’s 2023 AI Index Report, with $1.2 billion in funding. Celonis promotes its services as a way to find and fix workplace inefficiencies. Its pitch to clients is based on collecting data on employees — everything from what they’re typing and clicking on to content, length, and sentiments of call center worker conversations, according to Christl’s analysis. Celonis markets itself on its AI capabilities, using algorithmic techniques to crunch the data and make companies more efficient – though Christl cautions that AI is a vague term, and the deeper concern is how closely workers are watched, and how the information is used. Constantly watched: In one of the examples Christl found, the company discusses its capabilities for an airliner’s customer service call center. The documentation shows the software tracking every stage of a call, from when the worker picks up to discussing the reason for the call and when the call is resolved. It also checks how closely the employee is following a script, whether they offer to sell the airliner company credit card and the worker’s “sentiments” during the call. Contacted about the details in the report, Celonis didn’t dispute or confirm any of the individual descriptions of its products, but offered a blanket statement that its software comes with safeguards to prevent abuse, also noting that the use of AI in the workplace is at its customers’ discretion. “Celonis is committed to using AI responsibly and upholding the highest standards of accountability and responsibility,” the company’s director of data privacy Anna Rocke said. The Cracked Labs report is a part of a broader series on the impact of surveillance and digital control in the workplace. Beyond feeding data to management, Christl raised concerns in an interview that very granular activity logging could be a way for employers to boil down even expert office work to step-by-step instructions that anyone could follow. “It could lead to lower-scale workers who just need to follow guides, not people who know a lot about the systems,” he said. (Celonis does not offer this capability, but Christl believes that it would be plausible for managers to use the data for this purpose.) In a larger sense, Christl argues that this granular level of data collection widens a significant power imbalance between bosses and their employees, who can have every move they make tracked and used against them. “This is not just an issue about data, this is an issue about how do we want to balance the rights of employers and employees?” Christl said. “We have to look at the broader picture: How can employers use technology to achieve their goals, and how can we counterbalance that so they cannot use all the data and shift all the power to themselves?” Putting limits on early: The U.S. doesn’t have any laws restricting how AI can be used to manage employees, though agencies like the National Labor Relations Board have issued memos that they intend to use existing labor laws to protect workers’ rights. In May, the Biden Administration published a request for public comment asking for information on how automated tools are used to surveil, monitor, evaluate and manage workers to help understand how the technology works and develop best practices to prevent risks. Christl notes that in countries like Germany, workers have a right of “co-determination,” allowing them to be involved in managerial decisions, and that workers have the right to be informed when AI management software is being deployed. He recommended setting limits on what the work-collected data can be used for, and to what extent workers can be tracked. “Workers and union representatives would be advised to carefully examine the introduction of a system like Celonis at their company and demand the highest level of information and participation in this system,” Christl said.
|