As long as there have been employees, there has been workplace surveillance — the boss with the keen eye, the manager with quotas and a notebook, the timeclock and punch card.
Frederick Winslow Taylor, an early and influential 20th century management consultant is known for the tenet that an unobserved worker is an inefficient one.
But the nature of surveillance, both in scale and scope, has changed dramatically since Taylor’s time and the global pandemic both increased the use and awareness of monitoring — not least because it affected more white collar workers.
From listening in on chatter and phone calls to monitoring the computer screens of employees, and now gauging the emotional condition of workers using artificial intelligence, the possibilities and the pitfalls are endless.
There can be good reasons for a company to monitor its employees, but it’s also easy for so-called ‘surveillance creep’ to establish a foothold and drive surveillance to excess. That can lead to ethical and legal problems, not to mention a workplace culture poisoned with distrust.
In the end, productivity, trust, and efficiency can all suffer — the very problems a company is likely trying to avoid.
“The research does indicate that there is an expansion or an intensification of surveillance taking place in the workplace,” said Joe Masoodi, a senior policy analyst with The Dais, a public policy and leadership institute at Toronto Metropolitan University, and a co-author of Workplace Surveillance and Remote Work.
The urge to overreach
Masoodi said there is a certain amount of monitoring that can contribute to professional growth and development for employees, but employers need to be careful not to overreach, and points to guidelines from the Privacy Commissioner of Canada as an example of what’s reasonable.
But just as employees settle into new routines or hybrid work, managers can struggle to rein in their own productivity paranoia about employees. The workers, meanwhile, struggle to appear productive, burning themselves out in the process.
The motivations for taking things too far are abundant, but so too are the technologies.
Contemporary surveillance can track the keystrokes of employees or how often they move their mouse. Video cameras can, and do, record employees at work and home, while some systems allow managers to view employee computer screens throughout the day and track which apps and websites they’re spending their time on.
There’s plenty of indication that those techniques can backfire, with employees engaging in “productivity theatre” even with bening or imagined surveillance that comes with the expectation of being online, or outright sabotage in order to push back against true surveillance.
“It could lead to a lot of distrust between employees and managers,” said Masoodi.
“It could contribute to increased absenteeism, there’s a higher turnover rate. It could also introduce this [situation] where individuals resist the forms of surveillance that are taking place. So they create countermeasures to counteract that surveillance. For instance, they’ll buy a mouse that kind of periodically moves.”
The rise and the risk of emotional AI
New AI systems are designed to take things a step further, monitoring employee emotional states in order to gauge everything from interest, mood, and security threats — crossing a threshold from productivity to employees’ innermost privacy.
The countermeasures that accompany that level of surveillance can be just as personal and cause harm to workers.
Research from the University of Michigan on emotional AI monitoring shows employees have strong reactions to the technology, and can engage in emotional labour in order to mask their true feelings and comply with expectations implied by the technology.
Respondents in the research outlined both psychological, emotional, and physical harms from the use of emotional AI surveillance, and fears around reputational and employment impacts.
The practice can also expose existing inequalities based on race, gender, and socioeconomic status due to power imbalances that put greater pressure on some employees, or through the technologies themselves..
“We don’t know how those algorithms are designed, what types of biases they include, and how they make decisions,” said Masoodi. “They’re often very opaque and black box, and very often they can be discriminatory. It’s an area of concern.”
There are also questions about the accuracy of things like facial recognition software that can be part of emotional surveillance, particularly across cultures.
Communication and transparency are critical
According to Masoodi, employers are lured by promises of increased efficiency and increased profits by companies selling monitoring technologies, but not all workplaces are the same and not all employees will thrive under a watchful eye.
Reid Blackman, a consultant and author on ethics and technology, said surveillance should be approached cautiously and recommends employers decide what they want to measure and why as a first step.
But he, along with plenty of research, also says communication and transparency is key if an employer wants to avoid backlash.
When employees know they are being watched and the reasons behind that monitoring, they are far more likely to accept the monitoring.
“A clear place to start is, in a meaningful way, to bring workers into the process of determining what technology will be used, how the data it collects will be treated and who will have access to those data, and really thinking through how the technology can help workers to accomplish their work, rather than as a threat or a policing tool,” Karen Levy, an assistant professor of sociology at Cornell University and author of Data Driven, told the BBC.
That involvement can reduce the stress of surveillance and reduce anxiety around what exactly is being watched.
Others, like University of Michigan researchers, have found that even employee engagement and policy changes would likely be insufficient to mediate the impacts of emotional AI surveillance on employees.
Beyond ethics, legal challenges await
It’s not just those ethical and inclusive considerations, however. Employers also have to think about how the legal and policy landscape can, and will, change as new technologies force governments to wrestle with the consequences.
In Canada, Bill C-27 will introduce new privacy requirements for businesses that include workplace surveillance data, consent for collecting that data, and the use of AI technologies. It also comes with fines ranging from $10 million to $25 million for violations.
In the U.S., the National Labor Relations Board is pushing back against the use of AI surveillance technologies in the workplace.
Those are just two examples in a rapidly changing global regulatory environment.
But ultimately, the decisions made and the way that companies balance the need for understanding their workforce and treating them with respect, are downright personal.
“There’s a very direct relationship between the surveillance technologies, and the direct impact on workers,” said Masoodi. “It could influence whether a person is hired, or whether they’re promoted, whether they’re demoted, whether they’re terminated.”
It can also determine whether an employee wants to work with a company in the first place.