Employee Work Tracking is on the Rise
New research reveals how people feel about being judged by algorithms—and why employees should have first dibs on their data.
Employers have always kept tabs on employees, but in the rise of working at home during the pandemic, such monitoring has reached absurd extremes. It’s boom times for sellers of “tattleware,” which can track hours worked and applications used, and even secretly record keystrokes and capture screen images. In a July survey of 2,000 employers, a whopping 78 percent said they use monitoring software, most of it implemented in the last six months.
In workplaces of companies such as PwC and Teleperformance, bosses can mandate that webcams stay on or that IT people can remotely activate the camera on a remote worker’s device at any time. Not surprisingly, some workers find that sort of oversight intrusive and demoralizing, but new research reveals that not all behavior tracking is the same. People are much more accepting of, even motivated by, tracking that is done by algorithms versus other people, and people may actually embrace having their daily work tracked if it is meant to inform them of their performance rather than to creepily control them.
Smartwatch use jumped in 2015, and the widespread adoption of nanny-like exercise apps intrigued Roshni Raveendhran, a business professor at the University of Virginia’s Darden School of Business and the primary author of a recent paper on people’s perceptions of tracking. “If our partners came in and said, ‘Hey, you’ve been sitting too long, take a walk,’ or if our bosses said, ‘Hey, you’ve been on this website too long,’ that’s really aversive,” Raveendhran says. “And yet many of these devices seem to do that, and we actually welcome it.” She and her co-author, Nathanael Fast, a management professor at the University of Southern California, wondered what was going on.
Those told they’d received feedback from humans versus algorithms were twice as likely to freely report that they felt judged.
Their paper, published in May in Organizational Behavior and Human Decision Processes, describes a series of experiments. First, they established that people prefer tracking by algorithms over tracking by people. In one experiment, undergraduate participants were told they’d be demoing a program called Aptitude Tracker that would provide real-time feedback during an aptitude test. It would send messages comparing them with others in terms of time spent on questions, use of online calculators, and other metrics. The majority of participants, 79 percent, said they’d prefer the version in which the messages were sent by an algorithm rather than a human. (The feedback systems were described as equally accurate.) Afterward, those told they’d received feedback from humans versus algorithms were twice as likely to freely report that they felt judged (36 percent versus 17 percent).
In a second experiment, MBA students read two job descriptions. Both jobs required wearing a “sociometric badge,” which would record time spent at their desk, amount of face-to-face interaction, and time spent talking in meetings. Employees would receive feedback based on the metrics. At one company, an algorithm analyzed the data, while at the other a human performed the task. Participants said they’d feel less judged at the company using an algorithm, and as a result said they’d feel greater autonomy, and in turn would prefer to work there.
“It’s interesting to see people are more willing to accept an automated Big Brother compared to a human Big Brother,” says Shyam Sundar, a media studies professor at Pennsylvania State University who studies human-computer interaction.
Why do people prefer machine tracking?
One possible reason why participants preferred tracking by algorithm is that they thought humans and computers would use the data differently. Raveendhran and Fast tested that hypothesis in the next experiment. Employed American adults, surveyed online, imagined working at a firm that used sociometric badges. There were four conditions: Analysis was done by human or computer, and was meant for personal feedback or personnel decisions. Respondents rated their acceptance of the tracking and their intrinsic job motivation—an internal drive that has been shown to enhance performance and wellbeing. Participants preferred and felt more intrinsically motivated by the scenarios in which tracking was meant for feedback rather than control—whether the tracking was human or algorithmic. More interestingly, whether the tracking was meant to inform them or control them, they also preferred the scenarios in which algorithms versus humans did the tracking. So they don’t prefer algorithmic tracking merely because they think computers are less likely than people to use the data against them.
A fourth study revealed that workers would feel most intrinsically motivated when there was no tracking at all—then tracking by algorithm, a stranger, and a coworker, in that order. Finally, the researchers found that whether a smartwatch provided feedback on work behaviors or health behaviors, algorithmic versus human analysis reduced the feeling of being judged and thus increased willingness to use a tracking device.
In follow-up studies, the researchers are looking at differences in who’s open to tracking, based on variables such as age, ethnicity, and socioeconomic status, as well as tracking’s effects on perceptions of privacy and on actual performance.
Raveendhran says some work suggests that people are equally productive working at home or the office. “So then the question becomes, What, really, is the purpose of this tracking?” she says. “Is it more of a default? ‘Okay, we need to monitor them now that they’re not in the office.’ It seems a little unnecessary.”
“You, as the employee, should get first dibs on your data.”
While she doesn’t see the tech going away, she does think companies can pivot to use it differently and more effectively. First, they should automate as much as possible. There’s little appetite for real-time monitoring of an employee in most cases anyway. “I’ve talked to enough managers to know that, especially during the pandemic, they’re like, ‘I don’t even want to look at some of these data. I don’t care about the minute details,’” she says.
Second, companies should use it for feedback, not control. “You, as the employee, should get first dibs on your data,” she says. You might see that you’re more productive in the afternoons or you aren’t talking much in certain meetings. “You can then use it as a coaching opportunity. You go to your manager and say, ‘Hey, this is what I’m seeing with my data. Could you help me?’”
Sundar adds that users should have a chance to interact with and correct tracking systems. For example if a fitness tracker miscounts your steps by including movements and gestures that you don’t consider to be steps, you should be able to adjust its settings. “Customization affords a greater sense of user agency and gives users more autonomy,” he says.
Raveendhran summarizes the big lesson from her study: “Tracking does not need to always be monitoring. Tracking can be empowering.”