A blog post by workers’ compensation insurance company Safety National asked “Could the fitness tracker you are wearing be a key to the future of workplace injury prevention?”
The insurance industry likes to talk about “wearables” like Fitbits and Apple Watches as a workplace safety tool. I would be a luddite to say technology couldn’t be used to improve workplace safety.
But you have to be a Silicon Valley simp to blindly believe in technology as the solution for workplace safety. Case in point, Amazon. Amazon ruthlessly uses technology to monitor their warehouse employees, yet according to the Washington State Department of Labor and Industry, they have a much higher rate of injury than comparable warehouses.
The “Black Box” in the workers compensation claims
So in short, technology can cut both ways for workplace safety. A lot depends on how the technology is used and how the technology is regulated. Law professor Frank Pasquale wrote the Black Box Society in 2015 to warn about the dangers the unregulated use of data collected from people and the algorithms used to analyze data.
Professor Ifeoma Ajunwa applied these ideas in workplace law in the article “The “black box” at work“. In this article Ajunwa specifically mentions an employer using data about employee sleeping patterns to deny a workers’ compensation claim. The insurance industry touts the use of devices like fitbits to track employee sleeping patterns, so her concerns about employer use of data collected from wearables is well-placed.
Critics of the “black box” society have called for “grand bargain” for the use of data in general. I think there needs to be a grand bargain for data generated in the workplace that includes employee-access to data and right to explanation for decisions made by algorithms.
“Black Box” workers compensation in Nebraska?
How would health data generated by a fitbit or Apple Watch fit into Nebraska workers’ compensation? The Rules of Civil Discovery apply in Nebraska workers’ compensation, so that data would in theory be accessible and discoverable. But if an employer is using technology from a third-party, that third-party may not readily provide such information. Technology companies have a history of resisting queries for information from the judicial system.
But while some of gadgetry may be novel, the legal issues raised by the use of algorithms and technology exist apart from the technology. Algorithms are just complicated mathematical formulas. But the formulas can be set up to lead to certain conclusions. I remember a case where the defendant’s main argument was that my client couldn’t have gotten carpal tunnel at work because they had an IME doctor cite a bunch of statistical studies that held as much.
The trial judge in that case was troubled by the defendant using generalized information and not focusing on the specific work duties of my client. She won her case. At least in my experience, Judges in workers compensation cases are willing to question conclusions made by algorithm rather than by the individual circumstances of the individual.
The black box and employment law
I remember taking a deposition of HR manager at a large meat packing plant in a wrongful discharge case. I remember her answering that my client was “administratively terminated”.
I blurted out, “Well, aren’t you the administrator?”, but I think their point was that the company wasn’t responsible for my client being fired. I frequently read HR people write that they have “no choice” but to terminate my client for some violation of policy or rules. Having personnel decisions made by algorithm allows management to assign responsibility for a decision to a computer program. But even without algorithms and computer programs, there is an automated and inhuman thinking in how HR makes decisions. An algorithm or computer program further dehumanizes the decision by incorporating that logic into the programming.