Algorithmic tracking is ‘damaging mental health’ of UK workers
Report by MPs and peers says monitoring worker performance using AI should be regulated by law
Monitoring of workers and setting performance targets through algorithms is damaging employees’ mental health and needs to be controlled by new legislation, according to a group of MPs and peers.
An “accountability for algorithms act’” would ensure that companies evaluate the effect of performance-driven regimes such as queue monitoring in supermarkets or deliveries-per-hour guidelines for delivery drivers, said the all-party parliamentary group (APPG) on the future of work.
“Pervasive monitoring and target-setting technologies, in particular, are associated with pronounced negative impacts on mental and physical wellbeing as workers experience the extreme pressure of constant, real-time micro-management and automated assessment,” said the APPG members in their report, the New Frontier: Artificial Intelligence at Work.
The report recommends bringing in a new algorithms act, which it says would establish “a clear direction to ensure AI puts people first”. It warns that “use of algorithmic surveillance, management and monitoring technologies that undertake new advisory functions, as well as traditional ones, has significantly increased during the pandemic”.
Under the act workers would be given the right to be involved in the design and use of algorithm-driven systems, where computers make and execute decisions about fundamental aspects of someone’s work – including in some cases allocation of shifts and pay, or whether they get a job in the first place.
The report also recommended that corporations and public sector employers fill out algorithmic impact assessments, aimed at ironing out any problems caused by the systems, and expanding the new umbrella body for digital regulation, the Digital Regulation Cooperation Forum, to introduce certification and guidance for use of AI and algorithms at work.
The MPs added that the use of AI and algorithms produced a sense of unfairness and lack of independence among workers, who also aren’t aware of the role of personal information in guiding decisions about how they go about their jobs. Regulation of social media and video platforms will also be included in the online safety bill, which will become law towards the end of next year.
David Davis MP, the Conservative chair of the APPG on the future of work, said: “Our inquiry reveals how AI technologies have spread beyond the gig economy to control what, who and how work is done. It is clear that, if not properly regulated, algorithmic systems can have harmful effects on health and prosperity.”
Clive Lewis, a Labour member of the APPG, added: “Our report shows why and how government must bring forward robust proposals for AI regulation. There are marked gaps in regulation at an individual and corporate level that are damaging people and communities right across the country.”
The APPG inquiry was established after the publication of a report into the role of AI and algorithms in modern work in May this year by the Institute for the Future of Work, a research body, entitled the Amazonian Era. The report focused on retail workers and included testimony from delivery drivers and checkout workers who complained of monitoring systems and target-setting that produced high levels of anxiety.
“A lot of professional drivers will sometimes jump a red light or brake too hard because they are under time constraints and often they have to use their mobile while driving,” one supermarket delivery driver said in the report. The IFoW study also included testimony from manufacturing workers who had to log 95% of their activity on shifts, so their working day could be planned more intensively.