TUC taskforce to address ‘punitive’ use of AI in staff monitoring

[ad_1]

Shutterstock

One in seven workers say that monitoring and surveillance at work has increased during the Covid-19 pandemic, prompting the TUC to launch a taskforce with the aim of protecting workers from ‘punitive’ forms of performance management.

According to the trade union body, there has been a rapid increase in the use of artificial intelligence and other emerging technologies to monitor employees’ performance, assess candidates for the suitability of roles, and even provide day-to-day line management and make redundancy decisions.

Research for its Technology Managing People: The Worker Experience Report found just 31% of staff were consulted by their employer when any new forms of technology were introduced and 56% felt the introduction of technology to monitor employees was damaging the trust between workers and organisations.

When asked whether it was possible that AI-powered technologies were being used at their workplace without their knowledge, 89% said either “yes” or “not sure”.

Only 28% were comfortable with technology being used to make decisions about people at work and only 5% said they would trust the decisions made about them by AI, machine learning and algorithms.

TUC general secretary Frances O’Grady said: “Worker surveillance tech has taken off during this pandemic as employers have grappled with increased remote working.

“Big companies are investing in intrusive AI to keep tabs on their workers, set more demanding targets – and to automate decisions about who to let go. And it’s leading to increased loneliness and monotony.

“Workers must be properly consulted on the use of AI, and be protected from punitive ways of working. Nobody should have their livelihood taken away by an algorithm.

“As we emerge from this crisis, tech must be used to make working lives better – not to rob people of their dignity.”

The report acknowledged a recent survey conducted for the European Commission, which found that 42% of organisations currently use at least one AI technology, a quarter of them use at least two types, and 18% have plans to adopt AI technologies in the next two years.

Workers must be properly consulted on the use of AI, and be protected from punitive ways of working. Nobody should have their livelihood taken away by an algorithm” – Frances O’Grady, TUC

When asked about their experience of technologies making or informing decisions about them at work, 22% of workers said they had experience of this for absence management, 15% for ratings, 14% for work allocation, 14% for timetabling shifts and 14% in the assessment of training needs and allocation.

Many workers managed by AI described a sense of loneliness and pressure. One worker told the TUC their working life had become “increasingly robotic, alienating, monotonous and lonely”.

In response to the report’s findings, the TUC was launching a taskforce formed of representatives from trade unions and legal experts to develop new proposals around the use of AI and monitoring technology at work.

Next year, the taskforce and AI law experts Robin Allen QC and Dee Masters will publish a legal report on the issue.

The taskforce aims to:

  • Enable collective bargaining on the use of technology and data
  • Achieve more worker consultation on the development, introduction and operation of new technologies
  • Empower workers and unions with technical knowledge and understanding to negotiate better deals for workers in organisations that wished to use such technologies.

The report used polling data from a BritainThinks survey of 2,133 workers in England and Wales between 31July and 5 August 2020, and two TUC surveys involving workers and trade union reps, which received 1,000 responses.

  People analytics opportunities on Personnel Today


Browse more people analytics jobs

[ad_2]

Source link

Share on facebook
Facebook
Share on google
Google+
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on pinterest
Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *