The transportation company Uber has been taken to court by drivers who were allegedly fired through the company’s use of automated algorithms. 

British drivers have asked courts in the Netherlands, where the company’s data is based, to overrule the automatic algorithms that led to their dismissal, citing a breach of Article 22 of the General Data Protection Regulation (GDPR).

Under Article 22 of GDPR, an individual has the right to not be subject to a decision basis which was based solely upon automatic processing, including profiling, which produces negative effects for the individual.

However, Uber has argued that manual processes were in place in order to check over and confirm these dismissals.

A spokesperson for Uber, said:

As part of our regular processes, the drivers in this case were only deactivated after manual reviews by our specialist team.

Responding to these claims, Anton Ekker, a lawyer based in Amsterdam who is representing the British drivers, said:

We know for sure that Uber is using algorithms for decisions about fraud and deactivation of drivers. This is happening everywhere.

If it is automated decision-making, then the GDPR says they must have legal grounds to use such technology, and they must give drivers the possibility to object to an automated decision, which they clearly did not do.

The App Drivers and Couriers Union (ADCU), who will be putting forward the legal challenge, has said that since 2018, it has seen over one thousand drivers being accused of fraudulent activity who had their accounts terminated without the chance to appeal.

James Farrar, the ADCU’s general secretary, said:

For any private hire operator in London, if they fire someone, there is a requirement where they have to report the driver to Transport for London (TfL).

This is putting drivers in a Kafkaesque situation where they may be called in by TfL, they’re given 14 days to explain the situation and why they should keep their licence. Our drivers are in a terrible position because they don’t know what the issue is, Uber hasn’t told them.

Yaseen Aslam, president of the ADCU, added:

Uber has been allowed to violate employment law with impunity for years and now we are seeing a glimpse into an Orwellian world of work where workers have no rights and are managed by machine.

If Uber is not checked, this practice will become the norm for everyone.

According to experts, this could be the biggest legal case linked to Article 22 of GDPR that has ever gone to the courts.

David Greenhalgh, specialist employment lawyer at Excello Law, expresses this case should act as a warning:

This class action case in the Netherlands should act as a warning to UK employers about the reliance on AI to make decisions, without any human involvement, in relation to its employees or consultants. This over-reliance on AI robots could be in relation to the recruitment process in employment, during the employment relationship (perhaps around pay increases or promotion) and in relation to disciplinary action or termination. The GDPR has teeth and class action lawyers will not be afraid to bite.

 

 

 

 

Monica Sharma is an English Literature graduate from the University of Warwick. As Editor for HRreview, her particular interests in HR include issues concerning diversity, employment law and wellbeing in the workplace. Alongside this, she has written for student publications in both England and Canada. Monica has also presented her academic work concerning the relationship between legal systems, sexual harassment and racism at a university conference at the University of Western Ontario, Canada.