Tech giant Facebook has been accused of breaking UK law after an investigation revealed that the company’s algorithm overwhelmingly targeted certain job adverts to specific genders.

An investigation carried out by Global Witness, an NGO based in London, found that UK job openings on Facebook’s advertisement platform were being promoted to specific genders.

Global Witness put forward two job adverts to Facebook’s advertising platform, asking for one not to be shown to women and the other not to be visible to anyone over the age of 55.

Although the adverts were ultimately pulled before they went live, Facebook was shown to have approved both but did ask the company to ensure they would not unfairly discriminate against the latter groups.

To further test out any biases, Global Witness also added four job adverts to Indeed, a job search platform.

When specifying the target audience, the NGO only asked for the adverts to be shown to UK adults, stipulating no specific gender.

This, the lead investigator stated, would mean “that it was entirely up to Facebook’s algorithm to decide who to show the adverts to”.

The experiment revealed that two of the job roles – mechanic and airline pilot – were overwhelmingly targeted towards men (96 per cent and 75 per cent respectively).

Conversely, the remaining two adverts, searching for nursery nurses and psychologists, were pushed more towards women (95 per cent and 77 per cent).

As such, the NGO has accused Facebook of perpetuating existing recruitment biases and claims its practices have breached UK’s equality laws.

Ravi Naik, a data-rights lawyer acting for Global Witness, stated:

[These findings are] massively consequential because Facebook’s entire business model is advertising and if that business model results in discriminatory practices, that undermines the ability of Facebook to operate properly in this country.

However, a spokesperson for Facebook said:

Our system takes into account different kinds of information to try and serve people ads they will be most interested in, and we are reviewing the findings within this report.

This is not the first time a large technology giant has been accused of biases in recruitment.

In 2018, Amazon found that an AI created to review candidates for software development jobs was displaying bias towards men. This was due to the AI teaching itself that male candidates were preferable after being shown prior successful CVs, most of which came from men.

In this case, Global Witness have filed a complaint against Facebook with the Equality and Human Rights Commission (EHRC) and the Information Commissioner, warning of discriminatory practices.

 

 

 

 

Monica Sharma is an English Literature graduate from the University of Warwick. As Editor for HRreview, her particular interests in HR include issues concerning diversity, employment law and wellbeing in the workplace. Alongside this, she has written for student publications in both England and Canada. Monica has also presented her academic work concerning the relationship between legal systems, sexual harassment and racism at a university conference at the University of Western Ontario, Canada.