“There are still many questions that are not answered” – Nicolas Kayser-Bril on investigating algorithmic discrimination on Facebook

When deciding who to show an ad to, Facebook relies on gross stereotypes

 

In a special guest post for OJB, Vanessa Fillis speaks to AlgorithmWatch’s Nicolas Kayser-Bril about his work on how online platforms optimise ad delivery, including his recent story on how Facebook draws on gender stereotypes.

Kayser-Bril first became aware of automated discrimination when he read about an experiment done by researchers at North Eastern University in the US. Seeing that the analysis could be replicated in Europe, he decided to take a closer look at Facebook and Google’s distribution systems.

“Automated systems are supposed to bring relevant content to the users,” says Nicolas. “And I use ‘relevant’ because it’s the adjective that Facebook uses — and there is a sense that relevant content is determined based on the actions of the users themselves.”

But in reality, everything Kayser-Bril knows about large scale automated systems like Facebook’s news feed hints that their decisions about what to show to an user is based on many different factors instead.

“This includes the activity of users,” the journalist says, “but also the activity of other users and other unknown variables which we suspect are aimed at increasing the income and profitability of the firms deploying these systems.”

Identifying algorithmic discrimination: how the experiment was funded 

For the experiment Kayser-Bril worked together with Moritz Zajonz, a German data journalist. To get the money to buy the adverts, they applied to Journalismfund.eu.

Overall, they had a budget of 1,200 euros. Their estimates were based on the American experiment and since ads in Europe are cheaper than in the US, they didn’t use all of their budget.

“You need a lot of money to do this kind of research, but it’s not impossible,” Kayser-Bril says.

“I’m pretty sure many newsrooms can afford spending 1,000 euros on this.”

Collaborations with different media partners meant working closely with them on how to tell the story. They were also able to run some additional experiments based on hypotheses of media partners.

Conducting the experiment: advertising jobs with and without a gender bias

Kayser-Bril and Zajonz advertised for six different types of jobs in five countries. Each job was advertised on the jobs site Indeed as it was available in all five countries.

Some of the selected jobs had a strong gender bias, some others did not.

“Regarding the jobs with a gender bias, we took jobs where the gender bias was stereotypical, but not factual.

“One type of job that is stereotypically female and actually female would be midwives, for example. So, we didn’t take this type of job. We chose nurses and childcare workers which are stereotypically female, but in reality they are not exclusively female.”

They then chose pictures to illustrate each of the occupation categories and ran the ads on Facebook and Google.

Another important consideration was not to advertise for fake positions: Nicolas believes this would have not been ethical.

“We didn’t send people to individual jobs, but sent them to the search results [for similar jobs]. We had something like 300.000 impressions and a few thousand clicks, so maybe some people who were looking for jobs actually applied. This wouldn’t be a bad outcome.”

Each ad ran for just a few hours. Kayser-Bril would have preferred a longer duration, but it wasn’t possible with the budget.

“Also, Facebook suspended several of our campaigns on spurious grounds”, he adds.

Analysing the data to identify if gender was used to target ads

While the ads were online, they pulled the data at different times in order to be able to see how the numbers changed over time.

For each ad, they had data from Facebook and Google on how many males and females it was shown to.

“That’s how we could show that Facebook was not using the actual user behaviour to decide who to show the ad to, but already made the decision before an interaction had taken place.

“From the beginning there is already a huge difference between male and female.“

Adapting to problems and obstacles 

Facebook suspending some of the ads wasn’t the only problem they came across. On Google, they weren’t able to use a feature called “uncapped bids”.

When running an online campaign on Google, ad buyers have the option to tell the platform to uncap the bids. In that case Facebook and Google decide how much to bid for each impression.

The other possibility is to cap the bids, which means setting a maximum amount of money that the advertiser pays for each click on the ad.

As a consequence, they had to limit their bids on Google.

“This probably explains why we did not find much discrimination on Google because apparently the discrimination does not take place when advertisers use capped bids.”

Getting comments from Facebook and Google also proved difficult.

“After we sent Google our questions they suspended our advertiser account on the false pretence that we were trying to disturb their advertising systems.

“Facebook called me after we sent them our questions and then asked to see all of our data and have the names and contacts of all the journalists we were working with.

“I did obviously not comply with this request.”

In the end, neither Facebook nor Google answered.

Unanswered questions

The results were very similar to the findings in the US, suggesting that Facebook’s systems are no different in Europe than they are in the US.

“This raises important questions because in the US the researchers were able to show discrimination based on race and there is no reason to think that this is not happening in Europe as well.

“Unfortunately, we were not able to test this hypothesis, but it shows that there are still many questions that are not answered.”

Besides, Kayser-Bril sees no reason why they would change the way they optimize their ads anytime soon.

“I strongly encourage any journalist to take a closer look at the advertising systems of Facebook, Google and others. They are immensely important for society.”

The data and code for AlgorithmWatch’s investigation is available on GitHub.

Vanessa is a student on the MA in Data Journalism at Birmingham City University.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.