Twitter investigates racial bias in photos

Social platform under fire after users discover algorithm favours images of white people.

a diverse group of people hugging

Wednesday, 23 September, 2020

Racist code

Twitter is in hot water again for a glaring error in their image sorting algorithm. It was discovered that the algorithm prefers white faces, and crops out people with darker skin tones in large images.

Users discovered the error when uploading images of people which were ‘too big’ for Twitter’s standard image sizes. It was found that in an image containing multiple people, the site will always crop out darker skin tones.

Some users went further with testing the fault and found even chocolate Labradors were cropped out of images!

Work to do

In a statement, Twitter claimed it had tested for racial and gender bias during development of the algorithm. However they conceded; “It’s clear that we’ve got more analysis to do.”

Parag Agrawal, Twitter’s chief tech officer tweeted: “We did analysis on our model when we shipped it – but it needs continuous improvement.”

But in the face of vague statements like these users will surely still have questions. It’s unclear when the algorithm was introduced, but it’s certainly not from the 1950s. So how on earth could this have happened in 2020?

‘Many questions’

One researcher for Twitter said:

There are many questions that will need time to dig into […] More details will be shared after internal teams have had a chance to look at it.

- Twitter researcher

Twitter reported that tests on the algorithm in 2017 had used faces belonging to different ethnicities. The study had found “no significant bias between ethnicities (or genders)”. Twitter will now review that study.

Zooming in on bias

The issue came to light initially due to a fault with Zoom. Staff at a Canadian university were looking into Zoom removing black staff member’s heads on video calls. Colin Madland, Vancouver university manager then realised what was happening.

Zoom has a feature which edits your background surroundings so colleagues or students can’t see your home. It transpired that the video chat software mistakes dark skin tones as part of the background. So users with darker skin tones were appearing headless in calls!

However, when Colin Madland began posting about the issue on Twitter, he found his colleague’s face was consistently cropped out.

Madland and other users began changing the layout of the image and posting other images to test the theory. Even images including Barack Obama, arguably one of the most famous people in history, cropped him out.

Profiling

Zoom and Twitter aren’t the only software with this issue. Over the past few years there have been reports of facial recognition software failing to distinguish black and Asian faces.

In 2019, UK police officers reported that algorithms were “amplifying” prejudices and called for more scrutiny on the effectiveness of the technology.

Taking things personally

When faced with criticism over the issue, some Twitter staff took things personally.

Twitter’s chief design officer, Dantley Davis, tweeted: “I know you think it’s fun to dunk on me – but I’m as irritated about this as everyone else. However, I’m in a position to fix it and I will”, going on to add; “It’s 100% our fault. No-one should say otherwise.”

Let’s hope that this ‘irritation’ translates to action.

Natalie Dunning author picture

By:

Natalie Dunning is a freelance writer and Media Psychology researcher based in Manchester.