Are computer algorithms sexist?

  • August 8, 2016
Are computer algorithms sexist?

James Yang Zou co-authors paper which attempts to counter inherent sexism in computer algorithms.

By reducing the bias in today’s computer systems (or at least not amplifying the bias), which is increasingly reliant on word embeddings, in a small way debiased word embeddings can hopefully contribute to reducing gender bias in society.

James Yang Zou and colleagues

Are computer algorithms inherently sexist and if so what can be done about it? A research paper co-authored by a Gates Cambridge Scholar shows how data sets embed sexist assumptions into searches and investigates how to counter this.

The paper, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings, is published in arXiv.org. Led by Tolga Bolukbasi  from Boston University and co-authored by Gates Cambridge Scholar James Yang Zou [2007], who is currently an assistant professor at Stanford University, it investigates patterns in the way words on the internet appear next to each other based on a powerful data set called Word2vec devised by Google researchers researching Google News.

But the new study finds vector space is blatantly sexist with embedded pairings including she:he :midwife:doctor; sewing:carpentry; registered_nurse:physician; whore:coward; hairdresser:barber; nude:shirtless; boobs:ass; giggling:grinning; and nanny:chauffeur. This occurs because any bias in the Google News articles that make up the Word2vec corpus is captured in the geometry of the vector space.  They are concerned at the role of vector space in web searches, for instance, it could affect searches for potential candidates for jobs in professions deemed more "male" such as computer programming. The researchers says this could have the effect of increasing bias, rather than simply reflecting it.

To counter this, they use standard mathematical tools to manipulate vector space. That involves searching the vector space using Amazon's Mechanical Turk to find whether embedded pairings are appropriate or inappropriate.

After compiling a list of gender biased pairs, the team subjected it to a process of "hard de-biasing", with the sexist bias removed from the vector space. The pairings were then subject to the Mechanical Turk again and both direct and indirect bias was significantly reduced.

“One perspective on bias in word embeddings is that it merely reflects bias in society, and therefore one should attempt to debias society rather than word embeddings,” say the researchers. “However, by reducing the bias in today’s computer systems (or at least not amplifying the bias), which is increasingly reliant on word embeddings, in a small way debiased word embeddings can hopefully contribute to reducing gender bias in society…At the very least, machine learning should not be used to inadvertently amplify these biases.”

*Picture credit: Wikipedia.

James Zou

James Zou

  • Alumni
  • United States
  • 2007 CASM Applied Mathematics
  • Jesus College

I am participating in the Part III program in Applied Mathematics at Cambridge. I'm interested in the quantitative aspects of a wide range of topics--biology, sociology, and AI. I hope to explore the synthesis of these diverse topics at a fundamental level. I look forward to completing a Ph.D. after Part III.

Latest News

Addressing the mental health emergency

Mental health has been rising up the global health priority list over the last few years, but Covid accelerated it. Yet the resources available to those in crisis situations are few. Gates Cambridge Scholar Usama Mirza is addressing one particular gap in his home country of Pakistan, having recently launched Asia’s first mental health ambulance […]

Food security in Africa through a multi-disciplinary lens

Three Gates Cambridge Scholars are collaborating on an innovative project to map and address the disappearance of historically undervalued African indigenous and traditional food crops at a time of climate crisis. The project is the brainchild of Dr Carol Ibe, founder of the JR Biotek Foundation, a charity which trains, upskills and empowers present and […]

Double winner

Jenna Armstrong has done it again. Last year she was part of the winning Cambridge women’s rowing team and her team did it again last weekend. Jenna [2020] started rowing in 2011 as an undergraduate, but took five years off from 2015 to 2020 until she picked it up again when she started at Cambridge. She […]

What does extreme weather mean for us?

Three Gates Cambridge Scholars from China, the US and India are taking part in the third episode of the Gates Cambridge podcast, So, now what? which is out today [26th March] as part of the Cambridge Festival’s Festival of Podcasts. The episode, featuring Victoria Herrmann [2015], Songqiao Yao [2014] and Ramit Debnath [2018] and hosted […]