Are computer algorithms sexist?

  • August 8, 2016
Are computer algorithms sexist?

James Yang Zou co-authors paper which attempts to counter inherent sexism in computer algorithms.

By reducing the bias in today’s computer systems (or at least not amplifying the bias), which is increasingly reliant on word embeddings, in a small way debiased word embeddings can hopefully contribute to reducing gender bias in society.

James Yang Zou and colleagues

Are computer algorithms inherently sexist and if so what can be done about it? A research paper co-authored by a Gates Cambridge Scholar shows how data sets embed sexist assumptions into searches and investigates how to counter this.

The paper, Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings, is published in arXiv.org. Led by Tolga Bolukbasi  from Boston University and co-authored by Gates Cambridge Scholar James Yang Zou [2007], who is currently an assistant professor at Stanford University, it investigates patterns in the way words on the internet appear next to each other based on a powerful data set called Word2vec devised by Google researchers researching Google News.

But the new study finds vector space is blatantly sexist with embedded pairings including she:he :midwife:doctor; sewing:carpentry; registered_nurse:physician; whore:coward; hairdresser:barber; nude:shirtless; boobs:ass; giggling:grinning; and nanny:chauffeur. This occurs because any bias in the Google News articles that make up the Word2vec corpus is captured in the geometry of the vector space.  They are concerned at the role of vector space in web searches, for instance, it could affect searches for potential candidates for jobs in professions deemed more "male" such as computer programming. The researchers says this could have the effect of increasing bias, rather than simply reflecting it.

To counter this, they use standard mathematical tools to manipulate vector space. That involves searching the vector space using Amazon's Mechanical Turk to find whether embedded pairings are appropriate or inappropriate.

After compiling a list of gender biased pairs, the team subjected it to a process of "hard de-biasing", with the sexist bias removed from the vector space. The pairings were then subject to the Mechanical Turk again and both direct and indirect bias was significantly reduced.

“One perspective on bias in word embeddings is that it merely reflects bias in society, and therefore one should attempt to debias society rather than word embeddings,” say the researchers. “However, by reducing the bias in today’s computer systems (or at least not amplifying the bias), which is increasingly reliant on word embeddings, in a small way debiased word embeddings can hopefully contribute to reducing gender bias in society…At the very least, machine learning should not be used to inadvertently amplify these biases.”

*Picture credit: Wikipedia.

James Zou

James Zou

  • Alumni
  • United States
  • 2007 CASM Applied Mathematics
  • Jesus College

I am participating in the Part III program in Applied Mathematics at Cambridge. I'm interested in the quantitative aspects of a wide range of topics--biology, sociology, and AI. I hope to explore the synthesis of these diverse topics at a fundamental level. I look forward to completing a Ph.D. after Part III.

Latest News

Building Indigenous people’s citizenship rights: the role of heritage

Oscar Espinoza Martin [2024] sees archaeology and cultural heritage as tools for strengthening Indigenous people’s sense of citizenship. “It is important how people feel about their past and how they navigate from there to the present,” he says. As an Indigenous person growing up in Lima with an early interest in history and philosophy, he […]

Scholar appointed VP of the Human Frontier Science Program

A former Gates Cambridge Scholar has been elected a Vice President of the Human Frontier Science Program (HFSP), an international programme funding excellent frontier research in the life sciences. Christian Boehm’s appointment was announced in late June at an event in Washington D.C. to celebrate the 35th anniversary of the HFSP. The programme, founded in […]

Inside Unit X

A Gates Cambridge Scholar has co-authored a book which gives an inside look at Unit X, the elite unit within the Pentagon that brings Silicon Valley’s cutting-edge technology to America’s military. Christopher Kirchhoff is co-author with Raj M Shah of Unit X: How the Pentagon and Silicon Valley Are Transforming the Future of War, published […]

Gates Cambridge welcomes new Interim Director

Dr Holly Tilbrook has been appointed Interim Director of Gates Cambridge. Currently Deputy Director in the Academic Centres Division at the University of Cambridge, Dr Tilbrook joins Gates Cambridge in August following a diverse career at the University of Cambridge. She describes herself as “the classic generalist administrator”. Starting in the Faculty of Classics in […]