Is filling data holes the key to climate justice?

  • August 17, 2023
Is filling data holes the key to climate justice?

Ramit Debnath is primary author on a paper warning of the potential for holes in the data may bias climate change modelling

When the information on climate change is over-represented by the work of well-educated individuals at high-ranking institutions within the Global North, AI will only see climate change and climate solutions through their eyes

Dr Ramit Debnath

Bias in the collection of data on which Artificial Intelligence (AI) computer programmes depend can limit the usefulness of this rapidly growing tool for climate scientists predicting future scenarios and guiding global action, according to a new paper by researchers at the University of Cambridge published in Nature’s npj |Climate Action series.

AI computer programmes used for climate science are trained to trawl through complex datasets looking for patterns and insightful information. However, missing information from certain locations on the planet, time periods or societal dynamics create “holes” in the data that can lead to unreliable climate predictions and misleading conclusions.

Primary author and Cambridge Zero Fellow Dr Ramit Debnath [2018], a Gates Cambridge Scholar, said that individuals with access to technology, such as scientists, teachers, professionals and businesses in the Global North are more likely to see their climate priorities and perceptions reflected in the digital information widely available for AI use.

By contrast, those without the same access to technology, such as indigenous communities in the Global South, are more likely to find their experiences, perceptions and priorities missing from those same digital sources.

“When the information on climate change is over-represented by the work of well-educated individuals at high-ranking institutions within the Global North, AI will only see climate change and climate solutions through their eyes,” Debnath said.

The researchers said “biased” AI has the potential to misrepresent climate information. For example, it could generate ineffective weather predictions or underestimate carbon emissions from certain industries, which could then misguide governments trying to create policy and regulations aimed at mitigating or adapting to climate change.

AI-supported climate solutions which spring from biased data are in danger of harming under-represented communities, particularly those in the Global South with scant resources. These are often the same communities who also find themselves most vulnerable to the extreme weather events caused by climate change such as floods, fires, heatwaves and drought.

That is a combination which could lead to “societal tipping events”, the paper warns.

However, these “data holes” can be filled by human knowledge. The authors advocate for a human-in-the loop design to offer AI climate change programmes with a sense check on which data is used and the context in which it is used, in an effort to improve the accuracy of predictions and the usefulness of any conclusions.

The potential role of ChatGPT

The authors mention popular AI chatbot model ChatGPT, which has recently taken the world by storm for its ability to communicate conversationally with human users. On ChatGPT, the AI can ask its human users follow-up questions, admit mistakes, challenge incorrect premises and reject inappropriate requests.

This ‘human-in-the-loop’ style AI allows bias to be noticed and corrected, the authors said. Users can input critical social information, such as existing infrastructure and market systems, to allow the AI to better anticipate any unintended socio-political and economic consequences of climate action.

“No data is clean or without prejudice, and this is particularly problematic for AI which relies entirely on digital information,” co-author, Cambridge Zero Director and climate scientist Professor Emily Shuckburgh said.

In highlighting the importance of globally inclusive datasets, the paper also promotes broadband internet access as a public necessity, rather than a private commodity, to engage as many users as possible in the design of AI for contemporary conversations about climate action.

The paper concludes that human-guided technology remains instrumental in the development of socially responsible AI.

Less-biased AI will be critical to our understanding of how the climate is changing, and consequently in guiding realistic solutions to mitigate and adapt to the on-going climate crisis, the authors said.

Professor Shuckburgh, who also leads the UK national research funding body’s (UKRI) Centre for Doctoral Training on the Application of AI to the study of Environmental Risks (AI4ER), said that recognising the issue of data justice is the first step to better outcomes.

“Only with an active awareness of this data injustice can we begin to tackle it, and consequently, to build better and more trustworthy AI-led climate solutions,” she said.

Latest News

Upskilling the world in digital skills for the future

A computer science education company founded by a Gates Cambridge Scholar has gone from strength to strength, partnering with universities across the world and earning plaudits from a UK minister for its work in driving up digital skills. HyperionDev was founded by Riaz Moola as an online coding bootcamp based in South Africa. It has […]

Rob Henderson to speak at Gates Cambridge event

Gates Cambridge Scholar Rob Henderson will be speaking about his best-selling memoir Troubled: A Memoir of Foster Care, Family, and Social Class at an event at Bill Gates Sr. House next Friday [4th October]. The book, published by Simon & Schuster, tells of Rob’s journey from foster care to the military to academia and explores […]

Exploring Black intellectual history

Before his PhD Siyabonga Njica [2018] was a respected spoken-word poet and he is passionate about exploring Black intellectual history through exiled South African writers and artists. He believes strongly that their stories should not remain in the archives, but should be widely understood and their impact on South African and global culture celebrated. He […]

Sugarcane: a Q & A with Emily Kassie

Gates Cambridge Scholar Emily Kassie spoke about her award-winning documentary film Sugarcane at a screening in Cambridge this week. Sugarcane won the US Documentary Directing Award at the Sundance Film Festival earlier this year for its investigation into systematic abuse at an Indian residential school in Canada. Emily co-directed and co-produced the film which is a searing documentary […]