Is filling data holes the key to climate justice?

  • August 17, 2023
Is filling data holes the key to climate justice?

Ramit Debnath is primary author on a paper warning of the potential for holes in the data may bias climate change modelling

When the information on climate change is over-represented by the work of well-educated individuals at high-ranking institutions within the Global North, AI will only see climate change and climate solutions through their eyes

Dr Ramit Debnath

Bias in the collection of data on which Artificial Intelligence (AI) computer programmes depend can limit the usefulness of this rapidly growing tool for climate scientists predicting future scenarios and guiding global action, according to a new paper by researchers at the University of Cambridge published in Nature’s npj |Climate Action series.

AI computer programmes used for climate science are trained to trawl through complex datasets looking for patterns and insightful information. However, missing information from certain locations on the planet, time periods or societal dynamics create “holes” in the data that can lead to unreliable climate predictions and misleading conclusions.

Primary author and Cambridge Zero Fellow Dr Ramit Debnath [2018], a Gates Cambridge Scholar, said that individuals with access to technology, such as scientists, teachers, professionals and businesses in the Global North are more likely to see their climate priorities and perceptions reflected in the digital information widely available for AI use.

By contrast, those without the same access to technology, such as indigenous communities in the Global South, are more likely to find their experiences, perceptions and priorities missing from those same digital sources.

“When the information on climate change is over-represented by the work of well-educated individuals at high-ranking institutions within the Global North, AI will only see climate change and climate solutions through their eyes,” Debnath said.

The researchers said “biased” AI has the potential to misrepresent climate information. For example, it could generate ineffective weather predictions or underestimate carbon emissions from certain industries, which could then misguide governments trying to create policy and regulations aimed at mitigating or adapting to climate change.

AI-supported climate solutions which spring from biased data are in danger of harming under-represented communities, particularly those in the Global South with scant resources. These are often the same communities who also find themselves most vulnerable to the extreme weather events caused by climate change such as floods, fires, heatwaves and drought.

That is a combination which could lead to “societal tipping events”, the paper warns.

However, these “data holes” can be filled by human knowledge. The authors advocate for a human-in-the loop design to offer AI climate change programmes with a sense check on which data is used and the context in which it is used, in an effort to improve the accuracy of predictions and the usefulness of any conclusions.

The potential role of ChatGPT

The authors mention popular AI chatbot model ChatGPT, which has recently taken the world by storm for its ability to communicate conversationally with human users. On ChatGPT, the AI can ask its human users follow-up questions, admit mistakes, challenge incorrect premises and reject inappropriate requests.

This ‘human-in-the-loop’ style AI allows bias to be noticed and corrected, the authors said. Users can input critical social information, such as existing infrastructure and market systems, to allow the AI to better anticipate any unintended socio-political and economic consequences of climate action.

“No data is clean or without prejudice, and this is particularly problematic for AI which relies entirely on digital information,” co-author, Cambridge Zero Director and climate scientist Professor Emily Shuckburgh said.

In highlighting the importance of globally inclusive datasets, the paper also promotes broadband internet access as a public necessity, rather than a private commodity, to engage as many users as possible in the design of AI for contemporary conversations about climate action.

The paper concludes that human-guided technology remains instrumental in the development of socially responsible AI.

Less-biased AI will be critical to our understanding of how the climate is changing, and consequently in guiding realistic solutions to mitigate and adapt to the on-going climate crisis, the authors said.

Professor Shuckburgh, who also leads the UK national research funding body’s (UKRI) Centre for Doctoral Training on the Application of AI to the study of Environmental Risks (AI4ER), said that recognising the issue of data justice is the first step to better outcomes.

“Only with an active awareness of this data injustice can we begin to tackle it, and consequently, to build better and more trustworthy AI-led climate solutions,” she said.

Latest News

Rainforest carbon credit schemes less effective than thought, claims report

The effectiveness of widely used rainforest carbon credit schemes has been called into question by a new study. The study, Reducing Emissions from Deforestation and Forest Degradation (REDD+) Carbon Crediting,  by the Berkeley Carbon Trading Project is co-authored by Gates Cambridge Scholar Libby Blanchard [2012] and has been making headlines around the world. It brings […]

Gates Cambridge Trust seeks Global Engagement Officer

About us  Gates Cambridge Scholarships are prestigious, highly competitive, full-cost scholarships awarded to outstanding applicants from countries outside the UK to pursue a full-time postgraduate degree in any subject available at the University of Cambridge. Gates Cambridge Scholars become part of a lifelong global community defined by its core value of commitment to improving the […]

How combining clinical data could improve traumatic brain injury outcomes

Researchers, led by a Gates Cambridge Scholar, have integrated all medical data collected from traumatic brain injury (TBI) patients to calculate, for the first time, the personalised contribution of each clinical event to long-term recovery. This international effort marks a step towards patient-centred treatment in the intensive care unit (ICU). Shubhayu Bhattacharyay [2020] is the lead […]

Building a more sustainable future

When Alejandro Rivera Rivera [2015] was doing his MPhil in Engineering for Sustainable Development at Cambridge, a key theme was dealing with complexity, change and uncertainty. The course gave him some tools to cope, but he could never have imagined how useful these would come in when he returned to Guatemala and co-founded a business […]