75K dollar grant for tech to aid people with disabilities

  • January 27, 2021
75K dollar grant for tech to aid people with disabilities

Dr Pradipta Biswas will use the grant from Facebook Reality Labs to develop AR/VR for people with disabilities.

A Gates Cambridge Scholar and his colleague have been awarded a prestigious grant from Facebook Reality Labs to investigate the barriers people with disabilities may face with regard to accessing augmented and virtual reality.

Dr Pradipta Biswas and Professor Yogesh Simmhan have been awarded a $75,000 grant from Facebook Reality Labs for their proposal on ‘privacy-respecting augmented reality[AR]/virtual reality[VR] to enable differently abled people in multi-cultural societies.

The grant was the result of an international call for projects centred on responsible innovation in AR/VR. Less than 10 researchers were awarded worldwide. The proposed project will take forward Dr Biswas’ research on AR/VR-based assistive technology and human robot interaction and Professor Simmhan’s research on video analytics.

They will investigate current issues which prevent the widespread adoption of AR VR technology, in particular among people with a differing range of abilities in the rich cultural context of India. The project will develop a video see-through-based AR display with personalised content for education, communication and rehabilitation purposes for users with disabilities. In the diverse cultural, ethnic and economic context of India, the project will use qualitative and quantitative techniques to analyse end users’, carers’, parents’ and teachers’ acceptance criteria of AR products, including privacy and security issues.

Pradipta [2006], who is Assistant Professor at the Indian Institute of Science’s Centre for Product Design and Manufacturing and did his PhD in Computer Science at the University of Cambridge, received a Microsoft AI for Accessibility Grant award in 2018 and is also organising a virtual workshop on Inclusive AR/VR systems from 7-9 May at the Association for Computing Machinery’s [ACM] Special Interest Group on Computer–Human Interaction Conference on Human Factors. This is a premier conference in Computer Science and the workshop’s participants will include eminent senior researchers from Microsoft, Google, Verizon, British Telecom and the universities of Cambridge, Barcelona and Maryland (BC). Participants will be invited to submit extended versions of accepted manuscripts to a special issue of ACM Transaction on Accessible Computing (TACCESS).

Latest News

Why AI needs to be inclusive

When Hannah Claus [2024] studied computer science at school she soon realised that she was in a room full of white boys, looking at posters of white men. “I could not see myself in that,” she says. “I realised there were no role models to follow and that I had to become that myself. There […]

New book deal for Gates Cambridge Scholar

A Gates Cambridge Scholar has signed a deal to write a book on Indigenous climate justice. The Longest Night will be published by Atria Books, part of Simon & Schuster, and was selected as the deal of the day by Publishers Marketplace earlier this week. Described as “a stunning exploration of the High North and […]

Why understanding risk for different populations can reduce cardiovascular deaths

The incidence of cardiovascular disease (CVD) – the number one cause of death globally – can be reduced significantly by understanding the risk faced by different populations better, according to a new study. Identifying individuals at high risk and intervening to reduce risk before an event occurs underpins the majority of national and international primary […]

How can we create a more tolerant world?

Three Gates Cambridge scholars debate how we can create a more tolerant world in the sixth episode* of the podcast So, now what?, launched today. Alina Utrata, José Izquierdo and Farhan Samanani explore the importance of face-to-face interactions, trust and cooperation in building tolerance. They also examine the role of technology and social media in […]