Opening technology up to all

  • April 16, 2015
Opening technology up to all

Pradipta Biswas leads research team looking to help people with physical impairments to access technology.

Researchers led by a Gates Cambridge Scholar have devised a computer control interface which will help people with physical impairments and others who cannot use a mouse or touchscreeen to perform complex computing tasks at speed.

The team of researchers at the Department of Engineering, led by Dr Pradipta Biswas, has developed a computer control interface that uses a combination of eye-gaze tracking and other inputs. The team’s research was recently published in a paper, ‘Multimodal Intelligent Eye-Gaze Tracking System,’ in the International Journal of Human-Computer Interaction.

The researchers provided two major enhancements to a standalone gaze-tracking system. First, sophisticated software interprets factors such as velocity, acceleration and bearing to provide a prediction of the user’s intended target. Next, a second mode of input is employed, such as a joystick.

“We hope that our eye-gaze tracking system can be used as an assistive technology for people with severe mobility impairment,” said Pradipta, a Senior Research Associate in the Department’s Engineering Design Group. “We are also exploring the potential applications in military aviation and automotive environments where operators’ hands are engaged with controlling an aircraft or vehicle.”

One challenge that arises when designing such a system is, once the target is selected, how does the user indicate a desire for selection? On a typical personal computer, this is accomplished with a click of the mouse; with a phone or tablet, a tap on the screen.

Basic eye-gaze tracking systems often use a signal such as blinking the eyes to indicate this choice. However, blinking is not often ideal. For example, in combat situations, pilots’ eyes might dry up, precluding their ability to blink at the right time.

Pradipta’s team experimented with several ways to solve the selection problem, including manipulating joystick axes, enlarging predicted targets, and using a spoken keyword such as ‘fire’ to indicate a target.

Unsurprisingly, they found that a mouse remains the fastest and least-cognitively stressful method of selecting a target – possibly assisted by the fact that most computer users are already comfortable with this technique. But, a multimodal approach combining eye-gaze tracking, predictive modelling, and a joystick can almost match a mouse in terms of accuracy and cognitive load. Further, when testing computer novices and with sufficient training in the system, the intelligent multimodal approach can even be faster.

The hope is that these revelations will lead to systems that perform as well – or better – than a mouse. “I am very excited for the prospects of this research,” Pradipta said. “When clicking a mouse isn’t possible for everyone, we need something else that’s just as good.”

Latest News

Interpreting the world through words and drawings

Eréndira [Ere] Derbez [2025] describes herself as an illustrator who works in academia and the two passions, combined with a commitment to human rights and social justice, feed each other […]

Gates Cambridge seeks Community Platform Officer

The Gates Cambridge mission is to build a global network of future leaders committed to improving the lives of others. We achieve this mission by selecting outstanding scholars from countries […]

Educating the world about the Aztec civilisation

Jessica Fernandez de Lara Harada was approached by the production team behind the landmark BBC Arts Civilisations: Rise and Fall documentary series at the start of the year. “They were […]

Fossils reveal anacondas have been giants for over 12 million years

Scientists led by Gates Cambridge Scholar Andrés Alfonso-Rojas [2022] has analysed giant anaconda fossils from South America to deduce that these tropical snakes reached their maximum size 12.4 million years ago […]