Microsoft’s Eye Control feature heralds a new era in computer accessibility

For a young disabled person growing up in the 1990s and early 2000s, just as the personal computer was launching its invasion of homes, schools and workplaces; technological developments offered a tantalising glimpse of future liberations from restrictions imposed by my impairment. I have cerebral palsy which means that my condition is unlikely to change very much but, from early on, the running presumption was that technological progress would alleviate some of the resulting disadvantages.

During my teenage years, I could see how word processing, email, instant messaging, web browsing and any number of other software applications might revolutionise the way I worked and played. Yet, at the same time, I was frustrated that my lack of fine motor control meant that I was unable to use a mouse and keyboard.

The gap between what I imagined might be possible in the future when computers eventually became more accessible for someone with my impairments and the limited access options available only gradually narrowed.  I soon found that I could control a cursor reasonably well with a tracker ball or joystick placed under my chin.  However, typing with a chin-operated device and onscreen keyboard was slow and tiring.

Initially, I thought that the next breakthrough in my personal technological evolution would be provided by voice recognition software, although my trials of successive versions of Dragon NaturallySpeaking demonstrated that speech-based solutions would not be able to cope with my slurred speech for some time to come. That was before I encountered eye tracking technology which allows users to interact with their machines simply by looking at the screen. I was soon convinced that eye tracking was the solution to years of toiling with my chin.

The eyes have it

Microsoft’s recent announcement that native support for eye tracking will be included in a future update of Windows 10 is a watershed moment in operating system accessibility. While most of the major operating systems (Windows, Apple, iOS and Android) offer a range of inbuilt support to help disabled people operate their devices, they rarely address the needs of a minority of users with the most profound physical disabilities.

The new Eye Control feature was inspired by a Hackathon event held in 2014 which awarded top prize to a project that enabled a participant with amyotrophic lateral sclerosis to drive his wheelchair with an eye tracking device. The Windows development team jumped aboard an internal eye tracking study group formed after the project and partnered with Swedish firm Tobii to develop the new function (a beta version of which is available to be downloaded from Tobii’s website).

Windows 10 Eye Control lets users access applications by glancing at icons and menus and allows you to input information using a dedicated virtual keyboard and communicate via a text to speech. Like existing eye tracking software packages that allow users to take control of cursor, the mouse feature requires users to select a region of the screen before magnifying the area make fine adjustments. Eye Control also includes a shape writing feature whereby users look at the first and last letter of a word and skim over the intervening characters – a method that can be faster than gazing at each individual character in sequence.

Eye tracking requires specialist cameras that can map out where the user is looking in relation to the screen. Tobii’s cameras do this by emitting infrared light in order to illuminate the cornea of their eyes. Image sensors and image processing software then identify the eyes and detect the position of the iris and pupil. The direction of the user’s gaze is then pinpointed using 3D modelling.

Coming into focus

The history of modern eye tracking stretches back to studies of saccadic eye movements conducted by Russian psychologist Dr Alfred Yarbus in the 1950s. The technology is useful for measuring certain human behaviours like attention, interest and arousal and has been employed by researchers operating in disciplines such as education, medicine, engineering, computer science and marketing.

A combination of cheaper and more powerful hardware, new open source software packages and innovations in the way software can be programmed and trained has enticed consumer brands into the field. Eye tracking is set to become ubiquitous with big name players such as Apple, Google and Samsung joining Microsoft to incorporate it into everything from smartphones, laptops and tablets to video games and healthcare diagnostic equipment.

Microsoft’s Eye Control represents a milestone, making a future version of Windows accessible in principle to users with severe physical disabilities without requiring additional software. The need to purchase a camera is likely to remain a sticking point for many, with prices for most models reaching into the thousands of pounds (they are continuing to tumble). The feature can only currently be used with Tobii’s Eye Tracker 4C while additional options have been promised. However, these developments are likely to be a staging post for an imminent future when the hardware is built into devices of all shapes and sizes along with the software.

This article was originally published in the September 2017 edition of the dispATches newsletter about technologies that empower disabled and older people to be more independent – circulated by Designability. Click here to subscribe