Microsoft is taking important steps to make Windows 10 more accessible for everyone, adding a built-in eye tracking function to the OS for users who can’t control a computer through more traditional input methods.
Microsoft CEO Satya Nadella announced the upcoming new feature, dubbed Eye Control, at the company’s One Week Hackathon. The native eye tracking support will allow users to launch programs, type, and scroll through documents — just about everything you can do on a PC — by just looking at the screen.
Eye Control is especially important, however, because of the potential it has to allow those suffering from diseases like amyotrophic lateral sclerosis, aka ALS, to communicate with those around them with text to speech programs and other methods.
The idea for Eye Control stemmed from a previous Hackathon project challenge issued by Steve Gleason, a former NFL player turned accessibility advocate who has ALS. The degenerative neurological disorder, which was the cause behind the Ice Bucket Challengethat swept the internet in 2014, results in the loss of motor function, with eyes commonly being the last remaining muscle that can move.
Eye Control was designed in conjunction with Tobii, whose Eye Tracker 4C hardware will be the first to support the feature, since users will need to add an eye tracker to their PC to take advantage of the functionality.
There’s no timeline for when Eye Control will roll out for all Windows 10 computers, but the program is currently being tested in beta form. Windows Insiders have access to check out the feature before an official launch.