It’s not easy to use a PC if you have ALS or another neuromuscular disease that prevents you from using your hands. You can use eye tracking, but that could easily entail specialized software and an imperfect experience. Microsoft thinks it can do better. It’s adding built-in eye tracking to Windows 10, nicknamed Eye Control, that will let anyone navigate using their gaze. You can launch apps, type and otherwise perform common tasks just by focusing your eyes on the right part of the screen.
Microsoft partnered with Tobii on Eye Control, and it won’t surprise you to hear that Tobii’s trackers have the broadest compatibility with the new feature. The upgrade is available in beta as part of a Windows Insider preview if you’re eager to try it right away, although there’s no firm timetable for when it’ll reach stable Windows versions.
The addition represents the next big step in making PCs truly accessible. Both Apple and Microsoft have accessibility features, but they’re usually focused on vision and hearing issues. This opens the door to people who need an entirely different control scheme. Don’t be surprised if you see eye tracking interfaces (and eventually, other interfaces) come to other platforms and mobile devices.