The following blog post was written by Larry Weiss – Accessible Technology Strategist at Microsoft. He has worked on making platforms accessible for more than 26 years.
In a previous post, Alex Li challenged the assertion that a keyboard must control all of a computer’s software.
So, if a keyboard is not required, what alternatives could meet the needs of users with disabilities?
Consider Microsoft’s Surface with Windows 8 and built-in assistive technology, including a touchscreen that can be configured without a keyboard. Could a user with no vision or limited hand mobility perform keyboard functions on a Surface?
A person with no vision could start Narrator by selecting Start + Volume Up, explore screen content by hearing what is displayed under their fingers, and activate the control they last heard by double tapping anywhere on the screen. In addition, they could navigate sequentially with a quick swipe left or right. This covers moving user focus and manipulating controls. Surface also has a simplified text-input control that a user could navigate like other controls. Yes, a keyboard would be more efficient, which is why it is an option on the Surface, but for light-duty text entry simplified input would work.
With a Surface, a person with limited hand mobility has several options. Let’s ignore the On-Screen Keyboard, since it is a simulated keyboard. For someone with minor mobility impairment, the touchscreen’s user interface (UI) may be sufficient because it has relatively large touch targets. Users with more severe impairments may find that Narrator’s gross gestures work better. And users with a more extreme impairment can use Surface’s built-in speech recognition, which provides not only command and control, but voice input of text.
A keyboard provides buttons in a standard layout and connection. With all of the other options available to users with disabilities, a keyboard may no longer be required.
And again, if the apps you're exploring with touch don't work with keyboard access, they probably won't work when you "explore content by hearing what is displayed under their fingers" - most likely because the controls selected by the developer are non-standard and don't support the accessibility APIs needed for things like Narrator to work.
But if you enforce the "must work with a keyboard!" rule then the developer will probably select controls that support UIA - whether they know what this means or not - and you are more likely to get an accessible user interface.
Mandating keyboard access contributes to an accessible user interface in real-world applications.