A visual interface expresses information about the state of a web page, or application, through a visual language. For example, links are a different colour to the rest of the text indicating that, if clicked, another page will load. Also, a link's colour may change if the cursor hovers over it. The cursor, too, may change from an arrow into a pointer. The problem is that users who cannot see a visual interface receive none of these visual cues. One way to overcome this problem is to express this basic language through speech: aural interaction.
A screen reader does precisely this. It attempts to communicate the content, structure, and express the state of a page by reading it to the user. Some, also, sound alerts to indicate what type of interface is in focus (earcons).
I use the ChromeVox screen reader. It's a free extension for Chrome and is controlled via keyboard shortcuts. Other screen readers include VoiceOver, NVDA, and JAWS. If, like me, you decide to install ChromeVox you'll probably want to switch it off at some point. To make ChromeVox inactive, on a PC keyboard, it's: Ctrl-Alt+A+A. To disable ChromeVox entirely:
- drop down the browser menu and select Tools > Extensions
- uncheck the Enabled box.
In the future, it may come to pass that very small devices, which have very limited screen sizes, like a wrist watch or glasses, are better suited to aural interaction than visual interaction. Already, Apple’s operating system employs aural interaction through Siri. Google, also, has something similar. So, we may find that aural interaction has a bigger future as something that is beneficial to all users.