Integrating core accessibility into Android app development is relatively straightforward to do and should be considered as business as usual for every project
While many people know iOS comes with built in voice output support in the form of VoiceOver, fewer seem aware that Android is catching up. Since version 2.2 the Talkback screen reader has been available to download from the Android Market and since version 4 has been bundled with the operating system.
To get started with Talkback go to Settings > Accessibility > Talkback and select On. It’s worth doing the tutorial that can be found in the Talkback settings tab. You can also try alternatives to Talkback such as Spiel, Mobile Accessibility or Web Reader.
Once done you’re free to roam your apps; however, Talkback can’t access web content (either within hybrid apps or browsers) without the addition of two further add-ons:
- The Eyes-Free Keyboard – once downloaded from the Android Market and activated via Settings, the Eyes-Free Keyboard enables you to pull up a virtual keyboard or navigation panel over the bottom third of your screen that you can use to input text or navigate content.
- Web scripts – activated via Settings > Accessibility and selecting Enhance Web Accessibility the web scripts allow Talkback to interact more smoothly with we content
Now you should be all set.Build
The easiest way to guarantee voice output support is to use Android’s built in user interface controls. These are natively accessible as each already has a contentDescription assigned, which is where the label or alternative is housed. If using standard controls all you need to do is customise labels to reflect the meaning of your content.
If building your own controls you need to build the contentDescription yourself and assign an appropriate alternative. This is applicable to all user interface components including ImageButton, checkbox and ImageViews. Anything without a contentDescription will be ignored by voice output, so do not assign one to decorative images.
There is an exception for EditText components. These should have an android:hint instead of contentDescription. This allows the field label to be read as well as the contents of the field when it has been filled in.
If you have controls that change state such as Play/Pause or Add/Remove buttons be sure to update the contentDescription in line with their changes. Voice output also announces a control’s trait, therefore identifying whether a control is a button, link, form element and so on. It’s important to think carefully about what a control does as users will generally expect a button to perform an action such as adding an item to favourites and a link to open a new screen. It can be quite disorientating to activate a button and be taken to a new screen.
All elements must be focusable via touch, keyboard and directional controllers. This is done by assigning android:focusable=”true”, setFocus, isFocusable or requestFocus to controls. Focus order is determined by proximity to neighbouring content moving from top left to bottom right. If this isn’t logical you can manage focus order by assigning nextFocusDown, nextFocusLeft and nextFocusUp to components.
Occasionally it might be appropriate to hide content from voice output users or deliver alternative content. For example, you may want to hide instructions on how to use the app that are targeted exclusively at sighted users and replace it with instructions for voice output users. By setting isScreenReaderActive you can detect whether voice output is running and serve appropriate content. A word of caution, however: use this technique judiciously, because 99 per cent of the time all content should be fully accessible to all users.
Input via the keyboard is very fiddly for voice output users. As such you want to avoid free input fields as much as possible and replace them with radio buttons, checkboxes and toggle buttons, which are easier to use and have less margin for input errors.
Just as individual controls must be made accessible so too must custom views.
In addition to this, if building a hybrid app make sure web content is accessible following desktop accessibility best practices by providing alternatives for images, good semantics, focus order and labelling of form fields.Test
A useful development tool for day to day testing is Lint. This automated tool checks for a subset of accessibility issues such as missing contentDescription and missing input types. You can also write custom scripts to test for other issues you might want to flag.
While useful for day-to-day development, Lint is not a thorough test for accessibility. To really know whether the app is accessible you need to test it with disabled users, or test content yourself using Talkback. By doing this you will be able to assess not just if objects and form inputs are labelled but also whether the labels adequately describe the context and make sense. In other words, you test not just the accessibility of the app but how usable it is for voice output users.
Ideally you should test while not looking at the screen. Unlike iOS, which uses Screen Curtain to switch the screen off while VoiceOver is running, Android has no built in functionality to do this. A workaround if testing web content is to set the screen brightness to zero in the browser accessibility settings. To test apps you can use the download Shads, which lowers your screen brightness to the bare minimum.
Once done you can then test voice output support by running the following tests:
- Are all objects labelled?
- Are decorative objects ignored?
- Are all form inputs labelled?
- Are changes of state announced?
- Are the correct input types used?
- Is content order logical
If testing web content, it’s worth testing not just in the native browser but also in Chrome and Firefox Nightly, both of which have excellent voice output support. Firefox Nightly also provides keyboard shortcuts to perform common actions such as navigate between headings, form elements, links and so on.