Trends in mobile device use by people with disabilities

  • Author: Jonathan Avila
  • Date: 12 Mar 2014
The increasing use of screen readers on mobile devices means organisations must address the accessibility of their mobile content, Jonathan Avila, Chief Accessibility Officer at SSB BART Group writes.

 

As the use of mobile devices such as smartphones and tablets grows, the number and complexity or mobile websites and apps has increased. Mobile devices have also become more accessible to many users with disabilities. The most recent Web AIM Screen Reader User Survey from January 2014 indicates that mobile device usage among people who are blind or visually impaired and who use screen readers is continuing to increase —  indicated by 82 per cent of survey respondents. Close to half — 44% of respondents — indicated that they use mobile screen readers as much or more than desktop/laptop screen readers. It is time for organisations to seriously address the accessibility of their mobile content.

Accessibility standards and mobile devices

While all mobile content should be accessible to the widest possible audience, including people with disabilities, to date accessibility standards have focused on particular areas of mobile phone use. Examples include hearing aid compatibility and TTY support by people who are deaf or hard of hearing, or in the case of the Twenty-first Century Communication and Video Accessibility Act in the US, advanced communications services, video programming, and mobile browser access by people who are blind or visually impaired. These laws, as well as other directives and policies, assist countries in meeting the obligations to accessible Information and communications technology (ICT) under the United Nations Convention on the Rights of Persons with Disabilities.

To address the gap in mobile guidelines the World Wide Web Consortium (W3C) has created a task force under the Web Content Accessibility Guidelines (WCAG) Working Group to study the current techniques of WCAG for mobile applicability and propose additional mobile techniques that will assist organisations in implementing WCAG on mobile devices. These efforts will not update WCAG itself or create additional guidelines, but instead provide non-normative techniques and documented failures for accessibility supported methods of meeting WCAG with web content.

Addressing Mobile Accessibility in an Organisation

In making content more accessible, the first step is to understand what mobile resources an organisation has. Mobile accessibility is not unlike accessibility of other web content — accessibility is best implemented from the requirements and design of products, site, or app.   Organisations should include mobile requirements such as standards and guidelines in their development lifecycle. In the absence of other specific mobile guidelines organisations should use the WCAG guidelines with some additional techniques discussed in this article along with an evaluation process that includes verifying techniques are used in an accessibility supported manner. The BBC Mobile Standards and Guidelines are also a good place to start when organisations consider adopting or drafting requirements.

Understanding the types of mobile content

There are two primary types of mobile content, mobile web content and native mobile apps.  Mobile web content is designed to be used in a browser on a mobile device. The web content generally uses similar JavaScript frameworks such as Jquery and markup as desktop content.  Mobile apps are apps created in the native platform using environments such as Google's Android Software Development Kit or Apple's Cocoa Touch framework. This breakdown is conflated by the use of embedded web views in native mobile apps and frameworks such as PhoneGap that use web based user interfaces within a native app shell that provides cross device support and access to device resources. Based on the Web AIM Screen Reader Survey, it is clear the iOS and Android platforms should be the area of concentration for mobile app testing. While screen readers exist to some extent on other platforms the support may not be in the applications programming interface. For example, Windows Phone 8 can be bundled with a third-party created screen reader. However, the screen reader works with bundled apps and only certain screens as there is not currently an accessibility API for Windows Phone 8.  Blackberry has developed a screen reader and API under version 10.2 of its OS and can be found on the Z30 model. These platforms do have other assistive technologies such as large text, magnification, high contrast, etc. although many of these accessibility features work independently from the accessibility API.

Testing mobile content

One option is to test mobile web content for technical compliance using the standard desktop process for web content (with modification). Then, test the mobile web content for functional aspects such as use case testing and accessibility support methods using the mobile device, assistive technology, and users with disabilities. This approach leverages the tools available on the desktop for testing — mobile testing tools are limited. While there are many non-accessibility related solutions for testing mobile web content in the mobile environment, there are not many solutions for testing the accessibility of mobile content in the mobile environment. Most organisations already have a well-defined desktop process for testing web content and a solid set of tools for evaluating and inspecting accessibility.

In particular, mobile content may be generated differently depending on the device. This is typically done by two broad methods: 1) responsive web design with techniques such as breakpoints and feature detection and 2) user agent (browser) detection. There are some mobile device emulators and there are methods of obtaining the live document object model (DOM) is also provided. These methods may be useful for manual code inspection or visual inspection for colour and contrast but do not contain inspection tools similar to the Web Accessibility Toolbar or the AMP Toolbar for Firefox. For example, Adobe's Edge Inspect allows access to the mobile browser's DOM on iOS and Android and allows screenshots to be sent from the mobile device to the desktop for testing of contrast and colour. Additionally on iOS, Safari's developer tools combined with Safari on the Mac can be used in a similar method to access the live mobile DOM. This works well for mobile sites that are not embedded in apps such as PhoneGap. Although the HTML content from PhoneGap apps can be tested using these method prior to packaging as a PhoneGap app.

Additionally, mobile web requirements should be tested at this point. One example is the need for users to be able to resize mobile content up to 200 per cent without the use of assistive technology. Many mobile web sites will block the user from pinch zooming by setting the viewport metadata to block user scalable or block the maximum scale.  Below is an example of the viewport metadata that blocks pinch zoom:

<meta name="viewport" content="height=device-height, width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />

Also of note is that mobile browser support is not consistent. For example, the Firefox browser on the Android platform is the most accessible to users who are blind and visually impaired and is available at no cost. On iOS, the Safari browser is the most accessible to users who are blind or visually impaired and is provided with the OS. It's important to consider the technology stack including browsers when determining what is accessibility supported for mobile web content.

Native Apps can generally be tested with WCAG success criteria with the provisions outline by the guidance document Applying WCAG to non-web ICT. There are certain considerations however for mobile app content. For example, some mobile app creation tools such as AIR do not implement accessibility. There are other cross platform tools that have limitations in accessibility. Generally, the platform frameworks provided by Google and Apple are best for their respective environment. Some considerations for mobile apps include how mobile content is accessed by people with disabilities on mobile devices and other mobile specific techniques.

Google v. Apple considerations

For example, the Android platform provides for native external keyboard support while the iOS platform only natively supports external keyboard for input field navigation, text entry, and some specific functions. When the VoiceOver screen reader on iOS is run the keyboard can be used to perform many other tasks. Additionally, the way screen readers work on touch devices requires that the screen reader take control of the touch interface and intercept the touch events. This allows users who are blind or visually impaired to explore the screen by touch without performing an unintended interaction. This has at least two consequences, apps must correctly interpret the instructions to perform interactions from the screen reader such as scrolling or switch pages and apps that require direct touch techniques such as signing a name must allow direct touch access to that interface element while the AT is running.

Similarly, touch gestures must be accessible to people with motor disabilities. For example, gestures that require two fingers may not be achievable by users with limited dexterity or by those who use a stylus. Alternative gestures or mechanisms must be able. On iOS a suite of accessibility features such as AssistiveTouch and switch control are available to users with motor disabilities. However, on the Android platform alternative input access may be achievable via external keyboard or alternate gestures built into the app. While Android apps can build in switch control or dictation access to controls no system wide accessibility features current exist for the platform as a whole.

Another difference between Android and iOS is that Android allows different on-screen keyboards to be used across all apps as well as custom in-app keyboards. Currently iOS only permits custom in-app keyboards. There are two aspects to consider with this. Test with the expected standard on-screen keyboard and Android and make sue the on-screen keyboard is also accessible with the assistive technology including touch access via the assistive technology. These on-screen keyboards often include dictation buttons which also need to be accessible — this is particularly useful for people with disabilities as typing on on-screen keyboards can be difficult. Additionally, an item that isn't well address by current WCAG techniques are features that require speech. Apps that use this must provide an alternate to speech.

Different WCAG success criteria should be evaluated against support of the platform as well as the accessibility supported features of assistive technology. In the case of iOS the assistive technology is closely controlled and tied to the OS version and thus the Apple provided accessibility features and AT can be used for the versions of iOS that are supported by the App.  Android allows for third party assistive technology although the primary screen reader is TalkBack. The Spiel open source screen reader is also available. This is complicated by the fact that TalkBack offers varies level of support on different versions of Android and TalkBack can be updated independently from the OS version although certain features or not supported on the older versions of the platform. Also, the Android OS can be customised and custom user interfaces can be placed on top of the Android OS by manufactures. Google appears to be trying to rein in these actions by manufacturers. Similarly, some manufacturers provide their own accessibility features on Android such as inverse colors on Samsung phones. Both platforms offer options for displaying captions (when available) in the system media player. So, it's important to make sure that these settings are enabled at the platform level before testing for criteria that rely on these settings.

To verify that user interface elements have the required name, role, state, and value properties for each OS controls used in the app a table of required properties by platform and control should be used to evaluate the controls. During testing, events sent from the App or system should be also be monitored to ensure they are provided in a way that is perceivable by assistive technology. For example, when a layout change occurs an event must be sent from the App or OS alerting the assistive technology that the screen changed from portrait to layout. This is an important notification because touch gestures work differently depending on the orientation and if you can't see the orientation you may perform the incorrect gesture. Because the lack of accessibility tools on the mobile platforms like the equivalents on the Windows desktop such as Microsoft's Inspect and Accessibility Event watcher testing will likely require use of platform assistive technologies such as the manufacturer provided screen readers.

Conclusion

In summary, organisations must implement mobile accessibility design and requirements into their processes. Testing for mobile content can be broken down by web content and native content but ultimately requires accessibility supported methods on the platforms where the content will be consumed. Mobile specific techniques and failures should be documented as well as expected properties for user interface controls. Assistive technology will likely be a larger factor in testing due to the dearth of testing tools for native apps. Finally, users with disabilities should be involved in testing to ensure functional use.

Jonathan Avila is Chief Accessibility Officer at SSB BART Group, a provider of IT and web accessibility consulting, tools, and solutions. Jonathan will be presenting on the mobile accessibility landscape at the International Technology & Persons With Disabilities Conference, March 17 to March 22, 2014, Manchester Grand Hyatt Hotel, San Diego, CA.