Tips for testing iOS app accessibility using VoiceOver

I really enjoy testing iOS app accessibility using VoiceOver but for newbies it can be a little tricky to get started. Here’s some testing tips:

Use VoiceOver on a real iOS device to test accessibility

You can’t use VoiceOver on the iOS simulator, which is a good thing because it relies so much upon input gestures which can be mastered using a physical device, so you’ll need to test your app on a real iOS device. If you are creating an iPhone application and don’t have access to an iPhone, you can use a cheaper iPod touch for VoiceOver accessibility testing.

Use triple click home button to enable/disable VoiceOver

The first time you use VoiceOver is quite confusing as it basically entirely changes the way the operating system behaves. You can set an Accessibility Shortcut in the Accessibility menu of iOS so that triple click home toggles VoiceOver on/off. This is good for when you get stuck and you don’t need to navigate the menus with VoiceOver on to turn it off.

Triple Click Home VoiceOver iOS7

Master the gestures

VoiceOver has completely different gestures than standard gestures so you’ll want to practice them. The most common gesture is swipe/left right to select different elements which then require a double tap to activate (a standard tap). Two finger swipe up reads all elements from top of the screen and two finger swipe down reads from the bottom of the screen. There’s a useful guide to VoiceOver gestures available on the iOS developer site.

Use Screen Curtain

If you want to test your app is truly accessible then you can close your eyes, but if you are like me and might peek, use Screen Curtain (three finger triple tap) which blanks the screen entirely but leaves VoiceOver running so you using your app without a visual display. Neat.

Summary

The best way to do iOS accessibility is to dive in and get started. If you’re like me you’ll pick it up quickly and find it fun to do.

Tips for making your iOS app accessible

I’ve been doing some work recently on native iOS app accessibility and have started to see common issues appearing related to accessibility that can be resolved by using a common approach.

It’s important to note that accessibility is enabled by default on all iOS app development and if you don’t do anything crazy then your app should be mostly accessible, but it is still worth checking and keeping an eye out for common mistakes.

I do all my testing on a device using VoiceOver but I won’t detail that here, instead, I will write a separate blog post with some tips on VoiceOver testing.

Here’s some tips on making your iOS app accessible:

Enable form field tabbing

A VoiceOver user moves between elements using gestures, so on a form you should make it easy for a VoiceOver user to move between the fields by making a ‘return’ action on each field that moves to the next field, or submits the form on the last field. Apple goes above and beyond this functionality in its own native apps by putting accessible “Previous”, “Next”, and “Done” elements above the keyboard on forms which are recognized by VoiceOver and make it super easy to navigate and submit a form.

Apple Form Accessibility Example

Make embedded UIWebViews accessible

If you embed any HTML content in your native app then you should mark the UIWebView as an accessible element, but very importantly, don’t mark its parent as accessible, otherwise the UIWebView won’t be accessible via VoiceOver. Once you’ve done this, VoiceOver reads the content in the same way it does a web page in Safari.

Make UIPageControl page indicators accessible

When you have multiple horizontal pages in a UIPageControl, the page indicators are the dots that appear at the bottom of the control that indicate which page you are on as you swipe left/right through the pages. VoiceOver uses the swipe left/right gestures for navigation so a VoiceOver user won’t be able to switch between your pages unless they use the page indicators.

page indicators ios

Page indicators aren’t automatically accessible, you must do two things. First set the accessibility trait to UIAccessibilityTraitAdjustable and then implement a accessibilityIncrement and accessibilityDecrement which changes the pages. This means a VoiceOver user can focus on the element and use slider up/down from the top of the screen to navigate pages.

Ensure appropriate color contrast

This isn’t specific to VoiceOver testing but ensuring your application is accessible to visually impaired or color blind users. An example I have seen recently is a black ‘copy/paste’ popup on a black form.

black on black

Summary

Making accessible iOS isn’t difficult because Apple has done a lot of work to ensure accessibility is built into the development platform. The key is to build these features in as you go and continually test on a device using VoiceOver enabled.

More Information

There’s some great information here and here about iOS accessibility from Apple.

iPhone app of the week: Yahoo! weather

Today’s featured iPhone app is one of the best looking apps I have seen in a long time: Yahoo! Weather. It makes the default weather app look drab (at least until iOS7).

The background images regularly change to match the city shown, time of day and weather.

My favorite part is how it balances minimalism with detail by sliding up to show more info. The animated wind turbines that spin faster and the sun projection that shows the suns progress are particularly pleasant. Well done.

Link

20130820-160310.jpg

20130820-160319.jpg

20130820-160325.jpg

Touch Driven Computer Human Interfaces

Our third child, Winston, was born last week. He came bearing gifts for our other two sons in the form of a second-hand MacBook Pro I purchased for a couple of hundred dollars through an auction at work (our three year old son is still trying to work out how the MacBook fit in Mama’s tummy!).

I bought an external Apple mouse (I thought it would be easier than the trackpad) and have started to teach my older children (aged 1 & 3) how to use the mouse, but it’s much harder than I thought. They are both very proficient at using iOS (they both own iPod touches and use my wife’s iPad mini rather frequently), but struggle to use the mouse. They also try to touch the MacBook screen thinking it’ll do something even though it doesn’t.

Whilst I want them to learn the mouse (and trackpad eventually) I have started to ask myself whether we should even bother, since the future is clearly touch.

Different Computer Human Interfaces

The computer human interface on my first computer was a mouse driven graphical interface known as Windows 95 (yes I got a computer late, and yes I am that young).

To this day, I am often the odd one out in software development as I prefer a graphical interface to a command line (textual) interface. That’s probably because most people who work in software development today had their first experience with a textual interface (before GUIs were invented).

My two son’s first experience is with a touch driven graphical interface which is well and truly the most intuitive to them. By the time they enter the workforce I can’t imagine them using anything else, just as a large majority of workers today, bar people in software development, don’t use a textual interfaces. I’ve witnessed first hand usability issues that younger Generation Y staff have using ‘green screens’ on mainframes as they much prefer a click driven drag and drop user interface.

The other day I had to visit an Australian Government shopfront and the first thing that happened when I walked in the door was I was greeted by a staff member with an iPad who bought up my details and guided me to a lounge area where I waited for another staff member who welcomed me by my name. That wasn’t the Apple Store I was visiting, it was an Australian Government shopfront. The future of business technology is well and truly past using mice.

And it’s not just business where touch makes sense. My sixty-something mother has never used a computer in her life, although we did try teaching her a few years ago but it was just too hard (you double click a file, but single click a web link?!?). I took a risk early last year and bought her an iPad for her birthday. The moment she picked it up she could use it immediately without help, and since she’s retired she now doesn’t go a day without using it to send friends emails, skype grandchildren and play ‘words with friends’ with my brothers. She has since upgraded her old Nokia to an iPhone which she was again able to use immediately without training.

Children born now are getting seriously used to intuitive touch driven interfaces so we need to ensure that business progresses sufficiently to support this whole new generation that is growing up as we speak. Just as it is hard now to support ‘green screens’ with Generation Ys, it will be difficult to support older mouse interfaces with generations being born, and using iPads, as we speak.