We download mobile apps from stores, click on icons, try out features and use them to their full potential. But has it ever occurred to you that some people don't see the phone screen well, or at all? We think about such users, too, when we develop the app, so we can make it accessible by everyone. Our helper in this case is the VoiceOver feature in iOS and iPadOS.
One of the pillars of human communication with a mobile app is vision. In a world of digital technology, it is often taken for granted, but not all of us can fully rely on this sense. A significant part of our society cannot use their eyesight. According to the Czech Statistical Office (ČSÚ), there were 62.4 thousand people with severe visual impairment and blind people in the Czech Republic in 2018. According to the European Union, there were 2.5 million blind people and 23.8 million people with severe visual impairment in Europe in 2010.
Fortunately, mobile phones have a set of options (accessibility) to adapt mobile apps for people with visual impairments and the blind so that they are able to use them. Different mobile phone manufacturers offer different options. In this article we will focus on the iOS/iPadOS platform - i.e., iPhones and iPads. We'll delve into the topic of the Android platform in the next part of this series.
"The main element of visual accessibility on the iOS platform is VoiceOver. It is a screen reader that the user controls with special gestures. In the phone's settings, the user can set how they want VoiceOver to turn on, how fast they want the reader to read, what language to use, what gestures to use and other things that will make communication easier," explains Mobile team leader Ondřej Franek.
Typically, VoiceOver can be turned on/off with Siri or a triple press of the side button (on iPhones with Face ID) or a triple press of the desktop button. Scrolling through items then works by touching the display and then sliding your finger across the screen. VoiceOver thus reads and marks the items that the user is currently touching. When the user wants to go to the next item, they swipe right, but for the previous item they swipe left. To select the current item, the user double taps.
For smartwatches, VoiceOver can be turned on in the settings of the paired phone (activated by triple-pressing the watch crown), via Siri voice assistant, or by using the Apple Watch app. In the case of watches, VoiceOver works, for example, by automatically telling the time when you lift your wrist and providing the navigation when you control the environment with the crown. Touch gestures are similar to iPhone or iPad.
In order for VoiceOver to function properly, the application must be fully compatible with it.
To ensure this, we need to pay close attention to the following four topics.
Mobile apps often contain various specific UI elements that VoiceOver does not recognise on its own. Often these are multiple elements that form a meaningful whole - e.g., a video player that consists of several buttons (show title, play/pause, settings, close, timeline, etc.).
"These special features, such as the video player mentioned above, must be tagged by the app to work properly. On the other hand, elements that are not important for the interaction and would hinder the user should be marked as unwanted. These can be, for example, additional visual elements such as a horizontal separator that support readability," describes developer Adam Pečev, adding that the isAccessibilityElement attribute is used to mark the elements.
To move to the next or previous element, the user must swipe right or left. When moving to the next element, VoiceOver UI reads the items from left to right and top to bottom. This applies to Czech. Depending on the language, the direction may be reversed (right to left, top to bottom).
In mobile applications, related elements are frequently vertically grouped together or arranged in non-standard configurations. In order for screen readers to properly interpret and read these grouped elements in the correct sequence, it's imperative to ensure the application is properly prepared for this. By utilising UIAccessibilityElement to represent these groupings, the application can ensure that they are correctly read by screen readers.
The user interface often includes a variety of visual elements by which the user can tell what action will be triggered. A user icon that shows the user profile, or whether the user is logged in or not. Different coloured buttons, where the colour indicates the type of action - reject/confirm. Animation of a text box during user login, indicating that the user has entered an incorrect password. All these visual elements, as well as some other elements, need to be converted to text form using the accessibilityLabel attribute and, as the case may be, accessibilityHint.
The first specifies the text to be read by VoiceOver and should be short and concise. If accessibilityLabel is not set, VoiceOver selects the text of the element itself (icon name, button text, title text,...). If accessibilityLabel is not descriptive enough or is too long, accessibilityHint should be set. It specifies the result of the action of the item - e.g., "Show all items" - and is read by the VoiceOver only after the accessibilityLabel is read.
"In addition to the usual simple elements, more complex elements such as the carousel (horizontal list) appear quite often in mobile applications. These can pose a problem. Specifically, for example, for Czech, where VoiceOver counts on text being written from left to right, it starts reading all the items in the carousel one by one until it reaches the end of the carousel. Only then does it read the next item below," says Adam.
"This ensures that the user can move up and down in the carousel by swiping up and down. But if the user swipes left or right, the whole carousel will behave as a single element and the user can simply skip it if they want to. For more detail, I recommend the Apple documentation," adds Adam.
To ensure that the mobile app can be used really by anyone, VoiceOver is an essential part thereof. Once the mobile app has been checked and the points mentioned above have been implemented, it should be largely ready for user testing by a visually impaired user. Only then will it show if the app is really usable in real life.