How-to guides page navigation
transcript-content-accessibility-of-ios-apps

 Apps iOS

[Narrator:] Native iOS apps are typically programmed in the programming languages Swift or Objective-C. Apple’s development environment is called Xcode. Even though other companies offer different solutions, the majority of native apps are programmed using these languages and tools. They offer the best integration with iOS. If an app requires iOS-specific features or intensive computational power, for example when writing a 3D game, then the only option is to use these tools.

In general, a developer does not need to do much to make an app accessible. Many features work out of the box without changing a single line of program code. They are implemented at the operating-system level. These are zoom, colour inversion, greyscale mode, balanced audio and assistive touch. If these features are new to you, please consult the Assistive technology – iOS chapter. There is one exception: VoiceOver.

VoiceOver

VoiceOver is the screen reader built into iOS. A screen reader is designed for users with visual impairments. If you do not know what a screen reader is, you can learn more about it in the Screen readers – iOS chapter.

A screen reader reads text aloud to the user, but what if the text on the screen is not enough to make the app usable? There might also be images, which would require alternative descriptions. The screen reader also reads the alternative text of the graphical user-interface elements. This is why we will be focusing on making an app compatible with the VoiceOver screen reader.

VoiceOver architecture: Here is a quick overview of how VoiceOver works. An app interacts with the user-interface elements on the screen via the UIKit framework. UIKit provides the window and view architecture for implementing the user interface, the event-handling infrastructure for accepting multiple types of input, like multi-touch, and the main run loop needed to manage interactions between the user, the system and the app. The UIAccessibility protocol provides accessibility information about an app’s user-interface elements. VoiceOver conveys this information to users with disabilities to help them use the app and handles the text-to-speech conversion, making the result audible to the user.

If the user is working with a tactile braille display, they can also read text output with their fingers. Standard UIKit controls and views implement the UIAccessibility methods and are therefore accessible to assistive apps by default. This means that if an app uses only standard controls and views, such as UIButton or UITableView, a programmer only needs to supply app-specific details when the default values are incomplete.

In the next section, we will see a brief example of how to make an app accessible. Please note that this chapter is not a substitute for a complete programming course for iOS apps. Please refer to Apple’s documentation to see how to make an app accessible.

Accessibility attributes of the UIAccessibility protocol: The accessibility attributes of the UIAccessibility protocol convey information relating to user-interface elements and must be implemented by the app programmer. The attributes are as follows:

  • The isAccessibilityElement attribute describes a Boolean value indicating whether the element is readable by VoiceOver.
  • The accessibilityLabel attribute returns a text description of the element.
  • The accessibilityTraits attribute determines the behaviour of a user-interface element with the help of a bitmask of integers.
  • The accessibilityHint attribute provides information beyond the label for additional help although this is optional.
  • The accessibilityValue attribute describes a dynamically changing value of a user-interface element.

An important point for understanding the necessity of accessibility attributes is that when users operate the app through VoiceOver, they move the VoiceOver cursor around the screen themselves. So please note that the user operates the VoiceOver cursor. A programmer has no influence over where the VoiceOver cursor will be next. The programmer is therefore responsible for ensuring that all necessary information is provided to VoiceOver once the cursor hits a user-interface element on the screen. The information is sent to VoiceOver using the accessibility attributes.

The Hello world! of VoiceOver – static alternative text: The most basic application scenario for an accessibility attribute is the labelling of an image. These code lines generate a UllmageView with an image to be loaded with the content of an image file. If the VoiceOver cursor hits this image, VoiceOver will try to figure out what to tell the user. The user will be informed that there is an image. If no additional information can be found, the user will know that there is an image, but will have no information about its content.

By adding the following line of code, VoiceOver can offer a much better description of the image. Please note you do not have to mention the element type in the description; for example, “Photo of a finger”. VoiceOver will always tell the user what kind of user-interface element has been found.

Dynamic values: We have now seen simple labelling, but how do we communicate with VoiceOver that the value of an element can change? The first line of the program code makes the user-interface element visible to VoiceOver.

For UIKit classes that present information, like UILabel, UIButton, UIImageView, etc., the default is “true”. If the default value is “false”, we need to set the value for isAccessibilityElement. As our temperature value offers no possibility for interaction, we set the accessibilityTraits attribute to “none”.

The accessibilityLabel attribute tells the user what the value describes, whereas the accessibilityValue attribute holds the actual value information. As the text was set as an accessibilityValue attribute, VoiceOver knows that it has to update the value every time the VoiceOver cursor is focused on it.

If we had only used an accessibilityLabel attribute, then VoiceOver would only buffer the initial value, presenting it again when the cursor next reaches it. The temperature value would never change for a VoiceOver user as long as the app was running.

accessibilityTraits

Using the accessibilityTraits attribute, an app can communicate with VoiceOver what type of user interface should be created for an element. Is it a button, an image, static text or something else? Depending on these settings, VoiceOver will behave differently for each element. The accessibilityTraits attribute allows us to tell VoiceOver that the user can interact with the user-interface element. A user-interface element can use more than one trait, for example, it could be an image and a button at the same time.

Let’s assume we were to write a music player app. The interface could look like this. The text at the top could be static text; the cover art, an image; the play, pause, forward and back icons would be buttons, and the volume bar would be adjustable. The accessibilityTraits can be configured in two ways. Traits can be configured in the source code, as seen before, or using the Interface Builder in Xcode in the Identity Inspector panel.

iOS offers many more possibilities for improving accessibility, but for now, we will continue with a basic example.

A basic example

Here we have a little demo app. It shows a table of recipes. Each cell shows an image, a title and an evaluation indicator is shown using a set of hearts. Once we touch a cell, we see the name of the recipe, shown using its title, and a slider. We can use the slider to mark how much we like a recipe.

In Xcode, we have created a storyboard using a navigation controller and two view controllers. The first shows the table and the second, the evaluation screen.

In the source code, we have the root view controller. For demo purposes, we have created an array of recipes as a data source with images, titles and labels. The rest of the code contains standard table-handling code. We customised the table cell layout. The code for handling the cell content is here.

We created a class for our recipes, so we can pass data between the different view controllers more easily. Finally, we have code for showing the large image and slider handling.

We start VoiceOver on the iPhone to see how the app works for a visually disabled user.

[Screen reader:] Curry rice. 3 orange hearts emojis.

[Narrator:] The number of hearts indicates how much we like a recipe, but VoiceOver does not present its meaning.

[Screen reader:] Image.

[Narrator:] VoiceOver detects an image, but cannot tell us anything about its content.

[Screen reader:] Hamburger. Orange heart emoji. Orange heart emoji. Image. Hotdog. 5 orange hearts emojis.

[Narrator:] We do not like the sequence in which VoiceOver speaks these elements. We would like the title to be spoken first, then the image description and finally the evaluation indicator.

[Screen reader:] Image. Image.

[Narrator:] There is no description for the large image.

[Screen reader:] How do you like this recipe? 25%. Adjustable. Swipe up or down with one finger to adjust the value. 35%. 45%. 55%. 65%.

[Narrator:] The functionality of the slider is not easy to understand.

[Screen reader:] Demo Back button. Demo.

[Narrator:] We need to fix this. We start with the table cell. First, we give our image an accessibilityLabel. We do not like the counting of heart symbols, so we set an accessibilityLabel for each of the different presentations. We create a function setting the values of an array, named accessibilityElements, to define the sequence in which VoiceOver will read the elements.

The accessibilityElements array is a defined property of all UIKit objects. We need to call up this function in the root view controller whenever a new table cell is generated. Additionally, we want to pass the label of the selected recipe to the recipe view controller.

Let’s continue with the view controller showing the recipe. As we have seen previously, we can add an accessibilityLabel to the image. We do the same for the slider. To offer the user additional help with what to do when we add an accessibilityHint. Let’s try this with VoiceOver.

[Screen reader:] Curry rice. A plate of curry rice with carrots and peas, image. So-so. Hamburger. A hamburger with tomato, cheese and salad, image. Bad. Hotdog. A hotdog with bread, sausage and mustard, image. Excellent. A hotdog with bread… Hotdog. Bad. A hamburger with tomato, cheese and salad, image. How do you like this recipe? Evaluate how much you like this recipe. 24%. Adjustable. Select the percentage to indicate how much you like this recipe. Swipe up or down with one finger to adjust the value. 34%. 44%. 54%. 64%. 74%. Demo Back button. Demo. Demo, heading.

[Narrator:] This looks much better now.

Testing using the simulator

When running the app on an iPhone, we can use VoiceOver for our practical tests. Unfortunately, the simulator does not offer the VoiceOver functionality. But Xcode offers an additional utility called the Accessibility Inspector. The Accessibility Inspector allows us to check the accessibility attributes of user-interface elements in Inspection Mode, get a preview of all accessibility elements without leaving the app, and analyse the app to find common accessibility issues.

For our demo, we are going to use the app before the fixes have been applied. We start our app in the simulator. We go back to Xcode and start the Accessibility Inspector via Xcode, Open Developer Tool, Accessibility Inspector. In the target pop-up, we select the simulator. Please note that the inspector can also be used to check the accessibility of macOS, iPad OS, watchOS and tvOS apps.

After activating the Inspection Pointer, we can move the mouse over all elements in the simulator to see which ones will trigger VoiceOver and how it will react. If we move the mouse over the image, we see the element type, which is an image, but no further information. We try to move the mouse over the ranking hearts, but we cannot select them.

Beyond this basic information, we can trigger actions for the elements, get some technical information about the element in focus and see the view hierarchy of the element in the app view hierarchy.

We can also choose to audit the app. The Inspector Audit will run through the app to find all elements on a screen that might cause accessibility issues. Let’s do this with our app. The audit reports a set of issues. Each issue has a short title, which can be expanded into a longer description. When we click on an issue, the audited element will be highlighted in the simulator. By clicking on the eye, we can open a new and larger window, framing the element in question. The question mark symbols behind each issue will offer us a hint on how to fix the problem.

We want to have a special look at the last two warnings. We are using a fixed font size for our texts, which should be avoided as users will want to resize the text themselves. We can simulate this using the Inspector Settings. We see here a slider for the font size. We move it and the Ingredients texts do not change their size. We definitely need to fix this.

We enter the storyboard in the InterfaceBuilder in Xcode, select the labels and activate the Automatically adjust font size switch. For the font, we will not use a fixed font size, but instead, select Headline and Body. We recompile our app. We start it again in the simulator. When we move the font-size slider, we can immediately see the result.

The inspector setting can do more for us. We can invert the colours, reduce transparency or reduce motion if our app uses these. As you can see, the simulator and its accessibility inspector can be powerful tools.

Testing using a device

As the final goal is for the app to run on a real device, we should always test on real devices. Here we can evaluate all the different settings and operations our app offers by running them as the users will experience them. This is the only way of testing whether your app can access and use device-specific functions like the accelerometer or geolocation.

Be sure to test the app on more than one device. The screen size or the operating-system version may make a difference.

If you are able to install your app on real devices, why not invite potential users to test it? Along with the other testing methods, user testing can provide specific and valuable insights about the usability of your app.

Apple’s App Store environment offers the possibility of distributing pre-release versions of the app to a specific user group. For practical tests, you need to know how the accessibility features of the operating system work. If you are not sure, please refer to the Assistive technology – iOS chapter.

If you do not know how to operate the screen reader, please consider studying the Screen readers – iOS chapter.

Where to continue?

You have now been introduced to ways to improve the accessibility of iOS apps. In this chapter, we have just scratched the surface of iOS accessibility programming. Please refer to Apple’s documentation for a more detailed explanation. We highly recommend watching the session videos of the yearly Apple Worldwide Developers Conferences. They offer a lot of practical examples demonstrating how to improve the accessibility of any app.

Depending on your personal interests, you could continue with one of the following chapters:

  • Apps Android

  • Apps cross-platform

[Automated voice:] Accessibility. For more information visit: op.europa.eu/web/accessibility.

Close tab