Categories
Inclusive XR: accessible augmented reality experiences
Posted on by Joe Lamyman in Design and development
In our second post about creating accessible experiences within Extended Reality (XR), we highlight some key considerations for designing accessible augmented reality (AR) experiences with our AR TetraLogical principles cube.
You can also explore Inclusive XR: accessible 3D experiences.
What is AR?
Augmented reality blends digital content with the real-world. People tend to use it on their mobile devices, by scanning their environment. The mobile device will then add interactive content on top of the camera feed. The combined result is then presented on the device's screen. By merging the real and the digital, AR can provide people with contextual information, as well as interesting and engaging experiences.
A common example of an AR experience is digital face filters, where effects are digitally applied to a person's face. When viewed on the device's screen, the face appears to have different effects applied to it.
Another example is viewing the scale of animals in a physical space. When searching for an animal on Google using an Android device, you can view the animal with AR and see the size of the animal in relation to your surroundings.
IKEA Place offers similar functionality, allowing you to place furniture in your room with AR. You can use this to understand the size and positioning of furniture, without having to move and manipulate physical furniture.
There's also the use of AR in Google Maps to overlay directions and help with way finding in the physical world. During the Covid-19 pandemic, the NHS used the HoloLens headset. This headset makes use of AR by providing important information overlaid on top of the physical world. This provided staff with all the information needed to treat patients.
Technical information about AR
For developing AR experiences, the WebXR Device API provides a standardised way to create AR experiences. The API allows for creating an AR or virtual reality (VR) session and getting information from devices. The WebXR samples page provides examples of the functionality possible with the WebXR API.
At the time of writing, the WebXR Device API document is a candidate recommendation, so may be subject to change. It's also worth highlighting that support for the WebXR Device API is around 73% - most of which is from usage by Chrome for Android. Usage on iOS devices is not supported, but that may change in the future with Apple adding partial WebXR support in beta releases of iOS 16.
Please note that the API only manages the session and information about the device. To render any graphics, you can use WebGL or a framework, such as three.js used in our TetraLogical principles cube prototype. Frameworks can make rendering graphics simpler than having to write WebGL yourself.
Accessible AR experiences
AR experiences tend to rely on people being able to see content on-screen and manipulate their devices to scan their environment. These interactions may not be possible for everyone.
The BBC's Barriers Browser research documents barriers encountered by people with disabilities, using VR environments. These barriers may exclude people due to the design of the AR experience.
How can we use the information about barriers to create inclusive, accessible AR experiences?
The Inclusive Design Principles (IDP) include key considerations for creating inclusive and accessible experiences. To resolve the barriers encountered by people in AR experiences, we will focus on three principles:
- Provide a comparable experience: Ensure your interface provides a comparable experience for all so people can accomplish tasks in a way that suits their needs without undermining the quality of the content.
- Give control: Ensure people are in control. People should be able to access and interact with content in their preferred way.
- Offer choice: Consider providing different ways for people to complete tasks, especially those that are complex or non standard.
In summary, people must be able to interact with the experience in their preferred way and there must be accessible alternatives to access the same information.
As described in the previous post in this series, Inclusive XR: accessible 3D experiences, using an accessible 3D model viewer alongside an AR experience is a great way to provide an alternative for people who cannot or do not want to use AR:
Fundamentally, being able to view 3D models also provides people with control. Due to environment, hardware, situation, or preferences, people may not be able to view an object using augmented or virtual reality methods.
Providing alternative ways of interacting can allow more people to access AR experiences. In our AR prototype, the primary method of interaction allows the people to place the virtual model anywhere they like. By not restricting the placement method, people who cannot move their devices are still able to use the demo. If the person using the functionality would like to, they can control and change the placement method to find surfaces to place the model on.
When designing AR experiences, the primary experience must be accessible to all.
Using the Barriers Browser research and the Inclusive Design Principles, the following considerations have been made in our prototype:
- Allowing for different controls to address motor barriers
- Designing a clear user interface (UI) to make interactions and possible actions clear for addressing cognitive barriers
- Using different models, curated viewpoints and status messages for addressing visual barriers
Allow for different controls
As described earlier, people may be interacting with AR experiences in a number of different ways and they must not be prevented from interacting due to their interaction method. Instead, people should be given control to interact with the experiences in a way that works for them.
AR experiences on smart phone devices tend to rely heavily on touch controls to manipulate virtual models. This may not allow people who use a switch device or bluetooth keyboard to use the functionality. AR experiences must support these methods and not presume that everyone is using the same devices or controllers.
AR experiences must include controls that can be used by people browsing with assistive technologies. With DOM Overlays, you can use semantic HTML elements to create an interface that everyone can use. HTML elements such as a <button>
are usable with keyboards and switches by default and can be used in AR experiences. Use these controls as they can be operated by with a number of different devices and interaction methods and do not rely on people being able to interact in a specific way such as by dragging and dropping. You can read more about drag and drop accessibility in pointer-gestures.
Additionally, there may be a temptation to recreate physical experiences in AR. An example of this would be having to physically walk around a virtual cube to view all of its different panels. An interaction like this may not be possible for people who cannot move their device or cannot easily move. Instead, it's important to remember that virtual objects do not have physical limitations. Using scripting, virtual objects can easily be moved around, as evidenced with the IKEA Place furniture example. Providing controls for viewing different viewpoints, without having to move around the object, makes the experience usable for more people.
Design a clear user interface
In creating controls, the UI can make it clear as to what is possible, what is interactive, and what the available actions are.
In our prototype, the UI updates to provide people with available actions. When a model has not been placed, the UI options are only:
- Place model
- Change placement method
- Exit AR
These UI controls provide the available actions to people using the demo. They are only provided with controls and information that are relevant at that point in the experience.
Once the model has been placed, the UI updates to provide people with controls that can animate the model or change the viewpoints offered. Updating the UI ensures that the actions offered are relevant.
The presence of the UI helps with comprehension, as well as helping to manage a person's expectations. People should be able to understand the outcome of the interaction before even selecting the control.
Finally, allow people to exit the experience at any time. Do not constrain people's usage based on planned user journeys. If somebody finds the experience to be unsafe to them, or they find the experience to cause anxiety for them, for their own safety they must be able to leave whenever they want to. In our prototype, the "Exit AR" control is always available and can be used at any time.
Provide alternatives and communicate clearly
Colour contrast is an important consideration when overlaying virtual models on the physical space. We cannot predict the environment in which somebody may be using AR. If they are in a dark room with low light levels, there may be enough contrast to easily see the prototype's white TetraLogical cube. However, if they are in a well-lit white room, it may be difficult to see the white cube. To make this easier, in the prototype we have provided an option for a black outline to be added to the model which will provide enough contrast between the model and any light backgrounds.
Earlier we discussed ensuring people can use the controls that they want to use the AR experience. As part of this, people browsing with a desktop screen reader or browsing with a mobile screen reader must be included. While not covered in the BBC's Barriers Browser research, you must provide a comparable experience for those using a screen reader.
Our demo offers viewpoints so that people can understand the information present in the model. The status messages used in the prototype use ARIA live regions to provide a confirmation of the users actions. When placing a model, the status message provides feedback. The message will confirm that a model has been placed, in case the person using the demo is unable to see this.
The status messages also remain visible until dismissed. This approach allows people to access the information for as long as they need to, rather than hiding it after a short period of time. This is important as people may not be familiar with AR experiences. When interactions take a while, such as scanning surfaces to place a model on, the status message provides feedback.
Summary
AR allows people to view virtual models in physical spaces but must be designed and created in ways that give control, offer choice and provide a comparable experience.
By allowing for interaction with different devices, offering clear controls and by allowing people to use the experience in a way that meets their needs, we can create inclusive and accessible AR experiences.
More information
- Build an augmented reality (AR) app using the WebXR Device API, Google Codelabs
- Fundamentals of WebXR, MDN
- WebXR Standards and Accessibility Architecture Issues, W3C
Next steps
If you're currently designing an XR product, our design review service will provide you with our accessibility expertise and guidance to build an accessible experience. If you already have an XR product, our assessments can help you to understand whether your product meets accessibility standards.
We like to listen
Wherever you are in your accessibility journey, get in touch if you have a project or idea.