Categories
XR Accessibility: for people with moving disabilities
Posted on by Joe Lamyman in Design and development
Extended Reality (XR) experiences provide immersive experiences which tend to require movement based interactions. But we need to consider alternative input methods for people who can’t move.
In this post, we’ll explore considerations for designing and developing inclusive XR experiences for people with moving disabilities.
If you haven't already, you can also explore other articles in this series including:
- Introduction to XR accessibility
- XR Accessibility: for people with seeing disabilities
- XR Accessibility: for people with hearing disabilities
- XR Accessibility: for people with thinking disabilities
You can also watch my Introduction to XR accessibility talk from InclusiveDesign 24 (#ID24).
Who does this affect?
People with moving disabilities might include:
- People who have multiple sclerosis and experience muscle weakness and spasms
- People who have arthritis and find some movements difficult or painful
- People with Parkinson's disease who experience tremors
- People who use a mobility aid like a wheelchair or walking stick
- People who have a broken arm and are unable to use it
As we mentioned in Introduction to XR accessibility, there are different types of disability that might be permanent, temporary, or situational. Regardless, we need to create experiences that are usable by everyone.
Provide alternative navigation methods
Augmented Reality (AR) and Virtual Reality (VR) experiences often require people to move around their physical environment in order to interact with the experience.
In an AR experience, locating an object might involve moving your phone around. In a VR experience, it could involve walking up to objects or manipulating them. However, these interactions may not be possible if people are unable to make the required movements. As a result, we need to think about how we can provide alternatives that allow people to use our experiences.
Some user stories to help us think about this need could be:
As someone with a moving disability, I want to be able to navigate through an XR experience without having to physically move.
As someone who uses a wheelchair with their phone mounted in a stand, I want to be able to navigate around an AR experience without having to move my device.
In the case of an AR experience, this could be achieved by providing a control that centres the object in the device's field of view. Instead of requiring people to move around the object to view it, you could provide a control that automatically rotates the object, allowing people to see it without moving. In addition to this, you could curate a few different interesting viewpoints and present these as controls. When a control is selected, you can snap the object to the relevant viewpoint and allow people to easily view them.
For a VR experience, instead of requiring people to walk around, we can provide teleportation functionality. This would allow people to instantly navigate to where they want to be. We would need to make sure that we clearly convey the location to where a user is about to teleport to. This way, people have enough information to know whether they want to navigate to that location. Providing this information will help to prevent people having to teleport multiple times to find the location that they are looking for.
Include motion agnostic interactions
We need to make sure that our interactions don't require a specific physical movement, or fine motor control. People may not be able to make specific movements or gestures, which could pose a barrier. If people cannot complete the interaction, they may be unable to use the experience.
To help summarise this consideration, use the following user needs:
As someone without arms, I want to be able to use all of an XR experience's functionality.
As someone who experiences tremors, I want to be able to target objects without having to hold a VR controller stationary.
For a VR experience, being able to hold motion controllers, perform gestures, and target objects, requires a high level of dexterity. People who experience tremors, spasms, or are unable to move their arms, may find these interactions pose a barrier. Instead, think about different input methods, such as button presses, or an external keyboard. Instead of using a motion controller to target an item, people could use a single button press to interact with the object directly in front of them. Alternatively, people could use a shortcut to access a list of objects around them, in order to navigate to, and interact with the experience.
For example, imagine an AR experience which allows people to play rock, paper, scissors. It might be that by default, people place their hand in front of the camera and make a rock, paper, or scissors gesture. This might work really well and be intuitive for some people. But for people who are unable to make the gesture, provide an alternative method of input. An example of this could be selecting a corresponding rock, paper, or scissor control on-screen.
Another approach could be to implement a sticky keys-like feature. When enabled, this would allow people to press one key after the other, rather than having to press multiple keys at the same time. It could also be used for gestures, allowing people to make smaller, single gestures, rather than multiple complex movements.
At its core, this consideration is about offering choice and providing people with multiple ways to interact with the experience. These alternatives allow people to continue to use all of the functionality in a way that works best and is most comfortable for them.
A key part of designing for XR experiences for people with moving disabilities is making sure that we don't try and recreate inaccessible physical environments and interactions. For example, people shouldn't have to open and close doors in a VR experience, when this could be done with a single button press. Searching for items can be provided through search functionality, and manipulating objects can be done with single button presses. While we may design interactions based on what we're familiar with from our physical environments, we don't share the same physical constraints when creating XR experiences. We can be more creative when considering the needs of people with moving disabilities because we are not constrained by the physical environment. This allows us to offer choices and alternatives, enabling everyone to interact in ways that work best for them
In summary, experiences can have unique or interesting interactions. It's about making sure that alternative methods of input are provided so that people can still use your experience.
Customise interactions target sizes
To help with creating usable interactions, use customisable target sizes.
By default, interactive objects and controls should be sized appropriately. This helps to make content easy to interact with, and prevents people accidentally selecting the wrong controls. For experiences on the Meta Quest devices, their documentation on User Interface Components recommends that:
Buttons should be at least 6 centimeters or more on each axis. Targets smaller than 5 centimeters quickly drop off in accuracy and speed, and there’s somewhat of a plateau above 6 centimeters.
For the Apple Vision Pro, it's recommended that target sizes have an area of at least 60 points (pts). This can be achieved by sizing elements in a way that their area is 60pts or greater, or by using a combination of sizing and spacing, to ensure that controls do not overlap within a 60pt area.
However, people may find it better if content is sized larger than this. As a result, allow people to change the target area size for interactive controls. This could be done through settings that allow people to change the size of user interface (UI) elements, alongside things like text size. Alternatively, it could be achieved through techniques such as making all interactive objects larger. There may also be settings from desktop experiences that could apply to XR. When using VR motion controllers, allowing people to customise the pointer size and sensitivity of the controllers, might also help to make the experience more usable.
Given that people's needs change and are dependent on the context of use, customisation ensures that our experiences continue to be easy to use.
Customise the speed of interactions
Another way to create usable interactions, is to allow people to change the speed.
If the experience requires people to interact at certain points, then allowing people to adjust the speed of the experience, or the window of opportunity for interactions, may help make this more usable.
Similarly, in VR experiences, where movement is required, you can allow people to change the speed at which they can move. This could be done through different modifiers, or by allowing people to change the sensitivity of their controls.
Support speech recognition
At the start of this XR accessibility series, we discussed different input methods like external keyboards. However, another input method to consider is speech recognition control. This is supported by default in desktop and both Android and iOS devices. Speech recognition allows people to use standardised spoken commands, which are interpreted by their device, and used to operate and interface. You can read our post, browsing with speech recognition for more information about how this works when browsing the web.
For example, imagine a multiplayer VR experience which allows people to move around the interactive environment, interact with important objects, navigate the UI, and talk to other people, all using voice commands. Voice commands should be available for all interactions that someone may wish to complete using any other input method.
To help people use the experience, you could provide a list of the available voice commands in the settings menu. Alongside this, adding optional functionality that overlays the commands on top of the corresponding controls, can be used to help people learn the commands. This functionality is available in mobile devices, and allows people to better understand the names of controls and how to interact with them.
Summary
XR allows people to experience immersive environments. We must create these environments in a way that includes the needs of people with moving disabilities. This can be done by ensuring that the experiences are easy to navigate, don't rely on people making physical movements or gestures, and provide alternative ways of interacting with the experience.
More information
- Accessibility Options in Virtual Reality Gaming: A Case Study with Myst, Equal Entry
- Adapting VR Games for People with Disabilities, Equal Entry
- Barriers Browser, BBC
- Inclusive XR: accessible 3D experiences, Joe Lamyman
- Inclusive XR: accessible augmented reality experiences, Joe Lamyman
- Multiple Locomotion Styles in Virtual Reality for People with Disabilities, Shivam Sharma at A11yVR
- XR Accessibility User Requirements, W3C
Next steps
If you're currently designing an XR product, our design review service will provide you with our accessibility expertise and guidance to build an accessible experience. If you already have an XR product, our assessments can help you to understand whether your product meets accessibility standards.
We like to listen
Wherever you are in your accessibility journey, get in touch if you have a project or idea.