We respect your privacy

We and selected third parties use cookies or similar technologies for technical purposes and, with your consent, for other purposes as specified in the cookie policy. Denying consent may make related features unavailable.

Insights

How Can We Build an Accessible Metaverse for All?

The Metaverse offers a unique opportunity for engagement and interaction — but only if everyone can use it.

  • Article
  • 7 MIN READ
  • Nov 24, 2021
Illustration of disabled woman trying to access the metaverse

Summary

There’s no denying that Facebook’s Metaverse signals the advent of a new, exciting virtual era, but one important question is missing from the press coverage and social media buzz — is this brave new digital world built for every type of user?

The answer is no, it’s not. Beyond the limitations of finances and internet access, 15% of the global population is already excluded from accessing the VR space — those living with disabilities.

Bye, bye browser

It all comes down to technology. The Metaverse is the next step in an accelerating trend that has left the browser wars behind in the history book as apps and software take over the digital experiences space. At the same time, slowly made gains towards digital accessibility made possible through our reliance on browsers is being lost much more quickly, leaving those living with disabilities unable to socialize, work, shop, or participate in the metaverse.

Creating a new virtual space welcome for everyone means building the Metaverse on an effective blueprint we’re already familiar with — the Web Content Accessibility Guidelines (WCAG)’s four key principles: perceivable, operable, understandable, and robust.

Illustration of VR goggles.

Perceivable: Information and user components must be presentative to users in ways they can perceive, and cannot be invisible to all senses

Imagine two scenarios. In both, you’re unboxing Facebook’s Oculus headset for the first time.

In the first scenario, you have full vision. You slide open the polished white box, read the instructions, and download the mobile app. Next, you put on your headset, and are greeted with an image of your hands on the controllers. These become your pointers, allowing you to make menu selections and set up your wifi connection. You watch an animated safety video, and then set up the boundaries of your virtual play space. Finally, you land on your home page, featuring a layout you’ve never seen before — but that’s okay, because written labels guide you through.

In the second scenario, you have no vision. You rely on tactile clues to unbox your Oculus, but need a friend to read the instructions out loud. You’re able to download your mobile app, but things go off course as soon as you put your headset on. You’re not able to perceive your hands in the VR environment, and so can’t direct the pointers to set up menu items, provide wifi info, or set a safe boundary play area. Maybe a friend can set up the Oculus space for you, but what happens when it’s time to explore? The interface doesn’t recognize voice commands, there’s no built-in screen reader feature to navigate menu options, and the magnifier feature doesn’t help people without any vision. Third-party tooling isn’t possible without shared standards, and Facebook isn’t talking.

So what solutions are available? In the short term, we can mirror the VR environment in a browser-based experience, but this is actually incongruous with equal perceivability. In the long term, ensuring perceivable experiences for all relies on the three additional, interwoven WCAG 2.0 principles: operable, understandable, and robust.

Illustration of a finger touching a laptop screen.

Operable: User interface components and navigation must be operable, and not require an interaction a user cannot perform

One of the biggest barriers to operability in the VR space takes the form of the hardware used to explore it. For example, the Oculus is currently navigable using two handheld controllers with buttons and a joystick. For people living with physical disabilities, this is one of the biggest challenges to accessing VR.

Another major barrier comes out in the technical interface, where ensuring operable digital experiences for all is intrinsically linked to interoperability.

When it comes to digital accessibility, interoperability becomes possible when a structural or hierarchical model is built into a system that can be utilized by multiple different assistive technologies, so that an assistive tool functions effectively across different systems, softwares, and platforms. This guarantees an operable digital experience for users wherever they go and is only made possible in today’s digital ecosystem thanks to publicly shared W3C standards.

Currently, there are no shared standards for developing third-party tools for the Metaverse. Existing third-party accessibility toolings don’t work in that space, and even if they did the interface doesn’t offer the menus and settings required to install and open these apps.

In a recent announcement, Facebook promised to continue to grow their assistive feature offerings inside of the Metaverse and set out related standards for interface developers. Critics were quick to point out flaws, for example, that the proposed auto-captioning feature would never be as fluent or accurate as proper captioning reviewed by a person and then uploaded to the medium. This speaks to a larger problem — the Metaverse has already grown too quickly to be accessible for all.

So what can be done? As digital product makers, we’ll ensure the Metaverse products we make meet the four WCAG 2.0 principles. As accessibility advocates, we’ll continue to speak up for an interoperable and universal VR space.

Illustration of a computer with a code icon.

Understandable: Information and the operation of the user interface must be understandable to all users

A key part of a product’s success is user testing. In browser environments, established practices ensure coding is in place to provide a universally understandable and accessible experience through the use of assistive technology. These established practices will be lost as we migrate to browser-free VR experiences, leaving some users unable to interact with interfaces and tools.

Without third-party tooling, developers building for the Metaverse must include users with disabilities at every stage of a product development lifecycle. By doing so, you’ll build a brave new world for everyone — and we’ll be there with you.

Illustration of a hand, coin, and musi note. A wheelchair/accessibility icon and a listening icon.

Robust: Content must be robust enough that it can be interpreted reliably by a wide variety of user agents, including assistive technologies

Building a Metaverse for all isn’t just about enabling assistive technologies — it’s a chance to celebrate new ways to create content that entertains, educates, and inspires everyone who interacts with it. With multiple sensory ways to engage, there’s no reason VR content can’t be robust or interesting enough to provide a memorable experience for all.

Now we can manipulate three-dimensional art pieces, feel the haptic vibrations of a plane’s yoke as it takes off, hear our favorite musician perform live, and converse with people from around the globe. With so many ways to interact with our users, there’s no reason we can’t build better ways for our audience to ‘see’, ‘hear’, and ‘touch’ content.

Puzzle pieces with the Apply Digital logo on them.

Building an accessible Metaverse for all

Today’s Metaverse isn’t a very accessible place for those with disabilities, but tomorrow’s could be. It all comes down to the next steps brands, developers, content creators, and users take as we come together to explore and shape our new virtual world. Will we advocate for a perceivable, operable, understandable, and robust experience for every user? Will we push for open standards and interoperable tools to make this happen?

We will. And we’ll work alongside you to create and grow digital products that inspire users of every ability as we build the Metaverse and the wide world beyond.

Reach out to us at hello@applydigital.com to get started.


Written by Liz Goode with the support of Scott Michaels, Christina Ung, and Rabiya Samji.

Partner with us

Together, we can deliver innovative solutions and drive your digital change journey.

Keep learning

Illustration of robotic arm making a purchase on a laptop surrounded by tools
3 Min Read Nov 23, 2021

3 Reasons to Use Microservices

Read More

An illustration of two people speaking to each other through their laptop screens. The man's chat bubble shows a delivery truck and the woman's chat bubble shows to puzzle pieces coming together.