ITIF Logo
ITIF Search

User Safety in AR/VR

Overview

Over the past decade, augmented reality (AR) and virtual reality (VR)—immersive technologies that enable users to experience digitally rendered content in both physical and virtual space—have seen continuous growth and maturation, evolving from niche products for special applications into general-purpose technologies with significant potential to transform the way consumers shop, businesses manufacture goods, and people socialize with each other.

As AR/VR technologies enter the mainstream, one priority is addressing user safety: ensuring AR/VR products and services do not cause injury, harm, or loss to users. Due to the broad impact AR/VR technologies can have on multiple aspects of a user’s life by provoking physical and psychological sensations, user safety covers a wider range of issues beyond physical threats, including psychological and financial well-being.

However, the level of risk varies depending on age. Teenagers are particularly susceptible to safety threats because of their relative lack of maturity and naivete, while children, though the main drivers of the metaverse, are also engaging in immersive experiences not designed with them in mind.

Our report series explores user safety challenges in AR/VR technologies among three demographics:

  1. Adults (ages 18 and older);
  2. Teenagers (ages 13 to 17); and,
  3. Children (ages 12 and below).

Recommendations

ITIF concludes that comprehensive federal privacy legislation that addresses actual privacy harms and preempts state laws, creating a single set of protections for all Americans, is crucial. In addition, ITIF suggests policymakers do the following:

  • Consider the impact of laws and regulations for internet platforms on AR/VR. For example, consider proposals that aim to increase intermediary liability for user-generated content, as most reforms to Section 230 of the Communications Decency Act would do.
  • Criminalize harmful behaviors that impact AR/VR safety at the federal level, such as virtual sexual violence, and train law enforcement to respond to these crimes.
  • Avoid age verification legislation due to privacy and free speech concerns.
  • Fund further user safety research.
  • Require device makers and online platforms hosting age-restricted content to establish a “child flag” system, where platforms assume everyone is an adult unless they have been marked as a child, on an opt-in basis that parents can control.
  • Direct educators to add AR/VR literacy to digital literacy criteria for adults and children.

Platforms can address systemic issues through content moderation, design tools, and industry best practices. In the meantime, policymakers should be wary of regulations that could stifle the industry’s ability to experiment freely.

Back to Top