ITIF Series on User Safety in AR/VR: Protecting Adults, Teens, and Kids
Overview
Over the past decade, augmented reality (AR) and virtual reality (VR)—immersive technologies that enable users to experience digitally rendered content in both physical and virtual space—have seen continuous growth and maturation, evolving from niche products for special applications into general-purpose technologies with significant potential to transform the way consumers shop, businesses manufacture goods, and people socialize with each other.
As AR/VR technologies enter the mainstream, one priority is addressing user safety by ensuring AR/VR products and services do not cause people injury, harm, or loss. Due to the broad impact AR/VR technologies can have on multiple aspects of a user’s life by provoking physical and psychological sensations, user safety covers a wider range of issues beyond physical threats, including psychological and financial well-being.
However, the level of risk varies depending on age. For example, teenagers are particularly susceptible to safety threats because they lack maturity. Meanwhile, children are the main drivers of the metaverse, mainly through gaming, but they are also engaging in immersive experiences that were not designed with them in mind.
To explore AR/VR safety issues in detail, this series includes three reports:
- “User Safety in AR/VR: Protecting Adults” focuses on users who are 18 and older.
- “User Safety in AR/VR: Protecting Teens” focuses on users who are 13 to 17.
- “User Safety in AR/VR: Protecting Kids” focuses on users who are 12 and under.
Overall Recommendations
ITIF’s investigation concludes that comprehensive federal privacy legislation is crucial to address actual privacy harms and preempt state laws with a single set of protections for all Americans. In addition, ITIF offers the following recommendations to ensure user safety in AR/VR:
- Consider the impact of laws and regulations for internet platforms on AR/VR. For example, consider proposals that aim to increase intermediary liability for user-generated content, as most reforms to Section 230 of the Communications Decency Act would do.
- Criminalize harmful behaviors that impact AR/VR safety at the federal level, such as virtual sexual violence, and train law enforcement to respond to these crimes.
- Avoid age verification legislation due to privacy and free speech concerns.
- Fund further user safety research.
- Require device makers and online platforms hosting age-restricted content to establish a “child flag” system, where platforms assume everyone is an adult unless they have been marked as a child, on an opt-in basis that parents can control.
- Direct educators to add AR/VR literacy to digital literacy criteria for adults and children.
Platforms can address systemic issues through content moderation, design tools, and industry best practices. In the meantime, policymakers should be wary of regulations that could stifle the industry’s ability to experiment freely.