News

View Original

Concerns for child safety raised over Metaverse

The Metaverse - an immersive virtual reality developed by Meta, formerly Facebook - has proved to be unsafe for children, who may be targeted by adult players and pressured to engage in graphic virtual sex acts.

What is Virtual Reality?

Virtual reality (VR) apps combine the real and virtual worlds via a headset. The simulated world allows players to walk around, enter different rooms, and interact with other avatars. 

Mark Zuckerburg described the virtual world as ‘an interconnected virtual space’, which he deems to be the future of the internet. 

Facebook has invested billions of dollars into their rebranding and development of the metaverse industry, and the app VR Chat is now downloadable from the app store of Facebook’s Meta Quest Virtual Headset. 

Although a fun and dreamlike experience, entering the Metaverse poses extreme safety risks for both children and adults.

When people immerse themselves deeply into the virtual world, the line between actual and simulated reality is easily blurred. According to critics, becoming entranced in a false reality may cause people to do things they perhaps wouldn’t in the real world.

Dr David Reid, Professor of AI and Spatial Computing at Liverpool Hope University, is convinced that virtual reality will become mainstream and be as transformative to our lives as the emergence of the internet. And while he acknowledges the benefits of the virtual world, he claims that it also has ‘terrifying dangers’.




‘Dangerous by design’

The NSPCC have warned that Metaverse apps are ‘dangerous by design because of oversight and neglect’. Having a Facebook account, which has an age restriction of 13, is the only requirement for downloading VR Chat. Other than this, there are no further age verification checks, so adults can interact with children on the app however they’d like.

While some of the rooms in VR chat are innocent and used for friendly meet-ups only, others have been created to resemble strip clubs. Children are in no way restricted to the rooms in which they’re allowed access, so they can be exposed to age-inappropriate activity very easily, and even by accident.

In an undercover investigation, BBC researcher Jess Sherwood revealed some extremely unsettling findings. For the purpose of the research, she had posed as a 13 year old child and entered a number of different rooms in VR chat.

Jess told the BBC that it was ‘more like an adult’s playground than a child’s’. The rooms had been made to appear explicitly sexual - vibrant colours, inappropriate music, and sex toys and condoms openly displayed. Avatars would often be naked and engaging in sexual acts, usually in large groups.

Some of the adult male players were aware that she was a minor and had asked why she wasn’t in school. But this didn’t stop them from asking her to participate in sexual activities. 

Due to the highly immersive nature of VR, performing virtual sex acts requires some movement of the body. So, when children are pressured into engaging in this activity, they still have to act out the sexual movements - making it all the more sinister.

Ms Sherwood also witnessed other displays of verbal harassment first-hand, including grooming, rape threats, and racist abuse.

In another account, Catherine Allen, head of consultancy Limina Immersive, witnessed a 9-year-old girl on VR Chat. Herself and the young girl had received rape threats by a group of men, causing her to intervene and stop them. Ms Allen stated that VR had ‘no moderation’ to protect children.




Protecting the safety of children

The BBC’s findings confirm that the metaverse is not a safe space for children. On the BBC investigation, Children’s Comissoner Dame Rachel De Souza criticised Meta and stated that she was horrified about the lack of safety regulations on the app, which has inevitably led to the grooming and sexual harassment of children.

The Children’s Commissioner claims that age verification should be a requirement not just on Metaverse apps, but also on other social media platforms. She is pushing technology companies to put more safety measures in place to ensure that children are only exposed to age-appropriate content online. 

Meta commented: "We provide tools that allow players to report and block users. We will continue to make improvements as we learn more about how people interact in these spaces."

Child safety charities have now urged parents to monitor their children’s online activity, and in particular test out the VR headsets they have prior to their use. These simple precautionary checks could reduce the risk of children encountering age-inappropriate online activity and grooming.

If you think yourself or someone you know may have been affected by the issues raised in this article, please visit the NSPCC website to learn more about grooming and online abuse. You may be able help a child in danger.


If you feel strongly about this story or any other issues you feel need highlighting, start a campaign, petition or a legal case at Find Others today.