X

The Metaverse Needs to Figure Out How to Deal With Sexual Assault

Moderating social media is hard. Moderating the metaverse will be harder.

Daniel Van Boom Senior Writer
Daniel Van Boom is an award-winning Senior Writer based in Sydney, Australia. Daniel Van Boom covers cryptocurrency, NFTs, culture and global issues. When not writing, Daniel Van Boom practices Brazilian Jiu-Jitsu, reads as much as he can, and speaks about himself in the third person.
Expertise Cryptocurrency, Culture, International News
Daniel Van Boom
5 min read
A woman tries a VR headset on for the first time at a tech convention.
Getty/Xinhua News Agency

Watching a friend or family member use a VR headset is a strange experience. They gesticulate wildly, flaying at empty space. But they're reacting to what feels like a very real set of stimuli: Bad guys that need to be shot down, or haunted hallways with ghosts around every corner. It looks preposterous to you. It feels real to them.

Which is why sexual assault in the metaverse is a problem that we can't afford to wait around on.

On Tuesday, a researcher from SumOfUs, a nonprofit, spent time in Horizons Worlds, Meta's flagship VR world. It took her less than an hour to be "raped," according to a report from the organization.

What does it mean to be sexually assaulted in the metaverse? In this instance, the researcher was cornered in a room by two male avatars. One of those avatars got face-to-face with the researcher and made sexually abusive comments, while the other stood back and seemingly drank from a bottle of virtual vodka.

The report was mocked on social media, with many contending that "rape" is too strong a word to apply to what happened in that clip. These semantics obscure the point that virtual interactions can cause real-life trauma. Despite the metaverse's relatively young age, there are already countless cases of boundaries being crossed in virtual space

Back in February, VR researcher Nina Jane Patel said she was "gang raped" by four male avatars in Horizon Worlds. They crowded around her, capturing screencaps as they groped her character while saying, among other lewd comments, "don't pretend you didn't love it." Last June, a woman was playing sports game Echo VR with strangers when one said he'd recorded her speaking to "jerk off" to it later.

Abuse in the metaverse is likely to be as endemic as it is on social media. But these incidents illustrate how much more traumatizing they could be due to the immersive experience these worlds offer. The idea of living in a virtual world, once a selling point for VR, is flipped on its head in the darkest way possible. 

"It was surreal," Patel said of her experience in a blog post. "Virtual reality has essentially been designed so the mind and body can't differentiate virtual/digital experiences from real. In some capacity, my physiological and psychological response was as though it happened in reality."

VR worlds need to offer better protection and tools for their users. Social media moderators already have a hard but crucial job, but in the metaverse, they will likely need to act more like a police force patrolling the streets of a big city. Instead of taking down content after the fact, they'll need to find abuse as it's happening. 

Promo art for Meta's Horizon Worlds VR app.
Meta

But that's asking a lot out of moderators, and it's unclear if any company is prepared for that kind of proactive response. 

Meta didn't respond to requests for comment for this story.

Metaverses are sprawling, open worlds in which hundreds or thousands of people socialize. This can take place under the guise of a game, like Fortnite or World of Warcraft, or social simulators like Second Life. It's an old concept. The reason you've heard the phrase so often over the last year is that metaverses are evolving into their next phase. 

What that next phase looks like depends on who you talk to. When blockchain enthusiasts talk about the metaverse, they're referring to big, open-world games that are infused with NFTs and cryptocurrencies. CNET Editor Scott Stein says it isn't necessarily a specific thing, but more a new way for us to interact

Then there's Meta's vision. When the company formerly known as Facebook says "metaverse," it means a huge virtual reality world that simulates the real one. Blockchain metaverses will be on PC browsers. Meta's metaverse lives in a VR headset. (Meta CEO Mark Zuckerberg has spoken about potentially integrating NFTs and other tokens into his company's metaverse, but how that will look is as yet unknown.)

The advantage of VR metaverses is that they're more immersive. Unfortunately, that makes the abuse more overwhelming, too. This is especially true when users have vests equipped, which allow them to actually feel an unwanted touch.

"It was a nightmare," Patel said.

Moderating the metaverse

The metaverse will undoubtedly be harder to moderate than existing social media. On social media, you can block people who give you grief, and moderators can take down malicious content. Even with these advantages, platforms like Facebook and Twitter are rife with harmful content. 

In addition to extensive moderation, companies will need systems in place to restrict abuse in the first place. This will be challenging enough in browser-based metaverses where, as in MMO games today, damage can be done via voice chat. It'll be harder still in VR worlds where you can be virtually touched and have your space invaded. 

Meta's CTO, Andrew Bosworth, has said that moderating users "at any meaningful scale is practically impossible," according to a memo seen by the Financial Times. But he also called widespread harassment an "existential threat" to the success of the metaverse. 

Meta has tinkered with safety tools in recent months. As it tweaks Horizon Worlds, which is still in beta, it added safety precautions like the boundary bubble feature. If toggled on, it prevents people from coming within four feet of your avatar. 

"We have far more tools than we've touched," said Aaron Stanton, co-creator of Oculus archery game QuiVR. After being alerted that a woman was groped by another avatar in the game, Stanton and his co-designer implemented a gesture that allowed users to, if threatened, push abusers away.

Now the director at the VR Health Institute, Stanton believes that developers in these worlds should focus more on features that empower users. Stanton's reasoning isn't that the victim is responsible for their own protection. Rather, he says, protective tools are often inadequate, and can make the abused feel powerless. But he believes VR worlds open the door for better moderation than social media platforms.

He uses the example of a gesture that turns you into a giant, able to swat away harassers. To the abuser, your avatar would merely disappear. But in your game, you'll feel like you have the power to get out of a bad situation.

"The problem with purely protective tools is that they leave the threat still inside the headset," he said. Protective tools "don't really remove the threat, they just trap it online. I think we need solutions that actually deal with the issue without forcing players to abandon the virtual space."

There are many uncertainties that swirl around the metaverse. Arguably the most important is ensuring that it's built in a way that doesn't allow abuse to thrive like it does on the internet right now.

"Over the past twenty plus years we have integrated the internet into our daily lives," Patel wrote. "The non-negotiable, this time round, is ignoring the dark side."