The dark side of the metaverse, part I

By Mark Jamison and Matthew Glavish

In the metaverse and virtual reality (VR), seemingly anything is
possible. You can work, build mansions, visit museums with friends scattered
across the globe, or engage in immersive video games. But this space is rapidly
evolving and has outpaced controls and social norms that regulate our demeanor
in the physical world. As such, in some areas of the metaverse, there is a
growing presence of unacceptable behaviors with harassment, strip clubs,
unchecked gambling, and data exploitation becoming well established.

Antisocial
behavior

Antisocial behavior includes assault, bullying, harassment, and hate speech. In its study of the metaverse, the Center for Countering Digital Hate found that “VRChat — the most reviewed social app in Facebook’s VR Metaverse — is rife with abuse, harassment, racism and pornographic content.” In fact, researchers identified one conduct violation every seven minutes. It is also important to note that VRChat has been aware of these problems since 2018, when the platform announced it would address widespread racism and sexism, but these behaviors have persisted. Moreover, this conduct is not exclusive to VRChat as a study out of Denmark found such issues in other social VR applications (apps).

via Twenty20

Pornographic
content

The problem of pornographic content in the metaverse can be broken down into two subcomponents: sexually explicit material and sexually explicit activity. Regarding the former, the BBC reported that anyone older than 13 could enter a strip club in the metaverse through two main apps: VRChat and Roblox. Within these digital clubs, there are several sexually explicit items including dancing poles and contraception. But the more sinister element is sexually explicit activity. Not only is it possible for avatars to remove their clothes and simulate sex, but the BBC report uncovered what appeared to be the avatars of adult men approaching, groping, and attempting to groom an avatar that belonged to a researcher posing as a 13-year-old girl.

Gambling

Gambling is lucrative in the metaverse. In fact, Markets Insider reported that the ICE Poker Virtual Casino “has made over $7.5 million in the last three months, and accounts for a third of the daily traffic to Decentraland” (a popular metaverse location). Yet, similarly to the digital strip clubs, it is easy for metaverse-dwelling minors to gamble, even with age restrictions in place. Moreover, metaverse casinos are in a legal gray area: Users gamble tokens, as opposed to actual money, and courts have routinely dismissed lawsuits against this type of gambling as they have deemed that these tokens are not “something of value.

Information security

Lastly, there is an issue of data collection. The types of data collected in the current Web 2.0 (the internet of social networking and user-generated content) include contact, financial, and engagement data. But in the metaverse, apps can collect and utilize more types of data. The Wall Street Journal quotes and elaborates on Kavya Pearlman, founder of the XR safety initiative:

‘At any given time, the way you move, the way your gait is, the way you’re gazing, your pupil dilation, is giving away information to developers.’ . . . All these tidbits could give companies greater ability to deduce users’ traits. . . . For instance, [Pearlman] said, an insurance company might obtain information that suggests a user has a health problem before the person noticed any physical changes or saw a doctor.

This
issue will expand as the hardware we use to enjoy the metaverse becomes more
interconnected with our physical bodies.

Why do
these problems exist?

There are several reasons why these problems can manifest. First, being online lowers people’s inhibitions. For example, the New York Times quoted a different undercover researcher who asked a user to stop groping her avatar: In response, the user “shrugged as if to say: ‘I don’t know what to tell you. It’s the metaverse — I’ll do what I want.’”

Second, studies suggest large-scale content moderation may be impossible. While many apps have mechanisms to report misbehavior, context is hard to understand, and content moderators can be overwhelmed. Facebook alone deals with three million posts requiring moderation each day. But the WeProtect Global Alliance believes tech companies can do better, arguing that they don’t often utilize “advanced classifiers to detect video and livestream content . . . despite the fact this category is becoming more prevalent.”

Third, age-verification measures are very easy to circumvent. While there are strict age-verification laws for buying tobacco and alcohol in the United States online, most other websites are guarded with an “age gate.” Most of these gates only require someone to enter a birthday or check a box, both of which are weak safeguards.

The metaverse can to revolutionize the way humans interact, socialize, and enjoy leisure, but like any innovation that opens new opportunities, it also opens the door to the darker side of human nature. We will address what to do about these problems in a subsequent piece.  

The post The dark side of the metaverse, part I appeared first on American Enterprise Institute – AEI.