These norms reveal how moderation is complicated by trying to map the social conventions of the physical world onto virtual reality. If you covered yourself in purple body paint and showed up at an in-person medical conference, you’d probably be asked to leave. At a metaverse medical conference, the other attendees wouldn’t even bat an eye. This relaxing of certain social norms leads people to test the bounds of acceptable behavior, and moderators in some cases have to decide what crosses the line. For instance, new users will often pat strangers on the head, an interaction that would be strange and a little invasive in real life. Educators in VR tries to discourage people from doing this, though it seems to fall into a gray area of rude but not totally offensive. The same goes for pacing excessively around a room, walking through other users, or fidgeting too much with your controllers, which can cause your VR hands to distractingly bounce around. “People don’t get it at first because a lot of people come into VR from a gaming platform, so they don’t fully grasp the fact that behind every avatar is a person,” said Myer. During one of the events I moderated, VanFossen asked me to message an attendee to step back because he was a little too close to the speaker and invading her personal space. I needed the nudge: It’s hard to tell how close is too close in the metaverse. It’s not like you can feel the other person breathe.
To account for these gray areas, Educators in VR calibrates the strictness of the moderation based on the type of event. Parties are a bit more laissez-faire, while group meditation sessions have a zero tolerance policy where you might be removed simply for moving around the room too much. “I was very much against zero tolerance until I started witnessing what that meant,” said VanFossen of meditation events. “People are there for a reason, whether this is their daily thing, they have a crap stressful job, they need a break, or they have mental health issues.” Moderation levels also differ by platform—AltspaceVR tends to be stricter because it’s targeted at professionals, while VRChat is known for anarchy.
It remains to be seen how moderation will work at scale as the metaverse accelerates its expansion. At the moment, developers don’t seem to have a good answer. AltSpaceVR has been trying to put moderation tools into the hands of its users and also has staff on hand to help with particularly volatile situations. Meta has similarly relied on users themselves to block and report troublemakers in Horizon Worlds. Yet if tech companies succeed in their grand ambitions to get billions of people to inhabit the metaverse, maintaining it is going to take an immense amount of time and energy from a countless number of people who have to make tough, nuanced decisions minute by minute. As VanFossen said, “It’s the most disgusting, frustrating, stress-inducing, headache-inducing, mental health–depleting job on the planet.”
Social interactions and spaces require social norms. People appreciate knowing how to act and how they will be treated. Without them, chaos or anarchy or worse ensues
Enforcing social norms is an important matter. In many situations, the norms are communicated and enforced in less explicit or informal ways. People see what is happening and respond similarly or they have a general idea of how to behave. In other situations, the norms need to be explicitly addressed, perhaps through formal guidelines or enforcers who step in when needed.
What sounds unique about the situation discussed above is (1) the social space is relatively new, (2) unfamiliar to a lot of people, and (3) is still in flux because of #1 and #2 plus ongoing changes. The moderators are trying to step in and they are creating the norms as they go. If the metaverse becomes more popular, the norms will solidify as the space and the proper behavior becomes more known.