Making the metaverse safe | VentureBeat

Join us on November 9 to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers at the Low-Code/No-Code Summit. Register here.


One of the questions about the metaverse that needs answering is how to make it safe for kids. Every generation born now gets raised alongside technology, which itself is growing and changing. It’s likely that the mass adoption of today’s metaverse is going to happen with the younger folks.

This means the issue of safety needs solving. It’s the topic being covered in our Tackling Trust & Safety for Kids in the Metaverse panel, in fact. ActiveFence’s Tomer Poran, Roblox’s Tami Bhaumik and EA’s Chris Norris chat with ConnectSafely’s Larry Magid about child safety in the metaverse at GamesBeat Summit Next 2022.

It’s something of a daunting task in this case. It’s almost reminiscent of the pre-ESRB days; in the very early days of gaming it wasn’t impossible for anyone to make a game. Anyone could make a game and it could have any kind of content in there.

In the mid-90s the ESRB went live and brought about a ratings system that helped parents know what was acceptable for their children. Beyond that, it helped developers out. Being able to aim for a rating helped devs with content creation. These days, the ratings are still there and the safety tech has grown alongside more robust. People just don’t seem to know about it, or how to use it.

“One of the things I absolutely know, because I talk to parents and I talk to our community, day in and day out, is that parents and teachers don’t even know that account restrictions exist,” said Roblox’s Tami Bhaumik. “If there’s not a basic education level, then there’s always going to be a problem.”

ActiveFence’s Tomer Poran, Roblox’s Tami Bhaumik and EA’s Chris Norris chat with ConnectSafely’s Larry Magid about child safety in the metaverse.

These days it’s a little less clear. Take Roblox, for example. It’s kind of the ultimate example of metaverse interactions right now. Roblox, itself, is a game and a platform that has existed since 2006.

Roblox introduced a User Generated Content feature in 2019. The UGC feature in Roblox is curated decently well, but even with that we’ve seen issues. The question, then, is where does the blame fall when issues pop up?

Is it on the Roblox platform to deal with any and all problems as they arise? Is it on the content creators? How do brand deals fit into the equation? Television ads have guidelines advertisers must adhere to. We see some amount of that on the internet — sponsored streamers have to make it clear they’re advertising.

Does that change when it’s happening in virtual spaces? Seeing a 30 second advertisement on a screen or on a stream and existing inside one giant interactive ad are two different things.

The answer isn’t very clear-cut, in these early days. It’s probably going to be on everyone to keep kids safe and secure in the metaverse. The individual platforms need to be actively policing the things being hosted, as always. But the content creators need to be doing their part as well.

“I think that there has to be responsibility from a platform’s standpoint,” said Bhaumik, about Roblox. “We provide the safety tools. We’re continually innovating, and reassessing and making things better. We want to empower our creators and our developers to be able to.. have the ability and the tools to moderate their experiences, because one size does not fit all.”

Ideally content creators wouldn’t be causing problems, but bad actors are everywhere. The community needs to be reporting these people instead of ignoring them. 

Originally appeared on: TheSpuzz

Scoophot
Logo