The metaverse: A safe space for all?

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


The collective internet is grappling with misinformation, toxicity, and censorship. In countries all over the world, this is exacerbated by social media networks being restricted or even controlled by the government.

Not only does this damage the foundations of free speech and collaboration that the Internet was built on, but it also estranges entire demographics from being able to participate in global dialogue and understanding.

While the early iterations of the metaverse have started to face similar challenges, it also holds a promise for a more decentralized architecture of the web, which could help to mitigate some of these issues. On the legacy Internet, content can be easily tracked, controlled and censored—but Web3 could create a more secure way of authentication and thus enable the creation of truly protected digital speech.

Censorship in the centralized web

Much of the centralized web is controlled by a few actors who can manipulate what users see. This has led to rampant censorship and control of information, as well as harassment and abuse of vulnerable groups. In some countries, the government restricts or even controls social media networks. This means that people are unable to raise their voices or speak their opinions freely.

For instance, as China faces heightened criticism of its zero-COVID policy, the country is censoring social media platforms, search engines, and even individual posts. In fact, a recent proposal by the Chinese government would require all online comments to be reviewed to ensure alignment with the party narrative.

Further, China’s handling of the Uyghur minority has drawn the ire of the international community. The Chinese government has been accused of mass surveillance, forced labor and even genocide against the Uyghur people. In response, China has censored social media platforms, significantly restricted internet access in the Xinjiang region, and used artificial intelligence to monitor and control the Uyghur population.

China is not alone. As Russia faces a populace questioning its foreign policy decisions, the government is pressuring platforms to censor content and restrict user access. Russia has a long history of internet censorship, and it’s now reaching new heights.

These are just a few examples of how social media networks are used to control and censor information. The centralized nature of the web makes it easy for those in power to manipulate what users see and silence dissenting voices.

The metaverse has no single definition, but the ultimate goal is to become a community-based, distributed, 3D internet where users can create avatars and interact with each other in digital spaces.

The censorship in Web2 is possible because of its centralized nature. The web is a series of tubes, and those in power can easily shut off the taps. The metaverse is a solution to this problem, as it is architecturally more censorship-resistant. With the underlying data stored across a decentralized network of nodes, it is much more difficult for authorities to censor or control information.

Further, the metaverse can be used to create “safe spaces” for vulnerable groups. For example, NFTs can be used as gateways to protected areas where only certain people have access. This would allow marginalized groups to interact with each other in a safe and secure environment.

To give another example, the virtual world Personal Boundary has implemented safety features like a “four-foot zone of personal space” around users’ avatars, and Roblox has strong safety features in place, including machine detection of unsafe content and rigorous chat filters. However, Roblox has not been immune to safety issues, as there have been reports of children being groomed by extremists on the platform, highlighting the importance of designing comprehensive safety features into virtual worlds.

The venture builder and consultancy newkinco initiated anitya space, a white-label solution that enables brands and influencers to create their own metaverse experiences. Newkinco works with organizations like the Goethe-Institut, a German cultural association active in 158 locations worldwide and Tales of Us, a multimedia production company, to create cultural experiences in the metaverse. Tales of Us interacts with kids, teenagers and their guardians to reach a global audience and to share narratives at the intersection of culture, community, and nature. By partnering with both organizations, they have been exploring how to gamify an immersive learning experience while safeguarding diverse digital safe spaces.

This is just the beginning. As the metaverse becomes more popular, it will become a space for cultural sharing and exchange, leading to a more inclusive and diverse web.

The metaverse is not a panacea for the ills of the centralized web. In fact, we’ve already seen some of the worst aspects of the web play out in virtual worlds.

One major issue is “griefing”, which is when users harass or abuse others in digital spaces. This can take the form of flaming, trolling, doxxing, and even virtual assault. Griefing is a serious problem in multiplayer games like Second Life and Roblox, but it has also been an issue in regular online spaces.

Another issue is the “Wild West” feeling of the metaverse. There are no rules or regulations governing what can and cannot be done in virtual worlds. This lack of governance can lead to a feeling of lawlessness, which can be dangerous for users.

Finally, the metaverse is still very much in its infancy. The technical infrastructure is still being built, and there are very few “killer apps” that would make the metaverse essential for users.

Looking ahead

The metaverse is a promising solution to the problems of the centralized web. However, it is still in its early stages and faces many challenges.

As the metaverse grows and matures, we need to be mindful of the issues that have arisen in virtual worlds. We must learn from our mistakes and build an inclusive, safe, and secure metaverse for all. This means designing comprehensive safety features into virtual worlds, regulating grieving and harassment, and establishing user rules and guidelines.

Without these safeguards, the metaverse risks becoming an echo chamber for the worst aspects of the web. But with them, the metaverse has the potential to become a safe space for all. Implementing these steps, in practice, involves discussions between metaverse developers, platform holders, and civil society organizations to create a more inclusive and diverse digital future for everyone. 

While these conversations are ongoing, we can all help to create a better metaverse by using safety features on existing platforms, reporting harassment and abuse, and being respectful of other users.

Valerias Bangert is a strategy and innovation consultant, founder of three media outlets and published author.

Originally appeared on: TheSpuzz

Scoophot
Logo