About
Services
News
Contact us
Join us
Go back
What we do
Social Media
PR & Comms
Data & Insights
Creative
Influencer Marketing
Innovation
graphical linegraphical line

Trends & Insights | Blog

Safeguarding the metaverse: why regulators and tech firms must step up

May 6, 2022

Innovation

The metaverse was one of the key trends for 2022 at our annual social media briefing at the end of last year.

Abby McGovern

Account Executive

The metaverse was one of the key trends for 2022 at our annual social media briefing at the end of last year. Since then, the emerging technology has been written and talked about constantly, with businesses and users keen to discover more. However, it’s also been dubbed ‘the online Wild West’ due to its current lack of regulations and potential online harms.

Last month, however, we saw some positive moves and a potential framework for how future regulations may look. The Institution of Engineering and Technology (IET) published a report – ‘Safeguarding the metaverse’ – that examined the potential harms associated with virtual reality and the metaverse.

While the report labelled the metaverse as a milestone in the history of digital media, it conceded that without sufficient regulation, the new digital environment posed a considerable threat to users’ data and privacy.

What does ‘safeguarding’ mean in the metaverse?

Future-proofing existing social media regulations was a key recommendation in the report. Using the UK’s Online Safety Bill as an example, the report noted that most social media legislation is overly focused on the platforms users are using today, with little thought given to the technologies that will dominate in the future.

Thankfully, the UK government is not oblivious to the risks posed by the metaverse. Lorna Woods and William Perrin, two academics responsible for the framework of the Online Safety Bill, recently commented that “technology companies can’t use the metaverse to escape regulation”.

How can technology companies protect users in the metaverse?

The report was clear that safeguarding in the metaverse is not solely a responsibility for legislators.

Brands running metaverse experiences will need to step up and protect users from harm. Addressing toxic online culture is a starting point, with the report recommending that brands take a proactive approach and technology companies avoid “block and mute” features that put the responsibility on users.

This proactive approach is particularly important within virtual realities. Safeguarding in these environments is essential given the blending of online worlds with real-life forms of harassment, dissociation and desensitisation. Addressing the culture of these spaces, rather than placing the onus on users or victims of harassment, is a must if technology companies and any brand running metaverse experiences are to encourage a positive, healthy environment.

Accessing the full value of the metaverse hinges on the digital environment being a safe space for users where their privacy and data are protected. As the tech capabilities that underpin the metaverse evolve and more brands begin to explore the space, we expect the conversation about regulation to ramp up.

Regulation from governments will be key to upholding any safety standards, but, on a day-by-day basis, the onus will fall on technology companies to ensure that the metaverse is not only a creative space, but most importantly, a safe one for users.

Find out more about ways Battenhall is working with brands in the metaverse, or email metaverse@battenhall.com.