Picture: Microsoft
The article can only be viewed with JavaScript enabled. Please enable JavaScript in your browser and reload the page.
Meta had to take a lot of criticism for its own Metaverse moderation. Microsoft wants to avoid similar controversies and is now taking action.
Microsoft VR and AR boss Alex Kipman today announced significant changes to his own Metaverse platform AltspaceVR in a bid to increase user safety.
The following changes apply immediately:
- the Social hubs will be fully shut down. Social hubs are the public and largely unmoderated playgrounds of the Metaverse platform, where you can meet and get to know other users without prior appointment.
- the safety bubblewhich prevents other avatars from invading private parts, is now enabled by default.
- Anyone who attends an event will now be automatically muted.
In the coming weeks, Microsoft also intends to implement the following:
- the age rating for events needs to be improved and moderation in general strengthened.
- In the future, users will register with a Microsoft account need to login.
- The company wants age-related control mechanisms Integrate, with which parents can grant and restrict access to AltspaceVR for children and youth over 13.
Microsoft wants to establish metaverse guidelines
On the AltspaceVR blog, Kipman explains the radical action “As platforms like AltspaceVR evolve, it’s important that we look at what’s in place and see if it meets the needs of customers today and in the future. future,” writes Kipman. “This includes helping people connect better with like-minded people, while ensuring that the spaces they access are free from inappropriate behavior and harassment are.”
Everyone should feel safe on platforms like AltspaceVR, which is why Microsoft has a responsibility to “put safeguards in place,” Kipman says. He describes AltspaceVR as a “building block for the future of the metaverse”.
Microsoft responds proactively
The security measures are a response to criticism and scandals surrounding Meta’s Metaverse platform, Horizon Worlds, which launched in the United States and Canada in late 2021. Users have reported physical harassment and toxic behavior, some of which went unpunished.
Meta responded to the first by introducing a personal security zone. However, it’s unclear how Meta intends to prevent inappropriate behavior, as 3D rooms pose completely new moderation issues. With Prohibitions and monitoring the problem can only be partially solved. According to user reports, Meta currently seems to be more non-interfering and hopes the issue will resolve itself by muting or blocking each other.
Microsoft wants to at least mitigate these issues by disabling public Metaverse rooms and enabling the security bubble by default.
The establishment of control mechanisms for the protection of children and young people is also due to negative press: a few weeks ago, the Guardian reported that the British data protection authority wanted to subpoena ICO Meta. The reason: The Meta Quest 2 VR glasses are expected to offer insufficient protection and expose children to dangerous content, especially in the Metaverse.
Learn more about Meta and Metaverse:
Note: Links to online stores in articles may be so-called affiliate links. If you buy through this link, MIXED.de will receive a commission from the seller. The price does not change for you.