Brands that host and run virtual worlds or MMOGs (Massively Multiplayer Online Games) for children have a pretty big task on their hands.
Of course, the first priority of the moderators of these games is to ensure that children playing in the worlds they have created are kept safe from any kind of danger.
But they also have to make sure that children are engaging with the virtual world and having a positive experience.
There are two types of moderators. First, the silent moderator (the more traditional role), who stays in the background of a virtual world, unseen, monitoring and deleting offensive material, reporting abusive behaviour and diffusing any difficult situations that might arise between players.
The second and increasingly popular type is the in-game moderator, who actively participates as a character or avatar on the site, helping other players engage with the various activities within the game.
This type of moderator may also act as an in-game host – ie visible to the children - and can be compared to the host of a children’s party: the role is about encouraging children to explore and try new things and have as positive experience as possible, but stay safe and secure while doing so.
A visible presence can help to reassure children and parents alike. If parents know that a site is safe, they are more likely to allow their child to return to the site.
Work with parents. Reassure them and draft clear user guidelines
Brands should work with the parents to ensure that they really understand what the site is about, and what measures are in place to keep children safe.
One of the first steps a brand should take to reassure parents and children is to draft clear user guidelines – and enforce them – on what is, and what isn’t, acceptable on the site.
These should be easy to understand, and written in the tone of the site; and it should be very simple for a child to report inappropriate behaviour.
These guidelines help to set the tone of the site right from the offset, and often we find that children will help enforce rules among themselves, if they understand them clearly.
It’s also important to ensure that the moderators also have clear guidelines to work to, so that each different moderator is consistent in the way they operate and interact with the children. Children are very quick to spot any inconsistency in treatment or behaviour.
Act as hosts to encourage interaction
Once the tone and boundaries of the site are set, moderators can play a role in helping children get involved, by acting as visible hosts for the site, or becoming an interactive character within the game.
Humour, quirkiness and a dollop of fun are great ways to get kids involved. We’ve worked with brands that have created in-game characters for the moderators that talk in rhyme, for example, or use animal characters to interact with the children, helping them move through the game, or try new areas of the world.
Build elements in at the design stage rather than ‘add on’
Finally, build these elements into the world before, not after, it is built. A huge amount of the filtering process can be automated, with great software such as that from Crisp and as with all these things, it is easier to integrate it at the design stage, rather than as an ‘add on’ to the site.
Done properly, moderation can improve game play and make virtual worlds an even more enriching experience, increasing player loyalty in the process.
Tamara Littleton is CEO of community management and moderation company, eModeration.
Check out 12ahead, our brand new platform
covering the latest in cutting-edge digital marketing and creative technology from around the globe.
12ahead identifies emerging trends and helps
you to understand how they can apply to modern-day companies.
We believe 12ahead can put you and your
business 12 months ahead of the competition. Sign up for a free trial today.