A new white paper from moderation and community management company, eModeration, highlights the need for brands engaged in social networks to protect both their users and their own reputations.
The paper, Moderation in Social Networks, examines the impact of user-generated content within social networks on a brand’s reputation, and gives a clear guide to best practice on moderating that content within the ‘big four’ social networks (MySpace, Facebook, Bebo and YouTube). Moderation in Social Networks is free to download from emoderation.com.
The paper, written by eModeration’s CEO, Tamara Littleton, guides brands through the following issues: • The status quo: is content safe on social networks?
This includes a brief look at some of the complex legal issues surrounding the responsibility for areas including users’ safety and defamation.
• Who is responsible for keeping users safe? This lays out what brands should know about what the big four networks are doing to ensure safety on their sites.
• What is the risk to a brand? This section takes brands through the potential risks and pitfalls of a social network campaign (focusing on user safety and brand reputation).
• The rules on social networks. Each of the big four social networks has a different set of rules and processes to follow when engaging with users. This section gives a detailed breakdown of what each of the four sites does and doesn’t allow; and a guide to best practice for brands on moderating content on each.
• Should a brand moderate a third-party site? This is the big question for many brands. This section of the paper lays out questions brands should ask themselves, such as how far it knows (and trusts) its audience; what the risk might be to a brand’s reputation of being associated with negative content; and how to protect users.
• Can brands stop people saying negative things about them? There is a very clear difference between moderation and censorship. Brands must be prepared to take negative comments on the chin, but they don’t have to put up with abusive posts.
• What should a brand look for when moderating content? The obvious areas are bullying, abuse or illegal content. But there are other, less obvious pitfalls such as avatar images, swear-words in user names, harassment messages, spam and off-topic posts.
Tamara Littleton, CEO of eModeration, says, “There is little or no consistency between the processes of the major social networks and this can be a minefield for brands. Marketing to consumers on social networks is a fast-growing area for brands. The aim of this guide is to arm brands with the knowledge they need to deal with user-generated content on social network pages.”
Download a copy by clicking the link below.
Check out 12ahead, our brand new platform
covering the latest in cutting-edge digital marketing and creative technology from around the globe.
12ahead identifies emerging trends and helps
you to understand how they can apply to modern-day companies.
We believe 12ahead can put you and your
business 12 months ahead of the competition. Sign up for a free trial today.