One of the best parts of our job is getting to know new and established businesses alike. We love finding out what their hopes are for their online community, and then working together to make those hopes a reality.

Even more exciting is the number of companies that include trust, safety and online engagement as integral to their social media strategy – however we find that very few understand, let alone have heard of, Moderation and Community Management. Those of us that have spent our professional lives developing and working in the field know just how important this role is and the consequences of not taking it seriously enough.

A few months ago we highlighted one of our rockstar moderators – Becca Brinkworth. As a professional content moderator and community manager, she knows first-hand the work involved reviewing online commentary from all walks of life. Rarely does the public have a view into the ’behind the scenes’ reality of what it means to view every bit of online content and scrub out the bad stuff before it gets to the end user.

It’s important to note here that content removal is only a small part of our jobs. In addition to moderation of content, Becca’s (and all of our moderators’) time is spent crafting responses (according to client tone of voice) to ensure that outgoing messages are appropriate, on point and serve to cultivate their online communities. It’s exciting, but also challenging work which requires extremely talented individuals capable of ‘reading the room’, understanding context and responding AS our clients.


We recently celebrated another year as a Living Wage Employer, and thought it a good opportunity to share more about what we hear when speaking on the topic of moderation and moderators; and hopefully squash some of those common misconceptions about the profession.


Read on:

1. Moderation is a mindless job. They don’t think, they just do.

While the profession does need a client or perhaps a legal department to dictate certain rules about what is allowed in a community, it’s the Moderator’s job to interpret each bit of content submitted within that community and make intelligent decisions about that content. Moderation is a profession of ‘grey areas’ – and it requires a LOT of thinking, studying, interpreting and understanding the nuances of language, context and intent.

2. Anybody can moderate.

Sadly, many companies believe that moderation is as simple as removing a few naughty words here and there, and are happy to have just anyone speak on behalf of their carefully thought-out brand voice. However, this rarely takes into consideration things like: culture, slang, tone of voice, nuance and many more subtle differences in language barriers. Besides, most companies already have a mechanism in place to remove the simple naughty stuff; leaving behind that more complex linguistic interpretation for a highly-skilled moderation team to manage.

Plus – we’ve all seen what happens when someone ill-equipped to handle online social media is put in charge of a brand’s presence. Twitter, especially, loves to sit back and watch the horror roll out and contribute to the damage being done to the brand’s reputation. Nobody wants any of these very public online nightmares to happen to them.

3. Moderation is a ‘just’ a starter job, a means to an end.

It’s true that many people begin their careers in Moderation, and then move into other social roles in the same wheelhouse (eg Trust and Safety, Social Media Management or Marketing). However, there are loads of others who choose to remain in Moderation and enjoy the day-to-day challenges that come with the role. Sociologists, Linguists, Care Workers, people fascinated by culture changes or with a special interest in a particular charity etc., even criminal law experts continue to hone their understanding of the human experience through their work as Professional Moderators.

4. All Moderators do is ‘play’ on social media and in video games.

It is true that Moderators will spend much of their job logged into [insert your favourite platform]. However what they’re doing with this time is quite the opposite of ‘playing’. This team is combing your brands’ pages (or reviewing content via a 3rd party tool) for any content which requires intervention – be it a question from your community, a complaint or, at worst, a suicide threat by a user or a violent threat against your organisation (or one of its employees). The job may cause Compassion Fatigue, PTSD, Trauma, Depression and overall work stress.

Every company wants to care for the wellbeing of their team regardless of the role. However, moderators have rarely received the respect they deserve in the past for having the strength of character to handle, what is at times, some extremely disturbing content with patience and professionalism. And if morality and ethics ain’t your bag, think about this: Facebook paid out millions in a settlement against former and current moderators due to the lack of support granted them in the role.

5. Moderation doesn’t pay.

As we said above, we pride ourselves on paying a Living Wage to our Moderators. The job is extremely challenging, can be stressful, and requires extremely talented, dedicated (and robust) people if you want it done right. And recently, the Department for Digital, Cultural, Media and Sport released their latest report, which found the job sector in trust and safety has grown 30% in the UK.

Although, a cautionary note from our CEO:

“The role of covering online (child) safety, content moderation, crisis and brand engagement continues to be denigrated and misunderstood. Every day our team (and others out there) are exposed to distressing material in the ongoing battle to help keep the general public from seeing it as well as reaching out to people who need help. But, hourly rates haven’t been able to increase for about ten years – due to organisations not prioritising the importance of the service and therefore being unwilling to pay what it’s worth. Big budgets are allocated to marketing, paid ads, PR, etc. but where is the line item for online ‘insurance’ – child safety, combating fake news, dangerous posts, damaging information…? I can only hope that, by talking about it more, we can raise awareness of the many ‘invisible’ people who really care about doing the work of a Moderator. They’re superheroes.”


What type of things can a moderation agency do for you (and this list is not exhaustive):

  • Look after your online platforms in AND OUT of hours – social media is 24/7!
  • Remove, dissipate and report back on potential PR crises before they get started
  • Reduce the likelihood of fraud and fraud cases
  • Remove violent and threatening behaviour
  • Reduce the chance of illegal behaviour
  • Spot where people may be in danger, need help or support (or are suffering)
  • Ensure online safety when it comes to children and vulnerable adults
  • Carry out regular community engagement to ensure your customers keep a positive outlook of your brand…
  • …resulting in increased time spent on your sites and a healthy community of fans (and advocates who will often fight your corner).

Why wouldn’t you pay a premium for those that are closest to your customers, taking care of your online reputation?

We hope the above information clears up a few questions you may have had about Moderation and Moderators. We’re always excited when we get the opportunity to share more about the role with anyone that’s keen to learn and sees the value in Trust, Moderation and Reputation Management as a profession.
99d90e_5b469e76e0c74b10b8bcc913063bbf39~mv2

For more information about how content moderation, user engagement and community management can serve your brand in a positive way, reach out to us anytime here.