Today we’re grabbing a cuppa to chinwag with one of the incredible people here at StrawberrySocial. Someone with a wealth of experience in the industry, who knows what it takes to be an outstanding moderator. Someone who knows how to hire, train, mentor and grow a moderation engagement team.
Meet Shaz Collier, our Chief Operations Officer (& Online Safety Mastermind).
Shaz began her career online in 2000 at AOL. “We were hosting a chat room for 6-8 year olds,” she tells me on Zoom, “we thought it was innovative, but now I see it as naive. In those days we had no idea how the internet would go on to be used.” Shaz moved on in the noughties to work for Emoderation, took the CEOP Ambassador course, and studied digital safety.
“I gained a new awareness doing the CEOP Ambassador course. It was certainly eye-opening, I didn’t realise the depths that some people would sink to in an online environment. “
I asked her how the online experience for young people has changed in the intervening years, “In those days there were no resources that kids could use to advise them on keeping safe,” she says, “how to ask for help, how to recognise people who aren’t honest and trustworthy. Nowadays no one would have a chat room for that age group,” she adds, “now it’s all games with pre-moderated phrases that they type in.”
Shaz then went on to train as an educator on the “Thinkuknow” programme for online safety. She points out how far rules and regulations, training and education have come on over the years, something she has also seen through her work as a CEOP Ambassador.
Shaz also has experience of children’s safety from a parent’s point of view. Shaz is an avid gamer and has four children; guiding them through their teens in the last decade she has seen many issues of child safety and parental involvement firsthand. She has watched as the opportunities for children online grows, alongside the dangers.
“What,” I asked her, “makes a good moderator?”
“Everyone likes to think they’re an expert. But it’s such a fluid industry – things change so quickly. It only takes a couple of hours for things to snowball. So, attention to detail, ability to work without supervision, empathy and resilience are essential. Guidelines and policies and procedures can be learnt, but if anything unusual pops up it’s experience and natural ability that really matters.”
“Only certain people are natural moderators. On an emotional level you’ve got to have good empathy, but also resilience because some of the things you see can be very distressing.”
“One of the biggest challenges moderators face today is the subject of gender issues,” she says, and we agree that attitudes towards gender in social media content moderation can have far reaching repercussions, making it a thorny and difficult subject for everyone. With everything from bullying and intimidation (think of the flack female celebrities, politicians, etc. are subjected to online), to enforced stereotyping and threats of assault. Then there’s sexting and the ever-mounting pressure on young girls to look like they come from ‘planet porn’ - even whilst they’re then being ‘slut-shamed’.
We can only wait and see how this plays out for both boys and girls - the first truly native generation. What effect will the things they have access to online and the environment on social channels have on them, their growth, their attitudes, their respect of others?
We then naturally move on to the subject of risk-taking – and the documented links between taking risks, seizing opportunities and personal safety. I ask her how she feels about young people taking risks on the internet. What strategies would she suggest to parents and schools?
On this Shaz is firm: “The words internet and risk-taking shouldn’t be in the same sentence. So much can go wrong. The Thinkuknow series has resources for children of all ages - the tools are there, but whether parents know about them is another question. It was around when my children were in primary school but no one came and mentioned it. This does seem to be changing now with dedicated safety awareness training for youth groups and schools."
We discuss the NSPCC (one of our clients), and their excellent campaigns. They educate teachers and parents on guiding kids as they navigate the online world, making sure they know how to keep themselves safe (definitely check out their 'Pants' initiative.
“The companies I’ve come across follow the guidelines for keeping kids safe online in corporate terms,” Shaz says, “they limit the ability to chat freely. They offer human moderation rather than machine moderation. And, importantly, they make kids aware of how to report a problem if they come across one. “
So what, I ask her, is the role of the agency in supporting the moderator?
“It is vital that agencies offer their workers flexibility, extra resources, and an open door policy. Working remotely carries its own challenges and homeworkers need support.”
At StrawberrySocial moderator support is a priority. Team leaders reach out to the moderators when they have had to escalate something of concern. “Moderators don’t always know how much they have been affected by distressing content, and can feel anxious about asking for support.” Shaz is available at most times for moderators wanting to talk something through. They can book her time, without having to contact her first (via Calendly), and some have done so in these difficult months. “A counselling service is becoming a vital part of the service offered to remote workers,” she tells me. “We have this for our moderators, all they have to do is come to me and we can organise sessions for them.”
The online world is growing up, it is now addressing safety concerns more cohesively than in those first mad years of innovation and excitement.
Children need to be educated in online safety continually. Online risks, opportunities and tech changes all the time
Parents and teachers must learn alongside the children
Companies offering access to children have a responsibility to be transparent regarding safety practices
ThinkUKnow and the NSPCC offer excellent resources for families and educators alike
The role of human moderation remains a vital part of protecting children online
Organisations must support their moderators’ mental health