An illustration of a woman with red hair and a blue shirt holds a magnifying glass to her eye

FAQs

Browse our frequently asked questions section for helpful definitions, explanations and examples of everything from AI moderation to GDPR.

An illustration of a woman with red hair and a blue shirt holds a magnifying glass to her eye

Moderation

StrawberrySocial moderators receive a range of training, depending on the requirements of the projects they are assigned to. We combine our own customised training materials with the latest industry-standard certifications and specialised educational courses. This combination of internal training and external certifications means our moderators are fully prepared and capable of delivering the highest quality service in any situation.    

  • Escalations and Crisis management training
  • Moderation Gateway certification Our moderators are seasoned professionals however, having basic understanding and certification in the basics of the business is key to their progress.  
  • NSPCC Online Safety certification is required for any moderator working on Family Friendly projects 

Content moderation is the practice of reviewing and managing the visibility of user-generated content to ensure it complies with platform or community guidelines and legal requirements. It plays a vital role in maintaining a safe and positive online environment by filtering out harmful or inappropriate content.

Read more about this topic in our blog article: What is Content Moderation? 

AI content moderation is the use of artificial intelligence to automate and assist in monitoring and moderating user-generated content. It helps identify and filter out potentially problematic or inappropriate content, but human oversight is still critical for contextual analysis and accurate decision-making.

There are different types of content moderation practices employed to manage user-generated content. Here are a few common types:

Pre-moderation: In this approach, content is reviewed and approved by moderators before it is publicly displayed. It allows for strict control over what content is published but can cause delays in content visibility.

Post-moderation: Content is published immediately and then reviewed by moderators after being made public. This approach enables faster content visibility but may require removal if it violates guidelines.

When taking a post-moderation approach, there are two subtypes of moderation:

Reactive moderation: Moderators intervene when content is reported or flagged by users. They review the reported content and take appropriate action based on the platform’s guidelines and policies.

Proactive moderation: Here, moderators actively scan and monitor content without relying solely on user reports. They use automated tools, AI, or keyword filters to identify and remove potentially inappropriate or harmful content.

Chat moderation involves monitoring and managing messages and conversations in chat-based platforms to ensure compliance with guidelines and prevent inappropriate or harmful content. 

Chat moderation can involve the use of AI tools, simple filtering to block certain words and other automated approaches in combination with human review. Moderators can remove messages, but also manage the user’s ability to participate with the use of warnings, timeouts and bans.

Live moderation, also known as real-time moderation, refers to the practice of monitoring and moderating content or interactions in real-time as they occur in a live setting, such as live streams, live chats, or real-time discussions on social media platforms.

During live moderation, moderators actively monitor the content being generated or the conversations happening in real-time to ensure compliance with platform guidelines, community standards, and legal regulations. They may review and remove inappropriate, offensive, or harmful content, address user concerns or questions, and enforce rules to maintain a safe and positive environment.

Social media moderation is the process of monitoring and regulating user-generated content on social media platforms to maintain a safe and respectful online environment, respond to incoming inquiries and protect the reputation of the brand or organisation that owns the accounts. Moderators enforce brand and community guidelines, review and respond to posts and comments, and remove any content that violates the platform’s rules.

To succeed as a moderator, one needs a range of skills such as strong communication abilities, excellent judgement, an attention to detail, critical thinking, empathy, and a thorough understanding of platform and/or community guidelines and policies.

Online Safety

COPPA is the Children’s Online Privacy Protection Act, a U.S. law that safeguards the online privacy of children under 13. It requires businesses to obtain parental consent before collecting their personal information. Compliance with COPPA is essential for businesses targeting or collecting data from children to avoid legal issues.

GDPR stands for General Data Protection Regulation. It is a comprehensive data protection law that applies to businesses operating in the European Union (EU) or handling the personal data of EU residents. Compliance with GDPR requires meeting strict obligations regarding the collection, storage, and processing of personal data.

CARU refers to the Children’s Advertising Review Unit. It is a self-regulatory program in the U.S. that promotes responsible advertising to children. By following CARU guidelines, businesses can ensure their ads aimed at children are truthful, fair, and appropriate.

Online safety refers to the measures and practices aimed at protecting individuals and their personal information when using the internet and engaging in online activities. It encompasses various aspects of digital security, privacy, and responsible online behaviour.

Online safety involves safeguarding against potential risks, such as cyberbullying, identity theft, scams, malware, and unauthorised access to personal data. It includes implementing strong passwords, using secure and updated software, being cautious of sharing sensitive information, and understanding the privacy settings of online platforms.

Additionally, online safety encompasses promoting digital literacy and educating users about online threats, recognising and avoiding online scams, and fostering responsible online behaviour, especially for children and young individuals. 

This is when criminals (terrorists, thieves, fraudsters, child abusers) gather personal information about potential victims from a variety of online sources (gaming, social media, online community accounts.) Through online conversations, posted images, public information shared, and geo-location, the criminal can piece together a full picture of the target (often a vulnerable person or child), to find and cause harm to the person in the real world.

Community Management

Social media management focuses primarily on content for, and engagement on, social media platforms. This generally includes curating and/or creating content, working on paid advertising campaigns, analysing metrics, and engaging with followers. 

Community management is concerned with nurturing and fostering relationships within online communities. While these communities are sometimes gathered around social media platforms, they are just as likely to occur on dedicated community platforms or forums.

Community managers focus on building a sense of belonging, facilitating discussions, resolving conflicts, and maintaining a positive and supportive environment. They often act as the liaison between a brand and its audience.  

While social media management deals with the overall presence and strategy for how a brand or organisation makes use of social media, community management concentrates on fostering relationships and creating a sense of community in a broader range of online spaces. 

Social media management is more focused on the content that is published on social media, while community management is more focused on the people who interact with that content.

Community management involves building relationships and engaging within a community of individuals who share common interests. Community managers facilitate discussions, address concerns, and foster a sense of belonging within the community to promote loyalty and advocacy for the brand or organisation.