If you’ve spent any time reading the news lately (quite frankly, if you’ve been avoiding it like the plague, we get it…) you will have noticed a lot has been happening around law, policy, youth and digital safety. Both here and in the United States, groups and governments are taking further steps to hold social media platforms responsible for the content they allow on their sites. 

In the US, 41 states have banded together to file a lawsuit accusing Meta of intentionally serving dangerous information to minors (such as dieting/eating disorder posts) as a result of developed algorithms, with no regard for their safety, and in the name of profits. As a result, these states are seeking financial retribution for the damage to be used to invest in schools and mental health services.

New York Attorney General Letitia James: 

Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.

Meanwhile the UK has passed the Online Safety Act which will focus on overall protection of children with a zero-tolerance approach to inappropriate content across the board, including the following action points:   

  • Remove illegal content quickly or prevent it from appearing in the first place (Note: We’re unsure what “quickly” means at this point in time and will share more details once they’re available). 
  • Prevent children from accessing harmful and age-inappropriate content 
  • Strongly enforce age limits and use age-checking measures on platforms where content harmful to children is published. 
  • Ensure social media platforms are more up front about the risks and dangers posed to children on their sites
  • Provide parents and children with clear and accessible reporting mechanisms on their platforms, making it easier to remove and report inappropriate behaviour.

Both the UK policy and the US lawsuits will eventually require changes to how platforms present, monitor and build their sites. We’ll need to wait and see how company developers decide to deliver on these requirements. In the meantime, we offer the following recommendations to ensure your content (and audience) don’t get unduly punished or manipulated.

5 key tips for safer online spaces

  1. Age-gate your closed groups and monitor the ages displayed by members on a regular basis (because we all know young people circumvent age restrictions at sign up and therefore, may end up becoming a part of your community). 
  2. Carefully review the content that you and your marketing team are sharing. This is especially true if your charity happens to cater to sensitive subject matters or polarising content. If it includes controversial tones it could be at risk of being reported as being inappropriate for younger audiences, or not reach as many individuals as in the past. 
  3. Create your own ‘zero tolerance’ policies for your pages, forums and groups. Make these statements clear, up front and centre. Then take immediate action if something is uploaded which goes against your policies.
  4. Moderate your social and community spaces 24/7 where possible – if your team is not big enough, hire an outside agency to assist with your moderation, or shut comments down for a certain period of time. 
  5. If you operate in the US, follow the Meta lawsuit closely, and make sure your organisation can benefit from any financial settlement to which you’re entitled. 

We continue to keep a close eye on how the social media landscape is evolving and will keep you updated as we learn more about the changes, and what they mean for organisations. 

In the meantime, check out our free online safety resources hub, download our free online safety checklist or read more about the online safety act.