Your brand our protection

It’s time for StrawberrySocial’s annual look ahead. 2025 kept social media managers on their toes: the UK’s Online Safety Act moved from concept to compliance, and platforms wrestled with generative AI, authenticity and the proliferation of niche communities. With those shifts in mind, here are the trends we expect to shape social media, and your planning, in 2026.

1. Safe‑by‑design moves from best practice to mandatory

The Online Safety Act is now fully in force (as of July this year), with Ofcom’s Protection of Children Codes placing legally binding duties on any service likely to be accessed by children. Platforms must carry out risk assessments, embed measures such as age-assurance, safe-search and blocking/muting controls, and keep clear records of how risks are identified and mitigated.

At the same time, the Internet Watch Foundation (IWF) is sounding the alarm on the proliferation of AI‑generated child sexual abuse material (CSAM). In the first six months of 2025 the IWF discovered AI‑generated CSAM on 210 webpages, a 400 % increase on the same period in 2024, and confirmed 1,286 AI‑generated videos, up from just two the year before. The IWF warns that criminals are now creating highly realistic videos at scale. The Guardian also reports that most of these AI‑generated videos fall into the most severe category of abuse and that the UK government is making it illegal to possess or create AI tools designed to produce CSAM.

What this means for you: safety features, guardrails and moderation processes are no longer “nice to haves”. If your organisation offers user‑generated content, chatbots or generative tools, you will need to demonstrate compliance with Ofcom’s codes and have clear reporting lines for harmful content. Evaluate your AI suppliers and ensure that safety‑by‑design is embedded from the outset. Use human moderators to train AI tools and keep them in the loop. AI can assist, but protecting children requires human oversight at every critical point.

2. Raw authenticity and community‑driven engagement

Consumers are turning away from algorithmic feeds and polished corporate posts in favour of genuine, human‑scale interaction. The Verge and Vox Media’s 2024 survey of more than 2,000 US adults found that big platforms are losing trust: 42% of respondents said search engines like Google are becoming less useful, and 60% view the state of social media negatively. People crave meaningful connections – 90 % of members of digital communities simply lurk yet still feel connected, and nearly half of consumers would prefer to join communities that ban AI‑generated content. The study concludes that smaller, purpose‑driven communities built around shared values are the future of digital engagement.

Sprout Social’s 2025 Index underscores the appeal of authenticity: 49 % of consumers say the originality of a brand’s content matters most, and 36 % say how a brand engages with followers is what catches their attention. But the broader research explains why: people are tired of feeling like a number in a “giant algorithmic machine” (The Verge). Taboola’s analysis of niche communities found that 53 % of surveyed users believe online communities should have fewer than 200 people, and micro‑influencers (around 1,000 followers) have an average engagement rate of 9.7 %, compared with 1.7 % for creators with 100,000 followers.These metrics illustrate why intimate groups and human voices outperform mass broadcasting.

What this means for you: 

  • Invest in human‑centred storytelling and community building. 
  • Encourage your social teams to show the people behind your organisation, share behind‑the‑scenes content and emphasise your mission. This is especially important for charity organisations – people want to connect with others, be seen and supported by humans.
  • Build invite‑only groups or forums where supporters can connect safely. Moderate these spaces carefully – smaller communities thrive on empathy and trust. 
  • Partner with micro‑influencers and community leaders whose values align with yours; their engagement levels and authenticity can often beat a big follower count. 
  • Use AI as a tool for moderation or summarisation, but be transparent about its role – nearly half of consumers want AI‑free spaces according to The Verge, so human engagement must remain central.

3. Social search and AI‑powered discovery become core to strategy

The change in search behaviour has been well documented this last year: as consumers increasingly turn to YouTube, TikTok, Instagram, and now AI to find answers, products and communities. Networks are embracing AI‑driven search; features similar to Google’s AI Overviews summarise content and return direct, conversational answers. This trend is accelerating as social platforms experiment with generative search results and recommendation engines.

What this means for you: optimise your content for social engine optimisation (SEO). Write captions, titles and alt‑text that include the keywords your audience is searching for. Think about how AI might summarise your posts – short, clear descriptions and structured information help AI systems serve your content to the right people. Monitor emerging features such as TikTok’s search ads and AI‑generated topic summaries. As AI search becomes mainstream, your organisation’s discovery strategy will depend as much on understanding these algorithms as it does on your content plan.

4. AI moderation will assist – but not replace – humans

Large language models are being adopted by platforms as moderation tools. Meta has begun using AI models to give “a second opinion” on moderation decisions, and TikTok has laid off moderators as part of a reorganisation aimed at deploying automated systems. The incentive is clear: automation reduces costs and speeds response times. But researchers and policy experts warn of significant limitations. Off‑the‑shelf models struggle to distinguish prohibited speech from counterspeech in 15–20 % of cases, and users see AI‑only moderation as less trustworthy than decisions made by humans. Scholars at the Trust and Safety Research Conference advocate using LLMs to explain moderation decisions and support community‑led moderation rather than replacing human moderators. 

What this means for you: expect a hybrid future. AI can help flag problematic content and provide transparency around policies, but trained moderators will still be critical for nuanced decisions, appeals and building trust. When procuring AI moderation tools, ask providers about accuracy rates, false positive/negative handling and the ability to incorporate your own policies. Make sure there are clear, scheduled escalation paths for human review – AI learning and actions need to be checked and informed by humans. Transparency about how your organisation moderates content – and the role AI plays – will be a differentiator in 2026.

5. Accessibility becomes a baseline expectation

With 16.8 million disabled people living in the UK – roughly one in four of the population – inclusive design is no longer optional. Worldwide, there are billions of people with visual impairments, and social platforms are responding by embedding captioning, alt‑text and assistive tools. Meta reports that up to 85% of Facebook video plays are muted, and accessibility studies show that three-quarters of viewers prefer captions when watching online. These numbers show that accessible content benefits everyone, not just people with hearing impairments.

What this means for you: bake accessibility into every post. Add descriptive alt text and closed captions, use clear language and high‑contrast visuals, and capitalise multi‑word hashtags (e.g. #StrawberrySocial) to aid screen‑reader parsing. Consider how people consume content on the go – many scroll without sound or have temporary accessibility needs. Beyond the clear moral imperative, inclusive design broadens your audience, improves SEO and reduces legal risk under the UK Equality Act, the Americans with Disabilities Act and the European Accessibility Act.

6. Video and short‑form content remain king – but context matters

Video is still the dominant storytelling format, and short clips are the star performer. Metricool reports that short‑form video posts on its platform grew by 70 % in 2025 and reached nearly six million uploads. Taboola found that short videos generate 2.5 times the engagement of long‑form videos and that 72 % of consumers prefer watching a video to reading text. Wyzowl’s 2025 video marketing survey asked consumers how they most like to learn about a product or service; 78 % said they’d rather watch a short video than read an article (9 %) or e‑book (5 %). Wistia notes that while viewers expect concise messaging – they will abandon videos that fail to deliver quickly – longer videos can still deliver higher total watch time when they teach or demonstrate.

Platform dynamics matter. We know this. Metricool observed that Instagram and YouTube are becoming saturated: content volume is rising but reach and views are declining. TikTok still delivers strong engagement, but that is now shared across more accounts. That means you can’t simply churn out videos and expect results; you must tailor format and storytelling to each platform – and prioritise content that genuinely resonates and holds attention.

What this means for you: prioritise vertical, 15–60 second clips for quick hits, but use longer explainer or documentary videos when you need to build knowledge or trust. Make sure every video is accessible with captions and on‑screen text and experiment with interactive features such as polls or Q&As. Review your analytics by platform and format; double down where you see organic reach and refine or pause where performance dips. Above all, use video to tell human stories, not just to follow a trend.

7. The backlash against ‘AI slop’ and the enshittification of feeds

2025 saw a dramatic surge in concern about low‑quality, machine‑generated content. Brandwatch reports that online mentions of “slop” grew by more than 200 % in 2025, with 82 % of sentiment‑categorised mentions being negative. Conversations about slop in art and culture jumped 125 %, and discussions of digital detoxing – people deliberately curating or pausing their social media use – were up 10 %, according to the same report by Brandwatch. The Guardian notes that technology writer Cory Doctorow’s concept of “enshittification” – the degradation of services as platforms prioritise profit over user experience – is accelerating. Matt Navarra summed up the growing frustration nicely in his Geekout newsletter, calling Facebook “a broken mess” in need of “a hard reset”. His comment reflects a broader user pushback against low-quality, AI-generated content and overwhelming recommendations on major platforms.

What this means for you: audiences are increasingly wary of generative “slop” that feels bland, repetitive or deceptive. Many users are turning away from feeds cluttered with ads and AI‑generated content and seeking spaces that feel genuinely human.

To stay trusted, prioritise quality over quantity: invest in original storytelling, take time to craft posts that offer value, and be transparent about any AI assistance. Encourage your community to flag low‑quality content and consider running moderated spaces or AI‑free groups for those who want them. As digital fatigue grows, the organisations that thrive will be those who respect their audience’s time and intelligence.

What to keep in mind for social in 2026

2026 will be a year of accountability, authenticity, and quality. Regulations like the Online Safety Act and Digital Services Act are forcing platforms and the organisations that use them to take safety, transparency and user control seriously. At the same time, audiences are rebelling against slop and “enshittified” feeds; they will reward brands that are real, responsive, inclusive, community‑focused and respectful of their attention. Brands and organisations that prioritise safety, community, accessibility, quality content and human oversight will have a clear advantage in 2026.