Ofcom to Name and Shame Platforms Over Online Sexism

Ofcom vows to name and shame platforms over online sexism

London, November 27, 2025

Ofcom has announced it will publicly name online platforms that fail to protect women and girls from sexism and gender-based harms, following the publication of its final guidance under the UK Online Safety Act 2023. This move aims to hold tech companies accountable for misogynistic abuse and other targeted harms across digital services in the United Kingdom.

Ofcom’s Final Online Safety Guidance

The UK communications regulator’s guidance addresses four principal categories of gender-based online harm: misogynistic abuse and sexual violence, stalking and coercive control, image-based sexual abuse, and coordinated harassment. The rules mandate that platforms introduce measures designed to deter and mitigate these risks.

Required actions for platforms include the implementation of user prompts encouraging users to reconsider posting harmful content, the imposition of timeouts on repeat offenders, and demonetisation of sexist or violent material. Platforms must also employ automated technology to swiftly detect and remove intimate images shared without consent. Enhanced security and privacy controls are compulsory to reduce stalking risks, alongside improved blocking functionalities and more rigorous tracking of reports.

Official Responses and Regulatory Intent

Dame Melanie Dawes, Ofcom’s chief executive, highlighted the alarming scale of online abuse suffered by women and girls and underscored the urgency for tech firms to exceed the legal minimum standards to ensure user safety. Similarly, UK Technology Secretary Liz Kendall asserted that inaction by digital platforms effectively enables sexism to proliferate unchecked within online environments.

By committing to name and shame non-compliant platforms, Ofcom signals a firm intent to enforce stricter oversight and accountability. This approach is poised to increase pressure on technology companies to prioritize the protection of female users and deliver safer digital spaces.

Broader Regulatory Context

The publication of these guidelines coincides with heightened global recognition of violence against women, reflecting growing demands for robust regulatory frameworks that compel tech companies to take responsibility beyond traditional content moderation. Ofcom’s framework goes beyond reactive measures by promoting proactive interventions aimed at preventing gender-based harms online.

Future Outlook

This initiative is expected to reshape how online platforms manage and respond to misogynistic and gendered abuse in the UK. As Ofcom enforces these new rules, business leaders, policymakers, and digital stakeholders alike will need to closely monitor compliance and adapt their strategies accordingly to foster more inclusive and secure digital communities for women and girls.