Ofcom Vows to Expose Platforms Over Online Sexism

Ofcom vows to name and shame platforms over online sexism

London, November 27, 2025

Ofcom has issued final guidance to hold technology platforms accountable for online gender-based harms, effective immediately in the UK. This regulatory initiative, launched on November 25, coinciding with the International Day for the Elimination of Violence against Women, aims to compel social media and tech firms to implement robust protections for women and girls online.

Key Requirements for Tech Companies
The guidance mandates tech firms to adopt governance frameworks and accountability processes targeted at mitigating online gender-based harms. Platforms must conduct comprehensive risk assessments focused on women’s and girls’ safety, maintain transparency regarding protective measures, and integrate abusability evaluations and product testing. They are also required to implement safer default settings and actively limit content that promotes or encourages gender-based harm.

Four Priority Areas of Harm
Ofcom identifies four principal categories requiring urgent attention. First, misogynistic abuse and sexual violence, where platforms should enforce warning prompts prior to posting abusive content, impose timeouts for repeat offenders, diversify content recommendations, and de-monetize misogynistic material. Second, stalking and coercive control, addressed by bundling privacy features, reinforcing account security, enhancing visibility restrictions, and removing default geolocation tracking. Third, image-based sexual abuse demands the use of automated hash-matching technologies to detect non-consensual intimate images, blurring nudity with override options, and providing users with resources for support and crime reporting. Fourth, coordinated harassment requires empowering users to block multiple accounts swiftly and deploying advanced tools for managing abuse reports effectively.

Political and Regulatory Endorsement
Technology Secretary Liz Kendall emphasized the technical capacity of tech companies to combat online misogyny and warned that their failure to act would make them complicit in fostering hostile environments. Ofcom’s Chief Executive Dame Melanie Dawes highlighted the devastating impact on survivors, recounting how a single non-consensual image can profoundly damage an individual’s sense of safety and identity.

Enhanced Regulatory Oversight
This guidance represents a significant escalation over the statutory requirements set out in the Online Safety Act 2023, positioning Ofcom to rigorously monitor and publicly evaluate platforms’ adherence. By extending beyond legal mandates, the regulator aims to drive systemic change in how technology companies address online harms targeting women and girls.

The commitment signals an urgent call for the tech industry to prioritize user safety and accountability, reflecting growing societal and political demands for safer digital spaces. As implementation proceeds, the effectiveness of these measures will be closely scrutinized by stakeholders and the public alike.