
London, November 27, 2025
Ofcom has announced a firm commitment to enforce the Online Safety Act 2023 by publicly naming platforms that fail to protect women and girls from online sexism and gender-based harms across the UK. This new regulatory approach aims to hold tech firms accountable and drive improved safety standards.
Ofcom’s regulatory guidance demands that technology companies adopt practical and robust measures to tackle misogynistic abuse, stalking, image-based sexual abuse, and coordinated harassment on their platforms. These measures include introducing prompts that encourage users to reconsider posting harmful content, imposing timeouts on repeat offenders, and removing financial incentives for misogynistic materials. Additionally, platforms are urged to improve their algorithms to promote diverse, non-toxic content and reduce harmful echo chambers.
Addressing stalking and coercive control, Ofcom highlights the need for enhanced privacy controls such as default private account settings and the removal of features like geolocation, combined with stronger account security safeguards. To combat image-based sexual abuse, companies are expected to deploy automated tools that detect and remove non-consensual intimate images promptly, alongside blurring nudity by default and providing clear victim-reporting pathways.
Central to Ofcom’s approach is a five-point action plan focusing on rigorous legal enforcement, amplifying survivors’ experiences, and increasing platform transparency and accountability. This includes publicly identifying and “naming and shaming” platforms that do not comply with the mandated safety standards. Such transparency is intended to pressure companies into prioritizing user protection, particularly for women and girls disproportionately affected by online harms.
The government and advocacy groups have voiced strong support for Ofcom’s steps, emphasizing that technology firms possess the technical capabilities and a responsibility to curb online sexism and abuse. There is an increasing consensus that failure to act effectively amounts to complicity, contributing to the normalization of misogynistic behaviours and hostile digital environments.
This significant regulatory development marks a new standard in online safety for women and girls in the UK, reflecting the experiences and needs expressed by survivors of online abuse. With a combination of transparency, accountability, and enforcement underpinned by the Online Safety Act, Ofcom aims to transform the digital landscape into a safer space, compelling tech companies to adopt pro-active and sustained measures for harm prevention.
As platforms adjust to these requirements and Ofcom enforces this framework, the initiative could serve as a model for international efforts addressing online gender-based abuse, highlighting the crucial intersection of regulatory power, technological innovation, and human rights protections.

