Ofcom has launched the final version of its industry guidance aimed at facilitating a safer online experience for millions of women and girls in the UK. It includes a range of practical safety measures. Ofcom has written to sites and apps with its expectation that they start to take immediate action in line with the guidance. It will also publish a future report about how individual companies respond. It is worth noting that this is a voluntary code, and does not have statutory force. The guidance includes case-studies and focuses on four main areas of harm:
Misogynistic abuse and sexual violence
This includes content that spreads hate or violence against women, or normalises sexual violence, including some types of pornography. It can be both illegal or harmful to children and is often pushed by algorithms towards young men and boys. Under the guidance, tech firms should consider:
- introducing “prompts” asking users to reconsider before posting harmful content;
- imposing “timeout” for users who repeatedly attempt to abuse a platform or functionality to target victims;
- promoting diverse content and perspectives through their recommender ‘for you’ systems to help prevent toxic echo chambers; and
- de-monetising posts or videos which promote misogynistic abuse and sexual violence.
Pile-ons and coordinated harassment.
This happens when groups gang up to target a specific woman or group of women with abuse, threats, or hate. Such content may be illegal or harmful to children and often affects women in public life. Tech firms should consider:
- setting volume limits on posts (“rate limiting”) to help prevent mass-posting of abuse in pile-ons;
- allowing users to quickly block or mute multiple accounts at once; and
- introducing more sophisticated tools for users to make multiple reports and track their progress.
Stalking and coercive control
This covers criminal offences where a perpetrator uses technology to stalk an individual or control a partner or family member. Tech firms should consider:
- bundling’ safety features to make it easier to set accounts to private;
- introducing enhanced visibility restrictions to control who can see past and present content;
- ensuring stronger account security; and
- remove geolocation by default.
Image-based sexual abuse.
This refers to criminal offences involving the non-consensual sharing of intimate images and cyberflashing. Under the guidance, tech firms should consider:
- using automated technology known as ‘hash-matching’ to detect and remove non-consensual intimate images;
- blurring nudity, giving adults the option to override;
- signposting users to supportive information including how to report a potential crime.
More broadly, Ofcom expects tech firms to subject new services or features to ‘abusability’ testing before they roll them out, to identify from the outset how they might be misused by perpetrators. Moderation teams should also receive specialised training on online gender-based harms. In addition, Ofcom expects companies to consult with experts to design policies and safety features that work effectively for women and girls, while continually listening and learning from survivors’ and victims’ real-life experiences, for example, through user surveys.
What happens now?
Ofcom is setting out a five-point action plan to drive change and hold tech firms to account in creating a safer life online for women and girls. It will:
- Enforce services’ legal requirements under the Online Safety Act: it will use its powers to make sure that platforms meet their duties in tackling illegal content, such as intimate image abuse or material which encourages unlawful hate and violence.
- Strengthen industry codes: as changes to the law are made, Ofcom will strengthen its illegal harms industry Codes measures. For example, it is already consulting on measures requiring the use of hash-matching technology to detect intimate image abuse and the Codes will also be updated to reflect cyberflashing becoming a priority offence, next year.
- Drive change through close supervision: Ofcom has written an open letter to tech firms as the first step in a period of engagement to ensure they take practical action in response to the guidance. It plans to meet with companies in the coming months to emphasise its expectations and will convene an industry roundtable in 2026.
- Publicly report on industry progress to reduce gender-based harms: Ofcom will report in the summer of 2027 on progress made by individual providers, and the industry as a whole, in reducing online harms to women and girls. If their action falls short, it will consider making formal recommendations to the government about where the Online Safety Act may need to be strengthened.
- Champion lived experience: Ofcom will continue its ongoing research and engagement programme.
EU developments
Separately, the European Commission has called for greater efforts to end all forms of sexual and gender-based violence and specifically mentioned its Directive combatting violence against women from 2024, which explicitly criminalises the most widespread forms of gender-based online violence such as sharing intimate images without consent, deepfakes, cyberstalking, online harassment, and online incitement to violence and hatred based on gender. It also aims to make sure that publicly accessible illegal content online is promptly removed. For survivors of any form of violence against women and domestic violence, the law provides stronger protection and support. EU Member States are urged to transpose this Directive into national law swiftly.