By Charlotte Webster-
Ofcom will “name and shame” online platforms that fail to adequately protect women and girls from online abuse, including sexism and misogyny. The regulator is issuing guidance rather than an outright “ban” on sexism itself, but it can use its significant enforcement powers under the Online Safety Act against related illegal content.
The guidance was developed following surveys from victims, survivors, safety experts, women’s advocacy groups and organisations working with men and boys.
Research found female footballers were 29 per cent more likely to be targeted by online abuse than their male counterparts during recent World Cups.
The measures regarding sexism and misogynistic content are part of new industry guidance and not, for the most part, new laws in themselves (much of the content is not illegal). However, the guidance sets a new standard for online safety that platforms are expected to follow.
Ofcom Chief Executive Dame Melanie Dawes has stated that the regulator will publicly report on individual companies’ progress (or lack thereof) starting in summer 2027, which will effectively “name and shame” those falling short.
The regulator has urged technology firms to “step up” in their efforts to tackle trolling and toxic online abuse in new guidance – but some organisations working to protect women and girls have complained it might not work unless it is made mandatory.
While the guidance on misogyny is not a legal requirement, it complements the legally binding duties under the Online Safety Act 2023. Ofcom has the power to issue substantial fines (up to 10% of global annual revenue) or potentially block access to services in the UK for non-compliance with the Act’s legal requirements, such as tackling illegal content (e.g., intimate image abuse, stalking, and harassment). The guidance focuses on specific harms disproportionately affecting women and girls like online misogyny and content that normalises sexual violence.
Intimate image abuse, including non-consensual sharing and AI-generated explicit content.
The regulator has urged technology firms to “step up” in their efforts to tackle trolling and toxic online abuse in new guidance – but some organisations working to protect women and girls have complained it might not work unless it is made mandatory. Ofcom said the guidance goes beyond legal duties under the Online Safety Act, includes encouraging firms to bring in prompts asking users to reconsider before posting harmful content; imposing “timeouts” for users who repeatedly target victims; limiting the number of comments or posts a person can make on one account to help prevent mass posting of abuse in so-called pile-ons; and allowing users to quickly block or mute multiple accounts at once.
Nearly 70 per cent of boys aged 11-14 have been exposed to online content that promotes misogyny and other harmful views, and 73 per cent of Gen Z social media users have witnessed misogynistic content online. Research also found that the Revenge Porn Helpline found 98 per cent of intimate images reported were of women, and 99 per cent of deep fake intimate image abuse depicted women. The fresh measures urge tech companies enforce limits on the number of responses to posts on platforms such as X, in a move that Ofcom hopes will reduce pile-ons, where individual users are deluged with abusive replies to their posts.
‘People have got to feel their kids are safe’: Liz Kendall has urged the UK’s internet regulator to fully use its powers.utomatically set. (Photo by Matt Cardy/Getty Images)
Ofcom at risk of losing public trust over online harms, says Liz Kendall. Other measures raised by Ofcom include platforms using a database of images to protect women and girls from the sharing of intimate images without the subject’s consent – often referred to as “revenge porn”.
Sport England and WSL Football have welcomed the guidance, calling for better protection for sportswomen on social media. The regulator has urged technology firms to “step up” in their efforts to tackle trolling and toxic online abuse in new guidance – but some organisations working to protect women and girls have complained it might not work unless it is made mandatory.
Dame Melanie Dawes, Ofcom’s chief executive, said: “When I listen to women and girls who’ve experienced online abuse, their stories are deeply shocking. Survivors describe how a single image shared without their consent shattered their sense of self and safety. Journalists, politicians and athletes face relentless trolling while simply doing their jobs.
“No woman should have to think twice before expressing herself online, or worry about an abuser tracking her location.
“That’s why today we are sending a clear message to tech firms to step up and act in line with our practical industry guidance, to protect their female users against the very real online risks they face today.
“With the continued support of campaigners, advocacy groups and expert partners, we will hold companies to account and set a new standard for women’s and girls’ online safety in the UK.”



