Boys are being targeted with harmful content
Demand a safer internet now
Online content promoting sexist views is on the rise. Often packaged up with aspirational lifestyle goals and pseudo self-help material, it amplifies a negative and narrow view of masculinity that promotes being rich, powerful, in great shape and in control of others.
And it’s being aimed at young teen and tween boys.
One-in-10 boys aged 11-14 are seeing harmful content online in as little as 60 seconds and over half (52%) are familiar with content from influencers with ties to the manosphere (a term used to describe the network of online communities promoting negative, often misogynistic content). Worryingly over a third of these (39%) state they have a positive view of these influencers.
This isn’t what we raise our boys to believe. When tolerant, compassionate boys begin endorsing offensive, sexist views, something fundamental is going wrong.
"69% of boys have been served content that is negative about women and girls without looking for it"
It’s all too easy to blame ourselves as parents, or blame the boys, but the root of the problem is the lack of safety built into the online platforms that boys frequently engage with.
Algorithms powered by Artificial Intelligence (AI) predict what each of us find most ‘engaging’ and keep dishing it up to ensure we get hooked, maximising social media ad revenue in the process. What’s most ‘engaging’ is often what’s controversial, conspiratorial or even violent. Manosphere influencers know this and hone their extreme content to ensure algorithms put it in front of a young and impressionable audience, which over time desensitizes them to the negative views they’re witnessing.
Being online is a vital part of everyday life and every child deserves to enjoy this in a safe and happy way, exploring the exciting opportunities and benefiting from everything the internet has to offer.
Not enough is being done to protect our children and young people from toxic online content.
This has to change.
The only way to dial down misogyny and negative masculinity online is to rein in and reform the systems that recommend it. The opportunity to build safety into the design of platforms that use algorithms powered by AI has not passed. It is vital we take steps to ensure these are ‘safe by design.’
The next few months will determine how new online safety laws are interpreted, but the companies who profit from these algorithms will be lobbying to maintain the status quo.
That’s why we’re calling on digital regulators to stay strong and compel companies to make their products safe by design.
We can change the online experience for our children and future generations to come.
What are we asking Ofcom to do?
In October 2023, The Online Safety Bill became law as The Online Safety Act. It's now up to Ofcom to draw up codes and guidance to inform the new child safety duties under the Act. We're urging them to prioritise compelling online platforms to assess and mitigate the risk of harm to children caused by their features and functionalities.
Dear Ofcom
The rise of openly misogynistic influencers online and their hold over young boys and men is a deeply worrying trend. More than half of boys (52%) that have heard of influencers with ties to the manosphere have watched or liked content from them. Worryingly, over a third (39%) state they have a positive view of these influencers.
Social media and other online platforms incentivise this type of extreme content because it is ‘engaging’, just like conspiratorial, abusive, or violent content. Influencers in the ‘manosphere’ know this and exploit algorithmic and other design features to ensure their content gets vast reach.
The root of the problem is the lack of safety built into the online platforms that boys frequently engage with. Not enough is being done to protect our children and young people from toxic online content.
The only way to dial down misogyny and negative masculinity online is to rein in and reform the systems that recommend it. The Online Safety Act is a huge opportunity to address this vicious cycle.
As you draw up codes and guidance to inform the new child safety duties under the Act, we urge you to prioritise compelling online platforms to assess and mitigate the risk of harm to children caused by the features and functionalities of a service, as set out in sections 11(6)(b) and 12(8)(b) of the Act.
There will always be unsafe content online, and unsafe content will always be ‘engaging’. That’s why regulators must ensure platforms design their services to prioritise safety ahead of engagement, in line with the ‘safety by design’ aspiration on the face of the Online Safety Act.
Yours sincerely,
The undersigned

Vodafone is supporting Global Action Plan's petition, asking regulators to ensure tech platforms prioritise user safety in the design of their projects and services. Vodafone has create a hub of online resources to support parents and children in their conversations regarding online safety.
Find out more about why Global Action Plan are campaigning to create Safer Socials.