Search Icon

< Back

How the Big Tech business model turbo-charges the misogynistic manosphere
(06/12/23)

Natasha Parker, Director of Online Climate

Estimated reading time 5 minutes

It’s a year since online influencer Andrew Tate was first arrested in Romania. Since then, there’s been a shocking surge in the volume of insidious and outright misogynistic content that children are exposed to online. Misogyny is nothing new, but the power of Big Tech algorithms to spread extreme content is.


Andrew Tate is an ex-Big Brother contestant-turned-online-influencer recently charged with rape and human trafficking in Romania, who is famous for his overtly misogynistic views. But you might not realise just how far-reaching his influence has become. A 2023 survey by Hope Not Hate found that 8 in 10 boys between the ages of 16 and 17 have seen material from Andrew Tate. That’s more than had heard of Prime Minister Rishi Sunak.


Andrew Tate isn’t just well known, he’s also well liked. Almost half (45%) of 16 to 24-year-old young men had a positive opinion of him and when asked why they like him, most said they thought Mr Tate “wants men to be real men” or that “he gives good advice”. Shockingly, a survey by Internet Matters found that over half of younger dads have a positive impression of the influencer too.


Tate is part of a wider “manosphere” of influencers proliferating across social media. Emboldened by Tate’s success, a slew of podcasts and influencers have sprung up, united by a misogynistic, male supremacist worldview, in which men must defend their right to power against the feminisation of society. This content ranges from old-fashioned opinions on masculinity and gender roles, to outright calls for violence against women and often strays into racist, homophobic, and anti-trans messages.


Boys and young men don’t usually enter the manosphere because of a hatred of women. They find it searching for advice about fitness, nutrition, finances, or dating. They click on eye-catching content promoting an aspirational lifestyle of money, muscles, girls, and fast cars – but algorithms quickly start serving them more harmful content.

Tate-filled feeds


A recent Observer investigation set up a new account as a teenage boy, and revealed how TikTok’s algorithms push misogynistic content to young men. Without “liking” or searching for any content proactively, the account was served videos of Andrew Tate after simply watching content aimed at men, including videos about men’s mental health and comedy clips.


After watching two of Tate’s less extreme videos, the account was recommended more, including clips of him expressing misogynistic views. The next time the account was opened, the first four posts were by Tate. When opening the app again a week later, the account was flooded with Tate content, with eight out of the first 20 videos being of him. This active promotion of his content through algorithms helps to explain why videos of Tate have been watched over 11.6 billion times.


Tate’s content, and content like it in the manosphere, poses profound risks to the boys and young men who consume it. For example, Tate’s dangerous claims that ‘real men don’t cry’, that mental illness makes people ‘weak’, that depression ‘isn’t real’, and that real men don’t eat pose a real threat to boys’ mental and physical health.

Wider harms


The proliferation of this content is harming girls and women too. The 2023 Girlguiding survey found that the number of 13–21-year-old girls who have received sexist comments online has more than doubled since 2018 (57% compared to 24%). And online misogyny is spilling offline too. At school, 69% of girls said boys have made comments about girls and women that they would describe as ‘toxic’. More than two in five girls (44%) revealed boys at their school have made comments about girls and women that have made them feel scared for their safety.


Female teachers are also feeling the consequences. A recent survey of more than 1,500 female teachers showed that 72% had been a victim of misogyny in their school.


In a House of Lords debate about the new Online Safety Act, Baroness Kidron said that the impact of algorithmic recommendations had "rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. "To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents."


So, what can we do?


Ofcom have recently been charged with new powers to enforce the Online Safety Act. In Spring 2024 they will be consulting on measures to protect children from encountering harmful content.


Global Action Plan will argue that such measures must include protection from algorithmic recommender systems that are driving young people to engage with content, such as misogynistic content, that is causing real harms both on-and-offline. Join our fight against the harms of social media to protect ourselves and our children.

Why Global Action Plan is working on misogyny


Global Action Plan works to make the connection between what is good for our lives and what is good for the planet. Misogyny is not only harmful to people, but harmful to the planet through the consumerist and unequal values it espouses. Social media needs to be safer by design for the sake of our lives and our planet.


Sign up to hear more about Global Action Plan’s Safer Socials campaign, and get involved in collective action to hold Big Tech to account. 

We want to hear from families whose children have been affected by the harms of online misogyny. Our campaigning work in this area is gathering pace and we want decision makers to hear how the lives of young people are being impacted today. Please help us change the system by sharing your story, and email us on [email protected]