Andrew Tate has been banned from social media. But its harmful content still reaches young men

[ad_1]

Last month, controversial influencer Andrew Tate was banned from several social media platforms for violating their policies.

But nearly two weeks after those bans began, platforms are still awash with clips of Tate making derogatory comments about women – underscoring what some media pundits suggest is part of a dangerous system whose algorithm can being manipulated to radicalize young men into holding prejudiced views against women and the LGBTQ community.

And as the Tate case shows, banning controversial figures may actually make the problem worse.

Tate, a former kickboxer, rose to prominence after appearing on the British reality show Big brother in 2016. He was removed from the show when a video of him appearing to assault a woman with a belt was made public. Tate said the incident was consensual.

Recently, it has gone viral for sound clips shared on platforms like TikTok. These clips feature Tate, often wearing sunglasses without a shirt, making offensive comments about women. A notable example includes clips of Tate saying that if a woman dates a man, she « belongs » to him. In another clip, Tate suggested that women in relationships who have their own social media accounts are cheating.

In a video posted to Vimeo on August 23, Tate responded to the bans by saying he had been « unfairly vilified » and that his comments had been taken out of context.

Tate did not respond to a request for comment from CBC News.

From harmless memes to full-fledged misogyny

Content like Tate’s often starts out in a way that seems relatively innocuous, but then slowly becomes more harmful, says Joanna Schroeder, a writer whose work focuses on gender and media representation.

For example, she says, young boys often visit sites like YouTube to search for videos related to Minecraft, a very popular video game. But YouTube’s algorithm will often guess their age and gender – and Schroeder says it could then send them harmful content.

WATCH | Algorithms and their agenda:

How Algorithms Target Young Men

Joanna Schroeder, a writer who focuses on gender and media, explains why social media algorithms target young men and how it can impact what they watch online.

« There are people who want to target that demographic who are starting to show them content that’s getting more and more racy. »

Schroeder says Tate’s appeal is partly due to the way his opinions are framed. The idea that what he says is an « unpopular opinion that no one else will say out loud » might suggest to a young person that it has value, she says.

And since “edgy” content often comes across as something a younger demographic should consider normal – or even find funny – it slowly becomes problematic.

An example of this is the Pepe the Frog meme, something that started out as a harmless cartoon frog and turned into a symbol of hate.

Pepe the Frog started out as an apolitical meme but was later adopted by the alt-right movement. (Wikipedia)

It started out as a popular apolitical meme on sites like Myspace and 4chan in the 2000s. But as its popularity grew, it was appropriated by the alt-right movement.

Schroeder says Pepe began representing « anti-gay » and « anti-women » sentiments. And she says teenagers might initially perceive memes as jokes, but over time it can influence how and what young people think.

And clips like Tate’s are a common way to radicalize people, says Ellen Chloë Bateman, a documentary and podcast producer who has studied online radicalization among young men and the incel subculture.

Violence against women is normalizing, she says, becoming embedded in young men’s psyches through images and memes, in what she calls a « culture of intense competition and one-upmanship ».

Schroeder says it can often be seen on TikTok. Videos containing clips from creators like Tate will also often share a screen showing gaming videos like Minecraft Where Call of Duty to try to keep teenagers engaged.

This screenshot features a TikTok video from controversial creator Sneako paired with Minecraft gameplay. The creators try to attract the attention of young men and teenagers by combining their clips with video games. (@sneako.talks/TikTok)

At this point, she says, some social media algorithms notice the high levels of user engagement — and then start feeding them more « overtly racist » content.

“Algorithms push content that is often extreme. Extreme views, hateful views get a lot of traction on places like YouTube…and TikTok,” Schroeder says.

Enter the « manosphere »

The parts of the internet where these memes, and often outright racist or misogynistic content, circulate is a place Bateman calls the « manosphere. »

She describes it as a space where « men’s rights activists, male separatists, nihilists, sexual predators and trolls – who often share their membership with neo-Nazi and alt-right groups – congregate. »

WATCH | The ‘manosphere’: Where incels, trolls and neo-Nazis meet:

What is the « manosphere »?

Ellen Chloë Bateman, a documentary and podcast producer, breaks down what’s known as the « manosphere, » an area of ​​the internet where extremist groups often congregate and target young men.

“What unites them all is an extreme anti-feminist worldview,” says Bateman.

And alt-right groups often use this space to target young, impressionable men, she says.

Where do social media bans come from?

Social media companies say they are actively working to remove this type of content – as studies have shown that online hate speech is correlated with an increase in physical violence and hate crimes.

In Tate’s case, TikTok, Facebook and Instagram have removed its content.

A TikTok spokesperson said « misogyny is a hateful ideology that is not tolerated on TikTok, » and it continues to investigate other accounts and videos that violate its policies.

The spokesperson also said that TikTok was looking for ways to « strengthen enforcement » against this type of harmful content.

This includes partnering with UN Women and other non-governmental organizations seeking to end violence against women and girls to launch a new in-app hub to educate users about gender-based violence.

Bateman says such partnerships are essential for social media spaces to become safer and more educational, especially for young people.

Twitter has also taken action against controversial creators. The platform has issued temporary bans to creators like Jordan Peterson, Matt Walsh and Steven Crowder. (Each creator was then allowed to return to the app.)

But Schroeder says bans can sometimes be counterintuitive. In Tate’s case, it may have, in some ways, actually helped him.

« The bans only draw more attention to him, » she said. « It gave him a really big microphone. »

Look to other platforms

Bateman agrees, pointing out that these creators often find new apps, like Reddit, Gab, Telegram, and Discord, to post their content.

She says some of these platforms are actually harder to monitor because of their closed group structures or registration requirements, which make it more difficult to study and track content. An incel subculture site, which promotes misogyny and violence, has more than 17,000 users, she found.

« It’s such a complicated online world. It’s fluid…it’s moving. It’s spreading and these groups are basically interconnecting in one big cesspool of hate. »

[ad_2]
cbc

Back to top button