This musician says his pro-Palestinian posts were banned. Is social media being censored?

As the conversation around the Hamas-Israel war dominates social media, some content creators suspect they’re being deprioritised. But Meta and TikTok say they’re just trying to stop the spread of misinformation.

A man holds a smartphone to his face, staring into it. To his left is a crowd of protestors holding Palestinian flags,  to his right an error message that reads Video Removed.

Some social media users suspect they are being shadowbanned for making pro-Palestinian content. Source: SBS

Venus Bleeds stands in his living room, gesturing wildly at his phone screen while images and headlines about Gaza flash behind his head.

A Lebanese musician based in Paris, his green screen-style videos commenting on the Hamas-Israel war have gone viral on TikTok. Several have amassed between half a million to 800,000 views.

“I'm angry, distressed, terrified…and I'm just sharing that,” he told The Feed.
A young man in a houndstooth-patterned vest stares solemly into camera
Venus Bleeds is a Lebanese musician whose videos about the Hamas-Israel war are gaining traction on TikTok. Source: Supplied
“I am not promoting violence…just stating and giving my honest feelings about it.”

A few of his videos have been banned on TikTok – leading him to suspect social media companies may be hiding his pro-Palestinian content.

“I see a lot of people telling me in the comments, “Your videos are not showing up on my ‘For You’ page,” he said.
A screenshot of three notification messages on a phone. Two of them state "new strike added".
Venus Bleeds says a few of his videos were automatically removed after receiving strikes from TikTok, but most have been restored. Source: Supplied
Venus Bleeds believes he and other content creators are also being 'shadowbanned' on Instagram – with far fewer people viewing their stories than usual.

“What's terrifying for us on the internet is that as much as we thought we had decentralised power, we are just puppets in their algorithms.”

What is shadowbanning?

Shadowbanning is when social media companies hide people or content, or reduce the ability to find them – without notifying the affected person.

If you’ve ever typed the name of a user into a search bar and couldn’t find them in the results, or their posts are showing up less frequently in your feed, they may have been shadowbanned.

Marten Risius, an expert in content moderation at the University of Queensland, said very few platforms will admit to this practice.

“It's a very politically charged term, such as ‘fake news’…what they will say is they do some type of…’visibility reduction’.”
A man in a blue suit smiles into the camera, behind him is an out-of-focus sandstone building
Marten Risius researches content moderation. Source: Supplied
Risius said there are good reasons behind this type of moderation - including protecting users from spam, misinformation and harmful communities.

He said social media companies regularly tweak their algorithms during times of crisis. The algorithms detect unwanted content by scanning posts for keywords and comparing images against a database of known problematic content, such as terrorist content.

Venus Bleeds, who maintains his Instagram stories did not breach community guidelines, has noticed a drop-off in engagement ever since he started speaking out about the conflict.

His Stories, which usually get between 200 to 300 views, fell to 100 views.

Other users have spoken out about their Stories potentially being deprioritised – like this makeup influencer with over 250,000 followers.
A screenshot showing a person's Instagram story with 14.1k views and another with 16.5k views.
This popular influencer noticed their Instagram Stories were receiving fewer views than usual after posting about the war instead of makeup content. Credit: Instagram
Venus Bleeds has appealed his banned TikTok videos and most have been restored, after TikTok agreed they did not breach community guidelines.

As social media companies are forced to make rapid decisions with huge amounts of data, Risius said the algorithms sometimes don’t work as intended.

“They might go ahead and just say, ‘Okay, well what are posts that oftentimes are associated with harmful…content with victims and hate and whatnot?’ And then they might say, well, those are…pro-Palestinian content.”
A screenshot of a message stating "content restored", with details of a TikTok video below.
This video was restored after TikTok agreed it did not violate community standards. Source: Supplied
“They might not make that explicit decision, but just by virtue of them tweaking the algorithm, that kind of content by association might be systematically demoted.”

He said content moderation is imperfect by nature.

“If you give them two months to make that decision…then they might be able to come up with a very, very bulletproof and equal opportunity algorithm.”

Venus Bleeds is concerned social media algorithms may be skewing online discourse.

“[There is a] delusion that creators online have the freedom to speak about whatever they want, when in reality it's a biased algorithm,” he said.

“It's making people believe that they're getting real information from the internet, from credible sources. But in reality, it's not really that.”

Meta and TikTok deny censorship

A spokesperson for Meta, which owns and operates Facebook and Instagram, said the company is trying to stop the spread of harmful content, not targeting pro-Palestinian viewpoints.

“Our policies are designed to keep people safe on our apps while giving everyone a voice. We apply these policies equally around the world and there is no truth to the suggestion that we are deliberately suppressing voice,” the spokesperson told The Feed.

“However, content containing praise for Hamas, which is designated by Meta as a Dangerous Organisation, or violent and graphic content, for example, is not allowed on our platforms.”

Meta has blamed lower Instagram Stories views on a recent bug, which stopped Stories from showing up properly.
“This bug affected accounts equally around the globe – not only people trying to post about – and it had nothing to do with the subject matter of the content,” the company states on its website.

TikTok has also rejected the idea that it is censoring pro-Palestinian content.

“We absolutely deny this, we moderate based on our Community Guidelines,” a spokesperson told The Feed.

“Since the brutal attack on October 7, we've continued working diligently to remove content that violates our guidelines. To-date, we've removed over 500,000 videos and closed 8,000 livestreams in the impacted region for violating our guidelines.”

Both Meta and TikTok said they’ve recently stepped up measures to remove content that promotes violence, hate and misinformation.

Political bias on social media

Tim Graham is an associate professor in digital media at the Queensland University of Technology, who’s studied online disinformation during the Russia-Ukraine war.

He’s been observing the social media discourse on the Hamas-Israel war (though he’s yet to do an in-depth study.)

“I get the sense that Israel is being privileged in the digital discourse, but it's not clear to what extent that is just public sentiment and the fact that I look at Western platforms,” he said.

Graham said a skew is particularly evident on X (formerly known as Twitter), where users who pay for a premium subscription are given a boost in engagement.

“So many of these verified accounts often happen to be pro-US,” he said.

“Much of what gets to the top of the list, what gets filtered into the "For You" feed on X…does tend to be pro-Israel, possibly partly because there's so many hundreds of thousands of pro-MAGA Republican boutique accounts.”
Graham doesn’t think the US government actively interferes with US-based platforms like X.

But he said the discourse on X has changed ever since it was bought last year by Elon Musk, who has “unilateral, non-transparent, unaccountable control” over the platform.

“Even though I don't think there's is regulatory or state intervention in moderation, there is a political intervention.”

Omar says he won’t stop posting videos about the situation in Gaza.

“This isn't some…game on virality. This is really about a really serious issue that a lot of people are getting damage from,” he said.

“This will affect the entire world for the future.”

Share
Through award winning storytelling, The Feed continues to break new ground with its compelling mix of current affairs, comedy, profiles and investigations. See Different. Know Better. Laugh Harder. Read more about The Feed
Have a story or comment? Contact Us

Through award winning storytelling, The Feed continues to break new ground with its compelling mix of current affairs, comedy, profiles and investigations. See Different. Know Better. Laugh Harder.
Watch nowOn Demand
Follow The Feed
7 min read
Published 21 October 2023 6:36am
Updated 25 October 2023 11:41am
By Jennifer Luu
Source: SBS


Share this with family and friends