Venus Bleeds stands in his living room, gesturing wildly at his phone screen while images and headlines about Gaza flash behind his head.
A Lebanese musician based in Paris, his green screen-style videos commenting on the Hamas-Israel war have gone viral on TikTok. Several have amassed between half a million to 800,000 views.
“I'm angry, distressed, terrified…and I'm just sharing that,” he told The Feed.
Venus Bleeds is a Lebanese musician whose videos about the Hamas-Israel war are gaining traction on TikTok. Source: Supplied
A few of his videos have been banned on TikTok – leading him to suspect social media companies may be hiding his pro-Palestinian content.
“I see a lot of people telling me in the comments, “Your videos are not showing up on my ‘For You’ page,” he said.
Venus Bleeds says a few of his videos were automatically removed after receiving strikes from TikTok, but most have been restored. Source: Supplied
“What's terrifying for us on the internet is that as much as we thought we had decentralised power, we are just puppets in their algorithms.”
What is shadowbanning?
Shadowbanning is when social media companies hide people or content, or reduce the ability to find them – without notifying the affected person.
If you’ve ever typed the name of a user into a search bar and couldn’t find them in the results, or their posts are showing up less frequently in your feed, they may have been shadowbanned.
Marten Risius, an expert in content moderation at the University of Queensland, said very few platforms will admit to this practice.
“It's a very politically charged term, such as ‘fake news’…what they will say is they do some type of…’visibility reduction’.”
Marten Risius researches content moderation. Source: Supplied
He said social media companies regularly tweak their algorithms during times of crisis. The algorithms detect unwanted content by scanning posts for keywords and comparing images against a database of known problematic content, such as terrorist content.
Venus Bleeds, who maintains his Instagram stories did not breach community guidelines, has noticed a drop-off in engagement ever since he started speaking out about the conflict.
His Stories, which usually get between 200 to 300 views, fell to 100 views.
Other users have spoken out about their Stories potentially being deprioritised – like this makeup influencer with over 250,000 followers.
This popular influencer noticed their Instagram Stories were receiving fewer views than usual after posting about the war instead of makeup content. Credit: Instagram
As social media companies are forced to make rapid decisions with huge amounts of data, Risius said the algorithms sometimes don’t work as intended.
“They might go ahead and just say, ‘Okay, well what are posts that oftentimes are associated with harmful…content with victims and hate and whatnot?’ And then they might say, well, those are…pro-Palestinian content.”
This video was restored after TikTok agreed it did not violate community standards. Source: Supplied
He said content moderation is imperfect by nature.
“If you give them two months to make that decision…then they might be able to come up with a very, very bulletproof and equal opportunity algorithm.”
Venus Bleeds is concerned social media algorithms may be skewing online discourse.
“[There is a] delusion that creators online have the freedom to speak about whatever they want, when in reality it's a biased algorithm,” he said.
“It's making people believe that they're getting real information from the internet, from credible sources. But in reality, it's not really that.”
Meta and TikTok deny censorship
A spokesperson for Meta, which owns and operates Facebook and Instagram, said the company is trying to stop the spread of harmful content, not targeting pro-Palestinian viewpoints.
“Our policies are designed to keep people safe on our apps while giving everyone a voice. We apply these policies equally around the world and there is no truth to the suggestion that we are deliberately suppressing voice,” the spokesperson told The Feed.
“However, content containing praise for Hamas, which is designated by Meta as a Dangerous Organisation, or violent and graphic content, for example, is not allowed on our platforms.”
Meta has blamed lower Instagram Stories views on a recent bug, which stopped Stories from showing up properly.
“This bug affected accounts equally around the globe – not only people trying to post about – and it had nothing to do with the subject matter of the content,” the company states on its website.
TikTok has also rejected the idea that it is censoring pro-Palestinian content.
“We absolutely deny this, we moderate based on our Community Guidelines,” a spokesperson told The Feed.
“Since the brutal attack on October 7, we've continued working diligently to remove content that violates our guidelines. To-date, we've removed over 500,000 videos and closed 8,000 livestreams in the impacted region for violating our guidelines.”
Both Meta and TikTok said they’ve recently stepped up measures to remove content that promotes violence, hate and misinformation.
Political bias on social media
Tim Graham is an associate professor in digital media at the Queensland University of Technology, who’s studied online disinformation during the Russia-Ukraine war.
He’s been observing the social media discourse on the Hamas-Israel war (though he’s yet to do an in-depth study.)
“I get the sense that Israel is being privileged in the digital discourse, but it's not clear to what extent that is just public sentiment and the fact that I look at Western platforms,” he said.
Graham said a skew is particularly evident on X (formerly known as Twitter), where users who pay for a premium subscription are given a boost in engagement.
“So many of these verified accounts often happen to be pro-US,” he said.
“Much of what gets to the top of the list, what gets filtered into the "For You" feed on X…does tend to be pro-Israel, possibly partly because there's so many hundreds of thousands of pro-MAGA Republican boutique accounts.”
Graham doesn’t think the US government actively interferes with US-based platforms like X.
But he said the discourse on X has changed ever since it was bought last year by Elon Musk, who has “unilateral, non-transparent, unaccountable control” over the platform.
“Even though I don't think there's is regulatory or state intervention in moderation, there is a political intervention.”
Omar says he won’t stop posting videos about the situation in Gaza.
“This isn't some…game on virality. This is really about a really serious issue that a lot of people are getting damage from,” he said.
“This will affect the entire world for the future.”