An artwork showing a megaphone, a mobile phone and logos of the major social media platforms.
An artwork showing a megaphone, a mobile phone and logos of the major social media platforms.
12 min read

'Extremely politicised': How 'very worrying' Voice misinformation spreads online

There are concerns about a breakdown in public debate around the Voice, as X faces questions over removing a reporting function for misinformation. How did we get here?

Published 8 October 2023 3:57pm
By Emma Brancatisano
Source: SBS News
Image: An expert has warned of a "large volume" of misinformation and disinformation circulating during the Voice referendum campaign. (SBS News)
Timothy Graham has analysed mis- and disinformation around the world - from Russia, to the United States and Australia - on topics ranging from vaccines to politics.

For Graham, an associate professor in digital media at Queensland University of Technology, the way the discourse surrounding has evolved is “very worrying”.

"It is extremely politicised in a way that I haven’t seen for other events like this," he told SBS News.

On 14 October, Australians will vote in a referendum on whether to enshrine a Voice in the constitution.
In Graham’s view, what sets this campaign apart from other events such as elections is a "clear focus" on one issue.

"There’s such a complex and long-storied history behind this. It’s one of those events that really brings up all of the cracks and fissures, all of the injustices, inequalities that we see today," he said.

"Everything that is bubbling up historically behind this comes out all at once."
At the centre of it all, he says, citing critical disinformation studies, is race, which “is always a flash point for the spread of false and misleading information”.

"Unfortunately, for Australia and liberal democracies in general who contend with issues around First Nations representation, I think this is a really worrying sign that we cannot seem to be having a reasoned, informed, civil public debate about how we address these issues - and that it seems to be breaking down so quickly."

So, how did we get here?

Misinformation, disinformation and 'a grey zone'

Misinformation is generally defined as false or misleading content or assertions that can be unintentionally spread.

Disinformation, on the other hand, is false or misleading content that is deliberately spread.

Ed Coper is the director of Populares, a communications agency that ran advertising for the major during the last federal election. He has also worked with GetUp and Change.org.

Coper recently advised the federal government on dealing with mis- and disinformation in the lead up to the referendum.
It is opening up a really big, complex grey area that is ultimately producing doubt, fear and uncertainty.
Timothy Graham
While it’s difficult to quantify in real terms, "there’s no doubt that on the Voice, we’re seeing an incredibly large volume of mis- and disinformation".

"It doesn’t take long, if you open up any social media platform and start looking for content on the Voice, to see that it appears the majority of it all looks very similar.

"It is all around the same types of lies and myths about the Voice."

Similar and repeated types of misinformation is a "telltale sign that these are organised networks," Coper said. "We do know that overall, the amount of sentiment on social media platforms around No is significantly higher than Yes."

Graham argues the online space becomes even more muddied by a “grey zone” of information that would not necessarily be called mis- or disinformation, but has a similar effect.

"It is opening up a really big, complex grey area that is ultimately producing doubt, fear and uncertainty," he said.

How Voice misinformation has spread - and been amplified - online

Reset.Tech Australia is a not-for-profit lobby group that pushes for "better policy" to address digital threats to democracy.

The organisation set out to test social media platforms' responses to mis- and disinformation in the lead-up to the referendum by looking at content moderation and advertising approvals.

Under the Australian Code of Practice on Disinformation and Misinformation, signatories - including Meta, TikTok and X - are obligated to "develop and implement measures which aim to reduce the propagation of and potential exposure of users to digital platforms of disinformation and misinformation".

According to a Reset.Tech Australia report, published in early September, content that is misleading around electoral processes presents a credible and serious threat to the integrity of Australia’s democratic processes and is considered misinformation under the Code.

Its mid-year "rapid" investigation found a sharp rise in electoral misinformation, according to its executive director, Alice Dawkins.

In July/August, the group analysed approximately 100 pieces of content that it says included false or misleading claims about Australia’s electoral process. The posts had accrued around 30,000 views across TikTok, X (formerly Twitter), and Facebook.

Just a month later, audience views of 50 posts with a "very specific narrative on electoral misinformation" had jumped to around 300,000.

"There has been a really concerning boom in this kind of content, and also the virality of its distribution, especially on platforms like Twitter/X and TikTok," Dawkins said.

Reset.Tech Australia will be publishing an updated content moderation report ahead of voting day.
Graham analysed more than 246,000 tweets about the Voice between March and May this year.

In a preprint paper published online last month, which is yet to be peer-reviewed, he found that activity on X was mainly driven by a "comparatively small but very active core of participants", with the top 100 accounts sending one in ten of all Voice-related tweets.

There were about five times the number of 'Yes' tweets to 'No' tweets. Graham found that Yes supporters made a "concerted push early on", with a higher volume of tweets compared to the No camp. But the discourse was "marked by misinformation and conspiracy theories stemming from Vote No campaigners and further amplified by attempts to criticise and fact-check it from the Vote Yes camp".

He found top tweets shared by the No campaign were "characterised by confusion and misinformation about the details of the proposed constitutional amendment" and a "focus on race and racial division to spread fear and uncertainty about the Voice".
Graham observed a "vicious feedback loop" whereby Yes proponents moved from their own campaigning for the referendum to "fighting the No campaign on their own front".

Politicians and media outlets played a role in shaping this discourse, he said, with platforms including X reinforcing and evolving narratives that "these elite actors promote".

'Urgent concerns' over Australians' ‘ability to report misinformation’ on X

X is also facing questions over appearing to remove a function allowing its users to report election misinformation.

On 27 September, Reset.Tech Australia published an open letter to the platform over "urgent concerns" the change had occurred in recent weeks.

The letter said the categories for reporting in Australia include hate speech, abuse, spam, and imitation - but users are now no longer able to select politics to flag misleading posts.

"It is extremely concerning that Australians would lose the ability to report serious misinformation weeks away from a major referendum," it said.

"The platform is just letting any sort of election misinformation run wild," Dawkins claimed.

X has been contacted for comment.
On TikTok, Dawkins said the group is noticing "potential signs" of unusual content distribution.

"There’s a small number of accounts we’ve tracked that are very new accounts sharing repurposed misinformation content, and that is getting huge spikes in shares in the very early days of that account," she said.

"It’s unusual, particularly in an Australian social media environment, for new accounts with new content to be hitting an organic reach of that proportion.

"It does lead to questions of possible inauthentic behaviour."
While Graham’s analysis on X found little evidence of bot accounts, he found a "significantly larger proportion" of newly created accounts sharing No content. There is no suggestion that these accounts are fake or inauthentic, but Graham's research paper said "the stark differences in the number of suspicious accounts between the Yes and No campaigns is notable".

Billionaire - after the 2022 federal election, and concerns have been raised over misinformation and hate speech on the platform.

Graham speculates these "suspicious accounts" were recently created due to lack of moderation on the platform since Musk's takeover.

The Australian Electoral Commission (AEC) has set up , which it says lists the main pieces of disinformation it has discovered regarding the Voice - and its actions taken in response.

Commissioner Tom Rogers said a key issue was social media companies , noting there had been a "reduction in platforms’ overall willingness to act".
TikTok's Australian director of public policy Ella Woods-Joyce said its focus during the referendum is to "keep our community safe and protect the integrity of the process, and our platform, while maintaining a neutral position".

"We’ve worked closely with the AEC, and have promptly addressed all of their referendum-related requests to date."

Woods-Joyce said the platform's team of 40,000 trust and safety specialists "exercise vigilance around our community guidelines, which clearly state that we do not allow hate speech or harmful misinformation about civic and electoral processes, regardless of intent".

A spokesperson for Meta, which owns Facebook, told SBS News it has taken "extensive steps to combat misinformation" on its services in the lead-up to the referendum, including "providing one-off grants to support our local fact-checking partners and working with partners to promote education campaigns to help people recognise misinformation and report it".

"We have internal teams from across the company responding in real-time to potentially violating content we see emerging," they said.

"Alongside our internal teams, we appreciate the close working relationship with the AEC and have established channels for the AEC to report content to Meta that may breach our policies or breach Australian electoral laws."

Meta's misinformation policy outlines different categories of misinformation and how each is treated. Under the policy, which is not specific to the Voice, misinformation is removed "that is likely to directly contribute to interference with the functioning of political processes and certain highly deceptive manipulated media". Meta's community standards prohibit fake accounts, fraud and coordinated inauthentic behaviour.

Claims major platforms approved false paid-for ads

Last week, Reset.Tech Australia published the findings of a "small research piece" which found the major social media platforms approved a range of paid-for ads containing what it called explicit electoral misinformation.

The group created dozens of ads under accounts run by its staff and submitted them for approval on Facebook, TikTok and X. This included content suggesting that the date of the Voice referendum was, incorrectly, 31 November, and falsely claiming that the Voice referendum is voluntary and/or a postal survey.

The referendum will be held on 14 October. As stated in the AEC’s disinformation register, referendums are compulsory, like a federal election. The AEC did not run the , which was neither a referendum nor a plebiscite.
It is simply too easy to propagate electoral misinformation via paid-for ads.
Reset.Tech Australia
Of the 10 mock ads submitted to test TikTok’s approval system, the group said seven were approved, one was rejected and the final two were not reviewed.

Of the 20 ads submitted to Meta, which owns Facebook, all but one were approved. Reset.Tech Australia said these 19 ads were not self-identified as 'political ads,' claiming "Facebook’s system appeared entirely dependent on an advertiser’s self-declarations regarding the nature of advertising, which evidently offers insufficient protection against bad actors".

"X’s (Twitter’s) system did not request self-identification for political ads, nor did their system detect or reject it," the group said. All of the 15 ads submitted to X were approved and scheduled to run, it said.

Reset.Tech Australia stressed it cancelled its mock ads after gaining approval and none of them went live.

"It is simply too easy to propagate electoral misinformation via paid-for ads," the group said.
Meta's spokesperson said the report was "based on a very small sample of ads and is not representative of the number of ads we review daily across the world".

"Our ads review process has several layers of analysis and detection, both before and after an ad goes live. As noted in the report, these ads never went live, and therefore our full enforcement detection technology did not have an opportunity to pick up these ads."

Tiktok's Woods-Joyce said the platform "is taking the claims made by Reset Australia seriously and is conducting our own inquiries into the matter".

Dawkins acknowledged the "experiment" had a small sample size, and caveats in its approach. But she says it revealed all tech platforms need to do more.

"The ads experiment is revealing that there are issues in their systems and processes. We revealed them in a small, modest, basic way," she said.

"The simple answer is yes, platforms need to do more … but platforms need to be regulatory incentivised to do more."

Reset.Tech Australia argues the Code provides "unclear direction" when it comes to electoral misinformation propagated by paid-for advertising.

SBS News put multiple questions to X on claims included in this story, and received an automated response: "Busy now, please check back later."

Stay informed on the 2023 Indigenous Voice to Parliament referendum from across the SBS Network, including First Nations perspectives through NITV.

Visit the to access articles, videos and podcasts in over 60 languages, or stream the latest news and analysis, docos and entertainment for free, at the