News
A proposed new law designed to fight online misinformation is little comfort to ‘Yes’ campaigners who are horrified by what is being spread on social media. By Martin McKenzie-Murray.
The fight against disinformation on the Voice

A fortnight ago, a senior “Yes” campaigner expressed to The Saturday Paper their acute anger at the volumes of misinformation and disinformation about the Voice. What they described – and it was a recurring complaint among others – was a kind of endlessly replenishing swamp of lies, distortions and racist abuse propagated on social media. It was harmful, exhausting and publicly confounding, they said, and it obscured their own message and arguments. There seemed to be precious little will to help regulate it, they said.
It was a timely complaint – the federal government is collecting feedback on the draft of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill, which would enhance the powers of the media watchdog, the Australian Communications and Media Authority. The bill proposes empowering ACMA to design and enforce an industry standard for the regulation of misinformation and disinformation on digital platforms, to request data pertaining to misinformation and disinformation from media companies, and to impose substantial fines on companies for code violations.
While Communications Minister Michelle Rowland said the proposed laws would make Australia “safer”, the opposition warned against overreach. Naturally, defining precisely what constitutes harmful expression will be difficult and contentious.
To the senior “Yes” campaigner battling the swamp, the proposed legislation was sensible, but, even if passed, they said it would unfortunately come too late for the referendum and its public debate.
This week, Meta – the parent company of Facebook, Instagram and the recently launched Threads – announced it was improving its capacity to detect maliciously false information, as well as partnering with fact-checking organisations. In a statement, the director of public policy for Meta Australia, Mia Garlick, said: “Meta has been preparing for this year’s Voice to Parliament Referendum for a long time, leaning into expertise from previous elections. We will be using a comprehensive strategy to combat misinformation, voter interference, and other forms of abuse on our platforms … We’ve also improved our AI so that we can more effectively detect and block fake accounts, which are often behind this activity.”
Andre Oboler is a former senior lecturer at the La Trobe Law School, and the founder of the Online Hate Prevention Institute. He says Meta is “going some way in responding to an obvious problem” but there are qualifications.
“It’s a positive thing, and of all the platforms, Meta has the most experience handling similar issues. The concern is the content that is sitting in a grey area between political commentary and racism, and that’s very hard to make a call on for anybody. The overt racism is easier to deal with.
“Regarding the Voice, what concerns me is the link between media content and its promotion on social media and the [public] responses to those. There’s a responsibility not just for tech companies, but mainstream media, to moderate.”
He also noted the “steep rise in hate speech” on Twitter, which reflects “a permissiveness there, and a lack of effectiveness in automated removal. Some work we’ve done showed a reasonable response rate [at Twitter] when things were reported, but it’s relying on the public to take that action.
“My main takeaway is that there’s insufficient investment in monitoring what’s going on. And there are bigger consequences for democracy. A democracy requires its citizens to be able to make a decision on real information – and not a whole ecosystem filled with disinformation.”
One of Meta’s fact-checking partners is RMIT FactLab, of which Sushi Das is assistant director. FactLab’s mission is the debunking of online misinformation and disinformation. “What we have seen, in the lead-up to the Voice referendum, is an uptick in misinformation and disinformation,” Das says. “We generally tend to see an uptick at what I call ‘democratic moments’ – so that’s federal elections, state elections, and now the referendum. So we were expecting this uptick, and it’s exactly what’s happening. There is a lot of it.”
Das says the most popular disinformation – the stuff that is shared most quickly and broadly – is content designed to provoke outrage. “They all contain words or pictures that are designed to create outrage, or provoke a strong emotional response like anger or disgust,” she says. “And also, the worst types of disinformation tend to be very polarising. It’s either this or it’s that – they don’t engage in nuance or subtlety. That kind of disinformation tends to spread very fast, because it’s easy to digest.”
Das says Meta provides funding and software tools, but doesn’t dictate what kind or types of content they scrutinise. She says that, given the profound volume of misinformation and disinformation online, their team is guided by four central questions to help them prioritise their work – a kind of triaging of the internet’s sprawling content.
“The first question is, is this content verifiable? Secondly, is this content spreading far and spreading fast? Thirdly, is the content of importance – national importance or local importance? And fourthly, and a really important question: what is the community harm if this piece of content is left unchecked?
“Those are probably the four crucial things we make an assessment about. And what we’re finding at the moment is that there is a lot of misinformation and disinformation that is coming from ‘No’ supporters of the referendum. That’s what we’ve seen and we’re seeing less of it from the ‘Yes’ side at the moment.”
It was RMIT’s FactLab that exposed a curious bit of misinformation propagated by “No” campaigners Fair Australia, and also in posts by prominent “No” advocates Nyunggai Warren Mundine and shadow minister for Indigenous Australians Jacinta Nampijinpa Price. In May, Fair Australia’s website, as well as the Mundine and Price posts, carried an image of a Millwarparra man from the Northern Territory named Stewart Lingiari, and words attributed to him stating that he was opposed to the Voice. The Mundine and Price posts also stated the man was the grandson of the famed Indigenous activist Vincent Lingiari.
This apparent coup de grace quickly fell apart. Stewart Lingiari was neither Vincent Lingiari’s grandson nor an opponent of the Voice – he was yet to make his mind up. The line attributed to him, Lingiari said, was given to him by a cameraman who asked him to read it. Lingiari also said the whole episode was humiliating, and he hadn’t given permission for his image to be used either.
Das says Stewart Lingiari told her team of fact-checkers he was asked to come to Canberra and meet Senator Price, Mundine and Peter Dutton. “He was part of a group of I think nine people from his community who went to Canberra to meet these politicians. And he was under the impression that he was going up there to talk about community issues ... the conversation – as he tells it to us – quickly turned to the Voice,” Das tells The Saturday Paper. “He tells us that he was provided with a quote and asked to say that quote. And he then said that quote and was photographed and videoed. That quote was,
‘I don’t want you to look at me differently. That’s why I’m voting No’.”
Fair Australia is run by the conservative lobby group Advance, formerly Advance Australia, established in 2018 as a kind of counterweight to GetUp! and it’s not the first time RMIT’s FactLab has caught them propagating misinformation. A slew of Facebook advertisements the group paid for late last year, and in which it argued the Voice would create “one race of people with special rights and privileges”, FactLab found to contain “false information”.
Advance’s advisory board includes former prime minister and high-profile “No” campaigner Tony Abbott. Guardian Australia reported this week that Advance had ties to Australian and United States-based conservative Christian organisations. It has become notorious for aggressively mischievous and misleading campaigns, and has been found on more than one occasion by the Australian Electoral Commission to have breached electoral rules. During last year’s federal campaign, Advance blitzed the ACT with placards suggesting the independent candidate, now senator, David Pocock was a secret Greens member – a kind of Trojan candidate. The AEC rebuked Advance for this. “The dealings with Advance during the campaign were a nightmare, really highlighting the glaring gaps in our regulation of misinformation and we see the exact same thing happening now with the Voice,” a spokesperson for Pocock told The Saturday Paper.
And last month Advance was behind a now-infamous advertisement that ran in The Australian Financial Review featuring a cartoon of Wesfarmers chairman Michael Chaney, his daughter, member of federal parliament Kate Chaney, and “Yes” campaigner Thomas Mayo – seemingly dancing before the chairman for money. Arguably, it invoked the grotesque Jim Crow parody – certainly the former New South Wales treasurer, state Liberal MP Matt Kean, thought so. “The racist trope of Thomas Mayo in today’s full page AFR ad has no place in Australian politics,” he tweeted. “It’s a throwback to the Jim Crow era of the Deep South.” Nine Entertainment apologised for running the ad, though a week later the AFR ran an article defending it, quoting Mundine as saying, “There is nothing racist about it. It is factual. Where is the racism? Every time someone disagrees with the left, they always whinge that it is racist.”
And there was no contrition from Advance. Their signature is to double down. “There it is again – the Yes campaign elites playing the race card straight off the top of the deck. Matt Kean can keep his elitist Sydney views to himself,” they said.
Sushi Das says that, given the volumes of misinformation and disinformation, the measure of success is humble. “It’s just the tip of the iceberg,” she says. “There’s a lot more out there than we can actually deal with. But that’s not necessarily our primary concern, to try and take down or point out every single bit of misinformation out there, and try and change public opinion en masse. We’re not trying to do that. What we’re trying to do is raise awareness in the community that social media platforms are often used to spread misinformation and disinformation.
“Misinformation and disinformation can be really harmful. It’s not just an online thing, it can create real-world harm. It can create harm in terms of health – you saw what happened with vaccinations and Covid. It can create harm in terms of finances – there are financial scams out there, that are pushed about by social media. And it can create harm to democratic institutions.”
This article was first published in the print edition of The Saturday Paper on July 15, 2023 as "The truth is out there".
For almost a decade, The Saturday Paper has published Australia’s leading writers and thinkers. We have pursued stories that are ignored elsewhere, covering them with sensitivity and depth. We have done this on refugee policy, on government integrity, on robo-debt, on aged care, on climate change, on the pandemic.
All our journalism is fiercely independent. It relies on the support of readers. By subscribing to The Saturday Paper, you are ensuring that we can continue to produce essential, issue-defining coverage, to dig out stories that take time, to doggedly hold to account politicians and the political class.
There are very few titles that have the freedom and the space to produce journalism like this. In a country with a concentration of media ownership unlike anything else in the world, it is vitally important. Your subscription helps make it possible.
Select your digital subscription