Deepfakes and falsehoods are legal in political advertising. Not everyone is on board with fixing it
The outcome of two misleading assertions in 2022 reinforces what many in Australia have long suspected — that when it comes to the facts, there is one set of rules for politicians and another for everyone else.
Two years ago, the Federal Court found electronics giant Samsung Australia had engaged in misleading and deceptive conduct over its advertising blitz implying its flagship Galaxy mobile handsets were more water-resistant than they actually were.
That same year in April while the federal election was in full swing, an ad from a Labor Party politician baselessly claimed that the Liberal Party would expand the Cashless Debit Card scheme to restrict the purchases of pensioners. It was part of a long campaign of similar ads.
Samsung was fined $14 million. The Labor Party was never prosecuted for the misleading ads because they weren't against the law.
In Parliament House, there is agitation to introduce more integrity in political discourse ahead of the next federal election.
But some of those who could be held accountable under these proposed reforms have remained noticeably silent.
Falsehoods totally legal in political advertising
Labor is not the only party to mislead voters in Australian political advertising. Before the 2019 election, for example, the Coalition ran a campaign about a Labor "death tax", with no evidence it was actually party policy.
Outside of protections in the Commonwealth Electoral Act against misleading statements related to the actual voting process, no legislation at a federal level presents misleading claims like these in political advertising.
This wasn't always the case. In 1984, the Hawke government legislated to ban electoral advertisements during an election period "containing any statement that was untrue or was, or likely to be, misleading or deceptive". The penalty was $1,000 for an individual or $5,000 for a corporation.
After eight months, it was repealed before ever being tested at an election, following concerns voiced by a parliamentary committee.
The idea was canvassed once again after the 2022 election, with a parliamentary committee recommending that laws of this kind be enacted and a new division within the Australian Electoral Commission (AEC) established to administer the laws.
The Coalition members of the committee did not agree with that idea, writing in their dissenting report that it would be impossible for the government or its bureaucracy to adjudicate the truth and implying that the proposal would run counter to free speech.
The AEC itself is not keen on being the one who administers the laws, preferring that a different organisation, or a completely new one, be given the job.
Electoral Commissioner Tom Rogers told ABC NEWS Verify that "our involvement in that process will damage our neutrality, and it will therefore have an impact on people's perception of the integrity of the vote".
"The issue is that people trust the AEC," Mr Rogers said.
"The first time we have to make a pronouncement that a major party or a candidate is effectively not telling the truth, the supporters of that party will immediately start to potentially distrust the AEC in our actions."
But he added that if parliament were to give the AEC the power, the organisation would carry out its duties "to the very best of our ability".
South Australian model works, expert says
In South Australia, truth in political advertising laws have been in effect for 39 years and the Electoral Commission of South Australia (ECSA), which decides what is accurate or misleading, has wielded it to force politicians into embarrassing backdowns.
In July 2021, SA Labor leader Peter Malinauskas, then in opposition, was made to take down a Facebook post and concede his office "did not have sufficient evidence to support the statement that the Liberal Government ha[d] a 'secret plan to cut more doctors and nurses from our hospitals'".
Yee-Fui Ng, a law academic at Monash University and author of a recent report on truth in political advertising laws, said the laws enjoyed broad support from across the political spectrum and had changed the face of electoral campaigning there for the better.
"Political parties are extremely careful to make sure that each and every word of their political advertising is accurate and not misleading," she said.
Despite protestations at a federal level about restrictions on free speech, Dr Ng's interviews with political participants in SA found no "chilling" effect from truth in political advertising laws.
"They can still have their heated political debate and the attack ads and all those things that go on in the heated kind of dirty electoral campaign," she said.
The AEC's Mr Rogers said he believed ECSA was doing a great job, but questioned whether the AEC could do the same job at a federal level.
"The stakes are huge. The number of people involved, the sub-jurisdictions involved, the topics involved are very complex," he said.
"Therefore it becomes … a logistically more complex event than a state event."
Can legislation meet the rise of the machines?
Fast forward to 2024 — the rise of machine-generated content is now outrunning these regulations. Dr Ng's report found that even SA's laws were not fit for the age of AI-generated content.
Loading...Independent senator David Pocock has experienced first-hand the impact of misleading images in political advertising.
During the 2022 election, conservative lobby group Advance posted photoshopped advertisements around Canberra that showed the then-candidate smiling and opening his shirt to reveal a Greens logo.
The AEC found this breached an existing section of the electoral act that made it an offence to "mislead or deceive an elector in relation to the casting of a vote".
Senator Pocock argues generative AI now offers more convincing opportunities to deceive which will not be covered by existing legislation.
To demonstrate this, he recently commissioned two deepfake videos of Prime Minister Anthony Albanese and Opposition Leader Peter Dutton. He says they cost him only "a couple of thousand dollars" and took around a week to make.
"It's a very real threat to our democracy … I don't think there is any case for them at the moment," he told ABC NEWS Verify.
"Another one that we really need to act on, and we've seen this happen in the US, is the use of AI robocalls where you have a voice clone hooked up to a [large language model], and that's actually calling people and interacting.
"I think that has real potential to again affect our democracy and I just can't see it being used for good."
What is the government doing about this?
Special Minister of State Don Farrell, who has carriage of electoral reform, was reportedly aiming to legislate truth in political advertising ahead of the next election.
ABC NEWS Verify understands several independents have been briefed by the government, but have not yet seen an exposure draft and remain in the dark about the government's plans.
An interim report from a parliamentary inquiry released this month did not inspire confidence.
In lieu of proposing legislation to ban AI-generated content, it instead recommends the adoption (with no specification as to by whom) of voluntary codes prohibiting the use of deepfakes in election periods.
The Labor senators on the committee (including Tony Sheldon, who chairs the inquiry) were the only ones to not offer dissenting reports or additional comments, suggesting AI-generated manipulation would be fair game in the upcoming federal election.
Senator Pocock said the delay in legislating was "not good enough".
"They've known about these issues for years and here we are in October before, at the latest, a May election next year and we still haven't even seen a draft."
He described the threat of generative AI to democracy as "urgent" and noted that the parliament was quick to change the electoral act during the pandemic to allow COVID-positive Australians to vote over the phone, for example.
"If they're willing to put the resources into it and make it happen it can happen but unfortunately, I think the status quo, it works for them and I don't think they've quite clocked just how quickly AI is moving.
"Which is frankly ridiculous because we're being told by experts and we're obviously either not listening or just choosing not to act."
A spokeswoman for Senator Farrell did not comment directly about the delay, instead pointing ABC NEWS Verify to a recent speech the minister gave to the McKell Institute.
In the speech, the minister stated that "fundamental reforms to our electoral system" would be introduced "in the coming weeks".
But the speech only addressed reforms in relation to political donations, with truth in political advertising and AI conspicuously absent.
The ACT recently legislated for truth in political advertising. Those laws are being tested for the first time right now, with the ballot closing this Saturday, October 19.
But in larger jurisdictions, these reforms remain elusive.
Dr Ng said a factor in the long delay in legislating was self-regulation.
"Every single integrity reform, you will find that reluctance. And that's because we're asking politicians to regulate themselves. It's actually against their incentive to do that.
"Anything that shackles the politicians or the political parties, we will find some level of resistance and that's, I think, explaining the glacial pace of reform that we see in this area."