Emma Collins

In the 2020s, deepfake technology has become an issue of international concern. In the run-up to the Australian federal election, it is critical to note that such technology has the propensity to alter electoral communication and change how the public interacts with political information online. 

Australia’s legislative response to deepfakes has limped behind this developing technology despite ongoing calls to regulate truth in political advertising and online misinformation. Without proactive measures to counter the proliferation of deepfakes, Australia’s upcoming federal election may suffer the same fate as other democracies that have had their elections impacted by deepfakes.

A deepfake is a digital photo, video or sound file created with artificial intelligence (AI) to produce a convincing but false representation of a real person. These representations not only cause serious harm to a person’s reputation, but can erode the public’s trust in public figures, elections, and our democratic institutions. In particular, deepfakes of politicians can hinder voters’ ability to make informed decisions, which is integral to a functioning democracy.

With Australia’s federal election on May 3, concerns from politicians about deepfakes emerging in political messaging are deepening. In September 2024, Senator David Pocock commissioned and shared a ‘deepfake’ online of Prime Minister Anthony Albanese announcing a ban on gambling advertisements. This was followed by another deepfake of Opposition Leader Peter Dutton showing bipartisan support for the ban. 

Senator Pocock sought to showcase how easily content can be manipulated to deceive the public. These concerns are justified, given that the Australian Electoral Commission (AEC) does not have power to regulate or ban political deepfakes. Instead, the AEC places responsibility on voters to ‘stop and consider’ the source of political information. This approach is problematic when applied to deepfakes, as equipping people with detection strategies does not improve their ability to detect deepfakes.

In 2024, the federal government came close to passing the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill but ultimately failed due to backlash about its potential impact on freedom of expression. The Australian Communications and Media Authority (ACMA) defines misinformation as ‘false, misleading, or deceptive information that can cause harm’. Likewise, disinformation is misinformation intended to deceive another person or cause serious harm. If passed, the Act would have empowered ACMA to hold digital platforms accountable for disseminating misinformation that causes serious harm to Australians. This would have included platforms prohibiting the spread of deepfakes – a form of misinformation – by imposing civil penalties (e.g. in line with s 26 and s 62 of the Bill) on platforms if they neglected to do so. The bill was criticised for encouraging platforms to overly censor content to avoid penalties. An effective misinformation bill would need to balance the freedom of expression with the legitimate harm deepfakes pose to our democracy. Without this balance, injury to the public’s freedom of expression could inadvertently harm our democracy as well. Still, the Bill would not have limited the operation of the Commonwealth Electoral Act 1918, which does not have strong safeguards against false or misleading political advertisements.

Currently, Australia has no federal protections against false or misleading political advertisements. In 2010, the Federal Court upheld in Peebles v Honourable Tony Burke that a criminal offence under s 329 of the Electoral Act is limited to misleading or deceptive conduct that affects ‘the process of casting a vote rather than the formation of the political judgment about how the vote will be cast.’ In other words, misleading or deceptive political advertisements that affect a voter’s judgment of their candidate are not covered in s 329 of the Act. In addition, s 52 of the Trade Practices Act 1974, as demonstrated in Durant v Greiner, cannot be relied upon to regulate political advertisements with deceptive or misleading information – because these advertisements are not made in ‘trade or commerce’. Though the Trades Practices Act was replaced in 2010 by the Competition and Consumer Act, the Australian Competition and Consumer Commission’s stance likely remains the same.

Only South Australia (SA) and the Australian Capital Territory (ACT) have truth in political advertising laws (TPALs). Under s 113(2) of the South Australia Electoral Act 1985, a person or a corporation can be fined if they authorise, cause or permit the publication of an advertisement that contains an inaccurate or misleading statement while purporting to be a statement of fact. Section 297A of the ACT’s Electoral Act 1992 is similar. The effectiveness of SA’s political advertising laws was demonstrated in 2022 when Labor had to withdraw its campaign about ambulance ramping once South Australia’s Health data found the campaign was inaccurate. 

However, as these TPALs only concern paid advertisements, they do not apply to deepfakes created and shared online by an individual not part of a political campaign. This legislative gap demonstrates that state provisions need further consideration regarding the impact of deepfakes. These TPALs should be reformed to include deepfakes – this would be a legitimate indication that Australia is committed to protecting online users from misinformation and harm to the electoral process. 

Like the failed misinformation bill, the federal parliament’s hesitance to introduce TPALs boils down to the fear of the ‘chilling effect’ that prohibiting certain communications would have on Australia’s implied freedom of political communication. Separately, under Article 19 of the International Covenant on Civil and Political Rights (ICCPR), Australia has an obligation to support ‘[a] free, uncensored and unhindered press or other media… to ensure freedom of opinion and expression and enjoyment of other Covenant rights.’ However, such fears may fail to consider the benefits involved with such reforms, including for our democratic system. In 2024, South Australian participants in a study conducted by Yee-Fui Ng on South Australia’s truth in political advertising laws unanimously agreed that the laws did not have a ‘chilling effect’ on their freedom of speech.

The myriad ways in which deepfakes can be weaponised to dissolve the integrity of our electoral process are easily foreseeable. The eSafety Commissioner’s position statement on deepfakes noted the evolving field of deepfake technology has made it cheap and easy to create fake news that targets politicians. This is only amplified by the speed of trending hashtags or algorithms and the undetectable nature of deepfakes in the digital landscape. Following the Voice to Parliament referendum in 2023, in which misinformation flooded social media platforms, there has been strong support from the Australian public to introduce federal TPALs. A poll conducted by the Australia Institute found almost nine in ten Australians favour truth in political advertising laws.

In June 2023, the Joint Standing Committee on Electoral Matters stated, ‘the Australian community has an expectation that the political communication that they receive is credible and factual.’ While these words ring true, no urgent actions have been taken to ensure they have a tangible impact. Implementing truth in political advertising laws at a federal level is one robust approach that can combat the harm caused by deepfakes while ensuring that freedom of speech is not unduly impeded. Further, a misinformation bill that combats harms posed by deepfakes to the extent necessary for freedom of expression to exist harmoniously would ensure that the public is confident about the integrity of information and elections and well-informed during electoral periods. With our next federal election to be held this year, there is no better time to enact legislative change than now.

Emma Collins was an intern with the Australian Human RIghts Institute in 2025.