Navigating Australia’s federal elections in the era of AI

12 May 2025 Consultancy.com.au
Aaron Momin is Chief Information Security Officer at Synechron

The recent Australian Federal Election marked a turning point. Not just for politics, but also for technology. This was the first national vote held since generative artificial intelligence became widely adopted across the country, and it brought with it both exciting possibilities and serious concerns.

Globally, we’ve already seen how quickly generative AI can be misused. In the United States, a deepfake robocall pretending to be President Joe Biden went viral earlier this year. In India, AI-generated holograms and digital voices were used to influence voter perceptions in one of the world’s largest elections.

The UK, Nigeria, and Slovakia meanwhile have all had to deal with viral deepfakes that undermined trust in political leaders and electoral processes.

The rise in sophisticated cyberattacks and what it means for Australia’s digital resilience

Australia is not immune to these threats. While the election didn’t see any large-scale misuse of AI in political campaigning, the risks were real and no longer just in theory. The combination of cheap, easy-to-use AI tools and a highly digital media landscape could have provided an appropriate ground for potential interference that could have manipulated public debate.

We already struggle with disinformation and online trolling, and with the addition of AI-generated digital audio, images and videos, AI has the opportunity to amplify and worsen the issue.

As a cybersecurity leader, I saw this election as a crucial test of Australia’s resilience. AI-generated videos, voices, or images have the potential to impersonate political candidates, spread fake news, or create fake scandals to mislead the public at large. The biggest threat in all of these isn’t that people believe in fakes, but that they stop trusting what’s real altogether.

Gaps in our cybersecurity talent pipeline and the need to build a stronger local workforce

Addressing such threats will require more than just strong policies and good intentions. It demands a robust security program with adequate technical expertise and an efficient cybersecurity workforce, of which Australia has long struggled with a shortage. To add to this, the issue goes beyond just a lack of technical skills; the entire ecosystem and the workforce need proper awareness and education.

We face a huge gap between the rate of cyber-related issues, both at election time and beyond, and the rate at which our education system and industry training pathways can produce skilled cyber professionals is struggling to keep up. Cybersecurity jobs are continuing to stay vacant in both the private and public sectors, and many organisations are struggling to recruit professionals with the right combination of technical skills, knowledge of the regulations, and hands-on experience.

The importance of national policy in supporting Australian-made tech to mitigate cyber risk and safeguard national security

The federal election has brought a renewed focus on the importance of national policy, although we’ve seen some early signs of political awareness.

The Australian Government is reviewing AI regulations, and the Australian Electoral Commission has begun revising its misinformation policies. These are important steps, but more is needed. Just as Florida and Colorado in the US have authorised disclosures for AI-generated political content, Australia should consider clear and enforceable guidelines calling for transparency.

A key part of that answer is local investment in solutions made in Australia, ensuring our technology aligns with national interests, complies with local standards, and builds economic resilience. It also gives us a real opportunity to lead by example, sharing our approach to ethical AI and regulation with other democracies facing similar challenges.

Despite the risks, the election showed how AI has the potential for strengthening our democratic processes. It has the potential to verify voter registrations, detect anomalies in voting patterns in real time, help voters understand the different policies more clearly, thus making it easier for voters to make a decision, and improve the overall functioning of election day through better management.

Similarly to how we’ve learnt about the potential risks from the federal election, Australia has a chance to learn from international examples and build safeguards before it’s too late. Industry, government and media platforms must collaborate to set ethical standards and technological guardrails.

Australians must be encouraged to rely on credible news sources and learn how to identify and understand the increasing influence of AI in the democratic process. This should be a collective effort, led by government education initiatives, but also promoted by influential figures within business, community groups and the education system. Without such guardrails, trust in the democratic process could disappear quite quickly.

AI does not have to undermine the integrity of elections. With the right supervision, it could even help to make the process easier. But we all must stay abreast of the latest updates and risks, because once trust is lost, it’s nearly impossible to restore.

More on: Synechron
Australia
Company profile
Synechron is not a Australia partner of Consultancy.org
Partnership information »
Partnership information

Consultancy.org works with three partnership levels: Local, Regional and Global.

Synechron is a Local partner of Consultancy.org in Netherlands and Canada.

Upgrade or more information? Get in touch with our team for details.