There is one sentence in a recent presentation made by a media company to advertisers, that frankly terrified me: “by 2030, every ad and integration served by Seven West Media will be personalised, optimised and addressable”.
The implications of this are clear.
If every single advertisement that you see can be tailored for you, and directly targeted to you, then so can every other piece of content that you see online.
(If you thought Seven West Media was only talking about the ads you see on TV, think again. The big media companies in Australia are no longer really in the business of journalism; their profits come from trading in our data. Media companies collect, collate, share and use our data, across different brands, services, platforms and apps, in order to track, profile and target us, without our active agreement to do so.)
So as well as ads, what can be tailored and targeted to us personally?
Every news story, every fluff piece. Every meme, every social media post, every piece of recommended content, every video. Every snippet of misinformation, every campaign of disinformation. Every feed, of everything.
This is a propagandist’s dream. For the rest of us, it is a nightmare.
Individually tailored and targeted messaging has profound and irreversible consequences for the quality of our public discourse, and the health of our democracy.
We already live in a world of filter bubbles and echo chambers. There are fissures in our community now: what we believe and who we trust, where we stand and what action we are willing to take, are a product of what we see and hear. On issues from covid lockdowns to an indigenous Voice, and from climate change to conflict in the Middle East, we are splintering.
Add generative AI into the mix, and those fissures in our community will deepen into unpassable crevices, unless governments around the world act to prohibit targeted messaging.
Forget the tech bro predictions of existential risks to humanity posed by AI-built killer robots. We don’t need to wait for robots to kill us. If we don’t rein in technology and business processes now, we will kill ourselves, through war, through the breakdown of democracy, and through inaction on climate change. Those are the existential risks we should be worried about, fuelled by disinformation spreading like wildfire, by algorithms designed to amplify and reward conflict.
The multi-national Bletchley Declaration last week recognised this, stating “frontier AI systems may amplify risks such as disinformation” which could lead to “serious, even catastrophic, harm”.
This week the European Union’s Parliament and Council reached a provisional agreement on a new regulation on the transparency and targeting of political advertising. Even Meta is attempting to stop the use of generative AI for some types of personalised messaging.
The Australian Government also has an opportunity to act now. The Attorney-General has already agreed, or agreed in-principle, to the vast majority of recommendations made in the long-running review of the Privacy Act. One of the fundamental reforms to which the Government has agreed in-principle is to reform the definition of ‘personal information’ to include when individuals can be tracked, profiled and targeted at the individual level, even if their identity is not known.
Individuation online is the diesel that fuels the algorithmic engines, amplifying the voices of influencers, and powering the trains of online hate, misinformation and extremism which lead to everything from the explosion in mood disorders to pro-anorexia content to Holocaust denial, false claims about stolen elections and genocide.
So by bringing individuated data clearly within the scope of the Privacy Act, we can better set rules for when data about us can be collected, used or disclosed.
The Government has also agreed in-principle to introduce a requirement that all data collection and use must be ‘fair and reasonable in the circumstances’. These two reforms alone will make a critical contribution to modernise our law to reflect the risks of the digital economy, and the wishes of the Australian people.
However that won’t be enough to save our democracy, if the political party and politicians’ exemption to the Privacy Act remains.
Despite almost universal desire to abolish the exemption for political parties (the Department noting in its Privacy Act Review Report that “Almost all submitters that commented on the exemption considered that it was not justifiable and should be narrowed or removed”), the Attorney-General’s response did not agree to abolish the exemption, instead simply repeating the claim that the exemption is there “to encourage freedom of political communication and enhance the operation of the electoral and political process in Australia”.
My initial response was to shrug with cynicism, thinking – Well of course politicians don’t want to regulate themselves. Why would we expect anything better?
But if the Government really wants to preserve freedom of political communication and enhance our democracy, it must pass laws to protect public discourse from the corrosive effects of tailored propaganda. That means requiring all players – including publishers and AdTech players, social media platforms, data brokers, activists, lobby groups, politicians and political parties alike – to submit to the same rules limiting their use of our personal information, and the same scrutiny of their conduct, to determine when it is ‘fair and reasonable’ to track, profile and target us at the individual level.
Otherwise, political actors will still be able to track us across everything we do online, profile us, tailor messages to us and then target us as individuals – with ads, with memes, with social media posts, with videos, with fake news and false claims – with impunity.
In the past week alone, we have seen stories of fake scientific journal articles containing outright fabrications circulated in a community consultation process over wind turbines in NSW, and fake ‘case studies’ generated by Google Bard AI alleging misconduct by named companies included in a submission to Parliament about those companies. Both examples have the potential to corrupt our democratic policy and law-making processes.
This feels like only the tip of a mistruths iceberg.
In the 2023 Deakin Indigenous Oration “My lament for my country – where is truth?”, Wiradjuri man and award-winning journalist and academic Stan Grant said: “we are all betrayed by an age of division … in an age of media that thinks debate is finding the point of difference, and then widening it, to stoke the fires of a toxic social media… We live in an age of prosecution without process; no truth, but our own truths.”
Expanding on this theme in his address to the AISA conference last month, Grant drew a link between advances in technology, the diminishing quality of public discourse, and global threats to peace and democratic institutions. He noted that in 2017, Russian President Vladimir Putin said: “Whoever becomes the leader in (the sphere of AI) will become the ruler of the world”.
I am increasingly of the view that we cannot hope to stem the tide of misinformation generated and circulated at scale, unless we prohibit targeted messaging based on behavioural tracking and profiling in all its forms. Opting out from seeing ads is not enough. Focussing on truth in political messaging alone is not enough. Enforcement will never be able to scale the way online targeted messaging does; if messaging can be tailored to every individual differently, no-one can check the truth or impact of every piece of content online.
Privacy reforms to tackle online behavioural tracking, profiling and targeting are necessary to protect us from digital harms as individuals. But even more urgently, privacy reforms are needed to protect us collectively: as a society and as a democracy.
We need to rip the engine out of the mistruth amplification and polarisation business model.
If we value truth, if we value peace, our governments should prohibit all online behavioural tracking, profiling and targeting of individuals – including by politicians.
Photograph (c) AJ Colores on Unsplash