Have you ever felt fired up with moral indignation after reading a controversial tweet, or after watching a YouTube video about a political topic?
Not only are you not alone, but these experiences online are likely by design. Our emotional landscapes have increasingly become the battleground where politicians compete for votes and power.
When political parties hire big data companies to help their candidates develop digital campaign strategies, like geofencing or programming social bots to inject key messaging into online political discussions, they could be treading on our universal rights and freedoms.
With our digital footprints — information about who we are — big data companies can group our likes, shares, retweets and purchases into virtual personality profiles and create content to match them.
This helps make what you read or watch seem personal and familiar, prompting the social parts of our brains to feel really good. And this, in turn, can cause us to lower our guard and trust information posted by social bots without even knowing it’s happening.
Social strengths make us vulnerable online
Historically, our social nature prepared us for the kind of large-scale co-operation that makes political institutions work. It enabled us to create the complex, modern societies we live in today. Understanding and sharing intentions were central to this evolution.
But there’s a catch: the very social strengths that make us successful as a species can now make us vulnerable online. With the use of AI technologies, our social connections can be simulated in highly realistic ways, manipulating our perceptions in the process.
AI-generated photos, videos and texts are being used for political advertising informed by augmented analytics, which create political ads personalized to our individual traits. This strategy is called microtargeting.
The situation gets more complicated when the content we see on social media isn’t created by humans but by social bots.
These are automated social media accounts designed to imitate us. By slipping naturally into our conversations, social bots make it nearly impossible to tell them apart from real human beings.
Bots increase exposure to negative and inflammatory content in online social systems. They also affect how we feel about the other side of the proverbial political divide. They can also give us the impression that our way of thinking is aligned with the consensus.
What Bill C-4 has to do with online data
In Canada, political parties use third-party firms that can program bots to post this kind of content on social media during elections. For instance, political parties can use bots to dampen or suppress some messages while amplifying others, and this remains a legal practice in Canada.
In June 2025, the federal government introduced a bill on affordability — Bill C-4, the Making Life More Affordable for Canadians Act — that also touched on political parties’ use of personal data.
The bill requires each party to develop a privacy policy, but it doesn’t set clear guidelines for how our data can be used. This means parties can still collect and use traces of our digital behaviour to inform how they use AI to strategically communicate with us online, as long as they follow the policies they create and self-regulate.
Though part of the bill proposes changes to the Canada Elections Act, focusing on how parties can use our personal information, it takes only a small step to protect privacy. While political parties will be obligated to create a policy about the use of citizens’ private data, they will not be required to stop collecting our online data or using it to making predictions about our voting behaviour.
It also fails to provide Elections Canada or the federal privacy commissioner the power to enforce meaningful limitations on the use of our online engagement for such purposes. Even if parties publish their privacy policies publicly, any third-party consultancies they hire are likely to remain beyond enforcement, creating significant gaps in accountability.
Why does this matter? Misused data could harm our rights to privacy and to participate in democratic elections free from manipulation.
Why political parties need universal guardrails
While Bill C-4 does force political parties to create privacy policies and assign someone to oversee them, the way those policies are handled is left up to the parties themselves.
Without clear and universal guardrails for how parties are allowed to collect and use our online data, Canadians remain at the mercy of whatever political parties decide to do.
Instead, Canada could follow the European Union’s example by prohibiting the use of data for online microtargeting purposes. We could also adopt a framework for the ethical use of citizens’ online data like the one currently being implemented in the United Kingdom, which would require political parties to obtain voters’ consent before using their data for campaign purposes.
Allowing parties to define their own rules and decide who enforces them leaves far too much open to interpretation and even potential abuse. And under the Canadian Charter of Rights and Freedoms, Canadians should be free to form political thoughts and opinions without interference from those who wield political power.
If we are not able to do so as voting citizens, can we genuinely say that our elections are free and fair? What, then, does this mean for democracy in Canada?
This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Sophia Melanson Ricciardone, McMaster University
Read more:
- Election disinformation: how AI-powered bots work and how you can protect yourself from their influence
- The ‘dead internet theory’ makes eerie claims about an AI-run web. The truth is more sinister
- Algorithms, bots and elections in Africa: how social media influences political choices
Sophia Melanson Ricciardone does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.