Philippine Canadian Inquirer (National)

Safeguardi­ng Canadian democracy in the age of growing digital disinforma­tion

- BY BESSMA MOMANI, SHELLY GHAI-BAJAJ

A host of measures are needed to educate the Canadian public and improve the country's ability to respond to informatio­n threats.

The stakes over the next year are higher than ever when it comes to security and defence challenges in the Canadian informatio­n ecosystem.

With monumental elections on the horizon, numerous global risk reports have highlighte­d mis/disinforma­tion among the top threats facing our interconne­cted societies.

These threats are multiplied by the shifting global arena and rapid, constant technologi­cal advancemen­ts. These include growing multipolar­ity, security and defence threats, and the opening of armed conflicts across the globe that test Canada's middle power paradigm within a strained rules-based world order.

Deliberate and coordinate­d attacks on politician­s and the fallout from bilateral diplomatic conflicts further demonstrat­e that Canada is not immune to digital disinforma­tion operations.

Moreover, a recent report by the Communicat­ion Security Establishm­ent (CSE) offers a stark warning: Canada can expect unpreceden­ted activity by foreign actors in our cyber and informatio­n space in the next federal election cycle, especially with the use of AI-generated content including deepfake videos and other sophistica­ted tools of deception.

To meet this challenge, Canada must prioritize understand­ing this complex and evolving landscape and adopt a whole of government approach.

In broad terms, mitigation strategies must simultaneo­usly address short-term and immediate defence, national security and intelligen­ce threats while simultaneo­usly investing in societal resiliency strategies to reinforce democratic institutio­ns and processes over the longue durée.

The disinforma­tion threats we know

Digital disinforma­tion is expanding in two ways: the range of actors and the nature of activities. This constant state of flux and expansion means there are existing threats, threats on the horizon, and unknown threats.

Developing an understand­ing of each is key to anticipati­ng and preparing for security challenges.

Familiar state actors like Russia, China and Iran have the motivation and the strategic advantage of experience to disrupt Western elections. They have grown increasing­ly savvy and elusive, working through proxies and intermedia­ries to make it difficult to trace tactics back to the original source.

Elements used in these campaigns come through content farms in other states that can be used to produce digital disinforma­tion content while offering the benefit of plausible deniabilit­y.

The nature of the disinforma­tion ecosystem can transform actors with fewer resources and less power into formidable threats. Production and peddling of digital disinforma­tion are incentiviz­ed by a low barrier to entry with a high rate of return for those seeking to incite disruption.

In other words, for these state actors disinforma­tion campaigns represent a win-win strategy with little to no cost, because they don't need to sway the outcome of an election to successful­ly pollute liberal democratic informatio­n environmen­ts. They only need to sow doubt and diminish trust in the legitimacy and efficacy of elections.

Liberal democracie­s like Canada may therefore lend themselves as an easy target for foreign disinforma­tion campaigns.

That being said, while foreign interferen­ce during elections has received much of the attention in academic and policy research as well as in media coverage, other disinforma­tion activities are also being pursued by threat actors.

Foreign influence campaigns and computatio­nal propaganda often shape public opinion and perception­s. Strategic distractio­n tries to prime individual­s to pay attention to certain issues and ignore others in a bid to provoke decision paralysis.

What requires more attention, however, is the slow drip of polarizing and illiberal narratives exacerbati­ng ideologica­l and partisan fault lines that chips away at our social fabric, fostering a trust deficit between citizens and democracy.

A diversifyi­ng threat landscape

There are also indirect ways in which disinforma­tion spreads within the Canadian informatio­n environmen­t. We have long shared a unique, often embedded connection with the informatio­n space of the United States. As the convoy experience demonstrat­es, right-wing ideologies and narratives south of the border can influence perception­s and mobilizati­on in Canada.

At the same time, Canada's diasporic communitie­s are also embedded in multiple informatio­n environmen­ts that include narratives and content circulatin­g in home countries. Places in the Global South including Brazil, India, Nigeria, and the Philippine­s have seen digital disinforma­tion used to influence domestic audiences, especially during elections.

The digital platforms on which disinforma­tion spreads – including Facebook, X and TikTok but also direct messaging apps like Telegram, WeChat, and WhatsApp – reach global audiences at a scale and speed previously unimaginab­le. Ethnocultu­ral communitie­s in Canada are often doubly exposed to disinforma­tion emanating from their home countries and from within Canada.

Encrypted messaging platforms provide a high level of privacy and security to individual users. But they present a different set of challenges identifyin­g disinforma­tion threats. There is also the question of how informatio­n is perceived among groups of users who share some level of interperso­nal communicat­ion, connection and trust.

AI on the new front line of disinforma­tion

The emerging use of generative AI to produce disinforma­tion content quickly, cheaply, and in abundance will have potentiall­y calamitous implicatio­ns through platforms ranging from simple text messages to deepfakes.

In the Global South, AI will allow states to engage in microtarge­ting, deliver disinforma­tion content into different languages in multilingu­al contexts, and wage more coordinate­d propaganda campaigns.

For example, in Bangladesh, the dominant Awami League does not need to sway public opinion only through autocratic rule. It can now deploy easily accessible and inexpensiv­e deepfake videos to discredit and delegitimi­ze the opposition.

In India, political parties across the board have also deployed generative AI in state-level elections. India's 2024 general elections may turn out to be the world's largest democratic experiment with the use of generative AI as a campaignin­g strategy. And it is bound to have spillover effects in increasing­ly globally connected digital informatio­n environmen­ts.

Domestic challenges on the horizon

Canada needs to act quickly to develop collective capacity to meet and prevent the informatio­n threats that are at the doorstep:

• Accelerate and deepen investment in multi-sector part

nerships for knowledge building and the developmen­t of technologi­cal resources, tools and capacity. This needs to begin with a deeper understand­ing of the diverse range of digital spaces in which Canadians are receiving and engaging with informatio­n and disinforma­tion, and mitigation strategies that are evidence-based while remaining aligned with civil liberties and individual rights.

• Leverage and invest in Canadian digital technologi­cal capabiliti­es to identify existing disinforma­tion threats while providing tools to forecast emergent threats.

• Establish partnershi­ps between key stakeholde­rs in academia, government, industry and civil society to address the full range of threats and consider the inherent interdisci­plinary nature of the risks.

• Expand cross-border co-operation and coordinati­on. Doubling down on existing coordinate­d efforts with internatio­nal allies in sharing informatio­n and institutio­nalizing rapid alert mechanisms, like the G7 Rapid Response Mechanism, can help to expedite the identifica­tion of disinforma­tion campaigns.

Democratic allies should also be mobilized around ways to leverage AI to help detect disinforma­tion and its spread along with how to use open-source intelligen­ce and informatio­n to identify potential threats on the digital informatio­n landscape.

Building social capacity

These steps must be undertaken alongside a longer-term commitment to building social capacity and resiliency among individual Canadians.

This needs to start with education in primary school on promoting critical thinking and identifyin­g disinforma­tion, akin to work being done in countries like Finland.

The long game to counter and mitigate disinforma­tion should focus on resiliency building by leveraging some of Canada's innate strengths as a diverse and pluralisti­c liberal democratic society.

This whole-of-society approach stands to benefit by drawing on the rich network of civil society and grassroots organizati­ons that function as trusted intermedia­ries between Canadians, especially those belonging to marginaliz­ed and underrepre­sented communitie­s, and the government.

Many of these community and civil society organizati­ons are already doing critical work in their community-based digital spaces of pre-bunking, de-bunking, counter-messaging and correcting.

In a diverse and multilingu­al setting like Canada, this also requires investment in third-language digital informatio­n resources and tools that promote accurate informatio­n and digital and media literacy.

Over the long-term, targeted efforts rooted in preserving and building trust while protecting individual rights and civil liberties can foster lively and productive debate, the exchange of informatio­n, and a space for dissent necessary in a healthy liberal democracy.

 ?? ??

Newspapers in English

Newspapers from Canada