Canada Should Demand More From Platforms -How to Mitigate the Impact of Disinformation and Cyber Operations in the Context of Russia’s War Against Ukraine

Social Media Lab’s Directors, Anatoliy Gruzd and Philip Mai Visit Ottawa. Featuring opening Remarks by Dr. Anatoliy Gruzd, Professor and Canada Research Chair in Privacy-Preserving Digital Technologies, Toronto Metropolitan University (Witness – Senate Committee on National Security, Defence and Veterans Affairs)

Introduction

Mr. Chair and Committee Members, thank you for the opportunity to discuss the threat of disinformation and foreign interference in Canada in the context of Russia’s war against Ukraine.

I’m Anatoliy Gruzd, a Canada Research Chair and a Professor at Toronto Metropolitan University. While my comments today are my own, they are grounded in research conducted by my colleague Philip Mai and other collaborators at the Social Media Lab, where we study the spread of misinformation and propaganda, information privacy, and the impact of social media on society.

Russia’s Information Operations

The Kremlin has a long history of using information operations domestically and internationally. In recent years, Russia has expanded such efforts by deploying bots, trolls, hackers, and other proxies to create a more favourable information environment for its agenda.

Their influence campaigns are often deployed across multiple platforms and rely on techniques such as creating fake personas or websites, impersonating politicians, journalists, and public agencies, attacking activists’ accounts, and amplifying polarizing topics.

Exposure to Pro-Kremlin Narratives in Canada

Our research shows Canadians are not immune to Russian disinformation and propaganda. According to our 2022 national survey, 51% of Canadians reported encountering pro-Kremlin narratives about the Russia-Ukraine war on social media.

Our research shows a strong link between exposure to pro-Kremlin narratives and belief in them and that a person’s prior beliefs and politically motivated reasoning make them more susceptible to disinformation. For example, Canadians with right-leaning views or those who trust partisan media are more likely to believe in pro-Kremlin narratives.

Left unchallenged, state-sponsored information operations can undermine Canadian democracy. So, what can be done to mitigate such risks?

Recommendations – Demand More from Digital Platforms

Blocking state-run or state-supported propaganda media outlets like RT News is only partially effective, as the Kremlin circumvents such sanctions by cloning content and disseminating it through other channels such as pseudo-news websites.

They also rely on the social media accounts of their diplomatic services, such as the Russian Embassy in Ottawa, and enlist sympathetic media personalities in the West, directly or indirectly.

To fight state-sponsored disinformation, digital platforms should be mandated to bolster their Trust and Safety teams in Canada, expand their partnerships with fact-checking organizations and facilitate access to credible news.

Unfortunately, platforms have retreated from these areas in recent years. For instance, Facebook and Instagram no longer provide access to news; X slashed its global “Trust and Safety” team by 30%.

And with newsrooms closing or downsizing nationwide, more Canadians will turn to social media influencers rather than journalists to stay informed.

This is especially concerning because our research indicates that individuals who trust mainstream media are less susceptible to pro-Kremlin disinformation. Thus, investing in a strong journalistic community and enhancing trust in mainstream media outlets could effectively combat foreign information operations.

Recommendations – Pre-Bunking Strategies

Another line of defence is implementing proactive or pre-bunking strategies to inoculate Canadians against future disinformation campaigns. For instance, PSAs and educational games that incorporate known false claims, tactics, and sources used by foreign adversaries can reduce the perceived persuasiveness of disinformation and limit its spread.

Considering the need to reach a broader audience and the relatively high level of trust in government institutions in Canada, national security agencies could take a more proactive role in countering disinformation by publicly debunking false claims more promptly and transparently.

Recommendations – Generative AI and Disinformation

We’ve also seen an increasing use of Generative AI to create disinformation about the Russia-Ukraine war. While most recent AI fakes were quickly debunked, I expect an increase in the frequency, scale, and sophistication of phishing, social engineering, and reputational attacks enhanced by Generative AI.

Thus, we must educate not only the general public on the dangers of disinformation and the importance of cybersecurity but also policymakers and civil servants, who are often the targets of such attacks.

Conducting readiness assessments for these groups would provide a better understanding of their preparedness and identify existing vulnerabilities.

Conclusions

In conclusion, we must not underestimate the potential of the Kremlin’s information operations to impact public perception over time.

Deplatforming individual sources may not be effective as it could also undermine trust in the government and legitimize censorship.

A more nuanced approach should consider the various forms of interference and develop strategies to address them directly.

This could include requiring large social media platforms to expand their Trust and Safety teams in Canada, share data with researchers to promote transparency and allow for independent audits.

Schools must update digital literacy programs to address emerging challenges such as Generative AI.

For the general public, the government should develop PSAs and pre-bunking campaigns to educate Canadians about foreign inference. Aspects of such educational campaigns must focus on and be informed by diaspora communities in Canada, as they are more likely to be targeted by foreign states.