“The supreme art of war is to subdue the enemy without fighting.” – Sun Tzu.
Canadian businesses have been weathering a seemingly endless series of crises that erode prospects for business prosperity. In addition to railway blockades, pandemic restrictions, and supply chain disruptions, we now have political uncertainty, freedom convoys, and the Russian invasion of the Ukraine as exacerbators of our economic woes.
One means of adjusting to these crises has been a dramatic increase in online business practices, but our growing dependency upon information technology leaves us even more vulnerable to threats of service disruption.
In addition to the ever-present threats of computer viruses, malicious hackers, and criminal cyber activity, recent events have exposed a growing and equally sinister threat – Russian cyber attacks in the form of disinformation campaigns, with disinformation defined as “false information deliberately and often covertly spread (as by the planting of rumors) in order to influence public opinion or obscure the truth” (Merriam-Webster dictionary, 2022).
The Russian strategy is simply the latest approach to an age-old concept of influencing one’s enemy to cede defeat by convincing him of the futility of fighting. By using information on an adversary population “…to confuse, mislead and ultimately influence the actions that the targeted population makes” (Lin, 2018, para.1), it is a type of conflict in which there are no non-combatants, and everyone is a legitimate target. Campaigns of disinformation seek “…through manipulation to turn much of the target’s population into unwitting accomplices: They serve the adversary’s interests but do not know that they are being duped” (Lin, 2018, para.6).
In 2016, Russia employed targeted advertisements, intentionally falsified news articles, self-generated content, social media platform tools, and the covert employment of ‘botnet command & control’ servers (bots), to control malware infected machines for the rapid dissemination of disinformation (Government of USA, 2018; Spamhaus Project, 2020). The successful campaign “…sought to polarize Americans on the basis of societal, ideological, and racial differences, provoked real world events, and was part of a foreign government’s covert support of Russia’s favored candidate in the U.S. presidential election” (Ibid., p.3). In the case of the USA, “…the Russian bots did not need to create the polemic surrounding the mail-in vote or the Black Lives Matter protests: they only had to massively share the news exacerbating tensions created by the Americans themselves” (Yaffa, as cited in Marineau, 2020, para.10). This approach succeeds because, quite simply, we are susceptible to the ‘illusion of truth’ effect which means the more we hear something being said the more we believe it’s true (Hassan & Barber, 2021).
Canada has not been immune to similar disinformation attacks. Many Canadians wonder how the so-called freedom convoys and occupations and blockades suddenly arose in Canada, a nation that has traditionally valued peace, order, and good government. Media attention focused on US financial supporters for the protests, but the more insidious form of foreign interference has been the Russian disinformation campaign that was directed specifically against our unsuspecting population with the express intention of fanning the flames of discontent to create chaos and disruption. Whether or not one agrees with the protesters’ viewpoints, the adverse economic effects of the protests are evident, and the rapid development, organization, and support for the so-called freedom convoys can be attributed, at least in part, to the success of the Russian disinformation campaign.
With Canada’s growing condemnation of Russia for their invasion of the Ukraine, the financial sanctions we have imposed, and our supply of military hardware to the Ukraine, we are now amongst the many nations to be subjected to even greater disinformation campaigns and aggressive cyber attacks, potentially including service disruptions of our critical infrastructure. If deterrence is what kept the peace during the Cold War and the Nuclear Age, resilience (weathering an attack) may be the governing principle of the Digital Age (Moens et al., 2015).
Some Practical Solutions:
So, what does all this mean for Canadian businesses? Since Russia’s aim is to disrupt, distract, and create chaos, our challenge will be determining how best to detect, deter, and fight such disinformation campaigns being directed against us. To help enhance our own resilience, Canada should conduct a vigorous public education and awareness campaign, to encourage proactive measures to counter disinformation and the more immediate effects of cyber attacks. But even though the primary responsibility falls upon specific Government Agencies, we all have a role to play.
The current health and vitality of your own business enterprise, and how well it is surviving this continuous onslaught of crises, is not simply a matter of luck. Rather, it likely reflects the nature of your products, the flexibility of your business model, the resilience of your work team, and the quality of your organization’s leadership. For those of you engaged in conducting commercial enterprises, there are some actions you can take that can help.
To begin, you can learn to recognize disinformation and avoid becoming its victim. In that regard, the following list neatly summarizes the basic rhetorical tricks and often used strategies employed to manipulate public opinion. Many are hard to detect unless you know what to watch for, so, with thanks to Skeptical Science (2020), here is a list of common disinformation techniques:
- Fake experts – Presenting unqualified individuals or institutions as sources of credible information. Common techniques include Bulk fake experts, citing large numbers of seeming experts to argue there is no scientific consensus on a topic; Magnified minority, magnifies the significance of a handful of dissenting scientists to cast doubt on an overwhelming scientific consensus; and Fake debate, presents science and pseudoscience in an adversarial format to give the false impression of an ongoing scientific debate.
- Logical fallacies – Using arguments where the conclusions do not logically follow from the premises (also known as a non-sequitur). Techniques include ad hominem – attacking a person or group instead of addressing their arguments; False analogy, assuming that because two things are alike in some ways, they are alike in some other respect; Red herring, deliberately diverting attention to an irrelevant point to distract from a more important point; Ambiguity, using ambiguous language to lead to a misleading conclusion; and Jumping to conclusions to make a wrong claim look logical by ignoring relevant information.
- Impossible expectations – Demanding unrealistic standards of certainty before acting on the science. Techniques include Moving goalposts, by demanding higher levels of evidence after receiving requested evidence; Misrepresentation, of a situation or an opponent’s position in such a way as to distort understanding; Strawman, misrepresenting or exaggerating an opponent’s position to make it easier to attack.
- Cherry picking – Skillfully selecting data that appear to confirm one position, while ignoring data that contradict that position. Techniques include Anecdote, using isolated examples or personal experience instead of sound arguments or compelling evidence; and Slothful induction, ignoring relevant evidence when coming to a conclusion.
- Conspiracy theories – Proposing that a secret plan exists to implement a nefarious scheme, such as hiding the truth.
Next, as responsible and proactive business leaders (and with the help of your expert IT specialists) you should seize the initiative now, and ensure your company is extra vigilant in protecting your digital assets. A good place to start would be a professionally conducted risk assessment of your IT security, to set the stage for development of enhanced processes and procedures. A comprehensive, company-wide training and awareness program should follow to introduce those newly updated procedures. By alerting everyone of the increased risk levels and what indicators to watch for, and by insisting that everyone practice strict cyber hygiene, your leadership will set the conditions for success in achieving a secure cyber environment. Careful supervision and regular IT audits will help maintain the importance of cyber security as an essential, everyday requirement within your company. Perhaps most important of all, if you do suffer a crippling cyber attack, having a robust business continuity plan already in place will be essential to your business’ survival.
Considering the many crises and cyber challenges Canadian businesses are already facing, this undeclared information war being waged in the shadows by a hostile power, employing advanced techniques and technologies, might seem to be an overwhelming challenge. But just as the Ukrainians are proving that Russian soldiers are not 10 feet tall and invincible, so too can the Russian cyber attacks and disinformation campaigns be thwarted. Because their approach is to identify nascent sources of discontent, to identify soft targets and their vulnerable pressure points, and to fan the flames of discontent through the planting of disinformation and manipulation of open-source social media, we must harden the targets, remove the vulnerabilities, and be aware, through education and training, of when and how someone is attempting to manipulate and victimize us. Therefore, to be successful and to continue to thrive in this hostile environment, business leaders must equip their people, their systems, and their companies to detect, deter, deflect, and protect against cyber attack and disinformation campaigns.
About the author: Brigadier-General (Ret’d) Gregory B. Mitchell has significant international peacekeeping experience, including his final posting to Denmark where he commanded the Multinational Stand-by High Readiness Brigade for United Nations Operations (SHIRBRIG), an organization he led on deployment to the Sudan. Upon retirement from the Army, Greg worked on behalf of the UN’s Department of Peacekeeping Operations, was Director of Roméo Dallaire’s Child Soldiers Initiative, Director of Exercises and Simulations at the Pearson Peacekeeping Centre (working primarily with the African Union), and most recently was Executive Director of the Royal Military Colleges Club of Canada. He currently resides in Kingston, Ontario, where he works as Executive Director of Peace Operations Consulting, volunteers with the Canadian Peacekeeping Veterans’ Association as its Special Advisor on Peacekeeping, and will soon complete his second graduate degree, this time in Public Safety. Greg can be reached at (705) 930-9230 or email@example.com
Disinformation 101. (2020). Skeptical Science.
Hassan, A., Barber, S.J. (2021). The effects of repetition frequency on the illusory truth effect.
Cognitive Research 6, 38.. Retrieved at https://doi.org/10.1186/s41235-021-00301-5
Glicker, M. & Watts, C. (2022). Russia’s Propaganda & Disinformation Ecosystem – 2022
Update & New Disclosures. Retrieved at https://miburo.substack.com/p/russias-propaganda-and-disinformation?r=1dxl2
Government of Canada. (2020). CSIS Public Report 2020. Canadian Security Intelligence
Service. Retrieved at file:///C:/Users/Greg%20Mitchell/Desktop/WLU%20Course%207/G%20Position%20Paper,%20Due%2024%20Apr%202022/CSIS-Public-Report-2020.pdf
Government of the USA. (2018). Report of the report 116-xx select committee on intelligence
United States Senate on Russian active measures’ campaigns and interference in the 2016 U.S. election ‘ Volume 2: Russia’s use of social media with additional views. Retrieved at https://www.intelligence.senate.gov/sites/default/files/documents/Report_Volume2.pdf
Lin, H. (2018). Developing Responses to Cyber-Enabled Information Warfare and Influence
Operations. Lawfare. Retrieved at https://www.lawfareblog.com/developing-responses-cyber-enabled-information-warfare-and-influence-operations#:~:text=Developing%20Responses%20to%20Cyber%2DEnabled%20Information%20Warfare%20and%20Influence%20Operations,-By%20Herb%20Lin&text=If%20the%20patron%20saint%20of,information%20warfare%20and%20influence%20operations
Marineau, S. (2020). Fact check US: What is the impact of Russian interference in the US
presidential election? The Conversation. Retrieved at https://theconversation.com/fact-check-us-what-is-the-impact-of-russian-interference-in-the-us-presidential-election-146711
Merriam-Webster dictionary (2020).
Moens, A., Cushing, S., & Dowd, A. W. (2015). Cybersecurity challenges for Canada and the
United States. Fraser Institute. Retrieved from https://www.fraserinstitute.org/studies/emerging-cybersecurity-threats-require-increased-vigilance-cooperation-between-canadian-and
Myre, G., & Bond, S. (2020). Russia Doesn’t Have to Make Fake News’: Biggest Election
Threat Is Closer to Home. NPR. Retrieved at https://www.npr.org/2020/09/29/917725209/russia-doesn-t-have-to-make-fake-news-biggest-election-threat-is-closer-to-home
Public Safety Canada. (2016). Fundamentals of cyber security for Canada’s critical infrastructure
community. Retrieved at
Public Safety Canada. (2018). Cyber Security in the Canadian Federal Government.
Public Safety Canada. (2019). National Cyber Security Action Plan (2019-2024).
Public Safety Canada. (2019). National Cyber Security Strategy.
Canada’s Vision for Security and Prosperity in the Digital Age. Retrieved from https://www.publicsafety.gc.ca/cnt/rsrcs/pblctns/ntnl-cbr-scrt-strtg/index-en.aspx
Sun Tzu. (2022). Brainy Quote. Retrieved at
The Spamhaus Project. (2020). Spamhaus Botnet Threat Report 2019. Retrieved at