Strategic communication and countering disinformation – A very brief guide

Basic concepts

In the vocabulary of international institutions and other actors engaged in the fight against disinformation, the definitions for this phenomenon are calibrated for the purposes and context of each actor:

  • The EU sees it as “verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm [1]”.
  • NATO sees it as “the deliberate creation and dissemination of false or manipulative information with the intent to deceive and/or mislead“, but adds to its explanation that “disinformation seeks to deepen divisions within and between partner states, and to undermine people’s trust in elected governments“. In NATO’s Strategic Concept 2022, disinformation is treated as part of the arsenal of hybrid actions through which the Alliance’s strategic competitors influence the current strategic environment, challenging member states’ interests, values and democratic way of life[2].
  • Romanian law places disinformation in the list of threats to national security (see. National Defence Strategy 2020-2024[3]), but does not propose a working definition. The existing references in the Romanian law refer to the act (“communication or dissemination”) and its implications, avoiding to clarify the definition further.

Structure of disinformation

Actors involved in running disinformation operations build their manipulative actions around concepts, most often targeting the very values promoted by strategic communication (democracy, equality, loyalty, dedication and spirit of sacrifice, etc.). These concepts are encoded within narratives, linguistic labels or meta-messages that aim to replace objective reality with a fabricated one that serves the interests of the promoter of these narratives.

Narratives are simple linguistic constructs, easy to understand and assimilate, determined by the context in which they are being used for misinformation and carefully chosen to serve the interests of the promoter of the disinformation campaign. They promote social fractures, prejudices and grievances of a part of society un(der)-represented by the political elite.

Example: NATO is an aggressive organisation

Messages that support narratives mix easy and verifiable truths with lies and misrepresentations speculating the mind’s propensity to recognize patterns and fill in missing information. Most often disinformation messages are superimposed on a strong emotional context, speculating the individual’s natural inclination to fill in incomplete statements with their own perception, in line with their own beliefs.

Example: NATO provoked Russia to attack Ukraine through illegitimate involvement in Kiev’s political life.

Target audiences. In the successful implementation of a disinformation campaign, considerable effort is invested in identifying and knowing the target audiences. They are analysed on maximal criteria (prevailing values, vulnerabilities presented, etc.), but also on niche criteria (group jargon, emotional triggers, etc.).

Example: Very young audience, 18-25, without a satisfying career perspective, has a good opinion about communism, but at the same time a good opinion about interwar Fascism. 

Complex information operations have messages adjusted according to the features identified in terms of audience segmentation: inactive and unrepresented majorities and satisfied active minorities will be approached through different messages.

Channels for disseminating disinformation

By its definition, effective strategic communication is a proactive tool aimed at inserting messages supporting the values assumed by the communicators in all communication spaces frequented by the individuals of a society.

It is essential to be aware that disinformation operations are conducted simultaneously, alternatively or selectively on all information transfer channels, from the traditional ones (TV, radio, print media) to social media platforms (Facebook, Twitter, Instagram, etc.) and video sharing platforms (YouTube).

The choice of media is adjusted according to the preference of the target audience of disinformation, but also according to the specific vulnerabilities of each medium. Thus:

TV and radio are still preferred means of information for most Romanians[4] but suffer from partisan bias, excessive polarisation, editorial pressures and opaque ownership

Online media is easy to access in a variety of formats but can lead to forming echo chambers by engaging in networks that recycle content and ideas.

Social networks can promote the power of the powerless (see the Arab Spring) but can also offer means of cheap and easy multiplication of disinformation. 

Before you (re)act….

Even before specific disinformation manifests itself, strategic communicators must be prepared on several fronts.

  • They must be familiar with their organisation’s communication strategy, as well as the communication strategies of coordinated departments/organisations.
  • They must identify their target audience and the target audience of disinformation
  • They must identify vulnerabilities targeted by disinformation, which may include internal vulnerabilities, relevant society-level vulnerabilities and specific vulnerabilities of the target groups.

Based on this analysis, the communicator must create and maintain a trusted network of contacts among the target audience, one that would be willing to receive and trust their message, based on confidence built over time. This may include media organizations, NGOs, professional communities, opinion leaders, or online influencers.

In addition, continuous monitoring of the informational space using quantitative and qualitative methods is necessary to forecast the likelihood and nature of disinformation events.

Monitoring must be conducted systematically and presented through regular deliverables, including reports, meetings, conferences, etc. Simply assigning formal structures to monitor activities without operationalizing these deliverables and without establishing mechanisms for rapid response will likely lead to misunderstandings or delayed response.

Forecasting future disinformation allows prebunking. This means promoting information that counters falsehood and reflects the institution’s perspective. Pre-bunking allows citizens to conduct fact-checking on their own, leading to greater credibility for the institutions and more effective disinformation control.

Decision to act

In certain situations, inaction can be the optimal communication strategy. These include situations where disinformation has negligible impact on target groups, where it would do more harm than good (i.e. would give even more visibility to the falsified information/ narrative), or when it draws excessive resources from other priorities. The decision to initiate proactive communication is based on several elements, including:

Disinformation or the danger of disinformation has been identified.

Disinformation matches vulnerabilities already identified within the target audience.

In the absence of vulnerability (leading to potentially high negative impact), communication is a waste of resources. For example, states do not normally prioritise convincing the population that the world is not run by extraterrestrials.

The intervention is legitimate.

Strategic communication must not infringe on the rights of the citizens to free speech. In particular some forms of communication are particularly protected (parliamentary discourse, media). In such cases the communicators must present the position of their organisation without undermining the rights of citizens, politicians and journalists to a robust critique.

Communication is aligned with the general communication strategy.

How to communicate

Disinformation messages use factual statements but mainly appeal to emotions. Therefore, messages that combat disinformation will also need to appeal not just to reason, but also to emotions.

Note: There may sometimes be discrepancies between the emotions invoked declaratively in messages of disinformation and the emotional vulnerability that is actually targeted.

Example: An apparently xenophobic message may in fact seek to exploit social discontent or dissatisfaction with the standard of living within the target group. Conversely, a socially-oriented message may actually mask appeals to the xenophobic sentiments of the audience.

Trust is an important aspect of strategic communication. To reinforce it, it is important to use trusted communicators within the wider audience and to appeal to influencers who are trusted within the target groups.

Segmentation. Each target audience may have different interests, emotions and knowledge; at the same time, audiences may trust different personalities and receive information from different channels. Therefore, anti-disinformation communication will need to be tailored to the target audience.

  • The message can be varied according to the concerns and emotions of the target groups.
  • Messages will be transmitted through channels that the target audience actually accesses.
  • Communicators will try to use influencers within the target groups.

Good practice example: During the COVID-19 pandemic, the British government identified a vulnerability within the Muslim community concerning the belief that some vaccine ingredients were forbidden from a religious perspective (haram). To counter this problem, imams were contacted and involved in discussions about the list of ingredients that make up the vaccine so that they could verify that it is indeed allowed from a religious perspective (halal) and then disseminate the information further.

This specific communication did not target alarmist messages on the Internet and did not use classical channels such as television or outdoor advertising, but rather went directly to those people who were trusted within the target group and at the same time were open to a fact-based discussion on the subject of vaccines. It is significant that the government implicitly recognized their religious concerns as legitimate and did not seek to arouse negative emotions.

No vacuum! The information space should not be left uncovered. Once the decision to communicate has been made and the strategy designed, the communicator(s) will proactively try to cover, to the best of their abilities, all target audiences and all relevant channels. Any vacuum will offer ample space for manoeuvre to disinformation.

Inequality of resources and the whole-of-society approach

A major problem faced by those involved in strategic communication is the resource inequality between promoters of disinformation and communicators who combat it. Disinformation relies on simple and emotionally appealing messages that are easy to remember. By its nature, preventing and countering disinformation requires relying more heavily on dry facts and data with low viral potential. Fortunately, there are several ways to compensate for this.

Prioritisation: Based on the communication strategy, communicators at every level should be trained and able to distinguish disinformation events that deserve priority attention and channel efforts towards them.

Example: in a crisis situation, responding to citizens’ petition on issues unrelated to the current situation can be deprioritized.

Use of monitoring tools: Optimal monitoring of the information space is a combination of qualitative analysis that establishes monitoring areas, specific language use, etc., quantitative analysis that extracts and measures relevant content, and then again qualitative analysis to produce conclusions and recommendations. Relying on good quantitative tools helps the analyst to sift through large quantities of data and identify topics and positions of real interest.

Examples of quantitative monitoring tools include Pulsar (for online media and, to a smaller extent, Facebook), CrowdTangle (Facebook, Twitter and Instagram), Google Trends (spontaneous online interest in a subject expressed in searches on Google), or Newswhip (online media).

Many of the tools available are intended for commercial purposes (monitoring the reputation of a company). They use legally accessible data and are bound by GDPR rules.

Multiplication: Communicators must explore all relevant channels and platforms (newspapers, television, social media etc) and choose channels that are both widely circulated and trusted by the target audience.

Whole-of-society approach

An approach that builds trust and is conducive to the efficient multiplication of efforts is the whole-of-society approach. It recognizes that disinformation is a complex issue that cannot be solved by one organisation or individual alone and promotes collaboration and coordinated efforts from various stakeholders across different sectors of society, including:

Horizontal communication between disparate structures depending on the nature of the topic. Partners that should be included in communication include state organisations and departments, civil society organisations (including religious and professional organisations), and the media.

Vertical communication. In Romania, this is a less significant issue for civil society organisations, which are relatively small, focused and tightly integrated. However, in the larger CSOs and state structures, it is essential to maintain constant vertical communication with territorial actors, including local directors, mayors, priests, local organisations, etc.

The end-purpose is to involve as many segments of society as possible in counteracting disinformation so that (1) there is larger inflow of intelligence about disinformation, allowing for better response, (2) extend the reach and credibility of counter-disinformation messages, (3) build ground-level resilience against disinformation.

The whole-of society approach is useful in all stages of combatting disinformation: strategy-building, identification of vulnerabilities, monitoring the information space, prebunking and building resilience, proactive communication etc.

This report is part of an international research project financed by USAID, coordinated by the International Republican Institute’s (IRI) Beacon Project on countering Russian Disinformation and Propaganda. The opinions expressed are solely those of the authors and do not reflect those of IRI.


[1] https://merlin.obs.coe.int/article/8271https://merlin.obs.coe.int/article/8271

[2] https://www.nato.int/strategic-concept/

[3] https://www.presidency.ro/files/userfiles/Documente/Strategia_Nationala_de_Aparare_a_Tarii_2020_2024.pdf

[4] https://www.agerpres.ro/social/2021/10/18/sondaj-inscop-majoritatea-romanilor-se-informeaza-de-la-televizor-aproape-jumatate-cred-ca-posturile-tv-sunt-expuse-dezinformarii–798237