5. Evaluating counter-disinformation programmes
Summary of the presentations and discussions on "Evaluating counter-disinformation programs" from the June 2021 GFMD IMPACT donor-practitioner-academic meeting on disinformation.

Theories of change: correlation and causation

Problem: In the breakout group discussions, the issue of attribution correlation and causation was raised as an area where collaborative more work is needed.
Suggestion: Evaluators and researchers should be encouraged to use qualitative research or convenings for discussion that can also be used for outcome harvesting and to further understanding.
The issues around methodological and project and research design questions and how media development projects can measure their effectiveness within theories of change related to governance objectives were discussed in the GFMD IMPACT March event on Theories of change and Impact measurement:

Draft working paper

A draft working paper was created for the purposes of discussion at the June 2021 GFMD IMPACT donor-practitioner-academic meeting on disinformation.
Please leave your comments and suggestions directly in the draft document.
Susan Abbott, consultant, co-chair of the Media Sector Development Working Group of The International Association for Media and Communication Research (IAMCR)
Katerina Tsetsura, Ph.D., Gaylord Family Professor of Strategic Communication, University of Oklahoma, Independent Consultant, Research, Measurement, Monitoring, and Evaluation
The following is a summary of the presentation of the draft working paper.

5 key challenges to measuring and evaluating countering disinformation programmes

1. Disinformation is a big, all-encompassing concept.

From an evaluator’s perspective, one of the challenges is knowing what to measure – are we measuring changes to the level of disinformation?
  • Which disinformation is it that we are seeking to counter?
  • What has changed as a result of the programs being implemented?
  • What is a unit of analysis?
  • What or who is supposed to change?
  • Is it an individual person or an institution?
  • What must happen at the societal level?
  • The “disinformation” that’s meant to be countered is often lost in the course of programme implementation.

2. What is a programme’s overall theory of change?

Theories of change are largely underdeveloped. A clear articulation of the change that is supposed to happen as a result of specific inputs is essential.
A simple example:
If we (1) support better journalism and reporting on disinformation, (2) improve newsroom and social media production processes including fact checking and moderation, and (3) train audiences to be literate in assessing media, then we can give the public better tools to interpret and reject disinformation.
In turn, these efforts will help to reduce the impact of disinformation (e.g., on particular topics which were the programme’s focus) and to strengthen disinformation resilience among individuals and communities, a characteristic vital to any democracy.
For more information and recommendations on theories of change see the report from the previous meeting:

3. Media transparency

All the hidden influences that happen in the media can lead to distrust and make it hard to get honest feedback.
Media transparency is a building block for professional media development based on trust between the media and the audience.
Non-transparent practices can be found worldwide (Tsetsura & Kruckeberg, 2017): Media bribery, envelope journalism, or paid news, and media opacity. (Many practitioners around the world also use slang words to refer to this phenomenon: zakazukha in Russia, jinsa in Ukraine, mermelada in Peru, and pay-for-play in the USA.)
The non-transparency practice of journalists and editors worldwide has contributed to losing trust in the media as a social institution in many societies.

4. Trust and truth are hard to measure

Trust is central to any evaluation of countering disinformation efforts.
Countering disinformation is challenging in societies with the fragmented concept of truth, in which polarization and fragmentation, as well as purposely destructive deliberation, perpetuate the fabric of civic discourse.
The notion of truth is increasingly contested, and, as a result, the levels of societal and institutional trust are at an all-time low (Edelman Trust Barometer, 2021).
To what extent can one trust a researcher, an evaluator, a programme implementer in a society of distrust?
How can malign actors utilize the distrust culture to their advantage and turn countering disinformation efforts on its head by diminishing the very nature of societal trust?

5. Making space for non-Western approaches

Disinformation cannot be viewed through the prism of democratic vs. autocratic governments.
What is the extent of our knowledge about countering disinformation in non-western societies?
Who has the right to identify what counts as disinformation and what are the “appropriate ways” to counter disinformation efforts?

Recommendations

1. Build a community of practice around countering disinformation

What? Create a shared vision of what constitutes change.
How?
  1. 1.
    Set up an open collaborative collective of disinformation researchers.
  2. 2.
    Develop a shared lexicon of disinformation terminology.
  3. 3.
    Create shared indicator banks.
  4. 4.
    Craft common theories of change for countering disinformation programmes.

2. Carry out an impact evaluation/study

What? The media and civil society sector should carry out a comparative, impact study that looks at a variety of countering disinformation efforts to see what has been done thus far and what has been achieved.
How? It should be external and independent. Data collected can inform baseline understandings for future work in this space.

3. Countering disinformation diagnostic tool for program design

What? Produce a diagnostic tool that can be used by the wider community of practice seeking to design civil society and independent media-led countering disinformation programmes.
How? Leverage collective wisdom and understanding that can be used as a construct for carrying out assessments used to develop and design countering disinformation programmes: pressure points, contextual factors, key areas of concern.
Last modified 4mo ago