Disinformation

Policy briefs and working papers on media development and donor support to journalism in relation to addressing disinformation and information disorder.

This page is regularly updated. If you would like to suggest a resource, please get in touch.

For those advocating against similar laws around the world, this page features some examples of reference papers or examples that can be drawn upon:

Center for Law and Democracy

Center for Law and Democracy

In May 2021, the Centre for Law and Democracy prepared analyses of Mauritania’s ICT Act and proposed amendments to it. CLD also released an analysis of amendments to Myanmar’s Penal Code, which includes some commentary on the new provisions on false news. Both analyses referred to the problem of banning “false news” and referenced international standards in this area.

Fondation Hirondelle

This report from Fondation Hirondelle outlines their approach to disinformation, which “centres on the fundamental principles of journalism and on the lessons learned from over 25 years of applying these principles in highly fragile contexts, where access to reliable information for the majority is not a given, and where rumours, hate speech and propaganda undermine peace building and development.”

It features recommendations for governments and development aid donors; policymakers and institutions; media owners; companies and web and social media organisations.

Office of the United Nations High Commissioner for Human Rights - IRENE KHAN

Summary

In the present report, the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression examines the threats posed by disinformation to human rights, democratic institutions and development processes. While acknowledging the complexities and challenges posed by disinformation in the digital age, the Special Rapporteur finds that the responses by States and companies have been problematic, inadequate and detrimental to human rights. She calls for multidimensional and multi-stakeholder responses that are well grounded in the international human rights framework and urges companies to review their business model and States to recalibrate their responses to disinformation, enhancing the role of free, independent and diverse media, investing in media and digital literacy, empowering individuals and rebuilding public trust.

SUBMISSIONS

  • UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression Submission to the report on disinformation (Feb 2021) DW Akademie

Cyber Secure Policy Exchange - AVIV OVADYA

Search engines transformed the first decade of the millennium. Recommendation engines revolutionized the second decade. Neither in their current form are sufficient for addressing misinformation. They focus on discovery and primarily rely on relevance. But they are not particularly helpful at many other important information tasks, particularly contextualization.

We need better tools to help people quickly contextualize media that they come across online. This is especially important for supporting busy everyday people needing to rapidly make sense of the misinformation-laden text, images, and videos shared in group chats and online platforms.

Arab Reporters for Investigative Journalism (ARIJ)

The policy paper is based on polls involving 229 Arab journalists and dialogue with 17 experts from the press, representatives of local and international institutions concerned with training and supporting journalists, as well as representatives of Facebook, Twitter and Google.

The paper comprises three chapters. First, it discusses what fake news is, distinguishing it from other variations of misinformation and refuting common assumptions surrounding it; second, it unpacks the state of fake news in the Arab world during the COVID-19 pandemic, specifically until August 2020; third, it names the most important methods used in the region to confront this news, discussing their effectiveness and shortcomings. Finally, the paper puts forward recommendations to prevent and combat the regional spread of misleading news.

Freedom House - ANASTASIA NANI, LOLITA BUKA & GINA LENTINE

This paper argues that the approach to media policy, including adherence to fundamental media freedoms, is a key factor in determining the overall effectiveness of the country’s pandemic response. The brief compares government policies relating to media during the pandemic, considers how these policies impacted media and the effectiveness of the country’s coronavirus response, and presents actionable recommendations for Moldova and other countries in the region to develop more effective media policy in times of crisis.

Forum on Information and Democracy

In June 2020, the Forum on Information and Democracy launched its inaugural working group. The Forum asks experts, academics and jurists all over the world to define a policy framework (set of recommendations) to respond to the infodemics through four structural challenges.

1 - Meta regulation of content moderation

To evolve from content regulation to meta regulation (regulation of the corporate actors that dictate the moderation rules), we need to develop a set of principles that platforms and social media will have to accept, in accordance with international standards of freedom of opinion and expression.

2 - Platforms’ design and reliability of information

The pandemic has demonstrated the need to reverse the amplification of sensational content and rumour by promoting reliable news and information in a structural manner. Based on established criteria, mechanisms and policies to promote the authenticity, reliability and findability of content are to be determined.

3 - Mixed private and public spaces on private messaging systems

The virality of fake news shared on messaging apps is reinforced by the use of groups that sometimes have thousands of members. It is important to define minimal rules for messaging apps that exploit the possibilities of the online public domain while complying with international standards on freedom of opinion and expression.

4 - Transparency of digital platforms

Access to the qualitative and quantitative data of the leading digital platforms and access to their algorithms is a prerequisite for evaluating them. Transparency requirements must therefore be imposed on the platforms in order to be able to determine whether they are respecting their responsibilities in the aforementioned areas and, in general, with regard to their business models and algorithmic choices.

As false or manipulated information continues to proliferate online during the Covid-19 epidemic, the Forum on Information and Democracy is publishing a report entitled How to end infodemics. Based on more than 100 contributions from international experts, it offers 250 recommendations on how to rein in a phenomenon that threatens democracies and human rights, including the right to health.

Launched in 2019 by 11 non-governmental organizations and research centres, the Forum on Information and Democracy created a working group on infodemics in June to devise a “regulatory framework” to respond to the information chaos on online platforms and social media. After five months of work, this group, whose steering committee is co-chaired by Maria Ressa and Marietje Schaake, is publishing a detailed report with 250 recommendations for governments and digital platforms.

The twelve main recommendations of the working group

Public regulation is needed to impose transparency requirements on online service providers.

1. Transparency requirements should relate to all platforms’ core functions in the public information ecosystem: content moderation, content ranking, content targeting, and social influence building. 2. Regulators in charge of enforcing transparency requirements should have strong democratic oversight and audit processes. 3. Sanctions for non-compliance could include large fines, mandatory publicity in the form of banners, liability of the CEO, and administrative sanctions such as closing access to a country’s market.

A new model of meta-regulation with regards to content moderation is required.

4. Platforms should follow a set of Human Rights Principles for Content Moderation based on international human rights law: legality, necessity and proportionality, legitimacy, equality and non discrimination. 5. Platforms should assume the same kinds of obligation in terms of pluralism that broadcasters have in the different jurisdictions where they operate. An example would be the voluntary fairness doctrine. 6. Platforms should expand the number of moderators and spend a minimal percentage of their income to improve quality of content review, and particularly, in at-risk countries.

New approaches to the design of platforms have to be initiated.

7. Safety and quality standards of digital architecture and software engineering should be enforced by a Digital Standards Enforcement Agency. The Forum on Information and Democracy could launch a feasibility study on how such an agency would operate. 8. Conflicts of interests of platforms should be prohibited, in order to avoid the information and communication space being governed or influenced by commercial, political or any other interests. 9. A co-regulatory framework for the promotion of public interest journalistic contents should be defined, based on self-regulatory standards such as the Journalism Trust Initiative; friction to slow down the spread of potentially harmful viral content should be added.

Safeguards should be established in closed messaging services when they enter into a public space logic.

10. Measures that limit the virality of misleading content should be implemented through limitations of some functionalities; opt-in features to receive group messages, and measures to combat bulk messaging and automated behavior. 11. Online service providers should be required to better inform users regarding the origin of the messages they receive, especially by labelling those which have been forwarded. 12. Notification mechanisms of illegal content by users, and appeal mechanisms for users that were banned from services should be reinforced.

Read the full report here.

Last updated