Healthy Information Space

Effective democracies require that citizens have access to accurate and impartial electoral and political information. Five key components of a healthy information space – civil society, election management bodies, governments, mass media, technology companies – must work in tandem to fight back against disinformation campaigns that seek to spread cynicism, distort political processes and hinder citizens’ ability to make sound political decisions.

WHO DOES WHAT
These five sectors of society play important roles in defending the integrity of information in democracies. The analysis presented on this guide focuses on the responsibilities and best practices of these actors.
Civil society organizations combat disinformation and promote information integrity through programs and other initiatives, including fact checking, media literacy, online research and a host of other methods.
Civil
Society
1
Election Management Bodies play an essential role in protecting the integrity of elections and countering disinformation through proactive, reactive and collaborative strategies.
Election
Management Bodies
2
Tech companies, particularly social media platforms, play a critical role as the hosts, infrastructure and networks for much of our online space. They also have a great responsibility to build and maintain them in a way that promotes democratic ideals such as freedom of expression, privacy and open access to information.
Technology
Companies
4
Whether on social media platforms or though traditional journalism, media organizations play essential roles in protecting the information environment from false narrative through fact checking, moderation, media literacy, and research.
Media
Organizations
5
The public sector, including government administrations, legislatures and other executive bodies, play a critical role in the negotiation of the online space, from the legal and regulatory frameworks they build and maintain, to the norms and standards they seek to promote, to their engagement with other stakeholders in sustaining a healthy information ecosystem.
GOVERNMENTS
3

Who Does What

These five sectors of society play important roles in defending the integrity of information in democracies. The analysis presented on this guide focuses on the responsibilities and best practices of these actors.

1

Civil
Society

Civil society organizations combat disinformation and promote information integrity through programs and other initiatives, including fact checking, media literacy, online research and a host of other methods.

2

Election
Management Bodies

Election Management Bodies play an essential role in protecting the integrity of elections and countering disinformation through proactive, reactive and collaborative strategies.

3

Governments

The public sector, including government administrations, legislatures and other executive bodies, play a critical role in the negotiation of the online space, from the legal and regulatory frameworks they build and maintain, to the norms and standards they seek to promote, to their engagement with other stakeholders in sustaining a healthy information ecosystem.

4

Technology
Companies

Tech companies, particularly social media platforms, play a critical role as the hosts, infrastructure and networks for much of our online space. They also have a great responsibility to build and maintain them in a way that promotes democratic ideals such as freedom of expression, privacy and open access to information.

5

Media
Organizations

Whether on social media platforms or though traditional journalism, media organizations play essential roles in protecting the information environment from false narrative through fact checking, moderation, media literacy, and research.

By The Numbers

The guide includes the first-ever global database of organizations and initiatives working on to counter disinformation that includes more than 270 entries from more than 80 countries.

282+
Total interventions worldwide
112
Media literacy interventions
144
Fact checking interventions
58
Tools
8
Involving election management bodies
14
Engaging with platforms

9 Big Takeaways

In conducting this analysis and looking at these critical aspects of the problems, the research team has identified key takeaways that should drive disinformation efforts going forward.

1
Disinformation exists in every information ecosystem in the world. No actor can address this alone. For this reason, a whole-of-society approach is needed that encourages actors from governments, civil society, and industry to work together to counter disinformation and strengthen societal response.
2
Countering disinformation is not THE top priority for most institutions, governments, political parties, or civil society groups. However, some of these actors proliferate both disinformation and misinformation. Until this sense of urgency drives a collective effort to address it, lasting change cannot be achieved.
3
Efforts to combat disinformation in elections and to combat existing societal cleavages are distinct but overlapping challenges. Donors and implementers should not let a bias toward technologically innovative programming undercut continued investment in building the types of durable capacity that make democratic stakeholders more resilient when disinformation challenges arise.
4
Public and private institutions such as Election Management Bodies and platforms are often well equipped to address disinformation challenges but lack credibility. By contrast, civil society is a credible actor, nimble, and essential but chronically under resourced.
5
No one approach (media literacy, fact checking, research and monitoring, social media take downs, etc) is sufficient. A holistic approach to countering disinformation is essential.
6
Focusing on major events, such as the outcomes of elections and referendums, are effective in creating safe political processes. This contributes to, but does not achieve, a healthy information ecosystem.
7
Understanding the impact of gendered disinformation and the role gender plays in information integrity is critical. As such, interventions must include gender component and be localized for greater context from program design to implementation in order to increase effectiveness and minimize potential harm.
8
Disinformation efforts that rely on content moderation structures alone are not sufficient. Development of norms and standards, legal and regulatory frameworks and better content moderation of social media platforms must be addressed in order to create a healthy information ecosystem. This is especially important to strengthening complex information environments in the Global South.
9
Parties play a critical role in both political systems and the creation and dissemination of online campaigns that often propagate disinformation and other harmful forms of content. it is important that frameworks be put in place that discourage political parties from engaging in disinformation.