Programs on Disinformation Content and Tactics
Programs examining disinformation content and tactics take on a wide variety of forms, whether simply collecting and analyzing the information or looking to infiltrate disinformation groups to study their methods. These approaches also play an important accountability function with respect to political parties. A focus on the content of disinformation may help citizens and CSOs clarify complex policy issues, reducing the space for parties and candidates to muddy the water. This approach sees either independent journalists, volunteers, or CSOs check the veracity of content, issue corrections, and – in some instances – work with social media companies to flag misleading content, limit its spread, and post the fact-checkers correction alongside a post. Some of these initiatives target political party or candidate content explicitly, while others look at the broader information ecosystem and fact-check stories based on their likely impact, spread, or a specific interest area.
Programs to develop fact-checking and verification outlets are rarely done in direct partnership with political party actors given that the approach requires political neutrality to be effective. However, hypothetically, these programs serve an important accountability function by acting on the incentives of political actors. A theory of change underlying these approaches is that if political actors, especially elected officials, know that false statements will be identified and corrected in a public forum, they may be less likely to engage in this behavior in the first place. Furthermore, fact-checking and verification outlets can provide accurate information to voters, who may then more effectively punish purveyors of disinformation at the ballot box. In Ukraine, for example, a program funded by the British Embassy and implemented by CASE Ukraine developed a set of information technology (IT) tools to enable citizens to analyze state budgets, in theory to develop critical thinking to counter politicians’ populist rhetoric on complex economic issues.14 Similarly, support for “explainer journalism” modeled on outlets like Vox.com in the United States has emerged as an approach to counter parties’ attempts to confuse citizens on complex policy issues. VoxUkraine, for example, supported by several international donors and implementing partners, provides both fact-checking, explainers, and analytical articles, especially on issues of economic reform in Ukraine.15
Program approaches have also drawn on pop culture, using satire and humor to encourage critical thinking around disinformation on complex issues. For example, Toronto TV, supported by the National Endowment for Democracy , Internews, and Pact, and inspired by American satirical takes on news and current events by Jon Stewart, John Oliver, Hassan Minhaj and others, use social media platforms and short video segments to challenge disinformation narratives propagated by prominent politicians.
A number of the interventions aimed at this issue have focused on countering disinformation ahead of election cycles and understanding the role of social media in spreading information during modern political campaigns, such as International IDEA’s roundtable on “Protecting Tunisian Elections,” held in 2019. Similarly, the Belfer Center’s Cybersecurity Campaign Handbook, developed in partnership with NDI and IRI, provides context and clear guidance for campaigns facing a variety of cybersecurity issues, including disinformation and hacking. In terms of more concrete activities, DRG practitioners are building media monitoring into existing programs, including election observation. Grafting media monitoring onto existing program models and activities is a promising approach that could allow DRG programs to counter disinformation at scale. However, a potential drawback of this approach is that it focuses intervention on election cycles, while both the content and tactics transcend election cycles and operate over long periods of time.16 With this in mind, program designers and funders should consider support for efforts that bridge elections, and often, go further than the life of a standard DRG program.
Ultimately the real-world effects of content awareness and fact-checking programs are unclear. Academic research suggests that while fact-checking can change individual attitudes under very specific circumstances, it also has the potential to cause blowback or retrenchment – increased belief in the material that was fact-checked in the first place.17 Furthermore, there appears to be relatively little research on whether fact-checking deters the proliferation of disinformation among political elites. Anecdotally, fact-checking may lead politicians to attempt to discredit the source, rather than change their behavior.18 Ultimately, an accounting of any deterrent effect of fact-checking program approaches will require donors and implementers to evaluate the impact of these programs more rigorously.
In any case, the existence of factchecking, verification outlets, or awareness building alone is likely not sufficient to change political actors’ behavior regarding false statements or disinformation. In Ukraine, for example, research suggests that audiences for prominent fact-checking outlets were constrained geographically. The primary audiences tended to be younger, more urban, internet-connected, educated, and wealthy, and already inclined to monitor and sanction disinformation on their own.20 Fact-checking and verification programs should therefore pay close attention to deliberately expanding audiences to include populations that might otherwise lack the opportunity or resources to access high quality information. These programs should also consider efforts to make elected officials themselves conscious of their monitoring mechanisms and audience reach. If candidates or elected officials are confident that the products of these outlets are not accessible to, or used by, their specific constituencies, these programs will be less effective in serving an accountability function.