Written by Bret Barrowman, Senior Specialist for Research and Evaluation, Evidence and Learning Practice at the International Republican Institute, and Amy Studdart, Senior Advisor for Digital Democracy at the International Republican Institute
Conceptual Framework
Even in relatively democratic, competitive political party environments, two related dilemmas make countering disinformation difficult. First, competitive parties face a "tragedy of the commons” with respect to disinformation, in which a healthy information environment leads to the best social outcomes, but also incentivizes individual actors to gain a marginal electoral advantage by muddying the waters. Second, parties are not unitary, but are collections of distinct candidates, members, supporters, or associated interest groups, each with its own interests or incentives. In this case, even when party organizations are committed to information integrity, they face a “principal-agent” dilemma in monitoring and sanctioning co-partisans. These related dilemmas create an incentive for political parties and candidates to avoid engaging in or implementing programmatic responses. Democracy, human rights, and governance (DRG) funders and implementing partners can mitigate these dilemmas by using networking and convening power to help parties maintain commitments to information integrity, within and between parties.
Programmatic Responses
DRG practitioners have implemented a wide range of programmatic approaches to reduce both the impact and use of disinformation and related tactics by political parties during elections. These approaches are summarized in the table below, according to the “core party function(s)” – the functions that parties perform in an ideal-type democratic party system – upon which the program approach might be expected to operate. This typology is intended to provide DRG practitioners with a tool through which to analyze party systems and programmatic approaches, with the goal of designing programs that are tailored to the challenges of political party partners.
Program Approaches | Core Party Functions | |||
---|---|---|---|---|
Interest Articulation (expressing citizen interests through electoral campaigns or implementation of policy) | Interest Aggregation (bundling many disparate, and occasionally conflicting, citizen interests into a single branded policy package or platform) | Mobilization (activating citizens, usually party supporters, for political engagement, including attending rallies or events, taking discrete actions like signing petitions or contacting representatives, and especially voting. | Persuasion (parties’ or candidates’ attempt to change voters’, undecided voters or opposition supporters, opinions on candidates or policy issues. | |
Programs on Digital Media Literacy | * | * | * | |
Programs on AI and Disinformation | * | * | ||
Programs for Closed Online Spaces and Messaging Apps | * | * | ||
Programs on Data Harvesting, Ad Tech & Microtargeting | * | * | * | |
Programs on Disinformation Content and Tactics | * | * | ||
Research Programs on Disinformation Vulnerability and Resilience | * | * | ||
Programs for Understanding the Spread of Disinformation Online | * | * | ||
Programs Combating Hate Speech, Incitement, and Polarization | * | * | ||
Policy Recommendations and Reform/ Sharing and Scaling Good Practice in Programmatic Responses | * | * | * | * |
Recommendations
- When implementing these programmatic approaches, consider political incentives in addition to technical solutions.
- Programmatic interventions should account for diverging interests within parties – parties are composed of functionaries, elected officials, interest groups, formal members, supporters, and voters – each of which may have unique incentives to propagate or take advantage of disinformation.
- The collective action problem of disinformation makes one-off interactions with single partners difficult – consider implementing technical programs with regular, ongoing interaction between all relevant parties to increase confidence that competitors are not “cheating.”
- Relatedly, use the convening power of donors or implementing organizations to bring relevant actors to the table.
- Consider pacts or pledges, especially in pre-election periods, in which all major parties commit to mitigating disinformation. Importantly, the agreement itself is cheap talk, but pay careful attention to design of institutions, both within the pact and externally, to monitoring compliance.
- There is limited evidence for effectiveness of common counter-disinformation program approaches with a focus on political parties and political competition, including media literacy, fact-checking, and content labeling. That there is limited evidence does not necessarily imply these programs do not work, only that DRG funders and implementing partners should invest in the rigorous evaluation of these programs to determine their impact on key outcomes like political knowledge, attitudes and believes, polarization, propensity to engage in hate speech or harassment, and political behavior like voting, and to identify what design elements distinguish effective programs from ineffective ones.
- DRG program responses have tended to lag political parties’ use of sophisticated technologies like data harvesting, microtargeting, deep fakes and AI generated content. Funders and implementing partners should consider the use of innovation funds to generate concepts for responses to mitigate the potentially harmful effects of these tools, and to rigorously evaluate impact.