1. Political Parties and the Tragedy of the Information Commons

Updated On
Mar 31, 2021

Definition of Political Parties

Political parties are organized groups of individuals with similar political ideas or interests who try to make policy by getting candidates elected to office.1 This electoral function – advancing candidates for office and securing votes for those candidates – distinguishes political parties from other organizations, including civil society organizations (CSOs) or interest groups. This electoral role creates unique incentives for political party actors with respect to disinformation and programmatic responses. 

Paragraphs

Political Parties, Information, and Democracy: An Overview for Developing Context Analysis, Problem Statements, and Theories of Change

 

How Parties Connect Citizens with their Representatives

The ability of party systems to constructively shape electoral competition depends on the exchange of high-quality information. Conceptually, parties connect citizens to elected officials through a market mechanism. In democratic multiparty systems, political parties bundle many disparate, and occasionally conflicting, interests into a single branded package (interest aggregation) which they in turn “sell” to voters during elections (interest articulation).2 Importantly, however, this process represents an ideal model of democratic competition between programmatic political parties that political scientists expect to produce the best democratic outcomes for citizens, including high quality public goods and services, and high levels of accountability. However, no single party or party system approximates this model in practice, and many fall short of it. 

Indeed, in many cases, parties fail to effectively aggregate or articulate citizen preferences. Disinformation, creating fractured, isolated epistemic communities, clearly makes the processes of interest aggregation and articulation more difficult, although it is ultimately unclear whether disinformation is a cause or consequence. For these processes to operate effectively, political parties and elected officials must have good information about the preferences of their constituents, and voters must have good information about the performance of their representatives. Party brands facilitate this accountability by providing a yardstick for voters; citizens can judge their representatives against what their party brand promises. These processes are particularly important for political inclusion. Clear information about constituent preferences and representatives’ performance improves the likelihood that the interests of marginalized groups are heard and perceived as legitimate, and as such, provides an electoral incentive for political leaders to address those interests.  This transmission of information between elites and voters is a necessary (but not sufficient condition) for democratic party systems to function. Without good information, parties and elected officials cannot ascertain constituent preferences, and voters cannot associate performance or policy outcomes with a party brand to hold elected officials accountable. Furthermore, disinformation can influence whose voices are heard and what interests are legitimate. As such, political elites may have an incentive to use disinformation to further marginalize under-represented groups.

Excludability and Attribution: Why it is hard for citizens to hold representatives accountable for public policies without functioning parties and good information.

However, this problem of the exchange of good information is compounded by the nature of public policies. In economic terms, public goods and services are non-excludable – it is difficult to prevent individual citizens from enjoying them if they are provided. For example, a good national defense establishment protects all citizens, even those who have not paid their taxes; it is not practical or cost effective for a state to withhold national defense from specific citizens.  Private goods -- money in exchange for a vote, for example --  can be delivered directly to specific individuals, who know exactly who provided it. 

Public policies, on the other hand, suffer from a problem of attribution. Since these goods are provided collectively, citizens may be less sure what specific officials or parties are responsible for them (and conversely, who is responsible for unintended consequences or the lack of policy altogether). Also, public policies are complicated. Both these policies, and their observable outcomes for citizens, are the products of complex interactions of interests, context, policymaking processes, and implementation. Furthermore, observable outcomes, like a good economy or a healthy population, may significantly lag the policies that are most directly responsible for them. As such, citizens may find it difficult to attribute policy outcomes to specific representatives.3 Political parties can help simplify complex policy issues for voters, again assuming an exchange of good information between elites and voters. 

The Tragedy of the Information Commons: Accounting for Incentives in Countering Disinformation Programs

These interrelated concepts – the interest aggregation and articulation functions of parties, role of information in democratic political competition, and the attribution problems of public policies have important implications for the design and implementation of counter-disinformation programs. Like national defense or a functioning transportation infrastructure, a healthy information environment benefits everyone, and it is impractical to exclude individuals or single groups from that benefit. For parties, this nature of the information environment creates a collective action or “free-rider” problem.4 While the best collective outcomes occur when all actors refrain from engaging in disinformation, each individual has an incentive to “free-ride” – to enjoy the healthy information environment while gaining a marginal competitive advantage by muddying the waters. In this sense, the problem of disinformation for political parties is a tragedy of the commons,5 in which small transgressions by multiple actors end up spoiling the information environment.6 This can occur even in ideal circumstances – relatively open environments with competitive elections. It is compounded in authoritarian or semi-authoritarian systems in which the incumbent exercises significant control over the information environment through repression or control of media outlets, or where fringe parties or politicians have an incentive to proliferate provocative content with the goal of increased attention or visibility.7 This control of the information environment precludes meaningful electoral competition between parties, further reducing any incentive to cooperate on information integrity. While this situation may create incentives for opposition parties to counter disinformation, especially if they see gains from public perceptions of honesty, it may also lead to vicious cycles of degrading the information environment when there are alterations of power. 

Like other public goods and services, a good information environment benefits everyone. Citizens get accurate information about how their representatives are doing and can reward or sanction them accordingly. Parties get good information about what their citizens want. A good information environment depends on every actor committing to this outcome. In fact, parties have an electoral incentive to muddy the waters – to let every other competitive party be honest while they misrepresent issues of public policy. Again, this dilemma makes countering disinformation difficult even in the best-case scenario. Where parties and party systems fall short of this ideal type, the dilemma will be more difficult to resolve. 

Highlight


Principal-Agent Problem: An organizational problem in which one actor (the principal) has authority to set collective goals and must ensure that one or more other actors (the agents) behave in a way that advances those goals, despite the agents controlling information about their own performance. For an illustration of the principal-agent problem in campaign messaging, see Enos, Ryan D., and Eitan D. Hersh. “Party Activists as Campaign Advertisers: The Ground Campaign as a Principal-Agent Problem.”

American Political Science Review 109, no. 2 (May 2015): 252–78. https://doi.org/10.1017/S0003055415000064.

The Principal-Agent Problem of Political Parties: Maintaining commitments to countering disinformation within parties

Furthermore, political parties are not unitary; they are coalitions of varied (and often competing) candidates, constituencies, and interest groups. As such, all political parties face an additional challenge of keeping candidates and members accountable to the parties’ organizational goals and platform.  In the context of disinformation, even democratically inclined or reform parties, or parties that think they can gain votes by taking a stand against disinformation, confront a principal-agent problem. On the one hand, party leaders may simply be unaware of affiliates’ attempts to generate or take advantage of disinformation. On the other hand, this problem creates plausible deniability – elites may tacitly encourage supporters to engage in disinformation to help the parties’ electoral prospects while the leadership signals a commitment to information integrity. In addition, often, individual party members exploit gender or other identity-based cleavages of “competitors” within their own party to gain a competitive edge, that can include the use of hate speech, disinformation or other harmful forms of content promoted in the public sphere. If this dynamic is unacknowledged, DRG programming can help legitimize campaign tactics that undermine democratic accountability. In short, DRG practitioners should not assume political parties are unitary, and technical solutions should include approaches to helping political party actors ensure that all candidates and supporters maintain commitments to information integrity. While these models help illustrate important incentives that program designers should be sensitive to, it is important to note that they do not preclude technical solutions. Beyond providing encouragement, support, and training for party leadership in setting tone and expectations, establishing infrastructure for communication and coordination within the party will hold members and candidates accountable. The “DRG Program Responses to Disinformation with Political Party Partners” section below provides concrete ideas for programs to support parties’ efforts to protect information integrity.

Party Functions, Incentives for Abuse, and Program Design

In concrete terms, political parties perform four information-based functions in democratic multi-party systems: interest aggregation, interest articulation, citizen mobilization, and persuasion. Democratic collective outcomes are more likely when parties perform these functions based on good information. However, within each function, parties or individual candidates have incentives to manipulate information to gain an electoral advantage.

Interest aggregation refers to parties’ capacity to solicit information about citizen interests and preferences. To develop responsive policies and compete in elections, parties must have reliable information about the interests and preferences of the voters.  They may also have an incentive to mischaracterize public sentiment both to their opponents and the public. For instance, to prioritize a policy that is broadly unpopular, but which is important for a key specific constituency, a party or candidate might have an incentive to mischaracterize a public opinion study, or to artificially amplify support for a policy on social media using bot networks. 

Interest articulation refers to parties’ ability to promote ideas, platforms, and policies, both in the campaign and policymaking process. Interest articulation requires political parties to engage in both mass and targeted communication on issues with voters. This function may also require parties and candidates to persuade (see below) citizens of their viewpoints – particularly to convince voters that specific policies will fulfill those voters’ interests. Again, there is a social benefit to “true” information about the policies and positions – citizens can cast their votes for the parties that best represent their preferences. However, to gain an electoral advantage, individual parties or candidates may have an interest in misrepresenting their policy positions or the potential consequences of preferred policies, by fabricating research studies or by scapegoating vulnerable groups. 

Mobilization refers to parties’ capacity to activate citizens for political engagement, including attending rallies or events, taking discrete actions like signing petitions or contacting representatives, and especially voting. To produce the most democratic outcomes, mobilization should be based on good information; parties should provide potential voters with accurate information about policies and the electoral process – particularly where and how to vote.  However, individual parties and candidates can gain an electoral advantage by engaging in more nefarious mobilization tactics. Mobilization can involve coercion – the use of disinformation to “scare” voters about the consequences of opponents’ policies, to activate voters by inflaming prejudices or political cleavages, or to demobilize opposition candidates or supporters through harassment or by generating apathy. 

Persuasion refers to parties’ or candidates’ attempt to change voters’ opinions on candidates or policy issues. In contrast to mobilization, which often focuses on known party supporters or apathetic voters, persuasion is usually targeted to moderates, “undecideds” or weak supporters of opposing parties. 

Highlight


Key Agents: Trolls, bots, fake news websites, conspiracy theorists, politicians, partisan media outlets, mainstream media outlets, and foreign governments9

Key Messages10: causing offense, affective polarization, racism/sexism/misogyny, “social proof” (artificial inflation of indicators that a belief is widely held), harassment, deterrence (use of harassment or intimidation to discourage an actor from taking an action, like running for office or advocating for a policy), entertainment, conspiracy theories, fomenting fear or anxiety of a preferred in-group, logical fallacies, misrepresentations of public policy, factually false statements 

Key Interpreters: Citizens, party members and supporters, elected officials, members of the media

Key Resource


Notes on Sources of Disinformation:

The information disorder framework suggests identifying disinformation tactics and possible responses by thinking systematically about the agent, the message, and the interpreter of the information. Many DRG programs, especially funded by USG donors, focus on building resilience within a target country to foreign disinformation campaigns, especially by the governments (or pro-government supporters) of China, Russia, and Iran. However, it is important to note, especially among political parties, that the agents (or perpetrators) of disinformation campaigns may also be domestic actors seeking to affect the behavior of their political opposition or supporters. Even in the case of foreign-directed campaigns, interpreters (or targets) are not selected broadly or arbitrarily. Rather, foreign campaigns seek to exacerbate existing social and political cleavages, with the goal of eroding trust in institutions writ large. Furthermore, foreign campaigns often rely on witting or unwitting supporters in target countries.  In both cases of foreign and domestic disinformation campaigns, therefore, historically marginalized groups including women, ethnic, religious, or linguistic minorities, persons with disabilities, and LGBTI individuals are often disproportionately targeted and injured by these efforts.8

Implications

These interrelated concepts – the excludability and attribution problems of public policies, the concept of the information space as a tragedy of the commons, and the role of information in the interest aggregation, articulation, and mobilization functions of political parties have several concrete implications for practitioners designing and implementing counter-disinformation programs:

  1. The best collective democratic outcomes require a healthy information environment.
  2. Each individual actor (a party or candidate) may perceive an incentive to let others behave honestly while they try to gain a competitive advantage through disinformation.
  3. As such, political parties have an incentive both to perpetuate and to mitigate disinformation. Whether they choose to perpetuate or mitigate depends on the context; in short, parties want voters to have true information about things that help them and false information about things that hurt them. The inverse is true for their political opponents. 
  4. Information disorders are a product of many actors perceiving this incentive structure and knowing their competition is acting according to these incentives. Parties might be willing to commit to information integrity if they could be confident their competitors would do the same simultaneously. However, if they are not confident their opponents will do the same, even honest or democratically inclined parties may be unwilling or unable to forgo using disinformation if it means losing elections and their opportunity to implement their agenda
  5. When political parties ARE committed to information integrity, they are not unitary, and face an additional challenge of keeping candidates, members, and supporters accountable to the parties’ commitments. This unwillingness or inability to monitor and sanction co-partisans compounds the dilemma in point four above.  
  6. Consider these political incentives before designing and implementing technical solutions. Frameworks like Thinking and Working Politically and Applied Political Economy Analysis can help practitioners better understand the unique political incentives facing potential partners and beneficiaries. Key political solutions may include internal and external monitoring and coordination between relevant parties in committing to mitigating disinformation. 

A note on technical solutions: In some cases, particularly with democratic or reform parties, parties may express a willingness to take concrete steps to mitigate disinformation. In a smaller set of cases, all major competitive parties might be willing to take these steps.  Even these best case scenarios where parties are engaged in building solutions give rise to problems surrounding technical solutions, including resource constraints and technical capacity. However, technical solutions are necessarily secondary to more fundamental political solutions. 

Footnotes
  1.  ACE Electoral Knowledge Network. “Roles and Definition of Political Parties.” Accessed August 7, 2020. https://aceproject.org/ace-en/topics/pc/pca/pca01/pca01a.
  2.  This market analogy is taken from the seminal political science literature on political parties including Aldrich, John H. Why Parties?: The Origin and Transformation of Political Parties in America. Chicago: University Of Chicago Press, 1995; Kitschelt, Herbert, Zdena Mansfeldova, Radoslaw Markowski, Gabor Toka, and Ellen Comisso. Post-Communist Party Systems. Cambridge University Press, 1999. https://www.cambridge.org/core/books/postcommunist-party-systems/17E6F6F698800A1A6E3E58AEA27A86DC; and others. For a succinct overview of the market analogy, see Hale, Henry E. “Why Not Parties? Electoral Markets, Party Substitutes, and Stalled Democratization in Russia.” Comparative Politics 37, no. 2 (2005): 147–66. https://doi.org/10.2307/20072880.
  3. This attribution problem – the idea that the benefits of public goods are long term and uncertain, and private goods have short term and guaranteed benefits, particularly in electoral support, is central to the “politician’s dilemma” – a concept that explains why political clientelism and corruption retain their appeal even for well-meaning or public-minded elected officials. See Geddes, Barbara. Politician’s Dilemma: Building State Capacity in Latin America. University of California Press, 1994.
  4.  Olson, Mancur. The Logic of Collective Action: Public Goods and the Theory of Groups. 2nd Edition. Cambridge, Mass.: Harvard University Press, 1971.
  5.  Hardin, Garrett. “The Tragedy of the Commons.” Science 162, no. 3859 (December 1968): 1243–48.
  6.  Several academic and popular/policy-oriented analyses have described information disorders as a tragedy of the commons. See, for example, Tierney, John. “The Non-Tragedy of the Commons.” TierneyLab (blog), October 15, 2009. https://tierneylab.blogs.nytimes.com/2009/10/15/the-non-tragedy-of-the-commons/; Gapper, John. “Facebook Faces the Tragedy of the Commons,” November 29, 2017. https://www.ft.com/content/ec74ce54-d3e1-11e7-8c9a-d9c0a5c8d5c9;
  7.  Bret Schafer, personal communication with the authors, January 2021.
  8.  For examples of how violence against women (VAW) online deters women’s political engagement, see Zeiter, Kirsten, Sandra Pepera, Molly Middlehurst, and Derek Ruths. “Tweets That Chill: Analyzing Online Violence Against Women in Politics.” National Democratic Institute, 2019. https://d4dcoalition.org/sites/default/files/2020-06/NDI%20Tweets%20That%20Chill%20Report.pdf.
  9.  Tucker, Joshua, Andrew Guess, Pablo Barbera, Cristian Vaccari, Alexandra Siegel, Sergey Sanovich, Denis Stukal, and Brendan Nyhan. “Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature.” Prepared for: William + Flora Hewlett Foundation, 2018. https://doi.org/10.2139/ssrn.3144139.
  10.  These mechanisms are drawn from Ibid. pp. 26-27.