Complete Document - Election Monitoring

Written by Julia Brothers, Senior Advisor for Elections and Political Processes at the National Democratic Institute

 

Democratic elections rely on a competitive process, faith in electoral institutions, and informed participation by all citizens. However, the deployment of false, exaggerated, or contradictory information in the electoral environment has been effective in undermining these principles around the world. By interfering with the formation and holding of opinions, disinformation amplifies voter confusion, dampens turnout, galvanizes social cleavages, advantages or disadvantages certain parties and candidates, and degrades trust in democratic institutions. While anti-democratic disinformation campaigns are not new, modern information technology and the platforms by which citizens get their news, including online and via social media, encourage information dissemination at speeds, distances, and volumes unprecedented in preceding electoral cycles.

International standards for democratic elections assure open, robust, and pluralistic information environments that promote equal and full participation in elections by citizens and contestants alike. These standards are enshrined in international and regional instruments, which reflect pre-existing, globally-recognized commitments that pertain to disinformation, including: 

  • The rights to hold opinions and to seek and receive information in order to make an informed choice on election day: Everyone has the right to form, hold, and change opinions without interference, which is integral to freely exercising the right to vote.1 Voters also have the right to seek, receive, and impart accurate information that allows them to make informed choices regarding their future, free from intimidation, violence, or manipulation.2 Further, institutions are generally obligated to be transparent regarding electoral information so that voters can be informed and data sources can be held accountable.3 These rights are enshrined for all citizens regardless of race, gender, language, area of origin, political or other opinion, religion, or other status.4 Increasingly, organizations are working to link these standards with principles focused on disinformation and cyberspace. Electoral related disinformation efforts subvert these rights, because they are designed to overwhelm genuine political debate by intentionally deceiving voters, creating confusion, exacerbating polarization, and undermining public confidence in the electoral process.
  • The right to a level playing field: Universal and equal suffrage, in addition to voting rights, include the right to seek to be elected to public office without discrimination. Governments’ obligations to ensure level playing fields for electoral contestants are derived from this norm. The UN Human Rights Committee provides guidance on this in its General Comment 25 to the ICCPR. The norm implies providing security from defamatory attacks and other forms of false information aimed to harm a candidate’s or a party’s electoral fortunes. The obligations extend to government-controlled media, and the norm applies to professional ethics for journalists and private media.5 Fact-checking, other forms of verification, and traditional and social media monitoring relate to this norm, as well as to voters’ rights to receive accurate information upon which to make informed electoral choices. Manipulation of the information environment can undermine equitable competition, particularly for those that are disproportionately impacted by disinformation campaigns, like women and marginalized communities, who already face an uneven playing field.
  • Freedom of expression, the press, and regulation: The aforementioned commitments must be balanced by the freedoms of everyone to hold opinions and to express them, including the need to respect and protect a free press. One aspect of addressing disinformation campaigns is to develop proper legal and regulatory frameworks, including effective sanctions. Gendered, racial, ethnic, religious, and other forms of hate speech and incitement to  violence are often diffused throughout  disinformation campaigns affecting candidates and voters alike. Legal regulations in this area, like protection of personal reputation, can be applicable in the disinformation context.  However, regulation should not be overemphasized, and care is needed to safeguard freedom of expression while trying to protect the integrity of the information space in elections and beyond them. The UN Human Rights Committee provides guidance on this in General Comment 34.

Recognizing these necessary democratic conditions, the existence and impact of disinformation must be considered in any comprehensive assessment of an electoral process. Even if an election is well-organized and transparent, a highly compromised information environment leading up to and on election day can subvert its credibility. Identifying the types, volumes, and patterns of mis- and disinformation that may affect electoral integrity is crucial for mitigating their impact. Political watchdogs  should analyze deficiencies in the information environment with an understanding of the social norms and cleavages in the local context when determining the integrity of an election and creating accountability for the universe of stakeholders who engage in or benefit from disinformation tactics. 

Traditional electoral safeguards, particularly election observers, are expanding their capacities, activities, relationships, and advocacy efforts to confront disinformation threats to electoral integrity. Debunking fake news through emerging networks of fact checkers and bolstering media and digital literacy play important roles in building resilience and enhancing the information environment around elections. Those actions, as well as robust efforts to properly inform political debate and provide accurate electoral data, can inoculate against information disorder. All of such efforts can complement each other to safeguard electoral and political processes. 

Election monitoring programs broadly serve to promote electoral integrity through enhanced participation, inclusion, transparency, and accountability, thus fostering citizen empowerment and confidence in the democratic process.

Developing the right election observation intervention(s) to respond to disinformation should not be done without first considering the context of each electoral environment.

Design Tip


The nature, vulnerabilities, mitigating factors, and opportunities around the electoral information, online and otherwise, vary significantly from country to country, and successful projects have demonstrated the importance of conducting a preliminary assessment to identify these factors before designing a program. Subsequently, monitoring methodologies and approaches should be shaped and driven by objectives and organizational capacity, not by available tools. 

Decisions to use technologies and methodologies should be made through an inclusive process, with consideration of the accessibility and technology gaps among different groups of observers and citizens, including along gender, age, geography, and other lines. In addition, identifying and exposing online barriers for women and marginalized groups in electoral processes necessarily requires an inclusive, gender sensitive approach and may require observers to incorporate and balance specialized methodologies into their overall effort  that create an accurate picture of how the electoral landscape affects specific populations. 

There are several options to address the specific threats that disinformation poses to electoral integrity in an individual country context:

  • Citizen election observation to identify and expose disinformation as it relates to electoral integrity, including monitoring online and traditional media around an electoral process
  • International election observation of the electoral information environment, including disinformation, in the short and long-term by credible international and regional observation missions and in line with the Declaration of Principles for International Election Observation 
  • Advocacy for norms, standards, and policies to address disinformation in elections, including efforts by civil society and/or other groups to advocate for a range of appropriate responses from social media platforms and other private sector actors, legal reforms, policies and resource allocation from governments or legislatures, and support for norms-building and standards from regional and international instruments to combat disinformation during elections.
  • Building more effective partnerships between election observers and other key stakeholders, such as civic tech groups, fact-checkers, journalists, media monitors, electoral management bodies, women’s rights organizations and other CSOs that are composed of and represent marginalized groups, etc. 
  • Knowledge-sharing and developing best practices around combating disinformation in elections through workshops, online exchanges, guidance notes and other information sharing forms.

These interventions are explored in more detail below and demonstrate how focused electoral observation and analysis can enhance accountability and neutralize disinformation threats. Election monitoring is ideally conducted throughout the pre-election, election day, and post-election periods to evaluate all relevant aspects of the electoral process. Many of the case studies highlighted in this chapter are not standalone projects, but are part of broader election monitoring efforts that include online monitoring as a distinct component.

Election observers frequently adjust their methodologies to meet evolving tactics that undercut credible electoral processes, often in the pre-election period. Citizen election monitors, who are often viewed as trusted, politically impartial voices, are well-equipped to investigate, expose, and mitigate the effects of information manipulation around elections. They understand online vernacular and the significance of slang and other terms that are key to identifying disinformation and its connections to hate speech, incitement, and other means of fanning social divisions. That understanding can be helpful to international election observers and foreign researchers. Moreover, national organizations can provide ongoing monitoring not only during elections, but also during major legislative votes, national plebiscites, and the period between elections when the online manipulation of political narratives tends to take root.

Highlight


While fact-checking groups and other media integrity initiatives serve critical functions in weeding out false and misleading narratives, social media monitoring by citizen election observers tends to have different goals and timelines. The objective is not to quickly verify and/or invalidate individual stories but rather to identify and evaluate the impact information trends may have on electoral integrity, build accountability around a variety of actors participating in the electoral process, and provide actionable recommendations.

For example, in Georgia, the citizen election observer group International Society for Fair Elections And Democracy (ISFED) developed a multi-prong approach to identify disinformation tactics designed to influence voters and subvert fact-based discourse ahead of the 2018 presidential election and subsequent run-off. Using an NDI-designed tool (the Fact-a-lyzer) that was created specifically for citizen observers to monitor platforms like Facebook and Twitter, ISFED monitored a range of electoral integrity issues on social media, including abuse of state resources, campaign finance, the strategic spread of disinformation and divisive narratives, and the use of official and unofficial campaign pages in the elections to discredit candidates and, in some cases, civil society organizations. Some of their findings, including clear campaign finance violations, were flagged for government oversight institutions that subsequently levied fines on the violators. In addition, through social media monitoring ISFED was able to identify a number of suspicious fake media pages that Facebook eventually removed in a high-profile operation for coordinated and inauthentic behavior. The group continued to monitor social media between the 2018 presidential elections and the 2020 parliamentary polls, identifying a series of Kremlin-backed disinformation campaigns. 

Problematic pages highlighted by ISFED were all in Georgian, a language not widely spoken outside of the country and even less common among tech platform content moderators. The prevalence of disinformation in local language content reinforced the importance of citizen monitoring to appreciate linguistic subtext and more easily interpret social media content and behavior within the electoral context. ISFED’s effort has been rooted in long-term monitoring with well-trained staff and access to advanced data collection tools like Fact-a-lyzer and Facebook’s Crowdtangle, which have improved their capacity and ability to perform more advanced research. Their social media monitoring effort is ongoing to capture inter-election trends and identify how some narratives developed online well in advance of an election become weaponized for electoral advantages or disadvantages. Such an ambitious approach requires long-term resources and access to bulk public content.

In Nigeria, election monitors broadened traditional fact-checking efforts to conduct more nuanced research to identify underlying information trends ahead of the 2019 General Elections. NDI partnered with the Centre for Democracy and Development — West Africa (CDD-West Africa), which was already undertaking a robust media literacy and fact-checking campaign to quantitatively analyze the information environment in the weeks leading up to the elections. NDI hired Graphika, a private research firm that conducts data collection and analysis on online platforms such as Facebook and Twitter, to provide much of the research support. Through the combination of Graphika’s analysis and the manual data collection of the fact-checkers, CDD-West Africa was able to highlight the depth and scope of certain narratives around the elections, particularly related to Islamophobia and foreign influence. It also uncovered coordinated fake news networks and signs of inauthentic automated accounts. 

These efforts were complemented by research CDD-West Africa conducted in partnership with the University of Birmingham examining the use of WhatsApp ahead of the elections. CDD-West Africa briefed a number of international election observation missions on their findings, which contributed to election day statements and further analysis. By augmenting their fact-checking efforts with sophisticated data analysis, CDD-West Africa was able to spot broad trends impacting the electoral process while still providing updates on the online environment in real time. 

Penplusbytes, a local NGO in Ghana, developed a Social Media Tracking Center (SMTC) for the 2012 Ghanaian presidential elections and revived it for the 2016 presidential elections to identify electoral malpractices as they occur, using such information to warn relevant institutions and stakeholders quickly. The Penplusbytes teams used the Aggie social media tracking software developed by the Georgia Institute of Technology and the United Nations University  to monitor and verify instances of misinformation on Facebook and Twitter. They passed relevant information on to the National Elections Security Task Force (NESTF) which took action based on their findings.

In Colombia, the civic group Electoral Observation Mission (Misión de Observación Electoral or MOE) has been monitoring online aspects of electoral processes since the referendum on the country's peace agreement held in 2016. In many ways, the peace process has helped define Colombian society in recent years, as it fights to consolidate its progress democratically, reconcile various combattants in the war, integrate rebels back into society, and ultimately avoid regression into the conflict that ravaged the country for decades. According to MOE's Director of Communications, Fabian Hernandez: "Just at that moment, MOE made the first analysis of social media. Our focus at that time was to look at how much electoral crimes were talked about online,  what were the arguments with which people spoke of a referendum, [and was it to be] an endorsement of peace? We did not foresee, we did not envision that misinformation was going to be such a serious problem. Therefore it was not the object of our study, but we had a tool to give us alerts and others...that the great risk to the referendum was misinformation, how it was circulating through WhatsApp and text messages, and through Instagram, but also Twitter, a lot of false information, misinformation or exaggerated or decontextualized information that ended up being false.”6

Subsequently, MOE developed more sophisticated, data-driven social media research plans, linkages to platforms for reporting, and other advanced forms of coordination. During the 2018 presidential election, MOE worked to develop online data collection methods and mechanisms for reporting to the platforms and electoral authorities. As Hernandez noted: "After Brexit, Colombia was a very interesting pilot for the world of how disinformation could change elections. And that made our approach for the study of social media by 2018 characterizing disinformation. That is why we came to the study of who produces misinformation and how misinformation becomes viral.”7 With the help of social listening platforms, MOE collected data around keywords from Facebook, Instagram, Twitter, Youtube, blogs and other media, recording nearly 45 million pieces of content. This content was analyzed with natural language processing software to contribute to a final report covering both rounds of the election, as well as congressional and intra-party discussion rounds. 

Local elections are similarly vulnerable to misinformation and disinformation campaigns, but often receive less scrutiny and attention from international actors, the media, and researchers, further elevating the importance of citizen watchdog organizations. As noted by Hernandez: "In local elections we had the same exercise of looking at social media, and today our analysis focuses on: Disinformation, hate speech, intolerance or aggressiveness; and finally xenophobia, immigration, and Venezuela. From the traditional media it was understood that people with less education were more vulnerable to manipulation, which are barriers placed by the type of education, because of the little education they receive, that is why misinformation was easier.”8 

Citizen observation groups are more likely to capture digital threats at the local level than their international counterparts. They have a stronger understanding of what is said and what is meant on social media and insight into the particular experiences of women, members of other marginalized groups, and other populations online at the local, regional and national level. Integration of these perspectives is essential to informing the monitoring process. Moreover, national organizations can provide ongoing monitoring not only during elections, but also during major legislative votes, national plebiscites like Colombia's over the peace process, and the period between elections when the manipulation of online political narratives tends to take root. Linking traditional observers such as MOE with other kinds of online monitoring organizations, digital rights groups, fact checkers, civil society representing women and marginalized groups, and civic technologists becomes critical to understand the complete picture of a country's social media landscape over time.

International election observation missions are committed to assessing the quality of an electoral process in its entirety, including in the pre-election, election day, and post-election periods. This commitment is rooted in the Declaration of Principles for International Election Observation (Declaration of Principles or DoP). Therefore, consideration of the information environment, including the role of disinformation, hate speech, and other online forms of content where they play a significant role represent a critical part of any mission assessment. Additionally, according to the DoP, gender considerations must be emphasized not only at the individual mission level but also at the international and normative level. In the context of the information environment, this would include an understanding of the dimensions of Violence Against Women in Politics (VAW-P) and in Elections including their online manifestations such as gendered disinformation. This may involve incorporating analysis and recommendations concerning the information environment into pre-election and election day statements. Missions should strive to expand the pool of key informants and interlocutors from whom long- and short-term observers collect information, such as social media experts, academics, tech industry representatives, women’s rights activists, and media monitors, both in-country and from outside. Observation missions may also want to diversify the profiles of pre-election and election day analysts and delegates to include civic technologists, digital communications experts, or others with  particular knowledge of gendered digital manipulation techniques. Where needed, missions may seek to influence social media firms if analysis reveals serious challenges to electoral integrity, whether through disinformation, hate speech or other influences. 

In some cases, particularly for missions in countries experiencing acute disinformation campaigns around elections, a core team member or analyst could be slotted to concentrate on developing analysis of the dimensions of disinformation in the electoral context. For instance, in Nigeria the European Union deployed a media and digital communications analyst to cover the online space for the 2019 Nigerian presidential election, and has deployed other media monitors in different contexts globally. 

Similarly, for its international election observation missions of Ukraine’s 2019 presidential and parliamentary elections, NDI hired a long-term information environment analyst as part of the mission’s core team. The mission recognized the role that the information environment, including disinformation in traditional and social media, was likely to play in those high-profile elections. Like the mission’s other thematic experts, such as gender and legal framework analysts, the information environment analyst provided a clear focal point on the issue to ensure that all aspects of the mission were taken into account, including information disorder as an electoral integrity issue. 

The long-term analyst (LTA) collected data from key interlocutors and pre-existing data sets and monitored 26 regional and national Telegram channels, which revealed a pattern of disproportionately negative posts regarding the electoral process and the two major presidential candidates. This and other analyses by the LTA contributed substantially to the findings of the observation mission. In particular they framed the extent to which foreign and domestic online campaigns influenced the electoral process and how political parties,  candidates, and less transparent third party supporting accounts utilized online campaigning to shape the digital landscape. This builds on the NDI's experience from its 2017 observation mission in Georgia, during which it deployed a long-term information environment analyst for the first time

Other international and intergovernmental observer organizations, such as the Carter Center, Democracy Reporting International, the Organization of American States (OAS), and the OSCE/Office for Democratic Institutions and Human Rights (ODIHR) have also been integrating social media monitoring into their broader observation missions over the last several years, and the international observation community continues to work together to strengthen capacity and harmonize norms in this area. In some cases, they collaborate with civil society organizations such as Slovakia's Memo 98, which has developed media monitoring programs since the 90s. The linkage between traditional media monitoring and social media monitoring is important to note, and Memo 98, as with many organizations, has shifted from examining traditional media to social networking platforms in the last five years. Since its initial forays examining the online reach of Russian outlets such as RT and Sputnik in 2015, Memo 98 has broadened its social media focus, supporting the European Union, OSCE, and other election observation missions in Europe and elsewhere.  

Memo 98 media monitoring activists deployed online analysis through OSCE monitoring missions in Georgia in 2017 and for the European Union Parliamentary Elections. In the latter case, they worked to determine the extent to which messages on Facebook impacted the issues presented by political parties during the election. Memo 98 did not find that the parties attacked each other significantly in the posts and, rather, resulted in the unification against extremism. Memo 98 also monitored the Belarus 2020 election in collaboration with Belarusian NGOs Linked Media and the EAST Center. They developed reporting focused on social media and contrasted how the country’s skewed national traditional media resulted in President Lukashenko receiving 97 percent of coverage while opposition candidates were able to post and garner  some attention on social media such as Facebook.

As with other groups in Eastern Europe, Memo 98 is uniquely positioned to understand the potential of foreign influence operations, particularly emanating from Russia. As its director, Rasto Kuzel, notes:

"Obviously we could not ignore [the online space] any more after 2016. And that's why we started working on some kind of methodological approach. We saw that...understanding the basics of content analysis, understanding what the data shows us, understanding the larger picture, you show some of these infinitives but do we get a sense? Like how big a problem this is in the whole election environment. I mean, what is the real impact of social media in a particular country? And how does it correlate with traditional media and so on and so forth."9

Balancing the impact of both social and traditional media is a challenge in understanding conversations online, where the traditional media also plays a role. While traditional media monitoring is limited to the officially licensed media, television, radio and print, social media is difficult, if not impossible, to observe completely. Yet, as Kuzel notes, it is important to include it in any observation, and groups are working collectively to develop new methodologies for the online environment. With the Council of Europe, Kuzel has recently published a guide on media monitoring in elections that includes a section on social media methods based on his experience. New tools like Crowdtangle, a social media research application owned by Facebook that collects publicly available data about groups and pages on Facebook, Instagram, Twitter and Reddit, form a critical component. As Kuzel notes: "With Crowdtangle for Facebook and Instagram we can get the historical data, which makes a big difference. We feel more comfortable when we can analyze bigger periods and more data and that was not always the case."10 Tools such as Crowdtangle increase the field of view for observers, but hide comments and other private information about users. Observers and other researchers should be aware of any tool's blind spots, (e.g.  private Facebook groups) that are not covered by the platform. 

Electoral monitoring and electoral reform initiatives present a number of opportunities for advocacy at the local, national, and international level. Election observers are in a strong position to provide clear and actionable recommendations through observation statements as well as long-term electoral reform projects to enhance transparency and promote a healthy electoral information environment. Election observation statements by international election observers can draw international attention to particular challenges, and recommendations within those statements often serve as benchmarks for democratic actors to pursue advances and create accountability for their relevant targets. For instance, international election observation missions in Ukraine noted ongoing shortcomings of the tech industry in online political advertising transparency and limitations in their ability to manage electoral disinformation at the local level. International organizations that observe elections also can draw attention to normative issues to be addressed by technology companies and can help gain the attention of intergovernmental organizations and other sectors concerning those issues. 

Meanwhile, citizen election observers already play effective roles in highlighting deficiencies in regulations and enforcement in their own countries and advocating for reforms. Amidst ongoing attempts by political groups and foreign actors to undermine the election environment in Georgia, ISFED coordinated with 48 other leading Georgian civil society and media organizations to successfully pressure Facebook to increase transparency and accountability measures ahead of the 2020 parliamentary elections. Citizen observer groups Sri Lanka worked together to pressure the government to provide stronger campaign finance oversight mechanisms for political ads online. 

Supporting electoral reform efforts and dialogue between election management bodies (EMBs) and observers to expand the availability of election information and encourage transparency of political data, such as voting results (from the polling station to the national level), voter registries and related population numbers, procurement processes, complaints adjudication, and political advertisements on social media, can be central to address misinformation and disinformation. Transparent, accessible data can inoculate EMBs from conspiracy theories or misinformation while increasing citizens’ ability to fact-check information they may receive from third parties. Constructive engagement on this front can help build public confidence in otherwise vulnerable electoral institutions, and encourage EMBs to develop their own strategies for mitigating and responding to disinformation attempts to undermine their own credibility

Disinformation can manifest in complex ways and may require a range of actors to address. Observer groups that lack the time, resources or skills to launch their own social media monitoring efforts may also collaborate, formally or informally, with media monitoring groups, academics, tech advocates, journalists' associations, women’s rights organizations, organizations that are comprised of and represent marginalized groups, conflict prevention organizations, or other actors that may already be examining disinformation issues. Such partnerships can ensure that election observers give due consideration of the quality of the electoral information space in their overall electoral analysis without conducting direct data collection themselves.

Observers may also consider partnerships with nontraditional monitoring groups, such as fact checkers and other research organizations with experience in social media and broader online monitoring. A report coauthored by the Open Society European Policy Institute and Democracy Reporting International highlights how groups ranging from academic projects (e.g. the Oxford Internet Institute's Computational Propaganda Project and the Brazilian Getúlio Vargas Foundation's Digital Democracy Room), think tanks (e.g. Atlantic Council's Digital Forensic Research Lab) to fact-checking organizations (e.g. debunk.eu), to the private sector (e.g. Bakamo.Social) have all contributed to election monitoring in various forms.11 Multi-stakeholder collaboration forms one potential basis for development of next generation election observation and monitoring, allowing election observers to incorporate the findings of credible partners into electoral assessments rather than duplicate their work, thus expanding the potential leverage for advocacy around norms and standards. This may be a particularly useful approach for international election observers, who are outsiders by definition and who conduct analysis over a relatively short timeframe.  

Design Tip


Groups can also leverage partnerships to convene multi-stakeholder roundtables about countering disinformation, or to expand the agenda for pre-existing fora for sharing information around elections to also discuss how parties, media, election management bodies (EMBs), observers, and others can help one another spread the information to the broader public and create accountability for maintaining information integrity.

Relationships with credible EMBs are particularly important for both addressing disinformation through voter education and for encouraging EMBs to enhance their abilities to rapidly respond to electoral disinformation. For example, NDI co-hosted an event with Mexico’s election commission (INE) that focused on responding to disinformation threats in Mexico’s July 1, 2018 elections. This brought together a diverse mix of electoral stakeholders, including representatives from major tech platforms, academics, election monitors, and other civic activists in addition to election administrators. To facilitate further collaboration between electoral stakeholders, NDI organized workshops and regular coordination meetings between civic tech groups, fact-checkers, and citizen election observer groups to collaborate on combatting electoral disinformation. This approach was particularly helpful in merging Mexico’s civic tech expertise with the electoral analysis lens that observer groups could provide. 

Following a similar model, the Taiwan Foundation for Democracy (TFD), under the Global Cooperation and Training Framework (GCTF) mechanism, organized a conference in September 2019 entitled “Defending Democracy Through Promoting Media Literacy II.”12 Its purpose was to examine the different ways disinformation influences elections around the world, the implementation of media literacy education in curricula, how government and civil society initiatives have evolved to combat disinformation, and the challenges they face. The conference recognized the fact that Taiwan and other countries in the Asia-Pacific region faced presidential and general elections in 2020.  

NDI partnered with the TFD for the GCTF event and identified an opportunity to combine the conference with a more hands-on training event for civic groups from across the region. Following the GCTF, NDI organized and led a one-day workshop “Defending Electoral Integrity Against Disinformation,” attended by 13 NDI-funded civil society participants, mostly representing citizen election observer groups and fact checking organizations in Asian countries with upcoming elections, as well as guests from the Taiwanese civic tech movement. Building on the information presented during the GCTF, the workshop explored social media monitoring in greater depth. The workshop shared strategies and tools for assessing information environments, navigating the social-media platforms, collecting and analyzing social-media data, developing approaches for countering anti-democratic speech, and holding various stakeholders accountable. This workshop provided citizen observer groups and fact-checkers from the same country the opportunity to work together on mutual support, advocacy, and coordination approaches leading up to their respective elections. 

Related efforts have been designed to better build consensus among a broader universe of actors for international observers. For instance, the Carter Center has developed a partnership with grassroots journalism organization Hacks Hackers to conduct a series of workshops among international election observer groups, other electoral assistance practitioners, international fact-checking networks, academics, and technologists to strengthen interventions and best practices for verifying elections in the face of misinformation and disinformation on social media. 

In addition to building new partnerships to confront the challenge of disinformation in elections, pre-existing election networks, such as the Global Network for Domestic Election Monitors (GNDEM) or the Declaration of Principles for International Election Observation community, can elevate the issue of disinformation, build consensus around defining the challenges that it poses to electoral integrity, and develop best practices to counter it. 

As more election monitoring organizations begin to incorporate disinformation monitoring into their broader observation efforts, there are abundant opportunities for peer-to-peer learning and improvement of monitoring methodologies. In September 2019 in Belgrade, Serbia, NDI conducted an intensive academy for citizen observers from 20 different organizations from around the world on detecting, exposing, and countering malign disinformation. Participants in the academy learned how disinformation affects electoral integrity, undermines democratic principles, and weakens citizens’ trust in elections. The participants shared strategies and methods to monitor disinformation in their own contexts. They walked through exercises on assessing information environments in their countries and practiced using various tools for tracking and analyzing disinformation online. The academy structure encouraged participants to share their organizations’ experiences and highlighted lessons learned from working with various social media monitoring tools. For example, ISFED and the CDD-West Africa facilitated discussions and presented on the methods and tools their organizations used to monitor disinformation in their respective contexts.  

 

 

Participants also explored methods for advocating for greater transparency in online platforms and elevating fact-based political discourse. This included working together to identify ways to hold institutions accountable, build advocacy networks, and create effective messaging to thwart toxic narratives, rooted in each group's local experience. 

Knowledge-sharing initiatives have resulted in concrete guidance documents and resources. Over a series of meetings and drafting consultations in the spring of 2019, a small working group representing a mix of international election observers, including NDI, citizen election monitors, academics, fact-checking groups, and civic technologists developed a guide for social media monitoring by civil society, spearheaded by Democracy Reporting International (DRI). This guide includes sections on methodology, legal considerations, and tools for social media monitoring in elections by civic groups, working towards creating collective standards and best practices for groups working in the space.

Similar efforts are underway in the international election observation community as part of the continued implementation of the Declaration of Principles. A working group under the DoP is currently building consensus around a framework to observe and assess online campaigns and recommendations grounded in international standards and best practices. As mentioned in the previous section on international election observation, many participating organizations have already begun incorporating this work into their observation missions. The working group presents a chance to identify a set of approaches, rooted in international standards (freedom of expression, transparency, right to political participation, right to privacy, equality and freedom from discrimination, effective remedy) and respective mandates of endorsers of the DoP to assess online campaigns  and to seek agreement on a common set of guidelines for the observation of online campaigns by international election observation missions. These guiding principles will be reviewed and endorsed at the DoP annual implementation meeting in Brussels in Spring 2021.

Unfortunately, with technological advances, digital disinformation efforts and computational propaganda  present new and unique challenges to election observation. Identifying networks and connections around the creation, spread, and amplification of disinformation and hate speech in elections is particularly challenging. Online sources lack transparency, with content often spread via fake media houses, phony websites, or social media accounts animated by “farms” of hired users and boosted by automated "bot" accounts. 

This is compounded by the fact that the popularity of certain social media platforms and messaging applications varies dramatically by country and platform, as does access to underlying data, while disinformation techniques and content are constantly evolving. The growing popularity of closed messaging services present serious ethical considerations for election observers monitoring their influence. Moreover, the attention, on-the-ground engagement, and implementation of new transparency and content moderation measures provided by online platforms remain inconsistent across national lines. Therefore, monitoring tools and methodologies that may be effective in one context may be irrelevant in another. 

The incorporation of social media and other forms of online observation into electoral assessments is in an experimental phase, and monitors are still confronting nascent challenges and identifying lessons learned. These include new technical and political factors that can complicate observations, which may require flexible methodologies to build a more inclusive and comprehensive election assessment. 

Observing Closed Messaging Services

In many countries, campaigning, voter education, and general political discourse around elections is moving to closed messaging services like WhatsApp or Telegram. These networks create serious challenges in terms of what is acceptable to monitor and how to monitor them. Even private channels on public networks (such as closed Facebook Groups) create serious ethical considerations for any potential study of disinformation. Researchers can consider declaring that they are joining closed groups, as the research group at CDD-West Africa followed in their study. This has the potential, however, to change the nature of conversation within those groups. Another solution is to invite users already in closed groups to submit examples of problematic content, though this approach introduces selection bias and provides an extremely limited view of the closed portion of the online environment. Some civic activists (using tactics that CEPPS does not endorse) have exposed insidious closed syndicates such as hate groups through impersonation or fabricated accounts. This approach violates the terms of services of the platforms and presents serious ethical questions for researchers. Observers must wrestle with these issues to identify an appropriate way to monitor closed platforms, in addition to other methodological challenges, particularly as observers play a different role than traditional academic researchers. Michael Baldassaro, the Digital Threats Lead for the Carter Center, notes: "We do need some consideration that takes into account the law, and ethical considerations that are different from what academic standards might be. I'm not comfortable with going into a WhatsApp group and saying I'm here as a researcher. So, we need to develop modalities for what is appropriate to monitor...and how do we do that?”13

Exposing Online Barriers to Women and Marginalized Groups to the Electoral Process

Information disorder often disproportionately impacts women and marginalized populations as both contestants and voters, often further disadvantaging female candidates and fomenting unsafe online spaces where women and marginalized groups are dissuaded from participating in – or are altogether forced out of - the political discourse. Additionally, many content moderation systems, whether driven by machine learning and artificial intelligence or by direct oversight from human actors, are gender-blind and  poorly versed in the local context, including the patterns and dimensions of socio-cultural norms and vulnerabilities of marginalized populations. 

However domestic and international organizations surveyed in this research noted that this was an area of concern but not one that they generally addressed specific resources to evaluate. In some cases, the methods, units of analysis, and tools for monitoring hate speech or violence against women online may differ from the broader social media monitoring methodology. For instance, hate speech monitoring may be driven by lexicons of dangerous language, as explored in the methodology developed by NDI and its partners and set forth in Tweets that Chill: Examining Online Violence Against Women in Politics, which rely on examining key words and content. Election monitors may need to balance multiple approaches to derive a real picture of the electoral information landscape and how it affects particular groups. Observer groups should hire gender experts to examine these issues to better understand how existing gender norms function in the local disinformation context, as well as  coordinate with groups focused on the impact of disinformation on women and marginalized groups in elections and other critical political contexts. International and domestic observer groups should review their own implicit bias and cultures of masculinity that can hinder inclusive election observation, particularly as the online space presents new threats to women and marginalized individuals and can reinforce regressive norms. 

Hernandez, the Communications Director of MOE, noted that in past missions they had not focused on this in any systematic way, but were interested in developing this capacity in future, and noted groups such as Chicas Poderosas that had successfully integrated monitoring for hate speech in recent elections in Brazil, Colombia and Mexico. In Colombia, Chicas Poderosas developed workshops to train local researchers and activists to track hateful political speech on closed messaging groups ahead of the 2018 Presidential Elections.14 Methodologies such as these to study the content, networks and impact of disinformation and hate speech targeting women and marginalized groups should be more broadly and systematically integrated into election monitoring projects going forward. 

Navigating Interventions by Social Media Platforms

Social media and other technology companies are increasingly responding to the threats that occur on their platforms. In some cases, this has meant providing more transparency about political advertisements on their platforms, more information about group moderators or pages, enhanced responsiveness to flagged content, and specific policies related to managing content that can undermine electoral integrity. However, how and where these initiatives are applied varies drastically from country to country and lacks the level of granularity necessary for robust analysis. In addition, many platforms lack representatives and content moderation in smaller contexts and in countries outside of their major markets. It can be a challenge for observers to gain information about whether, when, and how platforms will respond  to any single election. This hampers the ability of observers to develop cogent observation strategies that involve those platforms. Monitoring groups should advocate for enhanced transparency from platforms and work to maintain open lines of communication with these companies, particularly around elections, to enhance corporate accountability and responsibility for safeguarding the online election environment.   

Developing Appropriate and Context-Specific Methodologies

Variations in how and where citizens consume election information and the dynamic nature of digital threats around elections means there is no “one size fits all” monitoring methodology. Domestic and international groups should consider innovative ways of partnering with each other as well as with fact checkers and advocates for political inclusion of marginalized populations, in order to gain greater insight into the contexts. Social media monitoring may feel overwhelming in scale and scope for election observer groups, with almost limitless numbers of pages, profiles, channels, and volumes of data to potentially collect and analyze. To manage, observers should develop objectives that are clear, realistic, narrow in scope, and which are derived from a preliminary assessment of the information environment. Subsequent methodologies should seek to achieve these objectives. Only after discrete areas for observation are clarified should groups begin to identify relevant tools that fit the needs of the project and the organization’s technical and human resources. In addition, groups should be transparent about the limits of their data and be thoughtful when drawing conclusions.  

Observers must consider a range of potential approaches to understand the online election environment. The information age presents new opportunities for developing research to understand how conversations flow online, as well as new challenges to electoral integrity, as trends in discourse are hidden from view in ways that were not possible when the majority of conversations were carried out in traditional media. This is a dynamic and important time for the field to consider the implications of its work in the online space, including the examples and practices analyzed and presented here. Continuous discussion and knowledge exchanges, online and off, will form a key element to countering disinformation through election monitoring. The ability to engage with non-traditional partners such as tech platforms, fact checkers, and others in elections is also crucial. With these considerations reviewed here, observers will be more prepared to address the online environment and integrate it into their planning and recommendations for elections going forward.