Election observers frequently adjust their methodologies to meet evolving tactics that undercut credible electoral processes, often in the pre-election period. Citizen election monitors, who are often viewed as trusted, politically impartial voices, are well-equipped to investigate, expose, and mitigate the effects of information manipulation around elections. They understand online vernacular and the significance of slang and other terms that are key to identifying disinformation and its connections to hate speech, incitement, and other means of fanning social divisions. That understanding can be helpful to international election observers and foreign researchers. Moreover, national organizations can provide ongoing monitoring not only during elections, but also during major legislative votes, national plebiscites, and the period between elections when the online manipulation of political narratives tends to take root.
For example, in Georgia, the citizen election observer group International Society for Fair Elections And Democracy (ISFED) developed a multi-prong approach to identify disinformation tactics designed to influence voters and subvert fact-based discourse ahead of the 2018 presidential election and subsequent run-off. Using an NDI-designed tool (the Fact-a-lyzer) that was created specifically for citizen observers to monitor platforms like Facebook and Twitter, ISFED monitored a range of electoral integrity issues on social media, including abuse of state resources, campaign finance, the strategic spread of disinformation and divisive narratives, and the use of official and unofficial campaign pages in the elections to discredit candidates and, in some cases, civil society organizations. Some of their findings, including clear campaign finance violations, were flagged for government oversight institutions that subsequently levied fines on the violators. In addition, through social media monitoring ISFED was able to identify a number of suspicious fake media pages that Facebook eventually removed in a high-profile operation for coordinated and inauthentic behavior. The group continued to monitor social media between the 2018 presidential elections and the 2020 parliamentary polls, identifying a series of Kremlin-backed disinformation campaigns.
Problematic pages highlighted by ISFED were all in Georgian, a language not widely spoken outside of the country and even less common among tech platform content moderators. The prevalence of disinformation in local language content reinforced the importance of citizen monitoring to appreciate linguistic subtext and more easily interpret social media content and behavior within the electoral context. ISFED’s effort has been rooted in long-term monitoring with well-trained staff and access to advanced data collection tools like Fact-a-lyzer and Facebook’s Crowdtangle, which have improved their capacity and ability to perform more advanced research. Their social media monitoring effort is ongoing to capture inter-election trends and identify how some narratives developed online well in advance of an election become weaponized for electoral advantages or disadvantages. Such an ambitious approach requires long-term resources and access to bulk public content.
In Nigeria, election monitors broadened traditional fact-checking efforts to conduct more nuanced research to identify underlying information trends ahead of the 2019 General Elections. NDI partnered with the Centre for Democracy and Development — West Africa (CDD-West Africa), which was already undertaking a robust media literacy and fact-checking campaign to quantitatively analyze the information environment in the weeks leading up to the elections. NDI hired Graphika, a private research firm that conducts data collection and analysis on online platforms such as Facebook and Twitter, to provide much of the research support. Through the combination of Graphika’s analysis and the manual data collection of the fact-checkers, CDD-West Africa was able to highlight the depth and scope of certain narratives around the elections, particularly related to Islamophobia and foreign influence. It also uncovered coordinated fake news networks and signs of inauthentic automated accounts.
These efforts were complemented by research CDD-West Africa conducted in partnership with the University of Birmingham examining the use of WhatsApp ahead of the elections. CDD-West Africa briefed a number of international election observation missions on their findings, which contributed to election day statements and further analysis. By augmenting their fact-checking efforts with sophisticated data analysis, CDD-West Africa was able to spot broad trends impacting the electoral process while still providing updates on the online environment in real time.
Penplusbytes, a local NGO in Ghana, developed a Social Media Tracking Center (SMTC) for the 2012 Ghanaian presidential elections and revived it for the 2016 presidential elections to identify electoral malpractices as they occur, using such information to warn relevant institutions and stakeholders quickly. The Penplusbytes teams used the Aggie social media tracking software developed by the Georgia Institute of Technology and the United Nations University to monitor and verify instances of misinformation on Facebook and Twitter. They passed relevant information on to the National Elections Security Task Force (NESTF) which took action based on their findings.
In Colombia, the civic group Electoral Observation Mission (Misión de Observación Electoral or MOE) has been monitoring online aspects of electoral processes since the referendum on the country's peace agreement held in 2016. In many ways, the peace process has helped define Colombian society in recent years, as it fights to consolidate its progress democratically, reconcile various combattants in the war, integrate rebels back into society, and ultimately avoid regression into the conflict that ravaged the country for decades. According to MOE's Director of Communications, Fabian Hernandez: "Just at that moment, MOE made the first analysis of social media. Our focus at that time was to look at how much electoral crimes were talked about online, what were the arguments with which people spoke of a referendum, [and was it to be] an endorsement of peace? We did not foresee, we did not envision that misinformation was going to be such a serious problem. Therefore it was not the object of our study, but we had a tool to give us alerts and others...that the great risk to the referendum was misinformation, how it was circulating through WhatsApp and text messages, and through Instagram, but also Twitter, a lot of false information, misinformation or exaggerated or decontextualized information that ended up being false.”6
Subsequently, MOE developed more sophisticated, data-driven social media research plans, linkages to platforms for reporting, and other advanced forms of coordination. During the 2018 presidential election, MOE worked to develop online data collection methods and mechanisms for reporting to the platforms and electoral authorities. As Hernandez noted: "After Brexit, Colombia was a very interesting pilot for the world of how disinformation could change elections. And that made our approach for the study of social media by 2018 characterizing disinformation. That is why we came to the study of who produces misinformation and how misinformation becomes viral.”7 With the help of social listening platforms, MOE collected data around keywords from Facebook, Instagram, Twitter, Youtube, blogs and other media, recording nearly 45 million pieces of content. This content was analyzed with natural language processing software to contribute to a final report covering both rounds of the election, as well as congressional and intra-party discussion rounds.
Local elections are similarly vulnerable to misinformation and disinformation campaigns, but often receive less scrutiny and attention from international actors, the media, and researchers, further elevating the importance of citizen watchdog organizations. As noted by Hernandez: "In local elections we had the same exercise of looking at social media, and today our analysis focuses on: Disinformation, hate speech, intolerance or aggressiveness; and finally xenophobia, immigration, and Venezuela. From the traditional media it was understood that people with less education were more vulnerable to manipulation, which are barriers placed by the type of education, because of the little education they receive, that is why misinformation was easier.”8
Citizen observation groups are more likely to capture digital threats at the local level than their international counterparts. They have a stronger understanding of what is said and what is meant on social media and insight into the particular experiences of women, members of other marginalized groups, and other populations online at the local, regional and national level. Integration of these perspectives is essential to informing the monitoring process. Moreover, national organizations can provide ongoing monitoring not only during elections, but also during major legislative votes, national plebiscites like Colombia's over the peace process, and the period between elections when the manipulation of online political narratives tends to take root. Linking traditional observers such as MOE with other kinds of online monitoring organizations, digital rights groups, fact checkers, civil society representing women and marginalized groups, and civic technologists becomes critical to understand the complete picture of a country's social media landscape over time.
6. Interview by Daniel Arnaudo (National Democratic Institute) with Fabian Hernandez, Misión de Observación Electoral (MOE), February 17, 2020.
7. Interview by Daniel Arnaudo (National Democratic Institute) with Fabian Hernandez, Misión de Observación Electoral (MOE), February 17, 2020.
8. Interview by Daniel Arnaudo (National Democratic Institute) with Fabian Hernandez, Misión de Observación Electoral (MOE), February 17, 2020