7. Challenges and ongoing considerations for monitoring digital threats in elections

Updated On
Apr 19, 2021

Unfortunately, with technological advances, digital disinformation efforts and computational propaganda  present new and unique challenges to election observation. Identifying networks and connections around the creation, spread, and amplification of disinformation and hate speech in elections is particularly challenging. Online sources lack transparency, with content often spread via fake media houses, phony websites, or social media accounts animated by “farms” of hired users and boosted by automated "bot" accounts. 

This is compounded by the fact that the popularity of certain social media platforms and messaging applications varies dramatically by country and platform, as does access to underlying data, while disinformation techniques and content are constantly evolving. The growing popularity of closed messaging services present serious ethical considerations for election observers monitoring their influence. Moreover, the attention, on-the-ground engagement, and implementation of new transparency and content moderation measures provided by online platforms remain inconsistent across national lines. Therefore, monitoring tools and methodologies that may be effective in one context may be irrelevant in another. 

The incorporation of social media and other forms of online observation into electoral assessments is in an experimental phase, and monitors are still confronting nascent challenges and identifying lessons learned. These include new technical and political factors that can complicate observations, which may require flexible methodologies to build a more inclusive and comprehensive election assessment. 

Observing Closed Messaging Services

In many countries, campaigning, voter education, and general political discourse around elections is moving to closed messaging services like WhatsApp or Telegram. These networks create serious challenges in terms of what is acceptable to monitor and how to monitor them. Even private channels on public networks (such as closed Facebook Groups) create serious ethical considerations for any potential study of disinformation. Researchers can consider declaring that they are joining closed groups, as the research group at CDD-West Africa followed in their study. This has the potential, however, to change the nature of conversation within those groups. Another solution is to invite users already in closed groups to submit examples of problematic content, though this approach introduces selection bias and provides an extremely limited view of the closed portion of the online environment. Some civic activists (using tactics that CEPPS does not endorse) have exposed insidious closed syndicates such as hate groups through impersonation or fabricated accounts. This approach violates the terms of services of the platforms and presents serious ethical questions for researchers. Observers must wrestle with these issues to identify an appropriate way to monitor closed platforms, in addition to other methodological challenges, particularly as observers play a different role than traditional academic researchers. Michael Baldassaro, the Digital Threats Lead for the Carter Center, notes: "We do need some consideration that takes into account the law, and ethical considerations that are different from what academic standards might be. I'm not comfortable with going into a WhatsApp group and saying I'm here as a researcher. So, we need to develop modalities for what is appropriate to monitor...and how do we do that?”13

Exposing Online Barriers to Women and Marginalized Groups to the Electoral Process

Information disorder often disproportionately impacts women and marginalized populations as both contestants and voters, often further disadvantaging female candidates and fomenting unsafe online spaces where women and marginalized groups are dissuaded from participating in – or are altogether forced out of - the political discourse. Additionally, many content moderation systems, whether driven by machine learning and artificial intelligence or by direct oversight from human actors, are gender-blind and  poorly versed in the local context, including the patterns and dimensions of socio-cultural norms and vulnerabilities of marginalized populations. 

However domestic and international organizations surveyed in this research noted that this was an area of concern but not one that they generally addressed specific resources to evaluate. In some cases, the methods, units of analysis, and tools for monitoring hate speech or violence against women online may differ from the broader social media monitoring methodology. For instance, hate speech monitoring may be driven by lexicons of dangerous language, as explored in the methodology developed by NDI and its partners and set forth in Tweets that Chill: Examining Online Violence Against Women in Politics, which rely on examining key words and content. Election monitors may need to balance multiple approaches to derive a real picture of the electoral information landscape and how it affects particular groups. Observer groups should hire gender experts to examine these issues to better understand how existing gender norms function in the local disinformation context, as well as  coordinate with groups focused on the impact of disinformation on women and marginalized groups in elections and other critical political contexts. International and domestic observer groups should review their own implicit bias and cultures of masculinity that can hinder inclusive election observation, particularly as the online space presents new threats to women and marginalized individuals and can reinforce regressive norms. 

Hernandez, the Communications Director of MOE, noted that in past missions they had not focused on this in any systematic way, but were interested in developing this capacity in future, and noted groups such as Chicas Poderosas that had successfully integrated monitoring for hate speech in recent elections in Brazil, Colombia and Mexico. In Colombia, Chicas Poderosas developed workshops to train local researchers and activists to track hateful political speech on closed messaging groups ahead of the 2018 Presidential Elections.14 Methodologies such as these to study the content, networks and impact of disinformation and hate speech targeting women and marginalized groups should be more broadly and systematically integrated into election monitoring projects going forward. 

Navigating Interventions by Social Media Platforms

Social media and other technology companies are increasingly responding to the threats that occur on their platforms. In some cases, this has meant providing more transparency about political advertisements on their platforms, more information about group moderators or pages, enhanced responsiveness to flagged content, and specific policies related to managing content that can undermine electoral integrity. However, how and where these initiatives are applied varies drastically from country to country and lacks the level of granularity necessary for robust analysis. In addition, many platforms lack representatives and content moderation in smaller contexts and in countries outside of their major markets. It can be a challenge for observers to gain information about whether, when, and how platforms will respond  to any single election. This hampers the ability of observers to develop cogent observation strategies that involve those platforms. Monitoring groups should advocate for enhanced transparency from platforms and work to maintain open lines of communication with these companies, particularly around elections, to enhance corporate accountability and responsibility for safeguarding the online election environment.   

Developing Appropriate and Context-Specific Methodologies

Variations in how and where citizens consume election information and the dynamic nature of digital threats around elections means there is no “one size fits all” monitoring methodology. Domestic and international groups should consider innovative ways of partnering with each other as well as with fact checkers and advocates for political inclusion of marginalized populations, in order to gain greater insight into the contexts. Social media monitoring may feel overwhelming in scale and scope for election observer groups, with almost limitless numbers of pages, profiles, channels, and volumes of data to potentially collect and analyze. To manage, observers should develop objectives that are clear, realistic, narrow in scope, and which are derived from a preliminary assessment of the information environment. Subsequent methodologies should seek to achieve these objectives. Only after discrete areas for observation are clarified should groups begin to identify relevant tools that fit the needs of the project and the organization’s technical and human resources. In addition, groups should be transparent about the limits of their data and be thoughtful when drawing conclusions.  

Observers must consider a range of potential approaches to understand the online election environment. The information age presents new opportunities for developing research to understand how conversations flow online, as well as new challenges to electoral integrity, as trends in discourse are hidden from view in ways that were not possible when the majority of conversations were carried out in traditional media. This is a dynamic and important time for the field to consider the implications of its work in the online space, including the examples and practices analyzed and presented here. Continuous discussion and knowledge exchanges, online and off, will form a key element to countering disinformation through election monitoring. The ability to engage with non-traditional partners such as tech platforms, fact checkers, and others in elections is also crucial. With these considerations reviewed here, observers will be more prepared to address the online environment and integrate it into their planning and recommendations for elections going forward.

Footnotes

13. Interview by Daniel Arnaudo (National Democratic Institute) with Michael Baldassaro, Carter Center, August 28, 2020.

14. Interview by Daniel Arnaudo (National Democratic Institute) with Fabian Hernandez, Misión de Observación Electoral (MOE), February 17, 2020.