Complete Document - EMB Approaches

Written by Lisa Reppell, Global Social Media and Disinformation Specialist at the International Foundation for Electoral Systems Center for Applied Research and Learning 

Digital disinformation is a real and immediate threat for election management bodies (EMBs) around the world. However, election authorities in different countries are embracing to varying degrees the expectation that they have a substantive role to play in countering disinformation related to electoral processes. Some EMBs have sophisticated social media monitoring capabilities and dedicated teams to track and respond to disinformation; others do not have any social media presence at all. For all of them, disinformation is an unwieldy threat that is being brought to their door, while the immense, primary task of the EMB – administering credible elections – continues to be just as complex an endeavor as ever.

An EMB’s resistance to taking up a role in countering disinformation may be based on an assumption that any response would require the institution to invest in a wholly new technical approach that pushes them beyond their legal, budgetary or human capacity. Though technology and social media have heightened the urgency and awareness of disinformation as a challenge to democratic processes and institutions, it is important to recognize that responses do not necessarily have to be technological in nature. In addition to technology-forward responses that some EMBs may be equipped to adopt, there are also a range of responses that EMBs can take that build on existing core functions of public relations, communication and voter education. Finding an alternate way to frame an EMB’s counter-disinformation efforts, such as investment in election authorities’ crisis and strategic communication capacity, may also be a way to gain institutional support for new initiatives.

An EMB’s specific role in contributing to the integrity of the information space around elections will vary based on its institutional mandate, resources, and capacity. Nonetheless, EMBs around the world are independently developing responses to counter disinformation in the electoral process and sharing lessons learned with peers. This section of the guidebook presents a global overview and preliminary analysis of the various EMB responses taken to counter electoral disinformation. The purpose of this analysis is to support election authorities as well as donors and implementers to combine, scale and adapt approaches based on an EMB’s capacity and the unique context in which it is working.

“We manage not just the election – but there is another thing we have to be concerned about. This is the social media issue. This makes a very big noise, but it’s not directly an election issue.” – Commissioner Fritz Edward Siregar, The General Election Supervisory Agency of Indonesia (Bawaslu)

Informative vs. Restrictive Approaches to Countering Disinformation

A fundamental tension at the heart of how EMBs choose to respond to electoral disinformation is whether to focus on increasing dissemination of credible information or on restricting or sanctioning content or behaviors deemed problematic. While it may be possible to do both with adequate resources, for some EMBs it is a question of what the guiding principle behind their approach will be. In a report summarizing their disinformation efforts in 2018 and 2019, the National Electoral Institute of Mexico (INE) sums up this choice, and the philosophy behind their approach:

“Disinformation strategies challenged INE with the need to find a way to counter them. One alternative could have been undertaking a regulatory stance … and punish[ing] pernicious practices; although it might have resulted in undue restrictions on freedom of expression. The other was to counter disinformation with its contrary: detailed, timely, and truthful explanation of the electoral process, its stages, tempos, stakeholders, and those in charge…. It was always clear for INE that this second option was the most adequate….”1

Other EMBs, often in concert with a broader intra-governmental approach, error on the side of restricting content and behaviors as a means to prevent harms.

Proactive, Reactive and Collaborative Strategies

The EMB strategies to combat disinformation discussed in this section of the guidebook are grouped into three categories: proactive, reactive and collaborative. Users can click on each strategy in the table below to explore global examples as well as analysis regarding what considerations should be made when choosing an approach.

EMBs can adopt proactive strategies in advance of electoral periods to promote trust and understanding of electoral processes, put contingency plans in place for when challenges emerge, and establish norms and standards for conduct during elections. Proactive strategies are more likely to build on pre-existing functions within an EMB. In designing a counter-disinformation strategy, EMBs and partners should acknowledge that reactive approaches that attempt to mitigate the impacts of disinformation once it is already in circulation can only address part of the problem. Election authorities, donors and implementers should not let a bias toward technologically innovative programming undercut continued investment in building the types of durable capacity that make EMBs more resilient when disinformation challenges arise.

      Explore Proactive Strategies:
  1. Strategic Communication and Voter Education to Mitigate Disinformation ThreatsBuilding resilience to misinformation and disinformation by ensuring voters receive credible information early, often, and in ways that resonate with them.  
  2. Crisis Communication Planning for Disinformation ThreatsPutting systems and processes in place so that an EMB is prepared to rapidly and authoritatively respond to misinformation and disinformation in high-pressure situations.
  3. EMB Codes of Conduct or Declarations of Principle for the Electoral PeriodCreating norms and standards for political parties, candidates, media and the electorate at large that promote the integrity of the information environment around elections.


Reactive strategies to track and respond to messages in circulation that have the potential to disrupt electoral processes, generate distrust in elections, or illegitimately shift electoral outcomes are an important aspect of countering disinformation. Reactive interventions may be the first to come to mind in designing a counter-disinformation approach, but these approaches can be the most technologically difficult for EMBs to implement and the most resource intensive. While reactive interventions are an integral part of a multifaceted response to disinformation, combining them with proactive strategies and ensuring that an EMB has the capacity and appetite to implement them effectively are critical for ensuring an effective approach.

      Explore Reactive Strategies:
  1. Social Media Monitoring for Legal and Regulatory ComplianceMonitoring social media during electoral periods to provide oversight of the social media use of candidates, campaigns and the media. 
  2. Social Listening to Understand Disinformation ThreatsDistilling meaning from conversations happening online in order to inform EMB messaging and responses to misinformation and disinformation during electoral periods.
  3. Disinformation Complaints Referral and Adjudication ProcessEstablishing a mechanism or process by which election authorities or election arbiters can adjudicate and remedy instances of disinformation. 


Regardless of how narrowly or broadly an EMB interprets its mandate to engage in counter-disinformation work, to achieve maximum impact the efforts of election authorities must be coordinated with the efforts of other state agencies and institutions. EMBs are likely to maximize the impact of their efforts through coordination or exchange with other stakeholders, including social media and technology companies, civil society and traditional media actors, as well as other state entities. There will always be aspects of the disinformation problem that fall outside the mandate of the EMB. The appropriate allocation of responsibilities in a way that allows EMBs to focus their counter-disinformation efforts on electoral integrity considerations – while coordinating their response with other stakeholders better equipped to handle other facets of the problem – will enable a more concentrated and focused effort on the part of the EMB. 

      Explore Coordination Strategies:
  1. EMB Coordination with Social Media and Technology CompaniesCoordination between EMBs and technology and social media companies to enhance the dissemination of credible information or restrict the spread of problematic content during electoral periods.
  2. EMB Coordination with Civil Society and MediaPartnerships with civil society and media to build coalitions to counter disinformation and enhance an EMB’s ability to monitor and respond to misinformation and disinformation. 
  3. EMB Coordination with Other State AgenciesPartnerships with other state entities to distribute responsibilities and coordinate responses to misinformation and disinformation.
  4. Peer Exchange Among EMBs on Counter-Disinformation StrategiesCreating opportunities for exchange of lessons learned and good practice among election authorities.

Should EMBs have a responsibility to counter disinformation? 

This is a question on which EMBs are not in agreement. Differences in legal mandates, political context, availability of resources, and technical capacity all influence the degree to which an EMB might be willing and able to adopt a substantive role in countering disinformation. 

Different EMBs highlight various aspects of their legal mandate to justify their role in counter-disinformation work. Oversight of the conduct of political candidates or the media during the electoral period, or a voter education or voter information mandate, are some of the avenues that EMBs might use to define the parameters of their role in countering disinformation. A broad responsibility for EMBs to maintain the fundamental right of citizens to vote can also be grounds for an EMB to take an active role. Differing legal mandates will inform what programming is possible to implement with an EMB. For instance, an extension of some EMBs’ mandates to monitor traditional media during electoral periods might naturally be extended to monitoring social media as well. For other EMBs, the monitoring of social media during electoral periods would be an inappropriate overstep of their legal mandate. Any programming to bolster EMB-responses to disinformation must be grounded in a thorough understanding of the bounds of what is legally permissible. 

From a resource perspective, strained budgets or limited control by the EMB over how to use allocated funds can make it difficult to dedicate resources to counter-disinformation activities, particularly if they are seen to divert resources from other essential aspects of election administration. If EMBs struggle to muster the resources to conduct their core mandate of delivering elections, the investment of resources to build out a significant capacity to address disinformation is likely to be untenable. 


While disinformation responses can be housed within different departments of an EMB, many EMBs have chosen to give this mandate to the public relations or communications staff. The Independent National Electoral Commission of Nigeria, with 90 full time communications staff, has enacted and can consider counter-disinformation approaches that are unlikely to be practicable for an EMB with a communications staff of only a few people.

From a technical and human capacity perspective, EMBs may also lack the human resources to contemplate responses to disinformation that are time intensive or technologically sophisticated. Recruiting and retaining staff that have knowledge of social media and technology more broadly can be difficult, particularly if the EMB is attempting to build out an entirely new capacity, as opposed to strengthening or investing additional resources in a capacity that already exists. 

As a final consideration, the political context in which an EMB operates may also impact the institutional independence of the EMB in ways that limit its efficacy as a counter-disinformation actor. In instances where an EMB’s actions are subject to or constrained by the political influence of domestic actors, extending an EMB’s mandate to counter disinformation may be ineffective and the EMB may be reluctant to take on such a role. If the EMB is already perceived to be partial, its efforts to counter disinformation may also further damage its credibility in the eyes of the public. 

In an era of information overload and digital disinformation, it is critical that EMBs are able to cut through the noise with proactive and focused messaging.  As credible information can easily be lost in a sea of distracting, problematic and misleading messages, the impetus is on authoritative actors – such as EMBs – to ensure credible messages are reaching the right audiences in ways that resonate with them. Proactive counter-disinformation messaging can be embedded within an EMB’s larger communication strategy, or can be one of the outputs of a dedicated counter-disinformation planning process. Either way, an effective communication strategy requires planning and refinement in advance of an election. Depending on patterns of social media use in their country, an EMB may also have the opportunity to use social media to reach specific audiences susceptible to or likely to be targeted by disinformation, such as women, people with disabilities, and people with lower levels of education, among other groups. 

Like all of the measures in this section of the guidebook, proactive communication strategies can and should be combined with other proactive, reactive or coordinated responses to form a comprehensive and inclusive approach to improving the integrity of the information environment around elections. The balance or combination of these measures is likely to vary from one election to the next. 

Proactive messaging should not be confused with the more limited idea of messaging that raises public awareness about the existence of disinformation. Awareness of disinformation as a threat is already on the rise, with a 2018 Pew Center survey showing that almost two thirds of adults across 11 surveyed countries believe “people should be very concerned about exposure to false or incorrect information.” This finding is further supported by CEPPS public opinion research. 1 Messaging can and should seek to raise awareness of the need to be critical of information sources, think before sharing content, and other basic tenets of digital literacy. However, messaging should also focus on the broader goal of communicating in ways that build trust in the EMB and faith in the integrity of electoral processes.

1.1 EMB visibility has value

Building a track record of consistent communication can help an EMB to message with authority during times of confusion or heightened tension that might stem from mis- or disinformation. As a new wave of digital disinformation has made clear, investments in an EMBs’ capacity to deliver their core communication mandate through new and established channels is increasingly vital. 

The INE of Mexico provides one model to consider for EMBs’ developing their own organizational approaches to countering disinformation. INE designed a robust digital media strategy ahead of 2018 elections, aiming to increase the volume of credible, engaging content contending for user-attention on social media. During the 2018 electoral period, INE produced and disseminated over six thousand pieces of digital content, which were also available through a centralized website focused on public outreach.2

 “We bet on a different strategy – to confront disinformation with information.” — Dr. Lorenzo Córdova Vianello, Councilor President of the National Electoral Institute of Mexico

INEC Nigeria - Facebook photoThe INEC of Nigeria deploys its longstanding institutional investment in public communication as a bulwark against disinformation. During electoral periods, the INEC provides daily televised briefings, participates in live TV interviews, issues regular press statements to explain the policies and decisions of the commission, and runs the INEC Citizens Contact Centre (ICCC) to provide the public with access to the commission and communicate with critical stakeholders. INEC has also had an active social media presence for more than a decade, using it as a channel to disseminate information and interact with voters. As the INEC confronts digital disinformation, their existing communication capacities are being reconsidered and adapted to enhance INEC’s transparency, credibility and perceived integrity in order to sustain public trust and confidence.  

The Brazilian Superior Electoral Court (TSE) augmented their traditional public outreach strategies through investment into widely-adopted mobile applications that allow election authorities to communicate rapidly and directly with voters and poll workers. The “e-Título” mobile app works as a virtual voter ID card, helps voters identify their polling stations and provides an avenue for direct communication between the TSE and voters.  The “Mesários” application provides information and training to poll workers. During the 2020 electoral period, more than 300 million messages were sent to the almost 17 million users of these apps with timely and reliable information on election organization, health protocols amidst Covid-19, and tips to fight fake news.

“We want to prevent the dissemination of fake news not with content control, but with clarification, critical consciousness and quality information.” — Justice Luís Roberto Barroso, President of the Brazilian Superior Electoral Court (TSE)

1.2 Counter the objectives of the propaganda, rather than the propaganda itself

A proactive communication strategy will attempt to anticipate what categories of false or problematic messages are likely to gain traction and be damaging during a specific election, and will then aim to build resilience in those areas. The Harvard Belfer Center’s Handbook on National Counter Information Operations Strategy emphasizes that communicators should seek to counter the objectives of propaganda, rather than the propaganda itself. A proactive communications campaign that builds public understanding of election procedures and public trust in the integrity of the EMB is likely more effective preparation than trying to anticipate each false narrative malign actors might choose to employ, particularly since these actors can change and adapt strategies quickly. If one false narrative is not gaining traction, they can simply switch to another.

“Given the volume and content of information operations that competitors can spew out through social and traditional media, [authorities] cannot and should not respond to each false narrative individually. Addressing the content directly adds fuel to the narrative’s fire.” — Belfer Center Handbook on National Counter Information Operation Strategy

Electoral disinformation within an EMB’s purview might seek to undermine faith in the value or integrity of elections or election authorities, incite electoral violence, or seed suspicions of fraud that lay the groundwork for post-electoral legal challenges. As a proactive approach, EMBs and other stakeholders could design a communication strategy in advance of the election around the goals of enhancing transparency and building understanding of electoral processes, highlighting election security measures, or explaining the election dispute resolution process. 

Electoral disinformation might also seek to prevent specific groups from participating in the electoral process by spreading false information about the rights of certain groups and by targeting specific groups with false election information. Disinformation campaigns frequently manipulate and amplify hate speech and identity-based social divisions, allowing malign actors to heighten social polarization for personal or political gain. EMBs can proactively combat these efforts by ensuring that their communications strategies target majority groups and minority groups with messages that highlight the rights of women, people with disabilities, and other marginalized groups to equally participate in the electoral process as well as other targeted voter information. To reach different groups, information might need to have unique dissemination strategies that differ from general voter education efforts – person-to-person; in markets, churches, and other common places; in simple language, in images, or in minority languages – to take into account barriers these groups face when accessing information. 

Given changes to the administration of elections introduced through electoral reform in 2014 in Mexico, misunderstanding of the new processes was a potential source for misinformation and disinformation during the 2018 election process, the first under the new reforms. A key push of INE’s public communications strategy ahead of the elections was to build understanding of the mechanics of voting, counting and results transmission by explaining new processes clearly and simply so that people knew what to expect at every moment during the election. Communicating in a way that reinforced INE’s political neutrality was also key, as the authorities knew that partisan or bad actors might attempt to politicize the institution.


Indonesia has two distinct election management bodies. The Komisi Pemilihan Umum (KPU) which administers elections in Indonesia as well as the election supervisory body, Badan Pengawas Pemilihan Umum (Bawaslu) which is charged with monitoring and oversight of the electoral process.  

CEPPS conducted fieldwork in Jakarta in late 2019 to inform the development of this guidebook.

In Indonesia, where intercommunal fault lines are ripe for exploitation, the election oversight body, Bawaslu, created PSAs against incitement to violence and hate speech and promoting digital literacy in advance of 2018 elections.  The PSAs were developed with IFES support and disseminated via YouTube, Instagram, Facebook, Twitter, WhatsApp, and Bawaslu’s websites as well as digital billboards throughout Jakarta. These PSAs were followed by a second round focusing on the role of participative public election monitoring and tutorials on election violation reporting tools as well as cautions against incitement to violence and disinformation.3 The strategy and storylines for the PSAs were developed through a consultative workshop facilitated by IFES with both election management bodies and key civil society partners. 


Indonesia cartoonTo counter hate speech and the spread of disinformation, in partnership with CEPPS/IFES, the Union Election Commission (UEC) of Myanmar developed animated public service announcements that were shared on the UEC and partner social media channels and websites. This was also adapted to a comic book and translated into 20 ethnic languages. 


Indonesian Ad

The KPU in Indonesia created 3,000 anti-hoax memes, which consisted of infographics and other branded social media content in advance of the election. Content created by the central KPU would be modified by regional offices in response to local context and translated into local languages.


Taiwanese poster


Premier Su Tseng-chang shared this image on Facebook showing him as a young man with a full head of hair, as means to dispel online misinformation of new government regulations on hair salons. It includes the mock caution: ‘Dyeing and perming within seven days really damages your hair, and in severe cases you'd end up like me.’ 



1.3 Effective Messaging to Promote Information Integrity

Make Messages Engaging

In the face of constant innovation in communication methods, EMBs must respond to the evolving nature of communication. By no means does this mean that EMBs should abandon traditional communication channels; radio, television and newspapers still directly reach a larger share of the population than social media in most countries, and traditional media is still a part of the information ecosystem that amplifies false and misleading information that originates online. However, revising and innovating within their communication approaches can help EMBs meet their key audiences with messages they will more readily consume and remember. Explicitly identifying ways to create engaging content can be an important part of an EMB counter-disinformation strategy. 

Even if an EMB is already using social media to some degree, strategic consideration should be given to the value of engaging with voters on new platforms or utilizing new features on the platforms where they have an established presence. For example, though the South African IEC has been present on Facebook and Twitter for nearly a decade, during 2019 they made use of a voter registration Snapchat feature for the first time. This in-app feature connected Snapchat users to voter registration resources, and the number of South African users taking advantage of this feature to register exceeded averages from other countries.4 Brazil’s TSE, while continuing to expand their use of Instagram, Facebook and Twitter, established a TikTok presence less than two months before 2020 local elections. Given that content on TikTok can organically reach large audiences without needing to build a follower base first, in those two months, the TSE’s TikTok account gained 20,000 followers and millions of views for their library of approximately 80 videos; a TikTok video outlining health protocols to be followed on Election Day achieved over 1.2 million views alone.

In Taiwan, the form of counter messages coming from official channels is encouraged to be funny and “memetic” to increase the likelihood that counter messaging can organically go viral via the same channels through which disinformation proliferates. For example, to prevent the transmission of misinformation and disinformation during the COVID-19 pandemic, Digital Minister Audrey Tang has established the Taiwan FactCheck Center, which include Meme Engineering teams that partner with national comedians to clarify online rumors to the public in an expedient, humorous, and effective way. This ‘humor over rumor’ strategy is acknowledged as a critical strategy in helping curb the spread of COVID-19 in Taiwan, and this approach can be adapted when countering disinformation beyond the pandemic. 

To make content both engaging and credible, EMBs can also identify trusted messengers with the ability to reach specific audiences. Establishing lines of communication with leaders or members of religious groups, sports clubs, libraries, professional networks or other traditionally apolitical spaces might be a means to reach new audiences with proactive messaging. Nigeria’s INEC, for example, works with actors and other celebrities to visit college campuses and build enthusiasm among youth voters. Brazil’s TSE partnered with football clubs as part of their #NaoTransmitaFakeNews campaign urging users to not spread fake news. Eighteen football clubs participated in the campaign, which garnered more than 80 million Twitter impressions across the first and second round of the election.

These networks of trusted messengers, when built in advance, can also be used as dissemination channels and amplifiers in instances where false information needs to be debunked, an approach that is discussed further in the subsection on crisis communication. While these networks can be built by national election authorities, regional EMBs might also benefit from building their own subnational networks of trusted messengers. 

Make Messages Inclusive and Accessible

Ensuring that messages are inclusive and accessible to all people and, in particular, groups that have been historically marginalized, is a key consideration for EMBs. EMBs should ensure that they consider the diverse ways people access voter information. For example, men in a given country might be more likely to rely on television for voter information, while women might rely on radio messages or conversations with neighbors. EMBs can conduct surveys or polls, or consult with organizations that represent people from different marginalized groups, in order to understand how different voters access information and then be responsive to those needs.  

In addition, many social media platforms offer ways for users to easily add accessibility features, such as alternative (alt) text5 to describe photos for screen readers, posting a transcript for an audio file such as a radio recording, or including subtitles or captions6 for videos. EMBs that use these features, and that include actors and images of people with disabilities and other diverse identities in their campaigns make their content more inclusive and accessible. EMBs can help ensure that the content they produce is accessible by distributing messages in multiple formats, such as sign language, easy-to-read, and local languages, and consulting with civil society organizations on the most commonly used platforms, pages and handles. 

For example, ahead of the August 2020 elections in Sri Lanka, the EMB collaborated with a group of DPOs to create a social media campaign to ensure people with psychosocial disabilities knew they had the right to vote and to raise awareness with political parties of the need to eliminate derogatory language from their political campaigns. The campaign, produced both Sinhala and Tamil, reached nearly 50,000 people and resulted in the EMB Chairman releasing a public statement acknowledging the political rights of people with psychosocial disabilities. 

Another key point in accessibility is considering the gap in access to and knowledge of certain technologies for certain populations. The gender gap in access to technical tools and the internet, for example, is well-documented and underscores the need to continue to disseminate messages in ways that are accessible to those who might not have consistent access to technology. 

Make Messages Memorable

To make a proactive counter narrative memorable in the face of an onslaught of repetitive and reinforcing disinformation, it must have a clear point and it must be repeated many times. Research suggests that for both true and false claims, information that is repeated feels more true, even if it goes against what you think you know. 

In response to a fraudulent campaign in which bad actors were using the Central Election Commission’s (CEC) name to knock on doors and collect personal information, the Georgian EMB widely disseminated the message that it does not collect information in this manner, and then repeated the message via multiple communication channels. The EMB’s response did not simply contradict the message that was being spread, but it used the initial incident to raise public awareness about the methods that were being used to deceive and to share a clear message on how to get credible information if voters were faced with similar uncertainty in the future. 

Messaging does not need to be technologically groundbreaking to be effective, but adapting approaches to fit new needs is critical. Stretched resources and staff, outdated or nonexistent strategic communication strategies, and a belief that the truth of a message should speak for itself can undermine the communication effectiveness of EMBs. Reflecting on a press conference that her institution had held to debunk false information circulating about upcoming elections, a staff member of an East African Election Commission observed that it had only served to increase the virality of the rumors and encourage the disinformation. Reactive, static and unengaging counter-messages are less likely to achieve the desired result of building trust in the process.

1.4 Take advantage of unique aspects of social media for EMB use 

Social media can provide EMBs that are equipped to use it with a potent tool for increasing institutional transparency, building trust, and executing their voter education mandates. While institutional use of social media is no longer a cutting-edge idea, there are EMBs that still do not use social media at all, and many that are working to keep their approaches current as patterns of social media use evolve. For countries with high rates of social media penetration, investment in an EMB’s social media capacity is a moderate cost, high impact way of reaching key audiences and providing a counter narrative on the same channels where digital disinformation is originating and spreading. 

Use social media for two-way communication

Social media has the potential to provide a direct channel of dialogue between EMBs and voters. Training and empowering designated EMB staff to take advantage of this two-way channel for communication is therefore very important. Because of the informality of the medium, social media has the potential to be a more authentic, open, timely and responsive means of communication. An EMB’s willingness to directly engage with voters through their social media channels to provide quick, personal communication can build trust and provide an authoritative source where voters can seek or verify information. In deciding to adopt a more robust social media presence, EMBs should be resourced and prepared to follow through on this potential. Once an EMB opens this channel for conversation, they must be ready to sustain it.

 “The deployment of Social Media as a communication strategy employed by INEC has had a profound impact on electoral processes, changing the channels used by citizens and voters to obtain information from the traditional media or one-way communication channel to the mobile-based platforms that allow for two-way interactions through user-generated content and communication.” — Dr. Sa’ad Umar Idris, Director-General INEC Electoral Institute, Nigeria

Segment audiences and reach target audiences

Social media also allows for the potential to segment and reach audiences with messages more uniquely calibrated to resonate with them. This is a powerful strategy already employed by disinformation actors. 

There are two lenses to use when identifying audiences that an EMB may want to target with specialized counter-disinformation messaging. Considering both of these at the outset of developing a counter-disinformation communication strategy can yield different insights into which audiences to reach and how to reach them.

The first lens is to consider audiences that are likely to be consumers of misinformation and disinformation that might impact their willingness or ability to participate. For example, an EMB might identify first time voters, voters with disabilities, voters from an ethnic or linguistic minority -- or any other group of voters -- as particularly at risk of encountering disinformation designed to suppress their democratic participation. By identifying tactics that might be used to inhibit the participation of these groups, EMBs can design and target content that dispels misunderstanding about voter registration, builds understanding of the accessibility of polling stations, or outlines the steps taken to ensure the secrecy and safety of casting a ballot. It is important, of course, for the EMB to understand if these targeted populations are actually using social media platforms (and which ones) before employing this strategy. For example, certain marginalized groups might be more likely to use specific social media platforms because of different individual, institutional, and cultural barriers. 

The IEC of South Africa identifies youth, special voters and those voting abroad in their communication planning as distinct audiences they are trying to reach with specific messages. Furthermore, they build discrete communication campaigns into their overall communication strategy, including messaging around registration, applications for special voting and voting abroad, voting procedures and awareness building about digital disinformation. Integrating this segmentation into a cohesive communication strategy that includes social media can be an important way of proactively using social media to provide information, create a feedback loop, and reach audiences that might need more information as a precursor to participation. 

The second lens is to think about audiences that might be the subject of a disinformation campaign. This might take the form of a disinformation campaign that evokes existing currents of hate against marginalized populations to suppress participation, allege electoral fraud, or promote outrage among dominant identity groups. For example, a disinformation campaign may be designed to intimidate women candidates into dropping out of a race or to allege that immigrant populations are engaging in large-scale voter fraud. 


Equipping EMBs to use social media to greater efficacy to reach different audiences could include:

  • Using social media analytics to determine what types of content are performing well and which audiences are and are not being reached. 
  • A/B message testing, which enables the content creator to compare the performance of different pieces of content so they can quickly pivot toward high performing messaging strategies while jettisoning underperforming content. 
  • Using the targeted advertising features of social media to reach defined audiences.

The complexity of this task can be tailored to match the needs and capacity of an individual EMB, recognizing that for some EMBs only very basic approaches will be possible or advisable and for other EMBs, advanced techniques would be entirely appropriate. 

It should be noted that in the hands of commercial entities and malign actors, tactics such as those above have been understandably treated with suspicion. While EMBs should always adhere to a high standard of data privacy and data protection, these widely available tools are largely value neutral – it is the uses to which they are put that determine their ethical implications. If EMBs and other democratic institutions do not take advantage of the ways in which social media tools can be leveraged to promote democratic goals and the integrity of their institutions, then they can never hope to compete in their ability to shape the information space around elections or around democracy more broadly and will continue to be outmatched by bad actors on the messaging front.  

“You have to use social media to engage proactively. If you only use it to react, control or limit social media then that is a losing wicket” — Vice-Chairperson Janet Love, Electoral Commission of South Africa 


 EMBs face a potent mix of pressures, including: heightened public perception of disinformation as a threat to elections; pressure on them to be seen actively countering disinformation; differing levels of understanding of the nature of the problem among EMBs; and the time-sensitive nature of effective responses. Given this context, an EMB’s reaction in the moment might be informed by a perception of immediate need rather than reflecting a larger strategy best suited to promoting electoral integrity. Crisis Communication planning can create a roadmap for EMBs to respond to electoral disinformation during sensitive stages of the electoral process. In instances where an EMB has historically relied on ad hoc communication strategies during a crisis, programmatic investment can help EMBs formalize a crisis communication strategy to improve the speed and accuracy with which they are able to respond to mis- and disinformation.

One tactic of disinformation campaigns – whether led by foreign or domestic actors – can be the deliberate attempt to create a crisis mentality in order to sow distrust or confusion and undermine faith in democratic institutions and the electoral process. Not all misinformation or disinformation is indicative of a crisis, and determining the timing and form of an EMB’s response, and in which circumstances it will make a response, is part of the preparatory work that can help an EMB focus resources and decision making when needed.

“Know the way you will react if a problem presents itself, if fake news comes out. The EMB can’t just receive hits.” — Dr. Lorenzo Córdova Vianello, Councilor President of the National Electoral Institute of Mexico

2.1 Don’t create your own crisis

Due to the added attention placed on digital disinformation, EMBs may feel compelled to respond to any and all items of election-specific misinformation or disinformation that they encounter. Crisis communication planning can help establish criteria for what circumstances will warrant a response from the EMB, and in what form. Highlighting the existence of a piece of false or misleading content for the purposes of rebutting it may not always be the best course. In explaining their thinking behind whether to debunk or ignore such content, nonprofit First Draft writes that, “[i]f certain stories, rumours or visual content, however problematic, were not gaining traction, a decision was made not to provide additional oxygen to that information. The media needs to consider that publishing debunks can cause more harm than good, especially as agents behind disinformation campaigns see media amplification as a key technique for success.” This same consideration is important for EMBs. Crisis communication planning allows the time and space for EMBs to develop best practices for how they will provide clarifications or rebuttals so that they do not inadvertently exacerbate the problem. Good practice on how to provide effective fact checks continues to evolve, a topic that is explored further in the guidebook subcategory on fact checking.  

2.2 Create clarity on lines of communication and authority

An integral part of crisis communication planning is ensuring that information flows and hierarchies are delineated in advance. This can be particularly relevant in instances when the EMB is called upon to confirm facts or issue clarifying statements and counter-narratives quickly. A clear and direct communication protocol to coordinate responses can be essential to restoring public confidence, as vague or conflicting clarifications can exacerbate the problem.  It is imperative to have clarity on who has the right to issue statements and through what means those statements will be made – not only within the EMB, but in consultation with other state agencies that may be called upon to clarify. Failure to do this can add fuel to the very mis- and disinformation a public statement can be intended to quell. For example, in response to alleged out-of-country voting fraud taking place in Malaysia, the two different Indonesian electoral bodies initially issued contradictory clarifying statements (one claiming that the allegation was entirely fabricated, and the other stating that the issue was real, but minor and had already been detected) – which created more confusion and potentially undermined the credibility of both bodies. Having a clear and expeditious protocol in place for how the two agencies would coordinate messaging could have helped avert this misstep. 

Crisis communication protocols should strike a balance between expedience and internal checks for accuracy. An EMB should avoid having communications choke points whereby requests for clarification cannot be responded to with speed. In the case of Indonesia, third-party fact-checking civil society organization MAFINDO would frequently call on the KPU and Bawaslu to issue a rebuttal of false or misleading information in circulation, but the CSO reported that response times varied significantly, at times taking several days to get a response, if one was received at all. Because the speed with which a false rumor is rebutted or removed once it has started to gain traction has a significant impact on the ultimate reach of that information, for cases where there is a clear-cut answer to be given, the right individuals should be given the power to clarify.

2.3 Balancing multiple priorities

Crisis communication planning can also help to establish institutional guidelines on balancing communication priorities with other electoral priorities. “While it can be important for the public to see leaders pitching in during a crisis response, there is a limit.”1 For example, Indonesian electoral authorities were very active in the investigation of cases of viral misinformation and disinformation in the run up to 2019 Elections. In the case of a rumor that cargo ships full of pre-voted ballots had arrived in Jakarta, the commissioners themselves mobilized to go to the port late in the evening on the day the rumor gained traction in order to investigate and issue a public statement. Similarly, a few days before the election, commissioners were deployed to Malaysia on short notice to rebut claims that fraudulent out-of-country voting was taking place.  For a severe case in which a false claim has the potential to derail the election, this response was transparent and visible, but a careful calculation should be made in terms of the best investment of time of EMB commissioners and staff, particularly in close proximity to elections when there are competing demands.

2.4 Coordination with Other State Entities

The EMB may be the lead agency in the crisis response, or it may be one member of a network. Misinformation and disinformation are rarely siloed and clear-cut, and will often include aspects that are within an EMBs purview to rebut, such as false or misleading information directly related to the electoral process, in combination with other issues upon which another government agency might be better positioned to comment, such as public health concerns or rumors of violence on Election Day. The subcategory addressing EMB Coordination with Other State Entities provides additional considerations on this topic. 

Crisis communication planning must also include the post-electoral period. While an EMB may be active during the campaign period and on Election Day in monitoring and responding to problematic content, the period immediately following Election Day is one of the most at-risk periods for false and misleading information. Misinformation and disinformation that emerge during this period can have implications for public acceptance of the results or post-electoral violence if, for example, narratives of fraud or malpractice in polling, counting or results transmission gain popular traction in ways that leave citizens feeling disenfranchised. An interlocutor involved in media monitoring in South Africa notes that bad actors may modify their behavior for the better during campaign periods when they are aware of enhanced media monitoring and enforcement efforts, but observed that in South Africa “vile” content ramped up as soon as enhanced monitoring efforts ended. 

The immediate post-electoral period can be a particularly strenuous period for EMBs as polls close and results are counted, aggregated and certified. Furthermore, the EMB is likely to be called on to resolve post-election complaints and they may also be a party in electoral cases that go to the courts. The coinciding of this exceptionally busy period for the EMB with a window of time that is particularly ripe for misinformation and disinformation means that a clear plan for communication protocol during the post-electoral period is essential, including clarity on communication with and shared responsibilities among state entities. Indonesian electoral authorities experienced this first hand following 2019 elections when rumors of electoral fraud led to protests that resulted in the deaths of 9 people and the restriction of social media access by government authorities in an attempt to stem the spread of misinformation and unrest. In the run up to Election Day, Bawaslu had been playing a leading role in coordinating responses to electoral disinformation and flagging problematic content for removal by the social media platforms. However, strained capacity in the days following the election forced them to step back from this role with the expectation that other government agencies would be able to step in. Crisis communication planning can help facilitate a smooth transition of responsibility from the EMB to other agencies in times where that is appropriate. A plan can help determine in advance in what way and at what point responsibilities might shift – both public communication responsibilities as well as communication with the social media platforms. 

2.5 Trusted messengers to amplify messages

In anticipating how they will counter electoral misinformation and disinformation, EMBs should consider who are the effective messengers they can call on for rapid amplification of crisis communications that can credibly reach the audiences that are at greatest risk: 

  • Who are the most effective messengers to reach supporters of different political parties? 
  • Who has credibility with groups that might be susceptible to violence or extremism in an instance where false information was rampant? 
  • Who can reach women, youth, or different religious communities? 


Verificado Brazil poster

The Brazilian football club partnership disseminated content that used sports analogies, including this example referring to the VAR football verification system. In this instance, the content seeks to build confidence in election integrity and counter rumors of security flaws in Brazil’s electronic voting machines by noting the widespread use of similar machines in other countries.  

Proactively identifying the right messengers can be a key preparatory step that allows an EMB to disseminate factual information most effectively under pressure. Preparing these networks to disseminate information to their communities in times of information crisis can be an essential way to ensure that an EMB’s message is amplified by credible sources in periods when a flood of information might drown out authoritative actors speaking from their own, more limited, platforms. For example, ahead of 2020 local elections, the Brazilian Superior Electoral Court (TSE), partnered with one of the country’s most popular soccer clubs to counter fake news.

In 2018, the office of the Prime Minister of Finland created an initiative to work with social media influencers to disseminate credible information in a crisis scenario. The network of 1,500 influencers that was established through that initiative was first activated to disseminate credible health information during the COVID-19 pandemic. Working with social media influencers enabled the government to reach audiences that are not consumers of mainstream media. For example, a video of an influential YouTube personality interviewing a government minister and health experts received more than 100,000 views within two days. A similar model could be employed by an EMB, for example, engaging social media influencers in advance of an election to sign a peace pledge that committed them to disseminating credible information and promoting peace on and directly after Election Day in a country where electoral violence is a concern.

While EMBs generally lack authority to sanction or deter the behavior of foreign disinformation actors, they may have a mandate to set standards and norms for domestic actors. Codes of conduct are a tool used by some EMBs to define how political parties, candidates, media or the electorate at large should behave during the electoral period. In recent years, some EMBs have moved to fill the normative and regulatory gap that exists around the use of social media in elections by creating codes of conduct, codes of ethics or declarations of principles (for the purposes of this subcategory, these are collectively referred to as codes of conduct, meaning documents outlining normative behaviors for the electoral period). 

Codes of conduct can either be voluntary, non-binding agreements that result from a consensus among the parties, or they can be part of the legislative and regulatory framework that is binding and enforced. Codes of conduct for the use of social media in elections include examples of both types. Voluntary, non-binding agreements tend to be shorter in length, committing signatories to broad principles. Those that have some weight of enforcement, of necessity, contain provisions that have greater specificity. 

“[The Principles allow] us to say that our political parties agree on a set of rules and it is a first step in moving towards developed democracy where political opponents respect one another and demonstrate issue-based discussions. In the long term, having a culture of dialogue instead of negative campaigning and defamation of political candidates is the goal of this document.” — IFES Interlocutor at the Central Election Commission of the Republic of Georgia


The Election Commission of India created a “Voluntary Code of Ethics for the 2019 General Election” that was developed in consultation with representatives of Social Media Platforms to govern the behavior of these entities during 2019 elections. Additional details can be found in the subcategory addressing EMB cooperation with social media and technology companies.

The guidebook section on Norms and Standards discusses regional frameworks and other transnational examples of norm setting around disinformation. The guidebook section on Legal and Regulatory approaches to countering disinformation discusses a larger array of legal approaches governing the use of social media in elections. This subsection is limited to codes of conduct that address disinformation (exclusively or in combination with other problematic electoral behaviors) and are created and promulgated by EMBs to govern the conduct of political parties, candidates and their supporters, or the media during elections.


EMB codes of conduct intended to limit disinformation can be directed toward various electoral stakeholders and can be limited to a specific election or exist as a standing document. The Central Election Commission of the Republic of Georgia, for example, narrowly tailored their counter-disinformation guidance in their “Ethical Principles of Candidates of 28 October 2018 Presidential Elections” to presidential candidates in the specified election.  Panama’s Digital Ethical Pact broadly addresses “users of digital media” in the context of elections. South Africa’s “Code of Conduct: Measures to Address Disinformation Intended to Cause Harm During the Election Period” (in draft form as of December 2020) is aimed at “every registered party and every candidate” with additional obligations under the code for how those parties and candidates must take appropriate recourse against any member, representative or supporter who behaves in violation of the code. Nepal’s “Code of Conduct to be followed by Mass Media, Non-Governmental Organizations and Observers1 has chapters addressing different audiences. 

Internal codes of conduct that political parties voluntarily adopt to govern the behavior of their candidates and members are discussed in the guidebook section on Political Parties.  


Particularly in the case of codes of conduct that rely on the voluntary commitment of signatories, a consultative development process can increase the legitimacy of the document. In their 2015 guide on developing social media codes of conduct, International IDEA recommends that EMBs “engage in a consultation process with a broad range of electoral stakeholders, especially journalists, bloggers, government agencies, and political commentators, that begins in the pre-electoral phase of an electoral cycle.” Consultations with civil society actors who represent different marginalized groups is also encouraged.

In Indonesia, Bawaslu conducted a highly consultative process in the development of their declaration to “Reject and Counter Vote Buying, Insults, Incitements, and Divisive Conflict in the 2018 and Pilkada and 2019 General Election.” The pledge was signed by 102 participating organizations after a 3-day consultative event that included CSOs, universities, religious organizations, and youth groups.2 Signatories committed to a seven-point declaration rejecting intimidation and disinformation. This consultative process created a network of known and trusted actors that Bawaslu continued to work with on issues of disinformation and incitement throughout the 2018 and 2019 electoral periods. In this instance, the process of creating the declaration and the network of actors that came out of it was of equal if not greater value than the substance of the code itself.  Bawaslu’s coordinated, multi-stakeholder responses to disinformation are explored in more detail in the subsection on EMB Coordination with Civil Society


Codes of conduct that address disinformation can take many different forms. In some countries, a commitment to refraining from sharing disinformation is included as part of a broader code of conduct that covers all forms of conduct during an electoral period. In others, a code to deter disinformation is created to stand on its own. Some codes are only a few hundred words in length; others are much longer. Despite these differences, there are several common elements that could be considered by other electoral authorities looking to develop their own standards:



Because the array of content that can be considered disinformation is relatively broad, it is necessary for electoral authorities to define the scope of violations that they view as falling under their authority. Particularly for codes of conduct that have some element of enforceability, the provision of clear and specific definitions is essential for enforcement.

South Africa’s code is drawn narrowly to limit its application to the electoral period and ground it firmly in the broader legal and regulatory framework in South Africa. Disinformation is defined as “any false information that is published with the intention of causing public harm.” That reference to public harm is based in the 1998 Electoral Act, which defines “public harm” as “(a) disrupting or preventing elections; (b) creating hostility or fear in order to influence the conduct or outcome of an election; or (c) influencing the outcome or conduct of an election.” This narrowly drawn definition creates gates around the types of disinformation that fall under the responsibility of the EMB; the EMB’s code addresses false information, published with intent to threaten the integrity of the electoral process.

Commitment to Freedom of Expression

Any code of conduct designed to deter disinformation will place bounds on what speech is allowable in an electoral context. As outlined in international human rights declarations and many national constitutions, any limitations on freedom of speech must meet a strict degree of scrutiny. As such, multiple EMBs have opted to include explicit recognition of the commitment to freedom of expression in the text of the code itself.

South Africa’s code, for example, includes an affirmation that efforts to curb disinformation must “tak[e] into consideration the right to freedom of expression” contained in the national Constitution.3 The introductory language to Panama’s Digital Ethical Pact outlines the challenges of disinformation and social media while noting that “it is important to remember that freedom of expression and respect for the civil and political rights that have been so difficult to achieve in a democracy, are and should continue to be, the guide for us to have a better Panama in the future.”4

Ban deliberate sharing of fake news

A core element across codes of conduct intended to limit disinformation is a provision exhorting signatory parties to refrain from knowingly sharing false information. This is drawn more or less narrowly and is framed differently in each code. The Georgian Ethical Principles include broad guidance to “abstain from dissemination of false information with prior knowledge,”5 but provide no additional details. Panama’s Digital Ethical Pact includes a call for signatories to be vigilant before the appearance of ‘fake news’ or false information that may endanger the electoral process, and imputes a proactive responsibility for signatories to seek reliable sources of information before sharing messages that may be false.6

This prohibition against the intentional sharing of false information may have precedent in broader national electoral law and general codes of conduct, and may extend existing principles that cover traditional media or campaigning to the realm of social media more specifically. In South Africa, the (draft) disinformation code of conduct is meant “to give effect to the prohibition against intentionally false statements contained in section 89(2) of the Electoral Act [73 of 1998].” Nepal’s code, which covers all aspects of the electoral period, calls on the mass media “not to publish, broadcast or disseminate the baseless information in favor of or against [a] candidate or political party on electronically used social networks such as S.M.S. [sic], Facebook, Twitter, and Viber.”7

Restricting deceptive online behaviors used to promote campaign content

In addition to guidance or limitations on the type or quality of content that signatories can use during campaign periods, codes of conduct can also provide restrictions related to what online behaviors are outside the bounds of ethical campaigning. This most often takes the form of exhortations to refrain from using specific techniques of artificial or manufactured amplification in ways the EMB perceives to be unethical or deceptive.

Panama’s Digital Ethical Pact, for example, instructs signatories to refrain from using false accounts and bots to misinform or promote electoral propaganda.8 Provisions of this nature must strike a difficult balance given that the disinformation tactics of malign actors continue to evolve. Too narrowly defining the discouraged online behaviors leaves open the door to a range of other tactics that are being used; too broadly-defined measures have little meaning or deterrent effect. Tying these tools to their potential deceptive uses, as Panama’s Pact does, is an important approach to strike that balance. A blanket ban on tools like bots would likely be overly onerous and prevent their legitimate use as, for example, part of an effort that provides information to voters on how to cast their ballot.

Prohibitions against incitement to violence and hate speech

In addition to discouraging the dissemination of false information, codes of conduct might also establish the expectation that candidates, parties or other signatories will refrain from incitement to violence or hate speech in campaigning.

Panama’s Digital Ethical Pact instructs digital media users to avoid “dirty campaigns” that “offend human dignity through the use of insults, incursions into privacy, discrimination” or “promote violence and lack of tolerance.”9 Georgia’s Ethical Principles instructed the presidential candidates to “refuse to use any hate speech, or statements that involve xenophobia or intimidation.” South Africa’s code does not explicitly prohibit hate speech, but its definition of “public harm” includes content that “create[s] hostility or fear in order to influence the conduct or outcome of an election.”10

Some codes of conduct also prohibit hate speech based on particular identity categories, including gender, and specifically prohibit violence against women in politics. Codes of conduct must include specific reference to gender-related hate speech and online violence and harassment against women in politics so that actors are held accountable for these specific acts. For example, Guyana’s 2017 Code of Conduct for Media – developed through the election commission’s engagement with leading media representatives – enjoined the media “ to refrain from ridiculing, stigmatising or demonising people on the basis of gender, race, class, ethnicity, language, sexual orientation and physical or mental ability” in their coverage of campaigns and elections.11

Application of a social media ban to the campaign period
It is also possible to use a code of conduct as an opportunity to set standards for the behavior of signatories during the defined campaign period, which may include limitations on social media use during a silence or blackout period directly before Election Day. Panama’s Digital Ethical Pact requires signatories to “collaborate with the Electoral Tribunal so that the electoral ban is respected and electoral campaigning is only carried out during the allowed period 45 days before the internal elections of the political parties and 60 days before the general election.”12 Nepal’s code stipulates that during the electoral silence period votes cannot be solicited through campaigning via social media or other electronic means.13 As discussed in the legal and regulatory section of this guidebook, the specifics about what types of content are restricted outside of the campaign period should be clearly defined. For example, authorities may choose to disallow paid advertising, while allowing organic posts on the personal accounts of candidates and parties.
Proactive obligation to share correct information
Codes of conduct may require signatories to not only refrain from sharing false information, but to actively work to correct false and problematic narratives that do circulate. South Africa’s draft code obligates parties and candidates to address disinformation, “including by working in consultation with the Commission to correct any disinformation and remedy any public harm caused by a statement made by one of their candidates, office-bearers, representatives, members or supporters….”14 While not yet observed in practice, including a proactive responsibility for parties and candidates to work with the election commission to counter false or problematic electoral narratives does provide the Commission with an additional avenue for disseminating corrections, counter-narratives or voter information messages as part of a crisis communication strategy. South Africa’s code also requires signatories to publicize the code and educate voters about it.15

3.4 Enforcement

Codes of conduct, as noted above, can be voluntary and nonbinding agreements or they can operate in conjunction with the legal and regulatory framework, allowing some degree of enforcement. Both voluntary and enforceable codes establish normative standards for signatories of the document. For voluntary codes, establishing norms through the public commitment of candidates, political parties and other relevant electoral stakeholders might be the sole purpose of the code. 

EMBs have varying degrees of legal authority and capacity to enforce codes of conduct. In the Georgian case, the decision to adopt a declaration of principles rather than a code of ethics was done, in part, out of a recognition that the CEC lacked an existing mechanism for implementation or enforcement.16 In the case of South Africa, the EMB’s enforcement mandate predates the code on disinformation, as they also have enforcement capabilities in regard to the general Electoral Code of Conduct and in the broader legal framework. The South African code defines the boundaries of the EMB’s enforcement capacity, noting, for example, that if the EMB considers any content that comes to them as a result of the code of conduct to be a violation of existing criminal laws, then it will be duly referred to the appropriate law enforcement agency.17 Similarly, the commission stipulates that it will refer complaints against members of the media to existing bodies that have oversight of the press.18 

Even when codes are situated in a clear legal framework, they are less weighty than other types of legal or regulatory deterrents. Vice-Chairperson of the South African IEC, Janet Love, characterized the IEC’s enforcement of the Digital Disinformation Code as “measured” rather than “aggressive.”

“We can’t pretend to have a bazooka when in reality we have a firm stick.” -- Vice-Chairperson Janet Love, Electoral Commission of South Africa

Though codes of conduct carry less legal weight, they do provide a flexibility that may be very attractive to EMBs. An enforceable code of conduct may be more easily and expeditiously adopted and led by the EMB, in comparison to a regulatory or legislative reform process. An enforceable code can provide EMBs with a “firm stick” by which they can strongly encourage compliance without resorting to lengthy legal proceedings that may drag on too long to allow for timely remedy.  Codes of conduct can also side step serious harms that could stem from using revisions to the criminal code as an alternate approach. A further discussion of the potential harms of criminalizing disinformation are discussed in the guidebook section on Legal and Regulatory approaches to countering disinformation.




A limited number of EMBs have a mandate to monitor the social media use of candidates, parties, media outlets or other designated electoral stakeholders to ensure compliance with the legal and regulatory framework. Monitoring might seek to enforce legal limits on campaign spending on political advertising on social media or on campaigning outside of a designated campaign period, or to enforce restrictions on content that has been deemed illegal in the context of an election. For many EMBs, this responsibility is not part of their legal mandate. In these instances, a mandate to monitor and enforce may rest with another entity, such as a media or political finance oversight body or anti-corruption agency, and the guidance outlined in this subcategory would be applicable to their work. Developing a means to monitor social media for compliance must go hand-in-hand with the development of legal and regulatory frameworks that govern the use of social media during campaigns and elections. Without establishing a capacity to monitor, audit or otherwise effectively provide oversight, laws and regulation governing the use of social media during elections are unenforceable. 

In truth, developing effective mechanisms to monitor electoral stakeholders’ online conduct is a challenge without ready solutions for many oversight bodies. Efforts to conduct effective monitoring are often highly dependent on the transparency tools made available by social media platforms. Facebook is ahead of other platforms in rolling out political advertising transparency tools and expanding them to more countries, but many countries still lack access and local users have criticized the Facebook Ad Library for not being comprehensive. Google’s political advertising transparency tools are only available in the EU and a handful of other consolidated democracies, with no observable efforts in 2020 to expand the availability of these tools to more countries. Other platforms offer even more limited tools for political advertising transparency. 

While a range of commercial tools do exist for aggregating social media content to aid in the analysis of the online messages and conduct of political actors, the lack of customization of these tools for use by oversight bodies remains a challenge. Commercial tools are also often costly. Anecdotally, multiple oversight bodies that are starting new efforts to monitor social media during elections have shared that at present their approach is largely manual, consisting of staff members visiting the individual pages and accounts of political actors or other electoral stakeholders to analyze the content that has been posted.

Some EMBs with an oversight mandate are, however, innovating and expanding their ability to monitor social media for legal and regulatory compliance. Bawaslu, for example, monitored the official social media accounts of political candidates during 2019 Indonesian elections, though they acknowledged the limitations of this effort, observing that that candidates keep their official pages free from controversy, while any misleading or divisive content would be disseminated and amplified through social media accounts not officially associated with the campaign. This effort was part of a larger approach to monitoring social media in collaboration with the Ministry of Information and Communication and the Cyber Crime Police Unit, which included efforts to detect deceptive coordinated campaigns on social media that might have links to candidates or political parties.

The efforts of the High Independent Election Authority (HIEA) of Tunisia to establish a capacity to monitor social media during the electoral period is illustrative of some of the approaches and challenges that such an effort may employ or encounter. Tunisia’s legal framework does not have explicit provisions governing the use of social media during electoral campaigns. However, during the 2019 electoral cycle, the election commission decided to monitor online content and social media to ensure that parties and candidates were respecting the principles and rules of the campaign. This work was undertaken as an extension of the work being done by the HIEA’s Media Monitoring Unite (MMU), which looks at electronic and print media during electoral periods. While the MMU was able to surface insights into the use of social media during the election, it also ran into a challenge common to social media monitoring efforts – defining the boundaries of which accounts are subject to monitoring. In Tunisia, as in other countries, the vast majority of offenses were observed to come from undeclared pages and accounts rather than the official accounts of the candidates. This creates challenges as in most cases there is insufficient evidence to definitively attribute these violations to the candidate or campaign benefiting from the problematic content. 


What is meant by "Social Media Monitoring"?

An increasing number of EMBs are identifying the ability to monitor social media as a skill that would aid them in fulfilling a counter-disinformation mandate. However, there are two different functions that are commonly implied by the phrase "social media monitoring."

The first function is monitoring the social media use of candidates, parties, media outlets or other designated electoral stakeholders for the purposes of ensuring compliance with legal and regulatory guidance. This function is intimately linked to the detection of violations and is necessary for enforcement of the legal and regulatory framework as it applies to social media.

The second function that is often implied by the phrase "social media monitoring" might more accurately be described as "social listening." Rather than monitoring the behavior of certain actors, social listening is an attempt to distill meaning from the broad universe of conversations that are happening on social media and other online sources in order to inform appropriate action.

These two functions are explored under separate subcategories: (4) Social Media Monitoring for Legal and Regulatory Compliance and (5) Social Listening and Incident Response.

4.1 Defining a monitoring approach.

Does the EMB have a legal mandate to monitor social media?

Prior to launching an EMB-led social media monitoring effort, the legal and regulatory framework must be consulted to ascertain the following:

  • Does the EMB have a legal mandate to undertake monitoring activities? 
  • If not, does this mandate lie with a different government entity that would prevent the EMB from conducting their own monitoring efforts?
  • What legal and regulatory guidance exists, if any, for the use of social media during election periods?
  • If there are no specific provisions related to the use of social media, are there general principles for the conduct of candidates, parties or other electoral stakeholders during the campaign that can be reasonably applied or extended to social media? 

What is the goal of the monitoring effort?

After consulting the legal and regulatory framework, an EMB must establish an objective for their monitoring effort. For example, is the objective:

  • To detect political advertising on social media that takes place outside of the designated campaign period? 
  • To identify instances in which online behaviors violate the legal framework governing the abuse of state resources?
  • To monitor the content posted by candidates and parties to ensure compliance with any legal guidance on refraining from hate speech directed at women or other marginalized groups, incitement to violence, disinformation about the election, or other prohibited messages? 
  • To verify that reported spending on social media political advertising is accurate? 

If there is little or no current legal or regulatory guidance on the use of social media in elections, is the effort:

  • To gather information and evidence to inform future law reform conversations or the development of a code of conduct? 
  • To raise public awareness about problematic content and behaviors that parties and candidates are engaging in on social media, including online harassment and violence against women and other marginalized groups?

What is the time-period for social media monitoring?

Based on the goal that is identified, the EMB should define how far in advance of elections social media monitoring efforts will begin, and whether they will extend to any part of the post-electoral period.

Will monitoring be an internal operation or will the EMB coordinate with other entities?

An EMB will need to ascertain whether it has sufficient capacity to conduct a media monitoring effort independently:

  • Does the EMB have the human capacity and financial resources to conduct their own monitoring effort?
  • Are there other state agencies or oversight bodies monitoring social media during elections that should be consulted or partnered with before an EMB launches their own effort? 
  • Are there any restrictions or prohibitions that would limit the EMB’s ability to procure outside services from the private sector to augment the EMB’s capacity?
  • If the objective of the monitoring effort is to gather information and evidence to inform future law reform conversations or to understand how certain marginalized groups are targeted by disinformation, is there a role for credible civil society actors focused on advocating for legal reform or representing marginalized groups to provide the EMB with additional information and analysis?   

What kinds of social media ad transparency tools are available in-country?

Understanding what is feasible for an EMB to do is in part contingent on the tools that technology and social media companies have made available in their country:

  • Will the Facebook Ad Library1 be enforcing disclosure for political and issue advertising in the relevant country? 
  • Will a Google Transparency Report2 covering political advertising be available for the country?
  • Do any other widely-used social media platforms offer transparency reports or features of any kind?
  • If yes to any of the above, is the EMB equipped to use these tools to execute their mandate or is training necessary? 
  • Does the EMB have the authority to make legally binding requests of the social media platforms for information as part of an enforcement effort?

Further discussion of the definitional considerations necessary for establishing a social media monitoring approach is found in the guidebook section on Legal and Regulatory responses to disinformation.

4.2 Tying social media monitoring to a response

Based on the identified goal of the social media monitoring effort, the EMB should identify how they will make use of the insights they gain through their monitoring efforts. 

  • If a legal and regulatory framework defining violations is in place, the EMB should identify how cases will be referred to appropriate enforcement agencies for further investigation and possible sanctions, if they do not have the ability to issue sanctions themselves. 
  • If the identified goal is to inform future regulation or the development of a code of conduct, a plan should be made for how content or behaviors that might constitute a violation under a revised legal framework is documented and explained in a way that would be compelling for the necessary audience of regulators or lawmakers. 
  • If the goal is to deter bad behavior by raising public awareness about the questionable or illegal conduct of parties and candidates, the public relations or communications department of the EMB should be involved in developing a plan for communicating findings to the general public. 

Responses must take into account gender considerations and, in particular, should ensure that violations targeting marginalized groups or exploiting stereotypes about marginalized groups are specifically addressed so that these groups are not further marginalized by responses that are blind to their concerns and experiences.

Rather than monitoring the behavior of certain actors, social listening is an attempt to gain insights into the sentiment, misperceptions, or dominant narratives circulating on social media and other online forums in order to inform appropriate action. An EMB may wish to set up social listening to inform a rapid incident response system or to inform strategic and communication planning. Gaining insight into what narratives are circulating and gaining popularity in online spaces can provide EMBs with insights into how to effectively counter narratives that threaten election integrity.

If electoral authorities wish to monitor political parties or other electoral stakeholders for compliance with the legal and regulatory framework, please refer to the prior subsection on Social Media Monitoring for Legal and Regulatory Compliance.

5.1 Understand EMB capacity and purpose

Setting up a social listening and response capacity is not a one-size-fits-all effort. Some EMBs have the staff capacity, recognized mandate, and financial resources to set up comprehensive efforts. For other EMBs, the barriers to entry to establishing social listening capacity may seem (or be) insurmountable and may divert attention from more essential activities. If donors and international assistance providers are assisting an EMB to establish or strengthen a social listening capacity, it is essential to tailor the monitoring effort to fit the EMB’s needs and capacity.

EMBs will have different purposes for setting up social listening capacity. This subsection focuses primarily on EMBs that wish to build a real-time monitoring effort that allows them to identify and respond to disinformation or other problematic content swiftly. Other EMBs may wish to use social listening earlier in the electoral cycle to inform communication strategies. This proactive strategy is briefly discussed in this subcategory for ease of comparison with other reactive applications of social listening. These efforts are not mutually exclusive and an EMB may choose to pursue both.

5.2 Social listening to inform rapid incident response

The National Electoral Institute (INE) of Mexico’s social listening and incident response efforts illustrate what a fully-staffed and resourced social listening effort can look like. INE designed and deployed “Project Certeza” in the days prior to and on Election Day in 2018, and also implemented the same system for 2019 elections. Project Certeza’s purpose was to “identify and deal with false information disseminated, particularly through social networks but also through any other media, that could produce uncertainty or distrust in the citizenry about the electoral authority’s responsibilities as the election is happening.”1 This effort included a technological monitoring system developed by INE, which screened millions of pieces of social media content and other sources for potentially problematic words and phrases associated with elections. That flagged content was then referred to human moderators for verification and determination on whether the content required action. In addition to this remote monitoring, INE hired a network of temporary field operators to gather real-world information and document first-hand evidence that could be used to refute false and inaccurate claims.2 Evidence and analysis from the remote monitoring team and field teams were then shared with INE’s social outreach division, where specifically-tailored refutations or voter information content was shared via social networks and with media outlets. The team working on Project Certeza included senior officials from eight different divisions at INE, which meant that immediate decisions could be made on appropriate responses.3 An effort as comprehensive as Mexico’s will be beyond the reach of most EMBs. However, elements may still be illustrative to other EMBs designing their own social listening efforts.

As an alternative to such an approach, election authorities might consider interventions that help voters encounter reliable information when they seek more details about a piece of disinformation that they have encountered. Election authorities in the U.S. State of Colorado monitored social media to identify trending misinformation and disinformation about the U.S. 2020 elections and then purchased Google ads tied to relevant search terms. This was an attempt to ensure that information seekers using the search engine to look up the disinformation they encountered were directed to credible sources, rather than surfacing search results that further fed conspiracy. Placing Google ads to ensure credible results appear at the top of a search page can be one approach to combat disinformation that emerges through “data voids,” which can occur when obscure search queries have few results associated with them, making it easier for disinformation actors to optimize their content in ways that ensure information seekers encounter content that confirms rather than rebuts disinformation.   

Another prospective area for social listening that might be better suited to EMBs that lack internal capacity to set up an independent effort is partnering with a technical assistance provider, working with civil society or contracting a credible private entity that specializes in social listening to set up an early warning system of alerts that could be monitored by EMB staff. Alerts could be built around key phrases, such as the name of the EMB, that would be triggered when social media content containing those phrases starts to go viral. The alerts could be designed based on high likelihood, high impact scenario planning that might be included as part of the development of a crisis communication strategy. For example, an EMB might determine that voter registration in a particular region or the integrity of overseas voting are topics at high risk of being the subject of damaging mis- or disinformation. By anticipating these scenarios, the EMB could tailor alerts that would flag potentially problematic content as it starts to gain popularity.

This approach would be considerably less comprehensive than a well-staffed internal monitoring effort, but for EMBs that lack more robust options, limited solutions may still have value. This research has not surfaced any examples of EMBs using this strategy currently, but a network of civil society actors in Slovakia, including media monitoring and elections CSO Memo98, used a similar model to set up a series of alerts for the Slovak Health Ministry to notify them of trending misinformation and disinformation about COVID-19. The ability of their counterparts at the Ministry to use the alerts in actionable ways was limited, suggesting that any initiative of this nature must be carefully planned to meet the needs and capacity of the EMB. 

Existing methodologies for detecting online violence against women in elections could be adapted to assist EMBs in understanding the ways in which gendered messages are contributing to distortions of the information environment around elections and to craft more impactful responses based on these insights. CEPPS has used AI-informed social listening to monitor online violence against women in elections, and findings and lessons learned from this work could be used to inform disinformation programming. Lessons learned from this work confirm that automated data mining techniques only go so far in distinguishing problematic content, and that the combination of automated techniques and human coders is essential to having accurate insights. 

5.3 Social listening to inform strategic and crisis communication planning: 

Social listening can be integrated into the development of communication strategies, providing insights into how electoral processes, the information environment and the EMB are perceived among different demographic groups. This understanding can in turn help an EMB craft evidence-based communication strategies to reach different audiences.

To inform its strategic and crisis communication planning, the Independent Electoral and Boundaries Commission (IEBC) of Kenya worked with a social listening firm to receive an overview of the social and digital media landscape in Kenya prior to 2017 elections. Insights gained through social listening are made more valuable through further analysis; the outside firm combined insights from their social media analysis with findings from a series of focus group discussions that explored awareness and perception of various digital platforms, as well as understanding how different sources of information were used by voters and the motivations behind sharing “fake news,” misinformation, and hate speech. Focus group participants also shared perceptions of the IEBC and provided feedback on the persuasiveness of sample messaging strategies. Engaging outside experts to conduct this analysis can supplement the EMB's capacity.




An increasing number of EMBs are identifying the ability to monitor social media as a skill that would aid them in fulfilling a counter-disinformation mandate. However, there are two different functions that are commonly implied by the phrase “social media monitoring”: 

  • Monitoring the social media use of candidates, parties, media outlets, or other designated electoral stakeholders to ensure compliance with legal and regulatory guidance
  • Engaging in “social listening”, or the attempt to distill meaning from the broad universe of conversations that are happening on social media and other online sources to inform appropriate action.

Full descriptions of these functions can be found in the prior subsection.

5.4 Defining a Monitoring Approach

Given the variations in need and capacity, each monitoring approach must be calibrated to suit the institution that uses it. 

What is the goal of the social listening approach?

Examples of insights EMB can gain through social listening include:

How the EMB is being talked about on social media. 

Given that one goal of anti-democratic influence operations is to undermine trust in electoral processes and institutions, social listening can help an EMB engage in some degree of “reputation management.” Social listening can give insights into where EMB performance may be seen as lacking, can help explain any accusations directed toward the EMB, or can help EMBs understand where a lack of transparency in their operations might generate distrust.

Whether false or problematic narratives about elections are gaining traction on social media.  

As part of an Election Day incident response plan, an EMB can monitor social media for allegations of malpractice, fraud, or violence in certain regions or at particular polling stations that need to be corrected or acknowledged. They can also use this information to determine how to distribute resources or support to districts or polling stations that are experiencing difficulties.

Whether misinformation or disinformation is circulated that might suppress voter turnout or otherwise impact the integrity of the election.

Based on their crisis communication planning, EMBs can determine when and how they will respond to voter interference messages that they might detect circulating on social media. If social listening reveals ways in which certain populations are being targeted as subjects or consumers of disinformation, for example, an EMB could use that information to focus counter messages toward impacted populations. 

What is the time period for social listening?

An EMB must determine how far in advance of elections social listening efforts will begin. Depending on resources and the goals of the social listening exercise, EMBs may choose to monitor only a narrow window of time around Election Day, or they may choose to monitor the entirety of the campaign period. EMBs using social listening for rapid incident response should also plan to continue efforts through the immediate post electoral period, when false and misleading information with the potential to incite violence or delegitimize results may be at its highest. 

For EMBs using social listening to inform their strategic or crisis communication plans, EMBs must strike a balance between completing this work far enough in advance to have strategies in place in time for the election, but not so far in advance that voters’ opinions about the information environment are outdated by Election Day.

Will the social listening effort be an internal operation or will the EMB partner with other entities?

An EMB will need to ascertain whether it has sufficient capacity to conduct a social listening effort independently:

  • Does the EMB have the capacity and resources to conduct their own social listening effort?
  • Are there other state agencies, civil society organizations or academics conducting similar work that might be able to partner with the EMB to do this work? 
  • Are there any restrictions or prohibitions that would limit the EMB’s ability to procure outside services from the private sector to augment the EMB’s capacity?

Which tools will the EMB use to monitor social media platforms or other online sources? 

If an EMB does not have the capacity to develop their own system, as Mexico’s INE did, a range of social listening tools are available. Those that are most comprehensive are available through paid subscription. Many of these tools and possible applications are discussed in NDI’s publication Data Analytics for Social Media Monitoring.

5.5 Tying social listening to action

The purpose of engaging in social listening is to inform more effective responses from the EMB. To that end, social listening for the purpose of rapid incident response should be closely aligned with an EMB’s crisis communication planning. Based on scenario planning done during crisis communication planning, the EMB should map out what their process will be for responding to any problematic or misleading content that they identify through social listening. There should be  clear lines of internal communication for verifying suspect content. This process may include receiving rapid input from regional election commissions or individual polling stations. Communication channels, including traditional media actors or identified trusted messengers, should also be established in advance.

Additionally, social listening may surface cases that may be referred to another government entity. For credible reports of activities in violation of the criminal code, the EMB should be prepared to refer reports to the appropriate actor. For example, INE’s social listening efforts in the 2019 Mexican elections surfaced three credible reports of vote buying that were referred to the Special Attorney on Electoral Crimes.4

Given controversy and lack of consensus over the standards by which social media platforms determine what content is allowable on their platforms, increasing national sovereignty over what content is allowable is of interest in many countries. The IEC in South Africa and Bawaslu in Indonesia have adopted disinformation complaints and adjudication processes to increase national decision-making power over the removal of certain types of content during electoral periods from social media platforms.

"If you submit a complaint to Twitter or Google – your complaint [is] adjudicated according to the terms of a private company. If you don’t like the outcome, there is nothing you can do about it, there is no transparency. It means a foreign entity is determining things of national importance." William Bird – Director of Media Monitoring Africa

Rather than flagging content via native reporting functions within the social media platform and leaving it up to the company’s discretion and community standards to remove or downrank that content, Bawaslu and the IEC developed processes that allowed them to issue a decision with the force of law to compel the platforms to remove content. For this approach to be given consideration, EMBs must be independent and credible institutions. If an EMB is not sufficiently independent of political pressures, a process such as this could be easily abused for political advantage.

South Africa: Ahead of May 2019 Elections, the IEC worked with civil society to establish a complaints referral and adjudication process. An online portal was launched that allowed the public to lodge complaints about specific pieces of content. Complaints were received by the IEC’s Directorate of Electoral Offences, which worked with a Digital Disinformation Commission (DDC) composed of outside media, legal and technology experts to assess the complaints and make recommendations to the Commission for action. Decisions were communicated to the public by the IEC through regular reports to the media and the status of complaints as they made their way through the process was publicly tracked on the IEC’s website.

Indonesia: Ahead of April 2019 Elections, Bawaslu established a complaints referral and adjudication process. In addition to receiving complaints directly from the public, Bawaslu also received a weekly compilation of complaints that had been received by the Ministry of Communication and Information Technology. A part-time task force within Bawaslu would then assess and categorize the content to determine if it was in violation of national standards. Content that was determined to be in violation was sent to the social media platforms via an accelerated channel for review and removal.  Of the 3,500 complaints received by Bawaslu, 174 were determined to be within their jurisdiction (related to elections) and processed for further review by the platforms. 

Setting up a complaints referral and adjudication process is a labor- and resource-intensive undertaking. If an EMB is contemplating this approach, there are several factors that should be considered regarding whether they should set up a system, and if they do, how to design an effective system.



As with any complaints adjudication process, the designers of the system must consider who has standing to bring a complaint for consideration. Should anyone have the ability to flag a piece of content for review by the EMB? Only political parties and candidates? Should the EMB aggregate complaints received by other government agencies or bodies?

The South African system allows that “the 411 form sampleCommission shall receive complaints of disinformation during the election period from any person.1 This is operationalized through the Real411 system, which includes a web portal where any member of the public, regardless of whether they are eligible voters, can flag content for review. The portal now receives complaints year-round, not only during the electoral period, and is maintained by the civil society group Media Monitoring Africa, which organizes the three-person review teams of outside experts that make up the Digital Disinformation Commission (DDC). During elections, the DDC makes recommendations to the IEC’s Directorate of Electoral Offenses on actions to be considered.

Systems that are open to public reporting from any member of the public provide opportunities for brigading, in which actors wishing to overwhelm or discredit the system could flood the reporting channel with disingenuous or inaccurate reports. A non-disinformation example of this took place ahead of 2020 Serbian elections. In this instance, a party that was boycotting the elections created a viral Facebook campaign encouraging supporters to submit claims via the EMB’s election complaints process for the suspected purpose of overwhelming the EMB’s dispute resolution capacity. Though the cases were dismissed, they reportedly caused administrative delays which weighed on the effectiveness of the complaints process. South Africa attempts to mitigate against this risk by requiring complainants to confidentially submit their names and email addresses along with their complaint.

Bawaslu, recognizing the ways in which overly formal reporting mechanisms can significantly slow collaboration, maintained informal communication channels with counterparts at the Ministry of Communication and Information Technology, the police, and the army via WhatsApp to refer and share intelligence about complaints in addition to receiving complaints directly from the public. On a weekly basis, the Ministry of Communication and Information Technology would collect and send the reports of content they had collected to Bawaslu for review, classification and a determination on further action. Interlocutors at Bawaslu estimate that they received an average of 300 to 400 reports per week.


The definitions that a disinformation complaints referral and adjudication body uses to determine what content constitutes a violation that requires remedy or redress must be clearly and narrowly drawn and fit within the country’s constitutional, legal and regulatory framework.

In South Africa, the complaints process is integrated into the draft Code of Conduct for Measure to Address Disinformation Intended to Cause Harm During the Election Period. The code itself draws clear definitions of what constitutes disinformation – specifically, intent to cause public harm, which includes disrupting or preventing elections or influencing the conduct or outcome of an election. As discussed in the subsection on codes of conduct, the code's definitions are firmly grounded in the South African Constitution and electoral legal framework. The same standards are used in each phase of the complaints process, by the DDC, which is external to the IEC, as well as by the Electoral Offenses Office and the Commissioners within the IEC.

Arriving at standardized definitions presents an opportunity for EMBs to engage in consultation and relationship building with potential allies in the fight against electoral disinformation. In Indonesia, Bawaslu created standardized definitions for unlawful content in electoral campaigns. Prior to 2018 local elections, existing laws outlined categories of prohibited content, such as hate speech, slander, and hoaxes, but these categories lacked clear definitions. To arrive at definitions, IFES supported Bawaslu in conducting a series of roundtable discussions engaging more than 40 stakeholders from government, civil society and religious organizations to discuss definitions for the types of content prohibited in electoral campaigns. This feedback was then taken into consideration in the formulation of Bawaslu’s Regulation on Prohibited Electoral Campaign Content.2 Consultation can be more narrowly drawn as well; in the run up to the launch of their complaints process, definitions in South Africa were discussed by a working group that included IEC members, media lawyers and members of the press. 

Definitions are also likely to evolve over time as the complaints process is put to the test. In South Africa, initial discussions included whether hate speech and attacks on journalists should be covered by the complaints process. Though both were excluded from the definitions used during 2019 elections, Media Monitoring Africa and their partners developed definitions and reporting processes for these additional categories of complaints after the elections. Complaints on these additional topics are now able to be submitted via the complaints portal, and may be considered by the IEC for future elections. 

It might also be useful to examine existing legal and regulatory frameworks around gender-based violence, violence against women, or gender equality that can be used to create definitions for online content that may violate these laws and regulations. Including definitions specific to violations that disproportionately affect women and other marginalized groups is key in making sure their concerns and experiences are addressed through this effort.


An adjudication process should provide for a variety of remedies and sanctions that can be adapted to fit the violation that is identified.  It may be desirable for a complaints adjudication process to have more remedies at its disposal than the referral of content to the platforms for removal.  

In both South Africa and Indonesia, the most common judgement regarding the referred complaints was to take no action - either because the content was not deemed to rise to the threshold of constituting public harm or because the content fell outside of the narrow focus on election-related content and therefore was outside the jurisdiction of the EMB. 

The South African IEC has discretion to determine appropriate avenues for recourse. These include:3

  • Determining that no action is necessary
  • Engagement with the party or candidate that has committed the violation to urge compliance with the disinformation code of conduct, which stipulates that signatories must act to correct disinformation and remedy public harm in consultation with the IEC, including disinformation that originates with the signatories’ representatives and supporters.
  • Referral to the appropriate regulatory or industry body that has jurisdiction, including the Press Council of South Africa or the Independent Communications Authority of South Africa
  • Referral to a relevant public body, such as the police, for further investigation or action
  • Referral to the Electoral Court for appropriate penalty or sanction
  • Use of IEC communication channels to correct disinformation and remedy public harm

The remedies envisioned through Bawaslu’s process were narrower than those of their South African counterparts. Bawaslu had authority to observe social media during the campaign period, but not to take action against violators.  The primary focus of their process was to elevate content for review and removal by the platforms. In instances where content was in violation of platform community standards, the content was removed or its distribution limited in accordance with platform policies. For Facebook, in instances where content was in violation of Indonesian law but did not violate Facebook’s community standards, content was ‘geoblocked’ – meaning that the post was inaccessible from within Indonesia, but was still accessible outside of the country. 

In addition to content removal or restriction, Bawaslu also used the content they collected to identify voter education and voter information themes to emphasize in their public messaging. They also referred cases to the criminal court system in instances where content violated the criminal code. Bawaslu reported that there were no instances of sanctions against political parties using the criminal code, though actions were taken against individuals. Notably, the highly-publicized “seven containers hoax” which alleged that cargo ships full of pre-voted ballots had been sent to Jakarta, led to criminal charges against the individuals that started and spread the hoax.   


The possible timeline for adjudication and action is a significant challenge for complaints referral and adjudication processes. Though some content identified as high priority was expeditiously addressed, the systems that Indonesia and South Africa developed and used during their respective elections had multiple stages that at times took weeks to clear in order to issue a decision on an individual piece of content. Given the volume of posts, quick iteration of messages and tactics, and the speed with which problematic content can go viral, a slow process for removing individual pieces of content is unlikely to have a measurable impact on the integrity of the information environment. By the time a piece of content has been in circulation for a day or two – much less a week or two – it is likely to have done the majority of its damage, and the churn of content will ensure that new narratives will have emerged to occupy public attention.  

In instances where the remedy or sanction sought goes beyond content removal, as in the case of South Africa, a slower timeline may not reduce the effectiveness of the remedy. Media Monitoring Africa would at times pursue a dual track in which content would be referred to the IEC and to the platforms simultaneously: to the IEC  for consideration of the array of remedies under their power to issue, and to the platforms for review of the content for expeditious removal. However, if the primary remedy sought by a complaints adjudication process is the removal of content from a social media platform, a multi-step complaints referral process may not be an efficient way to achieve it. Content removal may not be a goal that EMBs should be involved in at all.


The goals of a complaints referral and adjudication process should be twofold;  the intent of such a system is to both remedy the harms of disinformation, as well as to build confidence among the electorate that authorities are effectively addressing the challenges of disinformation in ways that protect the integrity of the electoral process. Thinking through a communication strategy to publicize the efforts and successes of the process is a critical component to making the most of a complaints system. The very existence of the complaints system, if compellingly communicated to the public, can help to rebuild public perception of the trustworthiness of democratic processes and election results.

In the case of South Africa, a subsidiary benefit of running the complaints process was that the IEC could offer reassurances to the public after the election that the integrity of the election had not been undermined by coordinated malign actors seeking to distort outcomes or disrupt election processes. The referral mechanism in some ways also served as a crowdsourced media monitoring effort, and contributed to the conclusion that there was no evidence of foreign or state-linked influence operations that were operating at scale. The IEC concluded that there were instances of misinformation and disinformation, but no evidence of a coordinated disinformation campaign. 

The complaints process developed by the IEC included planning for how to keep the public informed about the decisions that were made. As a way to build public awareness and interest in the complaints system, the IEC provided regular reports to the media that summarized the complaints that were received and how they were handled. Though the complaints process was only active for a brief period before elections, the IEC’s communication efforts helped build support for the complaints process, leading to calls for the system to continue even after the election. 

6.6 Provide adequate time to develop and review the complaints process   

An effective complaints adjudication process is a complex endeavor to start and to gain institutional buy-in. Such a system may take time for implementers to learn to use. The ramp-up time for a system may be extensive, particularly if it involves consultative elements and involves developing common definitions. Any existing system should be reviewed in advance of each election to ensure it is suited to the evolving threat of electoral disinformation.

Plans and consultations for South Africa’s Real411 System began in the fall of 2018 and the system was only able to begin operation in April 2019 ahead of May elections. Consultations on definitions of violating content in Indonesia, though at the time unconnected with the complaints referral system, took place a year in advance of 2019 elections.


Coordination between EMBs and Technology and Social Media companies to enhance the dissemination of credible information or restrict the spread of problematic content during electoral periods.

Technology and social media companies – including but not limited to Facebook, Google, Twitter, TikTok, and their subsidiary companies including Instagram, WhatsApp, and YouTube – have a role to play in ensuring that elections take place in a credible information environment and that their platforms are not used to undermine the integrity of elections. While companies must be held to account for harms that may stem from their platforms and services, progress towards alleviating these harms can be enhanced through direct engagement with these companies.

“Of course technology companies have more tools for how to regulate what happens on their platforms. There are things governments can’t do, but technology companies can – we need their help, but you need boldness from government to say so.” – Commissioner Fritz Edward Siregar, General Election Supervisory Agency of Indonesia (Bawaslu)

The market size of a country matters when it comes to how many resources social media companies are willing to invest and how available they are to assist and coordinate with EMBs. Before the 2019 elections, the Election Commission of India was able to convene representatives from top social media companies for a two-day brainstorming session on approaches to problematic social media content in elections, gaining a commitment from those companies to abide by a code of ethics. Conversely, EMBs of smaller countries have reported difficulty getting company representatives to respond to their messages, even after establishing a point of contact within the company. There is significant variation in EMBs’ experiences working with social media and technology companies, as platforms may dedicate variable levels of support to specific countries based on factors including market size, geopolitical significance, potential for electoral violence, or international visibility.


 “We aren’t naïve – these are profit-driven companies.” – Dr. Lorenzo Córdova Vianello, Councilor President of the National Electoral Institute of Mexico


There is also variation among social media platforms in terms of how willing they are to engage and how many resources they have put behind working with local election authorities. Indonesian electoral authorities, for example, reported that Facebook and YouTube had local representatives that made working with them easier, but that Twitter lacked the capability on the ground, making recurrent engagement more difficult. Based on conversations with more than two dozen EMBs globally, it appears that Facebook has invested more dedicated attention and resources than other platforms in establishing connections with election authorities in a wider range of countries.

The style and formality of agreements between tech companies and EMBs also varies from country to country. Ahead of 2018 Mexican elections, INE in many ways piloted what coordination with social media companies could look like, signing cooperative agreements with Facebook, Twitter and Google, for example.1 The Brazilian TSE, building on INE’s experience, signed formal agreements with WhatsApp, Facebook and  Instagram, Twitter, Google and TikTok ahead of 2020 elections. Brazilian electoral authorities pushed to include more concrete measures and actions to be adopted by the social media platforms, getting commitments from the platforms to use their features and architecture to react to malicious and inauthentic behavior, as well as to promote the dissemination of official information. The majority of arrangements, however, are less formal, and social media companies seem less willing to sign formal MoUs in some countries and regions than in others. For smaller countries, engagement is even more likely to be ad hoc. 

A few lessons learned that EMBs and social media company representatives have shared with regard to establishing productive relationships:

  • Both sides should establish clear communication channels and designated points of contact.
  • Companies should establish relationships early in the electoral cycle when election authorities have capacity to engage and there is sufficient time to build trust. 
  • EMBs should take ownership by having an idea of what they want from social media companies and how they want to collaborate.
  • EMBs should situate their coordination with social media companies within larger multi-stakeholder efforts as appropriate.  For example, if an EMB is working with both social media companies and international implementers to optimize their use of social media, ensuring that these efforts reinforce one another can increase their value and reduce duplicate efforts.
  • When desired, international implementers may facilitate or provide structure to the collaboration between an EMB and social media companies. In some instances, having a third party that understands how EMBs operate and what types of collaboration are more feasible for the social media company can increase the utility of these interactions and help EMBs feel confident that their interests are well represented.

Though there are similar services or types of coordination that social media companies provide across countries, the exact nature of coordination differs from country to country. As discussed, one fundamental distinction in EMB approaches to electoral disinformation is whether focus is on enhanced dissemination of credible information or on sanctions for problematic content. This distinction informs the types of collaboration that an EMB is likely to engage in with social media and technology companies, though many EMBs will coordinate in ways that fall under both categories. 

google doodle7.1 Work to help EMBs to enhance dissemination of credible information 

EMBs may partner with social media and technology companies on a range of initiatives that expand the reach of EMB’s public messaging or connect voters with credible electoral information.

Platform-embedded Voter Information

A common offering from Google and Facebook are Election Day reminders that direct users to EMB websites for additional details about how to participate in elections. In an increasing number of countries, Facebook will include Election Day notifications at the top of users’ news feeds, which may include the ability to mark that you have voted in a way that is visible to your friends. In some countries, Google will alter the Google “doodle” (the changing image on the search engine’s homepage) with an election-themed image that will link to country-specific voter information resources. In addition to Election Day notifications, Facebook and Google may also integrate notifications around voter registration deadlines, candidate information or details on how to vote. Google enabled an Informed Voting button one week before 2018 Mexican elections that redirected users to an INE microsite with information designed for first-time voters.2 While platforms may run these notifications independently, in some countries companies will engage the EMB to verify that the information being provided is correct. Both Google and Facebook have also debuted tools to help voters find their polling locations – which either directs voters to EMB resources or relies on detailed data provided or verified by the EMB – such as a Google Maps integrated feature. 

In addition to working with Facebook and Google, the TSE in Brazil pioneered a number of avenues for working with additional platforms. For example, the TSE partnered with WhatsApp to develop a chatbot that answered election-related questions asked by users and helped them identify whether information was accurate. The chatbot also provided information on candidates and on when and where to vote. More than 1.4 million WhatsApp users queried that chatbot during the election period, and 350,000 accounts exchanged 8 million messages with the chatbot on Election Day alone. For the 2020 Brazilian elections, Instagram created stickers to reinforce the importance of voting, automatically redirecting users to the TSE official website. Twitter created a notification for users with a link to the TSE webpage and promoted the dissemination of official TSE content on the platform. TikTok launched a page to centralize reliable information about the election. 

It is important for companies to work with EMBs to ensure that they are prepared for the extra traffic to their site that may result from these notifications. A Facebook notification that urged Indonesians to check their voter registration status resulted in so much traffic to the election authority’s website that it crashed. 

Civic Engagement and Voter Education Support

In some countries, the platforms will engage in more complex civic engagement efforts that aim to extend the reach of credible and informative content. In Mexico, technology companies partnered with INE to expand the reach of civic and electoral information. Facebook amplified INE’s call for citizens to choose the topic of the third presidential debate, and all three debates were streamed on the platform. INE also collaborated with Twitter using Periscope, Twitter's live video streaming application, to broadcast the three presidential debates, and encouraged national engagement around the debates with a series of customized hashtags. INE was also able to use a Tweet-to-Reply tool, which allowed users who retweeted INE messages on Election Day to opt-in to receive preliminary election results in real time.

Training on how electoral authorities can optimize their use of Facebook for voter education and voter information is another avenue for collaboration with EMBs. In Indonesia, Facebook provided these trainings to public relations departments in provincial and regional election offices. While Facebook provided guidance on topics such as how to make compelling videos, the value of identifying the right messenger for content, and other ways to use the platform for their goals, the company makes clear that they do not provide guidance on what content should be shared, merely how to share content effectively. 

Depending on the mandate of the EMB and the specifics of collaborative agreements, social media and technology companies may also engage with election authorities to deploy news literacy ad campaigns or trainings for electoral stakeholders on understanding and detecting disinformation. Similar efforts might also be organized with other national stakeholders outside of the EMB, additional details about these types of interventions can be found in the guidebook section on Platform Responses.


Cyberhygiene and Information Integrity

An area of overlap where EMB's cybersecurity and cyber hygiene practices have implications for information integrity is the protection of official EMB social media accounts and other online channels of communication. When EMB communication channels are hacked and then used to disseminate false information, the impact is not only the immediate confusion that might cause, but also has the potential to undermine the EMB's ability to be a trusted communication channel in the future and undermine faith in the credibility and professionalism of the EMB more broadly.

7.2 Work with EMBs to restrict or sanction problematic content

Social media companies also provide various avenues for election authorities to identify content that should be restricted or removed from social media platforms. 

Account Verification and Security 

An important, uncontroversial avenue of collaboration is providing election authorities with support for the expeditious removal of social media accounts that are falsely claiming to be or speak for the EMB. The existence of imitation accounts can be highly problematic, discrediting the electoral process and possibly sparking violence. For example, in the context of highly contentious 2018 Kenyan elections, a fake Twitter account declared Uhuru Kenyatta president prior to the official release of Presidential results, an incident that IFES field staff identified as a trigger for sporadic violence in opposition areas. Several fake accounts used the image of the Chairman of the Election Commission to announce incorrect electoral results or threaten violence against other members of the Election Commission. 

EMB imitation accounts are common, and the identification and removal of these accounts is a service that major platforms are able to provide to EMBs of any size with relative ease, provided a trusted communication channel exists between the company and the EMB. A secretariat member of the EMB of Malawi reported that Facebook had been of assistance in taking down fake accounts ahead of elections. The Central Election Commission of Georgia has reported the same. In Georgia, several fake CEC Pages were discovered. Though the CEC judged that their impact was minimal, the Commission acted expeditiously to have the accounts taken down, both by contacting Facebook and by directly writing to the page administrators to desist, which was successful in several cases. The imitation pages had the potential to erode the credibility of the CEC, prompting decisive action.


The fake CEC page discovered during the pre-election period, titled “Election Administration (CEC)” using the same profile and background pictures, would give unserious answers to people asking relevant questions…Our reputation and credibility were at stake as [this] is the goal of the disinformation itself.” – Interlocutor at the CEC of Georgia

Facebook, and possibly other platforms, express an active desire to have all EMB official Pages “blue check” verified on the platform. At a gathering of EMB commissioners and staff in South Africa in early 2020, they set up a booth that EMB representatives could visit throughout the conference to have their accounts verified, call attention to imitation accounts, and discuss other account security issues. Facebook reiterates basic account security protocols as part of account verification, including the enabling of two-factor authentication to make EMB social media accounts more secure.  

Whitelisting to flag problematic content 

Social media companies might also provide EMBs with an accelerated channel for reporting content that violates platform community standards. The major U.S.-based platforms maintain provisions that prohibit content that constitutes election interference, voter suppression and hate speech. In some instances, establishing a reporting channel is done through a more formal process. In others, it can happen on a more ad hoc basis.

Indonesia’s reporting process with Facebook, for example, was a formal arrangement, with a reporting process that was discussed and designed to fit Bawaslu’s needs. Facebook trained EMB staff on the platform’s community standards and content review process and provided Bawaslu with a dedicated channel through which they could report violations. Facebook and Bawaslu had a series of meetings to clarify Facebook’s content review policies in relation to local law and to establish a procedure for Bawaslu’s reporting process during the electoral period.  This process included Bawaslu classifying content they identified as problematic, what local law the content breached, and the argument for why the content was in violation of that law. This was then submitted as an Excel spreadsheet on a weekly basis to Facebook. The complaints referral and adjudication subcategory contains more details on this process. Although this formal process was carefully designed and adopted, a Bawaslu representative indicated that their reporting process with Facebook was not as expeditious as  with other platforms. The Bawaslu representative indicated that their formal reporting process with YouTube resulted in a quicker removal of violating content. 

In India, the election commission convened social media platforms ahead of the election and, as part of the voluntary code of ethics, platforms “agreed to create a high-priority dedicated reporting mechanism for the ECI and appoint dedicated teams during the period of General Elections for taking expeditious action on any reported violations.” 

Channels for flagging content to the platforms often form on a more ad hoc basis, particularly in countries with smaller populations in which the platforms lack a physical presence. If pre-existing relationships with the platforms do not exist, it may be too late to establish a clear process by the time elections are called. Merely establishing contact may be insufficient to lay a foundation for a productive exchange of information that benefits EMBs. A representative from the EMB of Mauritius reported that Facebook had sent representatives to meet with them ahead of 2019 elections and encouraged the EMB to report voter interference content for removal. However, when the EMB did identify content during the election that directed voters to the wrong polling locations and falsely alleged that ballots were being tampered with (clear violations of Facebook’s community standards related to voter interference), the EMB was unable to reach anyone at Facebook to remove the content. 

For EMBs that have at present only ad hoc communication with the platforms, greater systemization of the process for elevating concerns to the platforms would be valuable. The platforms should ensure that they have sufficient staff redundancies and reporting channels that a response is not contingent on the one or two individuals who initially established contact with the EMB. 

Pre-certification of political advertisers

An unprecedented arrangement that the Election Commission of India made for 2019 elections was to require political advertisements to be pre-certified by the Media Certification and Monitoring Committee before they ran on social media. Candidates provided the details of their social media accounts to the election commission as part of the process of filing their nominations, and platforms were required to allow those accounts to only run advertisements that had been certified. In addition, certification was required for all election advertisements that featured names of political parties or candidates for the 2019 general elections. The platforms were also obligated to remove political advertisements that did not have certification upon notification by the ECI. It is hard to imagine the platforms complying with measures such as this in a country with a smaller market audience than India,  or one in which the company was not physically present . This intervention is further discussed in the section on Legal and Regulatory Responses.

Enforcement of the Silence Period

The enforcement of a campaign silence or cooling period immediately prior to Election Day (as defined in local law) is another area where some EMBs have coordinated with social media platforms. Both Indonesia and India successfully gained compliance from social media companies that they would enforce the silence period. Other election authorities that express interest in having a similar arrangement have been less successful in gaining the platforms’ compliance. 

During the 48-hour silence period before Election Day, India’s Voluntary Code of Ethics compels platforms to remove objectionable content within three hours of it being reported to them by the Election Commission.

The ban in Indonesia applied only to paid advertising, not to posts that were organically disseminated. Indonesia adopted an assertive enforcement approach to the silence period by issuing letters to each of the platforms outlining the provisions of the ban on campaign advertising during the blackout period. Letters indicated a willingness to use the existing criminal provisions in the law to enforce platform compliance . Facebook initially argued that the boundary between regular advertising and political advertising would be hard to discern. Bawaslu responded it was not their responsibility to resolve that tension and that it was incumbent on the platforms to ensure that they were in compliance with the law. Bawaslu speculated that the force of this edict led to a conservative interpretation of what constituted political advertising by the platforms, leading them to restrict a larger array of borderline advertising during the three-day silent period than they might otherwise have done. Bawaslu estimated, based on reports they received from the platforms, that the ban led to the rejection of approximately 2,000 ads across all of the platforms during the three-day silence period.



Featured Intervention:

The MoU between Brazil's TSE and WhatsApp established a dedicated communication channel to allow the TSE to directly report WhatsApp accounts suspected of bulk messaging. The TSE then provided citizens with an online form to report illegal bulk messaging, and upon receiving those reports, WhatsApp would promptly launch an internal investigation to verify whether the reported accounts had violated WhatsApp terms and policies on bulk messaging and auto-messaging services. In which case, the accounts engaging in prohibited behaviors would be banned. During the 2020 electoral period,  the TSE received 5,022 reports of illegal bulk messaging related to elections, which led to the banning of 1,042 accounts.

“Does a country have the boldness to threaten Facebook and YouTube to follow the guidelines? If they have that boldness, tech companies will consider the position.” – Commissioner Fritz Edward Siregar, The General Election Supervisory Agency of Indonesia (Bawaslu)

The enforcement of a silence period is not something that the platforms have acted upon without being compelled by local authorities, and smaller countries are unlikely to have the clout to demand compliance. Other dimensions of campaign silence periods are discussed in the legal and regulatory section of this guidebook.

Election Management Bodies can coordinate with civil society to enhance the reach of their messaging or extend their capacity to engage in time and labor-intensive activities such as fact-checking or social listening. The ability to forge these types of partnerships will vary significantly based on the credibility, independence, and capacity of both EMBs and CSOs in a given country. 

EMB-CSO collaboration can be formalized to varying degrees. For example, in advance of the 2019 Indonesian elections, Bawaslu signed a Memorandum of Action (MoA) with fact-checking CSO Mafindo and election oversight CSO Perludem, outlining the parameters of their planned coordination to counter disinformation and online incitement. In South Africa, the coordination between CSO Media Monitoring Africa and the IEC in the development of their disinformation complaints referral and adjudication process included a close working relationship but was not formalized. Though partnerships should be reviewed regularly to ensure they are still serving their intended goals, collaborative relationships can also be long-standing as opposed to being re-invented every electoral cycle; Perludem has had a cooperative agreement in place with the KPU since 2015 to aid with voter information efforts, among other things.

Collaboration between EMBs and CSOs requires a careful balancing act to maintain the credibility and perceived independence of both entities. For CSOs, a visible relationship with an EMB can legitimize and raise the profile of the work that they are doing, but it can also open them up to accusations of partiality or abdication of their role as watchdogs of government institutions. 

In the case of Media Monitoring Africa, which played a critical role in the development and delivery of the IEC’s disinformation complaints referral and adjudication process in South Africa, the involvement of the IEC in the effort gave the project visibility and credibility with donors and with the social media companies that were initially skeptical of the idea. This credibility in turn allowed MMA to raise sufficient funds to develop the project and provide their assistance to the IEC at no cost to the institution, which removed any financial relationship that could have called into question their impartiality. Prelude also has a policy to not receive money from EMBs, and the executive director, having formerly worked for Bawaslu, is careful to ensure that communication between her office and the election authorities is transparently conducted through formal channels. 

At the same time, a visible relationship with an EMB can call into question the impartiality of a CSO. For example, Mafindo’s fact-checking work includes addressing disinformation about Bawaslu and the KPU, which has opened them up to criticism for too heavily relying on official rebuttals from those institutions rather than independent verification of the claims being investigated. Prelude reports that the media will come to them for clarification on some election-related stories because they provide more expeditious responses than official sources, which has opened them up to accusations that they serve as a public relations department for the KPU. 


EMB coordination with CSOs can simultaneously serve several goals including consensus building about disinformation as a threat to elections, coordination, and amplification of rebuttals and counter-narratives as well as transparency and accountability.

As discussed in the section on Codes of Conduct and Codes of Ethics and the section on Disinformation Complaints Referral and Adjudication Processes, the act of consultation can create a foundation whereby an EMB begins to build a network of actors that can work together to combat electoral disinformation


In 2019, Brazil's TSE launched its "Combatting Disinformation Program"  focused on November 2020 elections. The program brought together approximately 60 organizations including fact-checking organizations, political parties, education and research institutions and social media platforms.

The program organized efforts around six themes: TSE internal organization; training and capacity building; containment of disinformation; identification and fact-checking of disinformation; revision of the legal and regulatory framework; and improvement of technological resources.

Bawaslu’s engagement with CSOs, universities, religious organizations, and youth groups to establish their Declaration of Principles and consult on the definitions of prohibited content in electoral campaigns provided a foundation for Bawaslu’s multi-stakeholder intervention strategies.  The inclusion of religious leaders early, for example, meant a foundation for a relationship that could then help bolster the credibility of the EMB down the line, particularly in the context of the Hoax Crisis Centers. Building broad coalitions of this nature is also something that INE in Mexico did ahead of the 2018 elections, bringing together civil society representatives, media, academics, political leaders as well as social media company representatives for a conference to discuss countering the influence of disinformation. This initial conference was then followed by coordination meetings between civic tech groups, fact-checkers, and citizen election observer groups to collaborate on their efforts to combat disinformation in the elections. In August 2019, Brazil’s TSE launched its "Combatting Disinformation Program," which emphasized media literacy, after securing more than 40 institutional partners including media outlets, fact-checking agencies, and technology and social media company representatives.

The establishment of networks and coalitions can also help the EMB to amplify voter information messages and messages to counter misinformation or incitement. For example, part of the MoA outlining cooperation among Bawaslu, Marino, and Perludem included a joint information dissemination strategy to maximize each organization’s network for better outreach. Besides, Perludem undertook voter information efforts in cooperation with the KPU to promote understanding of each phase of the voting process and the role of the EMB – a proactive communication tactic that can make it more difficult for voters to be deceived by misinformation and disinformation about the electoral process. They also worked with both election management bodies to integrate website features that allowed the networking of information among the EMBs, their own work, and the work of journalists. As part of this effort, they worked with the KPU to develop an API that they could use to directly pull official data from the KPU to populate the Prelude website. They also allowed disinformation reports from the public to be channeled to Bawaslu by integrating the Perludem website with CekFacta – a journalist fact-checking network. 

Coordination with CSOs can also help promote the accountability of Election Management Bodies. For example, Prelude, in addition to providing a portal through which individuals could report disinformation complaints to Bawaslu also monitored the progress of the reports that were submitted through their system for an added level of transparency about how reports were being handled.


An EMB is unlikely to have the capacity or need to run its own fact-checking operation. However, having the EMB as an external contributor to a fact-checking operation can enhance the effectiveness of those efforts surrounding an election. 

Establishing communication links with the EMB can enable fact-checking organizations to receive quick clarification in an instance where the EMB can authoritatively weigh in on the accuracy of a piece of false or misleading information in circulation. 

INE had a role to play in the #Verificado2018 fact-checking effort in Mexico, which is discussed in greater detail in the chapter on civil society responses. The collaboration was particularly valuable on Election Day, as INE was able to quickly clarify several situations. For example, INE quickly filmed and shared a video explaining why special polling sites were running out of ballot papers in response to complaints coming from those polling sites. #Verificado2018 journalists also consulted INE to verify or rebut reports of election-related violence, with that information then widely disseminated via the media. INE’s agreement with the #Verificado2018 team of journalists was that election authorities would provide clarification on every issue brought to them as soon as possible and that the Verificado team of journalists, in turn, would consult INE before publishing allegations, in addition to seeking confirmation through independent sources. 1 

The arrangement between MAFINDO and Indonesian election authorities – both Bawaslu and the KPU – was also designed to facilitate quick clarification in instances where electoral misinformation or disinformation was brought to them by the fact-checking network. In practice, it was difficult at times to get speedy clarifications, an issue that MAFINDO attributed to inefficiencies in the internal flow of information that could result in receiving conflicting information from different individuals inside the EMBs.

Fact-checking organizations in Brazil were also dissatisfied with the speed and comprehensiveness of responses to requests for clarification that they directed to the TSE during the 2018 elections. The TSE reported that the volume of requests for clarification they received exceeded expectations and surpassed the capacity of their staff to respond. 

From the perspective of programming support, coordination with external actors and clarifying internal lines of communication as part of strategic and crisis communication planning is something that could be of use. EMBs should be ready to be appealed to by fact-checking organizations, with a recognition that speed matters in responding. A communication protocol should clarify who should receive, process, and track the response to requests for information, who within the EMB has the authority to issue a clarification, and what the internal process for verifying the accuracy of information will be.

8.3 Outsourcing Social Listening

Like fact-checking, social listening to inform rapid incident response is another labor-intensive endeavor that EMBs may lack the capacity to conduct on their own. Civil society may be able to fill this gap through partnerships with EMBs.

In 2012 and 2016 independent media organization Penplusbytes established Social Media Tracking Centers (SMTC) to monitor social media during Ghanaian elections. The SMTCs used an open-source software that presents trends in voting logistics, violence, political parties, and other topics. These were monitored for a continuous 72-hour period by Penplusbytes staff and university students. The process included a tracking team to monitor the social media environment and pass suspect content on to a verification team that would check the accuracy of the content forwarded to them. Problematic content was then sent to an escalation team that passed on information to the National Elections Security Task Force. Members of the SMTC were also embedded within the National Election Commission. 

If setting up a dedicated effort like the SMTC’s is not feasible, an EMB may be able to achieve many of the objectives of social listening for incident response through existing partnerships. Exchanging intelligence on trending narratives related to the election with fact-checking networks can be a means for EMBs to achieve the goals of social listening without the investment to build internal capacity to do this work. Similarly, EMB-established portals that allow the public to report problematic content for review, such as the Real 411 initiative in South Africa, can provide a crowdsourced approach to gain insight into problematic narratives circulating on social media.


A detailed case study of the State Electoral office of Estonia's establishment of an ad hoc interagency task force for countering disinformation in elections demonstrates the ways in which an election management body with limited staff and a restricted mandate can mount a comprehensive counter-disinformation response.

EMB Coordination with Other State Entities

Elections are a flashpoint for misinformation and disinformation, but they are certainly not the only target of disinformation campaigns launched against democratic actors. Ensuring that state entities beyond the EMB have an active interest in monitoring, deterring, and sanctioning disinformation is crucial. Coordination with these other state entities during electoral periods can be essential for enhancing an EMB’s ability to preserve electoral integrity in the face of misinformation and disinformation. Coordination among state entities can also align efforts and messaging to enhance efficiency and prevent the confusion of uncoordinated approaches. Coordination with other state entities can also be a valuable strategy for EMBs that have limited resources to dedicate toward counter-disinformation efforts.

“The election management body in a small-scale system cannot rely on its own capability and has to gather other specialist institutions. This does not mean that the different nodes of expertise should act on their own but, rather, through the election management body as the main focal point.” – Dr. Priit Vinkel, head of Estonia’s State Electoral Office 

9.1 Establishing areas of responsibility and lines of authority

An EMB’s counter-disinformation mandate must be considered in conjunction with the efforts of other state entities to promote information integrity. Ministries of Information, Digital Ministries, and Foreign Ministries, for example, might all have counter-disinformation mandates. State intelligence agencies, the police, courts, media and communication oversight bodies, anti-corruption bodies, human rights commissions, parliamentary oversight commissions, and others may also have a role to play. 

Given how many entities may possibly be involved, in the electoral context, it is valuable to understand what different state entities are doing, and what effective collaboration might look like. It may be that the EMB wants to step into an authoritative role during the electoral period.  This happened in Bawaslu’s case; in advance of elections, it became clear that there was no institution in Indonesia with the authority to supervise hate speech and disinformation on social media during the electoral period.


“We asked ourselves a question – are we as Bawaslu brave enough to jump in, to supervise everything? …We put ourselves in the hot seat.” Commissioner Fritz Edward Siregar, The General Election Supervisory Agency of Indonesia (Bawaslu)

Clarifying lines of authority can ensure that there is an authoritative voice in dealing with non-state entities, such as social media companies or political parties. Social media companies in particular are more likely to engage if the expectations and guidance they are receiving from state entities is aligned. 

Coordination might take the form of a task force, a formal cooperative agreement, or a more ad hoc and flexible arrangement. The role of the EMB may be different depending on whether that arrangement is a standing body that takes special actions during elections, or whether it is a group convened specifically for the purpose of countering disinformation during elections. In the case of the former, an EMB may be seen as more of a resource partner to an existing body. In the case of the latter, the EMB may be leading the response. 

In Denmark, efforts to organize a coordinated government response to online misinformation and disinformation included the establishment of an inter-ministerial task force, which had a special but not exclusive focus on elections. In Indonesia, Bawaslu, the KPU, and the Ministry of Communications and Information Technology signed a Memorandum of Action before the 2018 elections and continued their cooperation during the 2019 elections. The agreement focused on coordinating efforts to supervise and manage internet content, coordinating information exchange among institutions, organizing educational campaigns, and promoting voter participation. 

9.2 Facilitating communication 

Once institutions agree on a working arrangement, they must take steps to operationalize it. Focused discussions that delineate responsibilities and procedures for coordination can lay the groundwork for  flexible and responsive communications, enabling rapid alignment and action when needed. 

In Indonesia, Bawaslu held a series of face-to-face meetings with not only the Ministry of Communications and Information Technology but with the intelligence community, the army and the policy to discuss guidelines, procedures and the relationship among their institutions. Part of those discussions included identifying which entity and which individuals within those entities had the authority to issue clarifications on which issues.  After the formal relationship was established, the agencies communicated via a WhatsApp group that enabled quick responses and minimized formality that could hamper effective coordination.

To illustrate how communication worked, Bawaslu shared an example in which they encountered a social media post alleging that official army vehicles were being used as part of campaign activities. The Ministry of Communications and Information Technology had the tools to find the content and bring it to the group’s attention but lacked authority to take action. The social media platforms moderation action was limited as they would be unable to determine whether the claim was true or not. The army had the information to prove that this claim was false but had no authority to flag the content for removal. By coordinating through their established WhatsApp group, all of the relevant parties were able to expeditiously identify and act on the issue – a feat that Bawaslu indicates would have taken more than a day if communication had been routed through formal communication channels.

The existing plan also enables institutions to speak with a joint voice in the case of serious allegations that might impact electoral integrity. In the case of the highly-publicized “seven containers hoax” which alleged that cargo ships full of pre-voted ballots had been sent to Jakarta, Bawaslu, the KPU, and the police held a joint press conference to clarify the situation. The process projected a united front in the effort to counter disinformation in the election. 

9.3 Maintaining Independence

In coordinating with other state entities, maintaining the independence of the EMB will be of paramount importance.  In countries where government ministries, intelligence agencies or other potential collaborators are aligned with a governing party or political faction, the EMB must make a judgement call on whether and how to collaborate with these institutions. 

This decision may be made on a case-by-case basis. Though strong coordination existed among a number of entities in Indonesia, Bawaslu deliberately chose not to make use of the government’s LAPOR! system, a platform that facilitates communication between the public and the government, including functionality for receiving reports and complaints from the public that could have been adapted for Bawaslu’s disinformation complaints referral process. Though the platform was judged to be a technologically sophisticated tool that would have been of great use, after multiple discussions, Bawaslu ultimately decided not to use the channel given that using a tool associated with the ruling party might jeopardize their perceived independence.


“One of our considerations when we work with others is our impartiality” – Commissioner Fritz Edward Siregar, The General Election Supervisory Agency of Indonesia (Bawaslu)

9.4 Integrate into Proactive and Reactive Programming Approaches

Coordination with state agencies is something that can be or is naturally integrated into the proactive and reactive counter-disinformation strategies explored in other subcategories of this chapter.

Proactive Strategies

Proactive Communication and Voter Education Strategies to Mitigate Disinformation Threats – coordination with other state agencies can be a useful way to amplify messages to larger audiences. For example, in instances where countries have credible public health agencies, partnering to communicate messages about how voting processes are changing as a result of COVID-19 can mitigate the risk that changes to election procedures could be subjects of disinformation. 

Crisis Communication Planning for Disinformation Threats – Including other state agencies in crisis communication planning can build trust and working relationships that enable EMBs to get clarification and align messaging with other state entities in a crisis scenario.  

EMB Codes of Conduct or Declarations of Principle for the Electoral Period – If codes of conduct are consultatively developed, the involvement of other state agencies may be beneficial to include from the outset. If codes of conduct are binding and enforceable, coordination as  described under Disinformation Complaints Referral and Adjudication may be necessary.

Reactive Strategies

Social Media Monitoring for Legal and Regulatory Compliance – EMBs may or may not have authority to monitor social media for compliance or to enforce violations. In instances where the EMB shares this mandate with other institutions, clarifying the comparative mandates of each body and establishing how those entities will work together is essential.  

Social Listening to Understand Disinformation Threats – A minority of EMBs will be positioned to establish their own social listening and incident response system.  Ministries of Information, intelligence agencies, or campaign oversight bodies may, however, already have capacity to conduct social listening. It may be the case that an EMB is unable to preserve their independence and coordinate with these entities, but if it is possible, an EMB should consider establishing a channel through which information can be effectively relayed or staff from another state agency can embed with the EMB during sensitive electoral periods. 

Disinformation Complaints Referral and Adjudication Process – For enforcement, an EMB will need to coordinate with relevant entities that may have jurisdiction over different complaints. This may include media oversight or regulatory agencies, human rights commissions, law enforcement, or the courts.

With few precedents to emulate, dialogue and exchange among EMBs that are developing counter-disinformation approaches are particularly important. Exchange enables election authorities to learn from peers making similarly difficult decisions and adjustments. 

IFES’ Regional Europe program has established a working group for EMBs dedicated to tackling the challenges presented by social media and disinformation in elections. The virtual launch of the working group in May 2020 gathered nearly 50 election officials from 13 countries in the Eastern Partnership and Western Balkans and provided a forum to discuss the challenge of electoral misinformation and disinformation during the COVID-19 pandemic. The working group provides EMBs with a platform for continued peer learning, skill-building, and developing best practices. The effort complements the launch of a global working group that brings together election authorities and social media companies planned by the Design 4 Democracy Coalition.

EMBs that have been leaders in developing counter-disinformation strategies are also passing on lessons to peer institutions in other countries. INE has shared exchanges with electoral authorities from Tunisia and Guatemala to learn from Mexico’s counter-disinformation approach during elections. The Election Commission of South Africa hosted global experts and EMB representatives from across Africa in March 2020 to share experiences mitigating the impact of social media on electoral integrity.

We perceive the danger of disinformation, but a lack of information leaves us feeling like we don’t have sufficient information, and the result is fear…. We need more information about the problem and to map credible sources of resources so that we don’t have fear to use those resources.”- Southern African EMB Representative