7. EMB Coordination with Technology and Social Media Companies

Coordination between EMBs and Technology and Social Media companies to enhance the dissemination of credible information or restrict the spread of problematic content during electoral periods.

Technology and social media companies – including but not limited to Facebook, Google, Twitter, TikTok, and their subsidiary companies including Instagram, WhatsApp, and YouTube – have a role to play in ensuring that elections take place in a credible information environment and that their platforms are not used to undermine the integrity of elections. While companies must be held to account for harms that may stem from their platforms and services, progress towards alleviating these harms can be enhanced through direct engagement with these companies.

Paragraphs

“Of course technology companies have more tools for how to regulate what happens on their platforms. There are things governments can’t do, but technology companies can – we need their help, but you need boldness from government to say so.” – Commissioner Fritz Edward Siregar, General Election Supervisory Agency of Indonesia (Bawaslu)

The market size of a country matters when it comes to how many resources social media companies are willing to invest and how available they are to assist and coordinate with EMBs. Before the 2019 elections, the Election Commission of India was able to convene representatives from top social media companies for a two-day brainstorming session on approaches to problematic social media content in elections, gaining a commitment from those companies to abide by a code of ethics. Conversely, EMBs of smaller countries have reported difficulty getting company representatives to respond to their messages, even after establishing a point of contact within the company. There is significant variation in EMBs’ experiences working with social media and technology companies, as platforms may dedicate variable levels of support to specific countries based on factors including market size, geopolitical significance, potential for electoral violence, or international visibility.

 

 “We aren’t naïve – these are profit-driven companies.” – Dr. Lorenzo Córdova Vianello, Councilor President of the National Electoral Institute of Mexico

 

There is also variation among social media platforms in terms of how willing they are to engage and how many resources they have put behind working with local election authorities. Indonesian electoral authorities, for example, reported that Facebook and YouTube had local representatives that made working with them easier, but that Twitter lacked the capability on the ground, making recurrent engagement more difficult. Based on conversations with more than two dozen EMBs globally, it appears that Facebook has invested more dedicated attention and resources than other platforms in establishing connections with election authorities in a wider range of countries.

The style and formality of agreements between tech companies and EMBs also varies from country to country. Ahead of 2018 Mexican elections, INE in many ways piloted what coordination with social media companies could look like, signing cooperative agreements with Facebook, Twitter and Google, for example.1 The Brazilian TSE, building on INE’s experience, signed formal agreements with WhatsApp, Facebook and  Instagram, Twitter, Google and TikTok ahead of 2020 elections. Brazilian electoral authorities pushed to include more concrete measures and actions to be adopted by the social media platforms, getting commitments from the platforms to use their features and architecture to react to malicious and inauthentic behavior, as well as to promote the dissemination of official information. The majority of arrangements, however, are less formal, and social media companies seem less willing to sign formal MoUs in some countries and regions than in others. For smaller countries, engagement is even more likely to be ad hoc. 

A few lessons learned that EMBs and social media company representatives have shared with regard to establishing productive relationships:

  • Both sides should establish clear communication channels and designated points of contact.
  • Companies should establish relationships early in the electoral cycle when election authorities have capacity to engage and there is sufficient time to build trust. 
  • EMBs should take ownership by having an idea of what they want from social media companies and how they want to collaborate.
  • EMBs should situate their coordination with social media companies within larger multi-stakeholder efforts as appropriate.  For example, if an EMB is working with both social media companies and international implementers to optimize their use of social media, ensuring that these efforts reinforce one another can increase their value and reduce duplicate efforts.
  • When desired, international implementers may facilitate or provide structure to the collaboration between an EMB and social media companies. In some instances, having a third party that understands how EMBs operate and what types of collaboration are more feasible for the social media company can increase the utility of these interactions and help EMBs feel confident that their interests are well represented.

Though there are similar services or types of coordination that social media companies provide across countries, the exact nature of coordination differs from country to country. As discussed, one fundamental distinction in EMB approaches to electoral disinformation is whether focus is on enhanced dissemination of credible information or on sanctions for problematic content. This distinction informs the types of collaboration that an EMB is likely to engage in with social media and technology companies, though many EMBs will coordinate in ways that fall under both categories. 

google doodle7.1 Work to help EMBs to enhance dissemination of credible information 

EMBs may partner with social media and technology companies on a range of initiatives that expand the reach of EMB’s public messaging or connect voters with credible electoral information.

Platform-embedded Voter Information

A common offering from Google and Facebook are Election Day reminders that direct users to EMB websites for additional details about how to participate in elections. In an increasing number of countries, Facebook will include Election Day notifications at the top of users’ news feeds, which may include the ability to mark that you have voted in a way that is visible to your friends. In some countries, Google will alter the Google “doodle” (the changing image on the search engine’s homepage) with an election-themed image that will link to country-specific voter information resources. In addition to Election Day notifications, Facebook and Google may also integrate notifications around voter registration deadlines, candidate information or details on how to vote. Google enabled an Informed Voting button one week before 2018 Mexican elections that redirected users to an INE microsite with information designed for first-time voters.2 While platforms may run these notifications independently, in some countries companies will engage the EMB to verify that the information being provided is correct. Both Google and Facebook have also debuted tools to help voters find their polling locations – which either directs voters to EMB resources or relies on detailed data provided or verified by the EMB – such as a Google Maps integrated feature. 

In addition to working with Facebook and Google, the TSE in Brazil pioneered a number of avenues for working with additional platforms. For example, the TSE partnered with WhatsApp to develop a chatbot that answered election-related questions asked by users and helped them identify whether information was accurate. The chatbot also provided information on candidates and on when and where to vote. More than 1.4 million WhatsApp users queried that chatbot during the election period, and 350,000 accounts exchanged 8 million messages with the chatbot on Election Day alone. For the 2020 Brazilian elections, Instagram created stickers to reinforce the importance of voting, automatically redirecting users to the TSE official website. Twitter created a notification for users with a link to the TSE webpage and promoted the dissemination of official TSE content on the platform. TikTok launched a page to centralize reliable information about the election. 

It is important for companies to work with EMBs to ensure that they are prepared for the extra traffic to their site that may result from these notifications. A Facebook notification that urged Indonesians to check their voter registration status resulted in so much traffic to the election authority’s website that it crashed. 

Civic Engagement and Voter Education Support

In some countries, the platforms will engage in more complex civic engagement efforts that aim to extend the reach of credible and informative content. In Mexico, technology companies partnered with INE to expand the reach of civic and electoral information. Facebook amplified INE’s call for citizens to choose the topic of the third presidential debate, and all three debates were streamed on the platform. INE also collaborated with Twitter using Periscope, Twitter's live video streaming application, to broadcast the three presidential debates, and encouraged national engagement around the debates with a series of customized hashtags. INE was also able to use a Tweet-to-Reply tool, which allowed users who retweeted INE messages on Election Day to opt-in to receive preliminary election results in real time.

Training on how electoral authorities can optimize their use of Facebook for voter education and voter information is another avenue for collaboration with EMBs. In Indonesia, Facebook provided these trainings to public relations departments in provincial and regional election offices. While Facebook provided guidance on topics such as how to make compelling videos, the value of identifying the right messenger for content, and other ways to use the platform for their goals, the company makes clear that they do not provide guidance on what content should be shared, merely how to share content effectively. 


Depending on the mandate of the EMB and the specifics of collaborative agreements, social media and technology companies may also engage with election authorities to deploy news literacy ad campaigns or trainings for electoral stakeholders on understanding and detecting disinformation. Similar efforts might also be organized with other national stakeholders outside of the EMB, additional details about these types of interventions can be found in the guidebook section on Platform Responses.

Highlight


Cyberhygiene and Information Integrity

An area of overlap where EMB's cybersecurity and cyber hygiene practices have implications for information integrity is the protection of official EMB social media accounts and other online channels of communication. When EMB communication channels are hacked and then used to disseminate false information, the impact is not only the immediate confusion that might cause, but also has the potential to undermine the EMB's ability to be a trusted communication channel in the future and undermine faith in the credibility and professionalism of the EMB more broadly.

7.2 Work with EMBs to restrict or sanction problematic content

Social media companies also provide various avenues for election authorities to identify content that should be restricted or removed from social media platforms. 

Account Verification and Security 

An important, uncontroversial avenue of collaboration is providing election authorities with support for the expeditious removal of social media accounts that are falsely claiming to be or speak for the EMB. The existence of imitation accounts can be highly problematic, discrediting the electoral process and possibly sparking violence. For example, in the context of highly contentious 2018 Kenyan elections, a fake Twitter account declared Uhuru Kenyatta president prior to the official release of Presidential results, an incident that IFES field staff identified as a trigger for sporadic violence in opposition areas. Several fake accounts used the image of the Chairman of the Election Commission to announce incorrect electoral results or threaten violence against other members of the Election Commission. 

EMB imitation accounts are common, and the identification and removal of these accounts is a service that major platforms are able to provide to EMBs of any size with relative ease, provided a trusted communication channel exists between the company and the EMB. A secretariat member of the EMB of Malawi reported that Facebook had been of assistance in taking down fake accounts ahead of elections. The Central Election Commission of Georgia has reported the same. In Georgia, several fake CEC Pages were discovered. Though the CEC judged that their impact was minimal, the Commission acted expeditiously to have the accounts taken down, both by contacting Facebook and by directly writing to the page administrators to desist, which was successful in several cases. The imitation pages had the potential to erode the credibility of the CEC, prompting decisive action.

 

The fake CEC page discovered during the pre-election period, titled “Election Administration (CEC)” using the same profile and background pictures, would give unserious answers to people asking relevant questions…Our reputation and credibility were at stake as [this] is the goal of the disinformation itself.” – Interlocutor at the CEC of Georgia

Facebook, and possibly other platforms, express an active desire to have all EMB official Pages “blue check” verified on the platform. At a gathering of EMB commissioners and staff in South Africa in early 2020, they set up a booth that EMB representatives could visit throughout the conference to have their accounts verified, call attention to imitation accounts, and discuss other account security issues. Facebook reiterates basic account security protocols as part of account verification, including the enabling of two-factor authentication to make EMB social media accounts more secure.  

Whitelisting to flag problematic content 

Social media companies might also provide EMBs with an accelerated channel for reporting content that violates platform community standards. The major U.S.-based platforms maintain provisions that prohibit content that constitutes election interference, voter suppression and hate speech. In some instances, establishing a reporting channel is done through a more formal process. In others, it can happen on a more ad hoc basis.

Indonesia’s reporting process with Facebook, for example, was a formal arrangement, with a reporting process that was discussed and designed to fit Bawaslu’s needs. Facebook trained EMB staff on the platform’s community standards and content review process and provided Bawaslu with a dedicated channel through which they could report violations. Facebook and Bawaslu had a series of meetings to clarify Facebook’s content review policies in relation to local law and to establish a procedure for Bawaslu’s reporting process during the electoral period.  This process included Bawaslu classifying content they identified as problematic, what local law the content breached, and the argument for why the content was in violation of that law. This was then submitted as an Excel spreadsheet on a weekly basis to Facebook. The complaints referral and adjudication subcategory contains more details on this process. Although this formal process was carefully designed and adopted, a Bawaslu representative indicated that their reporting process with Facebook was not as expeditious as  with other platforms. The Bawaslu representative indicated that their formal reporting process with YouTube resulted in a quicker removal of violating content. 

In India, the election commission convened social media platforms ahead of the election and, as part of the voluntary code of ethics, platforms “agreed to create a high-priority dedicated reporting mechanism for the ECI and appoint dedicated teams during the period of General Elections for taking expeditious action on any reported violations.” 

Channels for flagging content to the platforms often form on a more ad hoc basis, particularly in countries with smaller populations in which the platforms lack a physical presence. If pre-existing relationships with the platforms do not exist, it may be too late to establish a clear process by the time elections are called. Merely establishing contact may be insufficient to lay a foundation for a productive exchange of information that benefits EMBs. A representative from the EMB of Mauritius reported that Facebook had sent representatives to meet with them ahead of 2019 elections and encouraged the EMB to report voter interference content for removal. However, when the EMB did identify content during the election that directed voters to the wrong polling locations and falsely alleged that ballots were being tampered with (clear violations of Facebook’s community standards related to voter interference), the EMB was unable to reach anyone at Facebook to remove the content. 

For EMBs that have at present only ad hoc communication with the platforms, greater systemization of the process for elevating concerns to the platforms would be valuable. The platforms should ensure that they have sufficient staff redundancies and reporting channels that a response is not contingent on the one or two individuals who initially established contact with the EMB. 

Pre-certification of political advertisers

An unprecedented arrangement that the Election Commission of India made for 2019 elections was to require political advertisements to be pre-certified by the Media Certification and Monitoring Committee before they ran on social media. Candidates provided the details of their social media accounts to the election commission as part of the process of filing their nominations, and platforms were required to allow those accounts to only run advertisements that had been certified. In addition, certification was required for all election advertisements that featured names of political parties or candidates for the 2019 general elections. The platforms were also obligated to remove political advertisements that did not have certification upon notification by the ECI. It is hard to imagine the platforms complying with measures such as this in a country with a smaller market audience than India,  or one in which the company was not physically present . This intervention is further discussed in the section on Legal and Regulatory Responses.

Enforcement of the Silence Period

The enforcement of a campaign silence or cooling period immediately prior to Election Day (as defined in local law) is another area where some EMBs have coordinated with social media platforms. Both Indonesia and India successfully gained compliance from social media companies that they would enforce the silence period. Other election authorities that express interest in having a similar arrangement have been less successful in gaining the platforms’ compliance. 

During the 48-hour silence period before Election Day, India’s Voluntary Code of Ethics compels platforms to remove objectionable content within three hours of it being reported to them by the Election Commission.

The ban in Indonesia applied only to paid advertising, not to posts that were organically disseminated. Indonesia adopted an assertive enforcement approach to the silence period by issuing letters to each of the platforms outlining the provisions of the ban on campaign advertising during the blackout period. Letters indicated a willingness to use the existing criminal provisions in the law to enforce platform compliance . Facebook initially argued that the boundary between regular advertising and political advertising would be hard to discern. Bawaslu responded it was not their responsibility to resolve that tension and that it was incumbent on the platforms to ensure that they were in compliance with the law. Bawaslu speculated that the force of this edict led to a conservative interpretation of what constituted political advertising by the platforms, leading them to restrict a larger array of borderline advertising during the three-day silent period than they might otherwise have done. Bawaslu estimated, based on reports they received from the platforms, that the ban led to the rejection of approximately 2,000 ads across all of the platforms during the three-day silence period.

 

Highlight


Featured Intervention:

The MoU between Brazil's TSE and WhatsApp established a dedicated communication channel to allow the TSE to directly report WhatsApp accounts suspected of bulk messaging. The TSE then provided citizens with an online form to report illegal bulk messaging, and upon receiving those reports, WhatsApp would promptly launch an internal investigation to verify whether the reported accounts had violated WhatsApp terms and policies on bulk messaging and auto-messaging services. In which case, the accounts engaging in prohibited behaviors would be banned. During the 2020 electoral period,  the TSE received 5,022 reports of illegal bulk messaging related to elections, which led to the banning of 1,042 accounts.

“Does a country have the boldness to threaten Facebook and YouTube to follow the guidelines? If they have that boldness, tech companies will consider the position.” – Commissioner Fritz Edward Siregar, The General Election Supervisory Agency of Indonesia (Bawaslu)

The enforcement of a silence period is not something that the platforms have acted upon without being compelled by local authorities, and smaller countries are unlikely to have the clout to demand compliance. Other dimensions of campaign silence periods are discussed in the legal and regulatory section of this guidebook.

Footnotes

1“New challenges for democracy: Elections in times of disinformation,” Instituto Nacional Electoral (2019): 7.

 2“New challenges for democracy: Elections in times of disinformation,” Instituto Nacional Electoral (2019): 9.