3. Measures to promote transparency during campaigning and elections

Updated On
Apr 21, 2021
Paragraphs

Measures that promote transparency can include obligations for domestic actors to disclose the designated political activities they engage in on social media, as well as obligations for digital platforms to disclose information on the designated political activities that take place on their platforms or to label certain types of content that may otherwise be misleading. These measures are part of the regulatory push back against disinformation as they allow insight into potentially problematic practices being used by domestic political or foreign actors and build public understanding of the origins of the content they are consuming. Transparency creates the opportunity for the public to make better-informed decisions about their political information. 

i. Promote Transparency: Measures directed at domestic actors
a. Require the declaration of social media advertising as a campaign expenditure

One of the most common approaches to promoting increased transparency by domestic actors is to expand the definition of “media” or “advertising” that is subject to existing disclosure requirements to include online and social media advertising. Expansions of this nature should take into account the definitional considerations at the beginning of this section of the guidebook. Detailed disclosure requirements may be required to delineate which types of expenditures constitute social media advertisements, including, for example, payments to third parties to post supportive content or attack opponents. While expanding existing disclosure requirements extends existing principles of transparency, crafting meaningful disclosure requirements necessitates careful consideration of the ways in which social media and online advertising differ from non-digital forms of political advertising.

To offer illustrative examples, section 349 of Canada’s Elections Act has extensive regulation on third-party expenditure and the use of foreign funding, which captures paid advertising online. A draft resolution in Colombia has also been put forth with the aim of categorizing paid advertising on social media as a campaign expenditure subject to spending limits. The resolution would empower Colombian electoral authorities to investigate these expenditures, given that they are often incurred by third parties and not by the campaign itself. It would establish a register of online media platforms that sell political advertising space and subject political advertising on social media to the same framework as political campaigning in public spaces. 

b. Require registration of party and candidate social media accounts 

While monitoring the official accounts of parties and candidates provides only a narrow glimpse into political advertising and political messages circulating on social media, having a record of official social media accounts is a first step toward transparency. This could be achieved by requiring candidates and parties to declare the accounts that are administered by or financially linked to their campaigns. This approach can provide a starting point for oversight bodies to monitor compliance with local laws and regulations governing campaigning. Such a requirement could be paired with a regulation that stipulates that candidates and campaigns may only engage in certain campaign activities through registered social media accounts, such as paying to promote political content or issue ads. This combination of measures can create an avenue for enforcement in instances where parties or candidates are found to be using social media accounts in prohibited ways of concealing financial relationships with nominally independent accounts. Enforcement would necessitate monitoring for compliance, which is discussed in the Enforcement subsection at the end of this topical section of the guidebook.

This approach has been taken in Tunisia, where a directive issued by the country’s election commission requires candidates and parties to register their official social media accounts with the commission.38 Mongolia’s draft election laws would also impose an obligation for the candidate, party, and coalition websites and social media accounts to be registered with the Communications Regulatory Commission (for parliamentary and presidential elections) and with the respective election commission (for local elections).39 The Mongolian law in its entirety should not, however, be taken as a model as it raises concerns related to freedom of expression and enforcement limitations given definitional vagueness. 

c. Require disclosure and labeling of bots or automated accounts

“Bots” or “Social Bots,” which can perform automated actions online that mimic human behaviors, have been used as a part of disinformation campaigns in the past, though the degree to which they have impacted electoral outcomes is disputed.40 When deployed by malign actors in the information space, these lines of code can, for example, power artificial social media personas, generate and amplify social media content in large quantities, and be mobilized to harass legitimate social media users. 

As public awareness of this tactic has grown, lawmakers have attempted to legislate in this area to mitigate the problem. Legislative approaches that seek to ban the use of bots have largely failed to gain traction. A measure to criminalize bots or software used for online manipulation was proposed in South Korea, for example, but ultimately was not enacted. A proposed bill in Ireland to criminalize the use of a bot to post political content through multiple fake accounts also failed to become law. 

Opinion is divided on the efficacy and freedom of expression implications of such measures. Detractors of this approach suggest that such legislation can inhibit political speech and that overly broad measures can undermine legitimate political uses for bots, such as a voter registration drive or an electoral authority using a chatbot to respond to common voter questions. Detractors also suggest that legislating against specific disinformation tactics is a losing battle given that tactics evolve so quickly. Removing networks of automated bots also aligns with social media platforms’ reputational self-interest, so that legislation against such operations may not be necessary. 

Efforts to add transparency and disclosure to the use of bots may be a less controversial approach than criminalizing their use. California passed a law in 2019 making it illegal to “use a bot to communicate or interact with another person in California online with the intent to mislead the other person about its artificial identity.” Germany’s Interstate Media Treaty (Medienstaatsvertrag – “MStV”) also includes provisions that promote transparency around bots by obligating platforms to identify and label content that is disseminated by bots.  Measures that criminalize or require disclosure of the use of bots do present challenges for enforcement given difficulty in reliably identifying bots.

 

“By the time lawmakers get around to passing legislation to neutralize a harmful feature, adversaries will have left it behind.” — Renee DiResta, Research Director at the Stanford Internet Observatory

d. Require disclosure of the use of political funds abroad

Facing tightening regulations in their home countries, political actors might also seek to place political advertisements on social media by coordinating with actors located outside of the country. Foreign funding might also be used to place advertisements that target diaspora communities eligible for out-of-country voting. While platforms with political ad disclosure and identification requirements will in some cases prohibit the purchase of political advertisements in foreign currencies or by accounts operated from another country, these efforts are not yet sufficient to catch all political or issue advertisements placed extraterritorially. 

Disclosure requirements that address foreign funding may wish to consider the ways in which foreign expenditures on social media advertising might differ from traditional media. New Zealand, for example, requires full disclosure of any advertising purchased by entities outside of the country, so that non-abidance constitutes a campaign finance violation.41 It could, however, be difficult to prove the beneficiary political party or candidate is aware of campaign funding being expended to their benefit extraterritorially, which could render enforcement futile. 

ii. Promote Transparency: Measures directed at platforms
a. Require platforms to maintain ad transparency repositories

Some countries have imposed legal obligations on larger online platforms to maintain repositories of the political advertisements purchased on their platforms. France and Canada, for instance, require large online platforms to maintain a political ad library. India’s Code of Ethics, signed by social media companies operating in the country ahead of 2019 elections, committed signatories to “facilitating transparency in paid political advertisements, including utilizing their pre-existing labels/disclosure technology for such advertisements.” This measure may have been decisive in compelling these companies to expand coverage of their ad transparency features to India.

Facebook voluntarily introduced a publicly accessible Ad Library in a very limited number of countries in 2018, and as of early 2021 has since expanded coverage to 95 countries and territories. Google maintains political ad transparency disclosures for Australia, the EU and UK, India, Israel, New Zealand, Taiwan, and the United States but has been slower to expand these tools to additional markets. As platforms contemplate where to next expand their advertising transparency tools, it is conceivable that updating national law to require platforms to maintain ad repositories could influence how companies prioritize countries for expansion. Details on the functionality of advertising transparency tools can be found in the guidebook section covering platform responses to disinformation

Legal mandates, however, might disadvantage smaller online platforms, since the cost of setting up and maintaining advertising repositories might be disproportionately higher for smaller platforms than for larger platforms. The legal requirement might thereby inadvertently stifle platform plurality and diversity. This side effect can be remedied by creating a user threshold for the obligation. For example, Canada’s ad transparency requirements apply only to platforms with more than three million regular users in Canada,42 though even this threshold might be too low to avoid becoming a barrier to competition. National regulators might also consider a standard whereby a platform is required to provide ad transparency tools if a certain percentage of the country’s population uses their services.

Some countries where the platforms do not maintain ad repositories have experimented with their own. Ahead of the 2019 elections, South Africa tested a new political ad repository, built in partnership with election authorities and maintained by civil society. Compliance was not obligatory and was accordingly minimal among political parties, but the effort showed sufficient promise that the implementers of the ad repository are considering making compliance legally mandatory for future elections.43 

Legal measures that compel, or attempt to compel, platforms to maintain ad repositories might also incorporate provisions requiring the clear labeling of advertisers to distinguish between paid and organic content, as well as labels that distinguish among advertisements, editorial, and news content. Requirements to label content originating from state-linked media sources might also be outlined. Measures might also include identity verification requirements for actors or organizations that run political and issue advertisements. However, these provisions would likely require alterations to the functionality of the platform’s ad transparency tools, a change that is more likely with joint pressure from multiple countries.

b. Require platforms to provide algorithmic transparency 

Additional measures being explored in France, Germany, and elsewhere focus on compelling platforms to provide greater insight into the algorithms that influence how content – organic and paid - is surfaced to individual users, or, put another way, transparency for users into how their data is used to inform the ads and content that they see. 

Germany’s MStV law, for example, introduces new definitions and rules intended to promote transparency across a comprehensive array of online portals and platforms. “Under the transparency provisions, intermediaries will be required to provide information about how their algorithms operate, including: [1] The criteria that determine how content is accessed and found. [2] The central criteria that determine how content is aggregated, selected, presented and weighed.”44 EU law on comparable topics has in the past drawn on German law to inform its development, suggesting that this route may influence conversations at the EU-level on platform transparency and, subsequently, include the global operations of digital media providers and intermediaries. 

The Forum on Information and Democracy’s November 2020 Policy Framework provides a detailed discussion on how algorithmic transparency might be regulated by state actors.45

Footnotes

38. Instance Supérieure Indépendante pour les Élections (ISIE) Decision 22-2019.

39. “ODIHR Opinion on Draft Laws of Mongolia on Presidential, Parliamentary and Local Elections” OSCE Office for Democratic Institutions and Human Rights, November 25, 2019: 10-11. 

40. See for example Bots and Automation over Twitter during the U.S. Election and The Army that Never Existed: The Failure of Social Bots Research.

41. New Zealand Electoral Act, § 3F (1). 

42. Canada Elections Act, § 325.1 (1). 3 million user threshold applicable for English-language platforms, with lower user requirements for platforms in languages other than English. 

43. Stakeholder interview, 2019.

44. Mackenzie Nelson and Julian Jaursch, “Germany’s new media treaty demands that platforms explain algorithms and stop discriminating. Can it deliver?,” AlgorithmWatch, July 27, 2020.  

45. See Chapter 1: Transparency of Platforms, § 1.4 Algorithms & Content Moderation, Ranking, Targeting.