6. Enforcement

Thoughtful regulation means little if it is not accompanied by meaningful consideration of how that regulation will be enforced. A lack of realism about enforcement threatens to undercut the authority of the regulatory bodies creating provisions and establishes unrealistic precedents for what will be achievable through regulation alone.

The levers of enforcement will change depending on whether provisions are aimed at domestic actors or platforms. In the case of the former, governments and political actors that are in office are increasingly complicit in or actively at fault for participation in the very behaviors that the regulatory actions outlined in this document seek to curb. In these instances, the ability to meaningfully enforce provisions will rely on the independence of enforcement bodies from the executive. 

The ability for an individual country to enforce provisions directed at foreign actors is very limited, which is one of the reasons why legal and regulatory approaches directed at foreign actors are not included in this section of the guidebook. 

Provisions directed at platforms will vary significantly in how enforceable they may be. Provisions that require alterations to the platform’s engineering or global business practices are highly unlikely to come from national-level laws passed in anything other than the largest-market countries in the world. However, many major social media platforms have thus far been ahead of lawmakers in instituting new provisions and policies to define and restrict problematic content and behaviors or to promote transparency, equity, and/or democratic information. These provisions have not been rolled out equally though, and where national-level legislation might have an impact is in pushing companies to extend their existing transparency tools to the country in question. Platforms will undoubtedly balance their business interests and the difficulty of implementing a measure against the cost of non-compliance with legal provisions in countries where they operate but do not have a legal presence. Recognizing that many countries in the world have limited ability to enforce legal obligations placed on the platforms, legal and regulatory provisions might instead serve to make a country a higher priority for companies as they globalize their ad transparency policies or promote voter information via their products. 

Paragraphs

6.1 Establishing which state entities have an enforcement mandate

Different institutions may have the right of oversight and enforcement over laws governing the intersection of social media and campaigning, and – given that provisions pertinent to this discussion might be scattered across a legal framework in several different laws – oversight may sit with multiple bodies or institutions. A few common types of enforcement bodies are noted below.

In many countries, responsibility for oversight and enforcement may sit with an independent oversight body or bodies. This might be an anti-corruption agency, a political finance oversight body, or a media oversight body, for example. As Germany expands their legal and regulatory framework around social media and elections, implementation and enforcement fall to an independent, non-governmental state media authority. This effort expands the mandate of the body, which has pre-existing expertise in media law, including advertising standards, media pluralism, and accessibility. Analysts of this move to expand German media authorities’ scope of work contend that “it is crucial to carefully consider what, if any, provisions could or should be translated to another European context… While Germany’s media regulators enjoy a high level of independence, the same cannot be said of other member states,” citing research that says more than “half of EU member states lack safeguards for political independence in appointment procedures.”.57 

Responsibility for oversight will often be spread across multiple independent bodies or agencies, necessitating coordination and the development of joint approaches. A Digital Regulation Cooperation Forum has been created in the United Kingdom, for example, which promotes the development of coordinated regulatory efforts in the digital landscape among the UK Information Commissioner’s Office, the Competition and Markets Authority, and the Office of Communications. 

Other countries vest election authorities or election oversight bodies with the implementation and enforcement capacity of some kind. For election authorities that have political finance, campaign finance, or media oversight mandates, the responsibility to oversee provisions related to social media in elections might, in some instances, be naturally added to these existing capacities. Election authorities may be in the position of having a legal mandate to monitor for violations, or they may have adopted this responsibility independently while lacking authority to enforce. In these instances, legal and regulatory frameworks will need to take into account relevant referral mechanisms to ensure detected violations can be shared with the appropriate body for further action. 

In other instances, enforcement sits more directly with the judicial system. In the case of France, judges play a direct role in determining what content constitutes information manipulation. In addition to ordering the removal of the manifest, widely disseminated, and damaging content, judges may also order “any proportional and necessary measure” to stop the “deliberate, artificial or automatic and massive” dissemination of misleading information online. In Argentina, the electoral court is responsible for enforcing violations resulting from advertising that takes place outside of the designated campaign period.58 Any model that relies on the judiciary to determine what constitutes a violation necessitates a fully independent judiciary with the capacity to understand the nuances of information manipulation and to review and respond to cases quickly.59 

6.2 Building capacity to monitor for violations

Without establishing a capacity to monitor, audit, or otherwise effectively provide oversight, laws, and regulation governing the use of social media during elections are unenforceable. The subsection on Social Media Monitoring for Legal and Regulatory Compliance in the guidebook section on Election Management Body Approaches to Countering Disinformation outlines key questions and challenges in defining a monitoring approach. These include:

  • Does the body in question have a legal right to monitor social media?
  • What is the goal of the monitoring effort?
  • What is the time period for social media monitoring?
  • Will the monitoring be an internal operation or conducted in partnership with another entity?
  • Does the body in question have sufficient human and financial resources to carry out the desired monitoring effort?
  • What social media advertising transparency tools are available in the country?

6.3 Considerations for evidence and discovery

The nature of social media and digital content raises new questions in the consideration of evidence and the discovery process. For example, when platforms notify national authorities or make public announcements that they have detected malicious actions on their platforms, it is often accompanied by action to remove the accounts and content in question. When this material is removed from the platform, it is no longer available to authorities that might currently or in the future be capturing the content as evidence of violations of national law. 

Highlight


In instances where a case is being brought against an actor for illegal conduct on social media, a legal request to preserve posts and data may be a step that authorities or plaintiffs need to consider. Dominion Voting Systems, for example, has pursued this action in a series of defamation cases against media outlets and others for falsely claiming that the company's voting machines were used to rig the 2020 U.S. elections. Dominion sent letters to Facebook, YouTube, Parler, and Twitter requesting that the companies preserve posts relevant to their ongoing legal action.

At present, there does not appear to be a comprehensive obligation on major platforms to preserve and provide information or evidence in the case of an investigation into the origins or financing of content and actions that may be violations of local laws. While in instances of violent crimes, human trafficking, and other criminal acts, major U.S.-based platforms have a fairly consistent record of complying with legal requests by governments for pertinent data, the same does not seem to be true in the case of political finance or campaign violations. A means and precedent for making legally-binding requests for user data from the platforms when a candidate or party is under credible suspicion of violating the law is an essential route to explore for enforcement. 

Granted, the platforms also play a critical role in ensuring user data gathered on their platforms is not handed over to government actors for illegitimate purposes. The determination of what does and does not constitute a legitimate purpose is one that necessitates careful deliberation and the establishment of sound principles. There is also likely to be frequent conflict between what platforms deem to be requests for data with the potential for abuse and what the national authorities requesting that data might think. Particularly for countries that have leaned heavily into the use of their criminal code to sanction problematic speech, the platforms may preserve legitimate resistance to complying with requests for user data that have a high potential for abuse.

6.4  Available sanctions and remedies

Countries have used a variety of sanctions and remedies to enforce their legal and regulatory mandates. Most of these sanctions have precedent in existing law as it pertains to analogous offline violations. 

The issuing of fines for political finance or campaign violations has a well-established precedent. In the context of violations of digital campaigning rules, fines are also a common sanction. Argentinian law, for example, stipulates that fines will be issued to human or legal entities that do not comply with content and publication limits on advertisements, including those transmitted via the internet. Argentina’s law assesses the fine in relation to the cost of advertising time, space, or internet bandwidth at the time of the violation.60 

Fines can also be directed at social media companies or digital service providers that do not meet their obligations. Paraguay, for example, holds social media companies vicariously liable and subject to fines for breach of campaign silence, illicit publication of opinion polls, or for engaging in biased pricing.61 It is unclear if Paraguay has successfully levied these fines against any social media companies.

Some legal and regulatory frameworks carry the threat of revoking public funding as a means of enforcement. In contrast to the penalty of a fine for individuals in breach of the law, the Argentinian Electoral Code stipulates that political parties that do not comply with limitations placed on political advertising will lose the right to receive contributions, subsidies, and public financing for a period of one to four years.62 The effectiveness of this sanction is heavily dependent on the extent to which parties rely on public funding for their income.

Provisions might seek to remedy harm by requiring entities found to be in violation of the law to issue corrections. As referenced in the section on promoting democratic information, South African regulation stipulates that the election commission can compel parties and candidates to correct electoral disinformation that is shared by parties, candidates, or their members and supporters. However, mandates to provide corrections can be manipulated to serve partisan interests; Singapore’s Protection from Online Falsehoods and Manipulation Act in 2019, which has been subject to heavy criticism for its use to silence opposition voices, requires internet service providers, social media platforms, search engines, and video-sharing services like YouTube to issue corrections or remove content if the government deems it false and that a correction or removal is in the public interest. The law specifies that a person who has communicated a false statement of fact may be required to make a correction or remove it even if the person has no reason to believe the statement is false.63 Individuals who do not comply are subject to fines up to $20,000 and imprisonment.64

Another sanction is the banning of a political party or candidate from competing in an election. The Central Election Commission of Bosnia and Herzegovina fined and banned a party from participating in 2020 elections for sharing a video that violated a provision against provoking or inciting violence or hatred,65 though this decision was overturned by the courts upon appeal. This sanction is at high risk of political manipulation and, if considered, must be accompanied by sufficient due process and a right of appeal.

In some instances, enforcement has resulted in the annulment of election results. The Constitutional Court of Moldova annulled a mayoral election in the city of Chisinau because both competitors were campaigning on social media during the campaign silence period. In the aftermath of this decision, which was viewed by many as disproportionate to the offense, Moldovan regulators introduced a new provision allowing campaign materials on the internet which were placed before Election Day to remain visible. Election annulment is an extreme remedy that is highly vulnerable to political manipulation and should be considered in the context of international best practice on validating or annulling an election.

Countries have banned or threatened to ban access to a social media platform within their jurisdiction as a means to compel compliance or force concessions from global social media platforms. The Government of India, for example, threatened to ban WhatsApp in 2018 following a string of lynchings resulting from viral rumors being spread via the messaging application. WhatsApp refused to accede to the government’s demands on key privacy provisions but did make alterations to the ways in which messages were labeled and forwarded within the app in response to government concerns. India also banned TikTok, WeChat, and a range of other Chinese apps in 2020. In 2018, the Indonesian government banned TikTok for several days on the basis that it was being used to share inappropriate content and blasphemy. In response, TikTok quickly acceded to the government’s demands and began censoring such content. The Trump administration threatened to ban TikTok in the United States over data privacy concerns unless the Chinese-owned company sold its U.S. operations. In 2017, Ukrainian President Petro Poroshenko signed a decree that blocked access to a number of Russian social media platforms on national security grounds. 

Banning access to entire platforms as a means to force concessions from companies is a blunt-force approach that is only likely to yield results for countries with massive markets of users. Far more frequently, bans on social media platforms have been used as a tool by authoritarian leaders to restrict access to information among their populations. 

Regulating social media in campaigning, particularly in a way intended to deter or mitigate the impact of disinformation, is far from coalescing around established and universally accepted good practices. As countries take legal and regulatory steps to address disinformation in the name of protecting democracy, the uncertainty and definitional vagueness of key concepts in this space has the potential to result in downstream implications for political and civil rights. Concerns about free speech, for example, are elevated when content is removed without any judicial review or appeals process. Critics point to the dangers of allowing unaccountable private social media companies and digital platforms to decide what content does or does not comply with the law. For example, if sanctions are severe, it might incentivize companies to overcorrect by removing permissible content and legitimate speech. The existence of robust appeals mechanisms is essential for preserving rights.

 

Footnotes

57. Mackenzie Nelson and Julian Jaursch, “Germany’s new media treaty demands that platforms explain algorithms and stop discriminating. Can it deliver?,” AlgorithmWatch, July 27, 2020.

58. National Electoral Code, art. 64. 

59. The 2018 French Law provides a 48 period for a judge to make her ruling, a time frame that is both ambitious from an administrative perspective and very slow in the lifecycle of online content.

60. National Electoral Code, art. 128 (c) 1-3.

61. Compendium of Election Norms, art. 329 and 337.

62. National Electoral Code, art. 128 (a).

63. Protection from Online Falsehoods and Manipulation Act n ° 26 (2019): § 11(4). 

64. Ibid., § 15(1).

65. Election Law of Bosnia and Herzegovina, n ° 23/01 (amended 2016): Article 19.9 (j).