Region |
Background |
European Union |
The European Union developed a Code of Practice on Disinformation based on the findings of its High-Level Working Group on the issue. This included recommendations for companies operating in the EU, suggestions for developing media literacy programs for members responding to the issues, and developing technology supporting the code.
The five central pillars of the code are:
- enhance the transparency of online news, involving an adequate and privacy-compliant sharing of data about the systems that enable their circulation online;
- promote media and information literacy to counter disinformation and help users navigate the digital media environment;
- develop tools for empowering users and journalists to tackle disinformation and foster a positive engagement with fast-evolving information technologies;
- safeguard the diversity and sustainability of the European news media ecosystem, and
- promote continued research on the impact of disinformation in Europe to evaluate the measures taken by different actors and constantly adjust the necessary responses.
|
The European Union's Code of Practice on Disinformation is one of the more multinational and well-resourced initiatives in practice currently, as it has the support of the entire bloc and of its member governments behind its framework. The Code was developed by a European Commission-mandated working group on disinformation and contains recommendations for companies and other organizations that want to operate in the European Union. In addition to the Code, the EU provides member governments and countries that want to trade and work with the bloc with guidelines on how to organize their companies online, as well as plan for responses to disinformation through digital literacy, fact-checking, media, and support for civil society, among other interventions.
The Code was formulated and informed chiefly by the European High-Level Expert Group on Fake News and Online Disinformation in March 2018. The group, composed of representatives from academia, civil society, media, and technology sectors, composed a report that included five central recommendations that later became the five pillars under which Code is organized. They are:
- enhance the transparency of online news, involving an adequate and privacy-compliant sharing of data about the systems that enable their circulation online;
- promote media and information literacy to counter disinformation and help users navigate the digital media environment;
- develop tools for empowering users and journalists to tackle disinformation and foster a positive engagement with fast-evolving information technologies;
- safeguard the diversity and sustainability of the European news media ecosystem, and
- promote continued research on the impact of disinformation in Europe to evaluate the measures taken by different actors and constantly adjust the necessary responses.
These principles were integrated into the Code, published in October 2018, roughly six months after the publication of the expert group's report. The European Union invited technology companies to sign on to the Code and many engaged, alongside other civil society stakeholders and EU institutions that worked to implement elements of these principles. Signatories included Facebook, Google, Microsoft, Mozilla, Twitter, as well as the European Association of Communication Agencies, and diverse communications and ad agencies. These groups committed not only to the principles, but to a series of annual reports on their progress in applying them, whether as communications professionals, advertising companies, or technology companies.
As participants in the initiative, the companies agree to a set of voluntary standards aimed at combating the spread of damaging fakes and falsehoods online and submit annual reports on their policies, products, and other initiatives to conform with its guidelines. The initiative has been a modest success in engaging platforms in dialogue with the EU around these issues and addressing them with members governments, other private sector actors, and citizens.
The annual reports of these companies and the overall assessment of the implementation of the Code of Practice on Disinformation review the progress that the code has made in its first year of existence, from October 2018-2019. The reports find that while the Code has generally made progress in imbuing certain aspects of its five central principles in the private sector signatories, it has been limited by its "self-regulatory nature, the lack of uniformity of implementation and the lack of clarity around its scope and some of the key concepts."
An assessment from September 2020 found that the code had made modest progress but had fallen short in several ways, and provided recommendations for improvement. It notes that "[t]he information and findings set out in this assessment will support the Commission’s reflections on pertinent policy initiatives, including the European Democracy Action, as well as the Digital Services Act, which will aim to fix overarching rules applicable to all information society services." This helps describe how the Code on Disinformation fits within a larger program of European initiatives, linking with similar codes on hate speech moderation, related efforts to ensure user privacy, copyright protection, and cybersecurity, and broader efforts to promote democratic principles in the online space.
Other organizations have made independent assessments that offer their own perspective on the European Commission's project. The project commissioned a consulting firm, Valdani, Vicari, and Associates (VVA), to review the project as well, and it found that:
- "The Code of Practice should not be abandoned. It has established a common framework to tackle disinformation, its aims and activities are highly relevant and it has produced positive results. It constitutes a first and crucial step in the fight against disinformation and shows European leadership on an issue that is international in nature.
- Some drawbacks related to its self-regulatory nature, the lack of uniformity of implementation and the lack of clarity around its scope and some of the key concepts.
- The implementation of the Code should continue and its effectiveness could be strengthened by agreeing on terminology and definitions."
The Carnegie Endowment for International Peace completed an assessment in a similar period after the completion of its first year of implementation, published in March 2020. The author found that the EU had indeed made progress in areas such as media and information literacy, where several technology signatories have created programs for users on these concepts, such as Facebook, Google, and Twitter.
The EU Code of Practice on Disinformation’s normative framework follows similar, related examples that describe and develop a component of the European Union's position, namely the 2016 EU Code of Conduct on Countering Illegal Hate Speech. This 2016 EU Code of Conduct links with the earlier "Framework Decision 2008/913/JHA of 28 November 2008 combating certain forms and expressions of racism and xenophobia by means of criminal law" and national laws transposing it, means all conduct publicly inciting to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, color, religion, descent or national or ethnic origin." Alternatively, organizations such as the Center for Democracy and Technology have criticized the EU's approach and potential for misuse and abuse, particularly in regards to the code on hate speech.
Overall, both the European Commission and Carnegie reports found that there is much still to be done and that the Code on Disinformation would benefit from better-shared terminology and structure. To that end, the EU recently adopted its Democracy Action Plan. Countering disinformation is one of its core pillars, with the effort to improve the EU’s existing tools and impose costs on perpetrators, especially on election interference; to move from Code of Practice to a co-regulatory framework of obligations and accountability of online platforms consistent with the Digital Services Act; and to set up a framework for monitoring the implementation of the code of practice.
As can be seen, while companies have signed onto the EU Codes on Disinformation and Hate Speech, and member governments have pledged to follow their principles, oversight, and enforcement are separate, more difficult mechanisms to apply. Nonetheless, with the force of other countries, in other regions, these codes or similar kinds of agreements could provide a framework for collaboration around various issues related to disinformation, hate speech, online violent extremism, and a host of other harmful forms of content.