By Riley Galligher and Samantha Tanner

This is part 6 of our series “A Guided Tour of Disinformation Policy”. To read the first part, click here.

Crossover

The structure of our guided tour has centered, until this point, around our three categories of disinformation. While we designed these categories to make this policy issue area accessible, many disinformation policies “cross over” and use concepts from two or more categories to accomplish their objectives. In this section, we will examine what we can learn from more examples of policy crossover.

The most common forms of crossover we see are countries criminalizing information identified as false by imposing sanctions, fines, or charges on individuals or platforms and ordering the removal of the information identified as false. This approach utilizes both the legal action and content moderation approach. Most policies that criminalize the spread of disinformation through legal action also require it to be removed, crossing over into content moderation. However, not all content moderation policies criminalize disinformation, and that is where we differentiate the two approaches.

We see many examples of crossover in the co-regulation policy approaches discussed in the content moderation section. Germany demonstrates an example of moving from self-regulation to co-regulation to legal action. Following the introduction of the EU’s Code of Practice on Disinformation, the “inadequacy of voluntary measures taken by social media platforms” bred discontent in some EU member states.[1] In response, Germany passed its 2017 Network Enforcement Act that imposes rules on so-called very large social media networks regarding the removal of illegal content, along with steep fines for noncompliance.[2] The progression of the EU’s efforts to inspire self-regulation, followed by the German government’s co-regulation with social media platforms, enforced through fines and punishment, demonstrates the crossover between policy approaches used in countering disinformation. This line of anti-disinformation efforts suggests that “quasi-inevitable inconsistency” in disinformation responses from EU soft measures and “hard, sanction-driven measures” can be overcome by a co-regulatory framework at a pan-European level.[3]

French Law no. 2018-1202 presents another prime example of the crossover between our three categories. The law establishes a “fast-track civil procedure” to curb foreign state disinformation and prevent the online transmission of false information during elections and referenda.[4] This law also outlines transparency requirements and content regulation cooperation by online platform providers and provides educational initiatives to promote information literacy.[5] All categories are represented here within the elements of this law: legal action to establish a civil procedure for punishing disinformation actors, content moderation to remove information deemed false and harmful to democracy, and media literacy to enable citizens to identify disinformation. The law takes a unique “legal action” approach by using civil courts rather than criminal courts to create a disincentive to disinformation. This requires a complainant to meet the burden of proof for “falsity and potential democratic impact of the challenged statement.”[6] “Only the clearest cases of disinformation, contained by free speech guarantees, can trigger controls over further dissemination” or moderation of the content.[7] If there is any doubt, publication will not be restrained, thus preventing the power of the state from creeping too far towards unlawful censorship. With transparency from online platforms, civil procedure can provide a speedy mechanism to target and remove information that meets the highest standard of falsity and adverse effect on voters and democracy. Finally, this law also charges The Center for Media and Information Literacy, a public operator acting for the French Ministry of National Education, with creating broad public education initiatives to inform the public about disinformation and support the public’s involvement in civil procedure against disinformation.[8] This law joins efforts from across our categories to create a holistic disinformation policy and suggests a potential for future, successful cross-category disinformation policy.

The UK White Paper A Cycle of Censorship demonstrates that policy responses that blend our categories may characterize a country’s broader, strategic vision for combating disinformation.[9] Legal action by the government should be directed towards ensuring transparency and accountability of social media platforms’ terms of service and community standards. Content Moderation should be directed towards making necessary structural changes to algorithms that currently boost dissemination of disinformation rather than harming freedom of speech by removing content deemed false, similar to the argument in Regulating Disinformation Beyond Content.[10] [11] This model, enforcing transparency by legal action and backing up content moderation with structural changes, would teach citizens how their information environment is shaped by algorithms and threatened by disinformation.

These ideas fuse much of our discussion throughout this guided tour as they build a policy approach that combines the strengths of each of our categories and applies approaches untraditionally, avoiding some of the undesirable effects of disinformation policy. We hope that our guided tour presents the question of whether policies that employ a single approach are effective and highlights the possibility of developing a comprehensive disinformation policy that combines the efforts of legal action, content moderation, and media literacy.

  1. Roudik, Peter, et al. Initiatives to Counter Fake News in Selected Countries. [Washington, D.C.: The Law Library of Congress, Global Legal Research Directorate, 2019] Pdf. Retrieved from the Library of Congress, <www.loc.gov/item/2019668145/>.

  2. Roudik, Peter, et al. Initiatives to Counter Fake News in Selected Countries. [Washington, D.C.: The Law Library of Congress, Global Legal Research Directorate, 2019] Pdf. Retrieved from the Library of Congress, <www.loc.gov/item/2019668145/>.

  3. Roudik, Peter, et al. Initiatives to Counter Fake News in Selected Countries. [Washington, D.C.: The Law Library of Congress, Global Legal Research Directorate, 2019] Pdf. Retrieved from the Library of Congress, <www.loc.gov/item/2019668145/>.

  4. Craufurd-Smith, R 2019, ‘Fake news, French law and democratic legitimacy: Lessons for the United Kingdom?’, Journal of Media Law, vol. 11, no. 1, pp. 52-81. https://doi.org/10.1080/17577632.2019.1679424

  5. Craufurd-Smith, R 2019, ‘Fake news, French law and democratic legitimacy: Lessons for the United Kingdom?’, Journal of Media Law, vol. 11, no. 1, pp. 52-81. https://doi.org/10.1080/17577632.2019.1679424

  6. Craufurd-Smith, R 2019, ‘Fake news, French law and democratic legitimacy: Lessons for the United Kingdom?’, Journal of Media Law, vol. 11, no. 1, pp. 52-81. https://doi.org/10.1080/17577632.2019.1679424

  7. Craufurd-Smith, R 2019, ‘Fake news, French law and democratic legitimacy: Lessons for the United Kingdom?’, Journal of Media Law, vol. 11, no. 1, pp. 52-81. https://doi.org/10.1080/17577632.2019.1679424

  8. Craufurd-Smith, R 2019, ‘Fake news, French law and democratic legitimacy: Lessons for the United Kingdom?’, Journal of Media Law, vol. 11, no. 1, pp. 52-81. https://doi.org/10.1080/17577632.2019.1679424

  9. Pomerantsev, Peter. A Cycle of Censorship: The UK White Paper on Online Harms and the Dangers of Regulating Disinformation, London School of Economics. October 1, 2019. https://www.ivir.nl/publicaties/download/Cycle_Censorship_Pomerantsev_Oct_2019.pdf

  10. Pomerantsev, Peter. A Cycle of Censorship: The UK White Paper on Online Harms and the Dangers of Regulating Disinformation, London School of Economics. October 1, 2019. https://www.ivir.nl/publicaties/download/Cycle_Censorship_Pomerantsev_Oct_2019.pdf

  11. Iglesias Keller, C.. Don’t Shoot the Message: Regulating Disinformation Beyond Content. In: Blanco de Morais, C., Ferreira Mendes, G., Vesting, T. (eds) The Rule of Law in Cyberspace. Law, Governance and Technology Series, vol 49. 2022.Springer, Cham. https://doi.org/10.1007/978-3-031-07377-9_16