Political figures, numerous organizations, and legal scholars criticized the law, arguing that it posed a danger to freedom of expression, particularly because content removal decisions could be made by a private operator without the intervention of the judicial judge, who is constitutionally the guarantor of individual liberties (Article 66 of the Constitution).
The bill was adopted by the National Assembly on 13 May 2020. The Constitutional Council, seized by opposition senators, ruled that the text was largely unconstitutional, particularly because it disproportionately infringed upon freedom of expression. On 24 June, President Emmanuel Macronpromulgated the law, stripped of its provisions deemed unconstitutional.
Drafting of the Law
A Text Inspired by a German Law
Deputy Laetitia Avia was inspired by the German law Netzwerkdurchsetzungsgesetz, known as "NetzDG", adopted on 1 September 2017, as a starting point for her work and claimed to propose a different system.[1] The German law required social networks to remove manifestly hateful content within 24 hours of notification.[2] If the illegal nature was not clear, sites had one week to respond.[2] Failure to comply with these deadlines exposed violators to a fine of up to 50 million euros.[2]
Similar positions were defended abroad by France, notably at the G7 Summit in Biarritz, in the summer of 2019, with the proposal of a charter on online moderation.[3]
Mission Against Online Hate
In March 2018, during the dinner of the Representative Council of French Jewish Institutions (CRIF), President Emmanuel Macron announced that he would entrust a mission to combat hate, racism, and antisemitism more effectively on the internet[4] to Franco-Algerian writer Karim Amellal, Gil Taïeb, vice-president of the CRIF, and Laetitia Avia, deputy of Paris (LREM). They submitted their report[5] to Prime Minister Édouard Philippe on 20 September 2018, containing twenty operational proposals to curb online hate and further regulate platforms in this area.[6]
Among the main measures included in the report were the establishment of a 24-hour deadline to censor hateful content, the implementation of a uniform reporting method for hateful content on the largest platforms, enhanced transparency obligations, better support for victims, a mechanism for measuring hate speech, intensified prevention and awareness campaigns targeting young people, a procedure allowing the blocking of clearly hateful sites, and a dialogue body involving all stakeholders.
In February 2019, Emmanuel Macron announced that the report and proposals co-written by Karim Amellal, Laetitia Avia, and Gil Taïeb would lead to a law to combat online hate.[7]
Concept of Hate in Law
Hate itself does not constitute an offense: it has no definition or existence in French positive law,[8] except when it serves as a motive for committing crimes or offenses. Thus, legal scholars debate the necessity of a new criminal offense since case law already punishes hateful speech under Article 32 of the July 29, 1881 law on freedom of the press.[9] In an opinion dated 10 July 2015, the National Consultative Commission on Human Rights stated that the existing incriminations [...] are sufficient.[10]
However, hateful content targeted by the bill could be defined, and the Pleven Law (1972) provided for punishing not hate itself but incitement to racial hatred. It reiterated the terms of the decree-law of Justice Minister Paul Marchandeau of 21 April 1939, which stated that the Public Prosecutor's Office should prosecute, of its own accord (without a complaint), the defamation or insult directed at a group of people belonging, by their origin, to a specific race or religion, [when it] was intended to incite hatred between citizens or inhabitants. Therefore, the first condition required that the statements be defamatory or insulting towards a group of people.
Bill Proposal
The bill, supported by the LICRA[11] and other organizations like Respect Zone,[12]SOS Homophobie,[11] and SOS Racisme,[13] was submitted on 20 March 2019 to the National Assembly by Deputy Laetitia Avia. Several dozen people were heard by reporters Laetitia Avia and Fabienne Colboc, including twelve associations,[14] nine independent administrative authorities and public bodies,[15] and twenty-two digital players,[16] along with specialized lawyers and magistrates.[17] These hearings were supplemented by a public consultation organized from 18 April to 12 May, which received one thousand four hundred and sixteen responses.[18][19]
The Council of State, consulted for an opinion, issued a series of recommendations and criticisms. It notably recommended extending the law to search engines and broadening the scope of targeted content.[20] The initial proposal was therefore heavily revised in the Laws Committee of the National Assembly to comply with the Council of State's opinion[21] and thus respect European law and the French Constitution.
In December 2019, senators removed the flagship measure of the text, which required platforms, under penalty of sanctions, to remove any content reported as "hateful" within twenty-four hours, in favor of a "simplification of the notification systems" for this type of content.[22]
The joint committee on 8 January 2020 failed to reach an agreement. The text was therefore returned to the Senate and the National Assembly, the latter having the final say. The text was adopted by the National Assembly on 13 May by 355 votes in favor, 150 against, and 47 abstentions. The majority (LREM and MoDem) and UDI-Agir deputies voted in favor despite some abstentions, while the communists and socialists mostly abstained, and The Republicans, Liberties and Territories, La France Insoumise, and the National Rally opposed it.[23][24] During this final vote, the favorable vote of Jean-François Cesarini, an LREM deputy who had passed away on 29 March 2020, was counted; according to the Assembly, the presence of this deputy was a material error, and the name of his substitute should indeed have been recorded.[25][26]
The law was to come into effect in two phases: on 1 July 2020, and 1 January 2021.
European Commission
The bill was notified on 21 August 2019 to the European Commission. Initially, the French request to trigger the emergency procedure was denied. Subsequently, the Commission even issued observations to France following the reasoned opinion of the Czech Republic.[27] The European institution expressed reservations about the compatibility of the French text with European law. Brussels requested that France not vote on this text.[28][29] Despite these criticisms, the government announced that it intended to modify the bill only marginally.[30]
Appeals and Censorship by the Constitutional Council
On 18 May 2020, Republican senators announced that they had filed an appeal with the Constitutional Council against the bill, in defense of freedom of expression.[31]
In its decision rendered on 18 June 2020,[32] the Constitutional Council ruled that the text was largely unconstitutional, finding that it imposed restrictions on freedom of expression that were not appropriate, necessary, or proportionate to the intended purpose.[33] The first article and eighteen other articles of the bill were censored.[34] The jurisdiction declared certain provisions unconstitutional due to a disproportionate infringement on freedom of expression.[35] Other provisions were also censored because they were considered legislative riders by the constitutional judge.[36][37][38][39]
Content of the Bill Before the Constitutional Council's Decision
This chapter relates to the content of the bill, but it is not part of the promulgated law due to the Constitutional Council's decision. Therefore, it is not applicable.
Content Covered
Several categories of manifestly illegal content that must be removed are targeted by the first article of the law. These are offenses already present in French law that websites must remove within 1 hour or 24 hours from the time they are reported:[40]
Offense
Legal Text
Timeframe
Targeted Sites
Incitement to voluntary attacks on life, attacks on the integrity of the person, and sexual assault.
Incitement to thefts, extortions, and voluntary destruction, damage, and deterioration dangerous to people.
Glorification of the above crimes, war crimes, crimes against humanity, crimes of enslavement or exploitation of a person reduced to slavery, or crimes and misdemeanors of collaboration with the enemy
Incitement to discrimination, hate, or violence against a person or group of people based on their origin or their membership or non-membership in a specific ethnic group, nation, race, or religion.
Insult towards a person or group of people based on their origin or their membership or non-membership in a specific ethnic group, nation, race, or religion.
Insult committed under the same conditions towards a person or group of people based on their gender, sexual orientation, gender identity, or disability.
Once hidden, illegal content must be retained "for the purposes of research, establishing and prosecuting criminal offenses, and only to provide information to the judicial authority." This will allow the determination or refutation of their illegal nature.
Targeted Sites
The removal within an hour of terrorist and pedopornographic content concerns all websites. The 24-hour timeframe applies to the following websites and Internet services:
"Online public communication services based on the connection of several parties for the sharing of public content," i.e., social networks (YouTube, Facebook, Twitter, etc.) and collaborative platforms (Le Bon Coin, TripAdvisor, Wikipedia, etc.);
Sites based "on the ranking or referencing, using computer algorithms," i.e., search engines (Google, Yahoo!, Bing, Qwant, etc.).
Notification Procedure
Site operators must implement, for users located on French territory, a directly accessible, uniform, and easy-to-use notification system allowing any person to report illegal content in the language used by the service.
Sites must acknowledge receipt of any notification by informing the notifier, and if possible the user targeted by the report, of the date and time of the notification, the outcome, the reason for the decision taken, and a reminder of the penalties incurred in the event of abusive notification.
Penalties
If the website refuses to delete manifestly illegal content or does so too late, its representative is subject to a fine of 250,000 euros. The Higher Audiovisual Council (CSA) can also impose an administrative penalty that can reach up to 4% of global turnover.[48]
Abusive notification is punishable by one year of imprisonment and a fine of 15,000 euros. This provision, provided for in Article 1, II of this bill, will introduce new provisions into the Law for Trust in the Digital Economy of 21 June 2004. Therefore, on the grounds of the future Article 6-2 that will then be inserted into this law, the applicant may rely on abusive notification [citation needed].
Higher Audiovisual Council
As with the law against the manipulation of information, Article 4 assigned the responsibility of monitoring the obligations imposed on websites to the Higher Audiovisual Council (CSA). It assessed whether the removal of content was insufficient or excessive. The CSA could issue a formal notice to a website and impose a financial penalty.
The CSA was entrusted, in place of the CNIL, with the oversight of requests from the OCLCTIC for blocking by Internet service providers of websites with pedopornographic or terrorist content.[49]
Educational Component
Article 3 required operators to inform minors under fifteen and their legal guardians, upon the first use of their services, about the civic and responsible use of the service and the legal risks involved in the dissemination of hateful content by the minor.[50][51]
The authors of hateful messages were scarcely mentioned in the law. Article 6 bis A only provided for the establishment of a specialized digital prosecutor's office within a high court designated by decree to prosecute and judge, under a principle of "concurrent jurisdiction," the authors of illegal hateful content online.[48] This jurisdiction could be located in Nanterre, due to its geographical proximity to the premises of Pharos, the public platform for reporting illegal content.[52] The prosecutor's office would have jurisdiction over both public messages and private communications (WhatsApp, SMS, etc.).[53][54]
Article 7 provided for the creation of an "Online Hate Observatory[55] tasked with "monitoring and analyzing the evolution of" hateful content targeted by the law. The observatory would be composed of representatives from websites, associations, researchers, and regulatory authorities. It would make proposals regarding awareness, prevention, repression, and victim support. The observatory would be linked to the CSA, which would provide its secretariat, define its missions, and determine its composition.[56]
Designated Contact Person
Paragraph 9 of Article 3 required website operators to designate a contact person, a natural person located on French territory. This contact person was responsible for receiving requests from the judicial authority and the CSA.
Content of the Law After the Constitutional Council's Decision
Following the Constitutional Council's decision, only minor provisions remained from the law:[57]
the creation of a specialized prosecutor's office for online hate messages;
the sanction on the obligations imposed on access providers and hosts mentioned in the fourth and fifth paragraphs of Article 7 I of Article 6 of the LCEN and Article 6-1 of the LCEN: this increased from €75,000 to €250,000 (L. No. 2020-766, June 24, 2020, first article);
the creation by Article 16 of the law of an Online Hate Observatory;
the addition by Article 17 of the law of the term manifestly illegal to Article 6-I of the LCEN, thus enshrining the case law of the last fifteen years.
Criticisms of the Bill
In addition to political figures, numerous organizations and individuals criticized the bill:
Lawyer and press law specialist Christophe Bigot,[78] lawyer François Sureau,[79] and law professor Anne-Sophie Choné Grimaldi[80] criticized the possibility that content removal decisions could be made by a private operator without the intervention of a judicial judge, who is constitutionally the guarantor of fundamental freedoms. In an open letter to the Prime Minister and the presidents of parliamentary groups, Memory of Jewish Resistance Fighters of the MOI (MRJ-MOI) and the Union of Jews for Resistance and Mutual Aid (UJRE) lamented the delegation to websites of the removal of hateful content under the pretext of the slowness of the judicial system and were not convinced by the ex-post surveillance planned by the CSA.[69] The National Digital Council (CNNum) made the same observation: "the PPL implies a significant delegation of powers to platforms in the field of regulating hateful content, which could give the impression of a certain privatization of missions historically devolved to the State".[81]
Socialist deputy Hervé Saulignac reminded that "extremely important financial and human resources will be needed, for justice, for the police, for education".[82]
The National Consultative Commission on Human Rights indicated in July 2019 that it supported the objective of the bill against hateful content on the Internet but found the bill to be inadequate and disproportionate and called for a complete review.[83]