Introduction

  The French Constitutional Council released its decision406 regarding the controversial LOPPSI bill on 10 March 2011. Judges held that Article 4 of the bill, which allows the executive branch to censor the internet under the pretext of fighting child pornography, is not contrary to the Constitution. In doing so, the constitutional court has failed to protect fundamental freedoms on the internet, and in particular freedom of expression. Hope now lies with European institutions, which are the only ones with the power to prohibit or at least supervise administrative website blocking and its inherent risks of abuse.

  The LOPSSI law collated many repressive measures on vastly unrelated subjects. The Constitutional Council found itself caught out in this strategy. While it did strike down some of the most shocking provisions, it left untouched those that seemed less harmful or were proposed in the name of noble goals, in spite of having a highly detrimental impact on civil liberties – such as the ones related to the internet.

  According to Jérémie Zimmermann, co-founder and spokesperson for La Quadrature du Net:

  This decision on Article 4 is a great disappointment. It is obvious that internet censorship will not help solve the child pornography problem in any way, as experiments in other countries have shown.407

  After HADOPI’s408 internet access suspension measures, calls to ban WikiLeaks hosting and recent talks against net neutrality, France is siding with the group of countries hostile to a free internet by adopting administrative filtering of the internet.”

  The following analysis is based on a legal study on the screening measures published in 2009 by a team of European lawyers.409 It attempts to identify – given the European Convention on Human Rights (ECHR) and related case law – a number of safeguards that must govern any action involving the freedom of communication on the internet. The review of arrangements for supervision of restrictions on fundamental freedoms in play shows that the administration of internet filtering violates some basic principles of the rule of law.

  International law and the protection of freedom of expression and communication

  Respect for fundamental freedoms is the legal basis of democratic societies and the rule of law. The highest legal protections are granted to fundamental freedoms. These protections are enshrined in law but also in national constitutions and international instruments, and it is traditional for judges to protect each of these levels. The foundation of this protection is the idea that people who enjoy these freedoms must be protected, especially from any interference by the executive and the parliament.410

  Measures to regulate online communications may, depending on the various cases, violate one or more fundamental freedoms protected by constitutions and conventions:

  The first of these is, of course, freedom of expression and communication, as these measures prevent the transmission of information and access to this information by the public.

  The second is the right to respect for one’s private life and correspondence. Whatever the techniques employed to intercept and block the offending content, private communications will be intercepted as well as criminal communications.

  In the ECHR, freedom of communication is protected by Article 10, the second paragraph identifying cases in which this freedom may be restricted if it were to jeopardise “national security, territorial integrity or public safety.” Measures are also necessary “for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.” Article 8 of the ECHR, which asserts the right to respect for one’s private and family life, also provides a framework if this freedom comes under question.

  Conditions on challenges to freedom of communication in European law

  As evidenced by the second paragraph of Article 10, any questioning of fundamental freedoms protected by the ECHR must meet a number of conditions to be acceptable. With regard to freedom of communication and the right to respect for one’s private life, such interference must, in addition to being required by law, pursue a goal called “legitimate” under the Convention,411 and be “necessary in a democratic society”. This last condition, which looks rather vague, seems to be the most important in terms of interference with freedom of communication, including blocking communications or removing content.

  As judges of the European Court of Human Rights (ECtHR)412 had the opportunity to point out in their jurisprudence, in a “society that wants to remain democratic”, the notion of “necessity” of the interference implies that interference refers to “a pressing social need”413 and is proportionate to the legitimate aim pursued.414

  Let us examine these two aspects:

  One of the requirements attached to the pressing social need – for which the states have some discretion while remaining dependent on the decisions of the Court – implies that the restriction of liberty ordered must meet this need. So, the measure must be effective.

  Second, the measure must be proportionate to the aim pursued. The Court has distinguished several criteria to assess the proportionality of a restriction. With regard to the screening procedures or removal of content, the Court will check in particular if the purpose of the interference can be satisfactorily achieved by other means less restrictive to rights.

  Are screening measures “a necessity in a democratic society”?

  Do screening measures meet the criteria of efficiency and proportionality? Are they needed in a democratic society? To answer, we must obviously take into account the purpose (child protection or copyright, for example) as well as technical solutions to prevent access to litigious content. In the case where we seek to prevent access to child abuse content, which is undoubtedly the most pressing need that has been argued to date to justify screening measures, these measures have very different “legitimate aims” that are included in paragraph 2 of Article 10 of the ECHR. These are the protection of morals and protection of the rights of others – especially children and sensitive people who may find such images extremely traumatic – and the prevention of crime and punishment. However, in each of these cases, technical problems with the screening procedures suggest that they are neither effective nor proportionate.

  The availability of technical means to bypass screening curtails the effectiveness of these measures. A well-known method, often used by political dissidents in authoritarian regimes, is, for example, to set up a proxy (or encrypted “tunnel”) to another computer or server connected to the internet. The criminal networks engaged in the business of child abuse content have long developed distribution channels impermeable to filtering techniques. Whether for prevention or suppression, filtering is totally ineffective in this regard.

  Proportionality of filtering measures is also strongly questioned because of their lack of accuracy in implementation. There is broad consensus among experts who emphasise that no methods to block access to content can eliminate the risk of over-blocking perfectly legal sites. Several cases of over-blocking have been identified. In the United Kingdom, Wikipedia, which is one of the busiest sites in the world, was blocked for almost three days in late 2008415 and blacklisted (secretly) by the Internet Watch Foundation (IWF), due to the publication of the original album cover of Virgin Killer by the rock band Scorpions, released in 1976. The cover shows a prepubescent girl posing naked. Because of these inevitable collateral effects, filtering is too dangerous compared to its objectives.

  Finally, when the ECtHR assesses the necessary action, it seeks to determine whether alternative measures that are less restrictive of the fundamental freedoms at stake can meet the pressing social need. From this point of view, other measures are more satisfying than the screening procedures. The first one is that the removal of content from servers should be accompanied by international cooperation.416 (A negative to this is that a study by two United States researchers shows that filtering has the effect of
discouraging the activation of international cooperation policies already in existence.)417 The second one is the possibility for users (parents) to install monitoring systems on their computers to block access. These filtering systems, on the edge of the network and much less intrusive, seem more proportionate to the objective.

  The procedural framework of attacks on freedom of communication on the internet: The role of ordinary courts

  Despite these factors, the French national legislature decided to address a pressing social need (the fight against child pornography) by restricting the freedom of online communication through content filtering. Article 4 of LOPPSI gives the executive power to delete information circulating on the internet. Contrary to its decision on HADOPI, the Constitutional Council has approved the legislation authorising the administrative authority to order measures that conflict with the freedom of online communication. The position of the Constitutional Council seems to be to find, for each case, a balance between protecting freedom of communication and other fundamental rights.

  However, the traditional role assigned to the judicial authorities in European law should disqualify the competence of non-judicial entities to impose restrictions of freedom of communication on the internet, and this a fortiori when these measures conflict with other fundamental rights, such as the right to respect for one’s private life.

  Three principles justify the exclusion of non-judicial authorities when it comes to deciding on cases concerning the restriction of freedom of expression:

  The declaration of illegality The jurisdiction of ordinary courts is primarily because the judge alone can declare a situation the illegal abuse of freedom. In all liberal democracies, only the judge has jurisdiction to establish the illegality of content, situation or action.

  The guarantees attached to any criminal charge Restrictions on freedom of online communication should be accompanied by the guarantee of a fair trial (Article 6 of the ECHR).418 Indeed, an administrative or judicial injunction of filtering, removing or blocking access to content, if it relates to offences of a criminal nature, seems to be a charge leading to the respect of guarantees attached to fair trial, including the right to be tried by an independent and impartial court.419

  Control of proportionality The control of proportionality of measures intended to respond to an abuse of freedom of communication is a function traditionally the responsibility of the ordinary courts in democracies.

  The role of prior judicial authority in monitoring violations of freedom of communication on the internet

  Given these different observations (declaration of illegality, the right to due process and control of proportionality), the judge’s role in monitoring violations of freedom of online communication seems essential.

  Because of their ineffectiveness and their disproportionate nature, the screening procedures proposed in LOPSSI do not seem able to meet European standards and should be discarded.

  Regarding the withdrawal of content, it seems more conceivable that the administrative authority may, for very serious offences, order a hosting provider to take down content. However, at this stage, concerned content will only be “potentially” illegal and the alleged offence needs to be prosecuted.420

  Beyond these considerations, signatories to the ECHR have discretion regarding the definition of serious offences that can be subject to restrictions of freedom on the part of the administrative authority as a precaution. In reality, this is a choice of a political nature. In 2009, during the review of the Telecoms Package,421 an amendment was made to this law twice (“Amendment 138”) stating that only the judiciary should be able to impose restrictions on freedom of communication on the internet:422

  No restrictions may be imposed on fundamental rights and freedoms of end users without a prior ruling by the judicial authorities, notably under Article 11 of the Charter of Fundamental Rights of the European Union on freedom of expression and information, except when public safety is threatened.

  It is regrettable that this principle has not been enshrined in European Community law. It would have allowed a rigorous defence of freedom of expression and communication in France.

  Action steps

  The freedom offered by the internet, such as free communication and other fundamental rights, must be strictly protected by law. The main issues to assert in the context of LOPSSI should include:

  A guarantee of the presumption of legality for any online publication

  We must oppose the requirement for filtering online content because it is disproportionate.

  Citizens must be sufficiently informed of orders to remove content, so that they can legally oppose it.

  Citizens must be sufficiently informed if their access to the internet is blocked, so that they can legally oppose it.

  The right to a fair trial must be guaranteed.

  The government should not be able to impose sanctions that have the effect of restricting freedom without trial.

  The opportunity to speak anonymously online must be guaranteed.

  INDIA

  The internet and the right to information in India

  Digital Empowerment Foundation

  Ritu Srivastava and Osama Manzar

  defindia.net