Online Intermediaries Research Project: Good Practice Document

This paper on good practices builds on the NoC Online Intermediaries Case Studies.

Good Practices

Photo: Flickr/Paul Lowry (CC BY 2.0)

Good Practices in Online Intermediary Liability Regimes

Authors: K.S. Park (Open Net, Korea University Law School), Aleksandra Kuczerawy (ICRI, KU Leuven)

Editors:
Prasanth Sugathan, Mishi Choudhary (Software Freedom Law Center of New Delhi)
Felix Krupar, Wolfgang Schulz (Hans-Bredow-Institut für Medienforschung)
Andy Sellars, Adam Holland (Harvard University Berkman Center for Internet and Society)

Table of Contents

Practice 1 (Basis for liability): Any intermediary liability shall require an element of knowledge of, or privity to, an unlawful online act, on the part of the intermediary.

Practice 2 (Function of Intermediaries): Given the social function of the Internet, an intermediary shall not be required, or incentivized through a liability scheme, to pro-actively monitor its services for illegal third party content or otherwise be required or incentivized to obtain knowledge about that content.

Practice 3 (Nature of Intermediaries): In consideration of an intermediary’s natural incentive to err on the side of deleting, a preferred liability scheme should not give an intermediary an adjudicator’s role and should require the intermediary to remove materials only following a court’s judgment.

Practice 4 (Minimum Requirements for Limited Liability): Imposing any prospective or retrospective liability on an intermediary should be conditional upon (a) proper notice of (b) the unlawful content or activity, and should be (c) proportional to the culpability thereof (d) and established by a court of law (e) pursuant to statute.

Practice 4A (Proper Notice): An intermediary should be held liable only when it continues to host or index the content after it was previously given clear notice of the specific content in question and its alleged illegality or has otherwise acquired equivalent knowledge.
Practice 4B (Unlawful Content): Only the contents or actions identified as unlawful in domestic law may form the basis of intermediary liability.
Practice 4C (Proportionality): The magnitude of the intermediary’s liability should be proportional to the harm resulting from the illegal content or action, balanced against the public interest in the content and culpability of the intermediary.
Practice 4D (Judicial Oversight): The liability of an intermediary for illegal content or activity should always be determined by a court of law. Orders to remove content should only be issued by independent courts.
Practice 4E (Legality): The conditions of imposing intermediary liability shall be set out clearly and transparently.

Practice 5 (Specific Good Practices for Safe Harbors): The government should adopt a “safe harbor” that (a) shall be clearly defined in the law, (b) preferably require that the intermediary connect the relevant parties to each other, rather than take content down on its own, (c) should not require the intermediary to adjudicate the legality of the content in question, (d) should allow takedown only if it also adopts a “put-back” provision, and (e) should provide a total immunity to claims when the intermediary is fully compliant with the safe harbor’s conditions.

Practice 5A (Legality): Safe harbors for exempting intermediaries and the conditions therefor shall be set out clearly in the law.
Practice 5B (Notice-To-Notice Preferred): Preferably, a safe harbor regime should require the intermediary to connect the relevant parties rather than take content down on its own.
Practice 5C (No Adjudication): An intermediary should not be required to correctly adjudicate the legality of the content in question to qualify for the safe harbor.
Practice 5D (Put-Back): Any safe harbor that requires an intermediary to take down content upon a complainant’s notice should also provide a mechanism for restoration of content upon the poster’s notice.
Practice 5E (Complete Defense): A safe harbor should provide total immunity to claims when the intermediary is fully compliant with its conditions.

Background

This paper on good practices builds on case studies on the governance of online intermediaries conducted in 2014 by a team of international researchers affiliated with the Network of Internet & Society Research Centers (NoC).[1] Those case studies discuss the effects of various approaches to online intermediary (OI) liability, some key characteristics of intermediary liability regimes, the options available to policymakers, and standards that can be identified. This paper also acknowledges that liability regimes affect fundamental rights, especially – but not restricted to – the freedom of communication and information.

In this paper, the authors use those case studies as a foundation from which to make aspirational recommendations for good practices regarding OI liability. The practices described here represent the best judgment of the authors, in light of the NoC studies and other academic literature. Under its own governance principles the NoC itself does not make any policy recommendations but merely serves as a platform to facilitate research.

The practices highlighted in this paper are not supposed to serve as strict guidelines that are to be incorporated in every working OI liability scheme, but merely represent a compilation of practices that have been extracted from the case studies and have been found by our observations to be well functional. Wherever the facts in the cases studies or related documents indicate that a governance mechanism serves its purpose and balances fundamental rights in an appropriate way, it constitutes a “good practice”. Because there is no empirically sound way to evaluate practices based on the case studies, this paper is offered as a basis for further discussion.

[1] https://publixphere.net/i/noc/page/Online_Intermediaries_Research_Project_Case_Studies

1. Basis for liability of Intermediaries

Practice 1: Any intermediary liability shall require an element of knowledge of, or privity to, an unlawful online act on the part of the intermediary.

Most constitutions of the world give a place of preeminence to general freedom of expression while being mostly silent about general freedom of physical action because all expressions are the results and vehicles of human perceptions, feelings, information, and opinions and do not directly cause harms to other persons, and therefore should be entitled to protection as such.[2] Therefore, if liability is imposed on anyone with reference to expressions, it is only the unlawful acts of making, or sometimes possessing, those expressions of certain content for which one should be held liable, not the contents themselves.

Online infringements most often arise out of making certain contents available to the public (hereinafter, “posting act” or “act of posting”) in a way that violates the law, rather than on the illegality of the content per se. Then, online intermediaries which did not itself engage in the posting act should be held liable only as an accomplice to that posting act.

Now, intermediaries such as (1) search engines, (2) content hosts, (3) ISPs, and (4) private messengers may access the contents themselves as long as they are posted publicly, but are usually not directly involved in the decision to post certain content. An intermediary should therefore not be considered an accomplice to that posting act and should not be held liable unless there is another act performed by the intermediary that transcends that intermediating function. This line is hard to draw but nevertheless important. An intermediary should be held liable for content only if found to have aided or abetted the unlawful posting act, and should not be held liable for the nonfeasance or non-action (i.e., by not preventing or not stopping the unlawful posting act).

There are some types of content that governments treat as inherently illegal, including child pornography, content with Nazi memorabilia in some European countries, or content that offends the monarchy in Thailand, where even mere possession is condemned.[3] In the authors’ opinion this does not mean that liability may be imposed on intermediaries for unknowingly hosting these contents. Even here, where the act of possession is itself criminal, basic tenets of criminal justice would require a demonstration of some knowledge before punishment. The same rule on basis of liability shall apply.

[2] Lüth Judgment BVerfGE 7, 198, 208 (1958) (English translation available through D. Currie, The Constitution of the Federal Republic of Germany 175 (1999)); Irwin Toy Ltd. v. Attorney-General (Quebec) [1989] 1 S.C. R. 927, 976.
[3] See Ramasoota, Pirongrong. “Online Intermediaries Case Study Series: Internet Intermediary Liability in Thailand”, The Global Network of Internet & Society Research Centers; and Kuczerawy, Aleksandra, and Ausloos, Jef. “Online Intermediaries Case Study Series: European Union & Google Spain”, The Global Network of Internet & Society Research Centers

2. Social Function of intermediaries

Practice 2: Given the social function of the Internet, an intermediary shall not be required, or incentivized through a liability scheme, to pro-actively monitor its services for illegal third party content or otherwise be required or incentivized to obtain knowledge about that content.

Lawmakers and courts all over the world have begun to realize that intermediaries perform a special function for public communication that is of a fundamentally different nature than mere facilitation of publication of content.[4] They also are starting to recognize that the role of intermediaries is in many ways significantly different from pure transport of data,[5] in terms of its importance as well as technical capabilities and social functions. To better understand the specific function is a key for developing an adequate liability system.

The empowering function of the Internet originates from its nature as an extremely distributed communication platform.[6] The high level of distribution allows each individual to participate in mass-scale communication without being subject to scrutiny before making content available to the public at large. Many sites allow individuals to post their views and stories without anyone’s approval and for everyone else to see and read those stories accordingly. This freedom of unapproved posting and unapproved viewing on these sites is the essence of the popularity of the Internet.[7] It is what has made the bulk of the world’s population the Netizens and what justified businesses’ and governments’ singular focus on such numbers such as page views and clicks.

At the same time, this freedom also means that unlawful content may easily be posted on the Internet without the knowledge of the relevant intermediaries. Holding an unknowing intermediary platform responsible for unlawful content placed on its service by third persons will deter those platforms from accepting content in an unrestricted manner and without prior review. If intermediaries are held liable simply by virtue of having hosted or transmitted unlawful content without their knowledge, they will have to protect themselves by constantly monitoring what gets posted on their services.[8] This will mean that all the content is made available upon tacit approval of the intermediary, which is given by not deleting it. What the users of the Internet see online will no longer be what the fellow users posted but what they posted upon editorial approval of the intermediaries. The power of the Internet – the ability to post and download unapproved – will suffer.

Also, if intermediary service providers were required to actively monitor content for illegality, intermediaries would be forced to switch from a neutral and passive role to a more active one. They would hence have to fulfill the duties of editors or publishers, exercising editorial control (See Practice 3.). Such a general monitoring obligation could also lead to censorship out of fear for liability.

Intermediary platforms are by their nature open for any kind of content. Even when they apply filtering algorithms (e.g., digital rights management) or limit the types of content permitted to be posted (e.g., only video files on YouTube), they still leave the space open for unapproved uploading and viewing within those clearly defined boundaries to which the platform’s users have implicitly consented by the very act of using the platform. If this openness changes, the massive real-time discussion, resource sharing, and agenda-setting open to all on a wide variety of social and political issues will no longer be possible, and the societal significance of the Internet may be lost.

It is for this reason that none of the countries analyzed in the accompanying case studies impose, for instance, secondary copyright liability on broadband providers for infringers that use their services or a related obligation to search for data packets of illegal content that they have processed.[9] The same reasoning should be extended to other types of intermediaries such as the providers of web applications that greatly facilitate the exchange of ideas and contents, e.g., “portals” and “search engines”.[10]

Because intermediaries have a structural function for Internet-based communication, a comparatively small increase in liability could have far-reaching implications. Although the free space created by these intermediaries may be abused by some, requiring the intermediary to monitor the space for unlawful content will be in turn equivalent to holding them liable for creating the space, and it will be too threatening to the future of the Internet, as many countries already decided.[11]

This is not to suggest, however, that voluntary monitoring should be prohibited in all instances. Some intermediaries offer hybrid services, instead of solely facilitating the access to third party content, and thus some degree of monitoring would be appropriate. Some countries even provide for liability exemption for the removal decisions resulting from their monitoring activities.[12] This is quite distinct from the harms that flow from mandatory monitoring.

[4] Holland, Adam, Chris Bavitz, Jeff Hermes, Andy Sellars, Ryan Budish, Michael Lambert, and Nick Decoster. “Online Intermediaries Case Study Series: Intermediary Liability in the United States”, The Global Network of Internet & Society Research Centers, p. 12-14; and Lemos, Ronaldo, and Carlos Affonso Pereira De Souza. “Online Intermediaries Case Study Series: Regulatory Framework and Case Law on Liability of Internet Intermediaries in Brazil”, The Global Network of Internet & Society Research Centers; and Zittrain J., A History of Online Gatekeeping, Harvard Journal of Law and Technology, Vol. 19, No. 2, 2006, p.258. CompuServe, 776 F. Supp.; Prodigy, 1995 WL 323710.
[5] OECD, Directorate for Science, Technology and Industry, Committee for Information, Computer and Communication Policy, The Role of Internet Intermediaries In Advancing Public Policy Objectives, Forging partnerships for advancing public policy objectives for the Internet economy, Part II, 22.06.2011, p. 10 – 14.
[6] OECD, The Economic and Social Role of Internet Intermediaries, Directorate for Science Technology and Industry, April 2010, p. 43.
[7] See Recommendation CM/Rec(2014)6 of the Committee of Ministers to member States on a Guide to human rights for Internet users (16.4.2014): https://wcd.coe.int/ViewDoc.jsp?id=2184807 and the Explanatory Memorandum (16.4.2014): https://wcd.coe.int/ViewDoc.jsp?Ref=CM%282014%2931&Language=lanEnglish&Ver=addfinal&Site=CM&BackColorIntern.
[8] CoE Human Rights Violations Online, drafted by European Digital Rights DGI(2014)31 (4.12.2014): https://edri.org/files/EDRI_CoE.pdf, p 8-12.
[9] Section 512(a) of the Digital Millennium Copyright Act; and Holland, Adam, Chris Bavitz, Jeff Hermes, Andy Sellars, Ryan Budish, Michael Lambert, and Nick Decoster. “Online Intermediaries Case Study Series: Intermediary Liability in the United States”, The Global Network of Internet & Society Research Centers.
[10] L. Edwards, The Role and Responsibility of Internet Intermediaries in the Field of Copyright and Related Rights, WIPO, June 22 2011, p. 14.
[11] OECD, Directorate for Science, Technology and Industry, Committee for Information, Computer and Communication Policy, The Role of Internet Intermediaries In Advancing Public Policy Objectives, Forging partnerships for advancing public policy objectives for the Internet economy, Part II, 22.06.2011, p.11.
[12] Holland, Adam, Chris Bavitz, Jeff Hermes, Andy Sellars, Ryan Budish, Michael Lambert, and Nick Decoster. “Online Intermediaries Case Study Series: Intermediary Liability in the United States”, The Global Network of Internet & Society Research Centers.

3. Nature of Intermediaries

Practice 3: In consideration of an intermediary’s natural incentive to err on the side of deleting, a preferred liability scheme should not give an intermediary an adjudicator’s role and should require the intermediary to remove materials only following a court’s judgment.

Even limiting intermediary liability to circumstances where the intermediaries have knowledge of illegal content should be done carefully. Experience has shown that the intermediaries under such liability schemes have the strong tendencies “err on the side of caution and therefore take down material which may be perfectly legitimate and lawful”.[13] This may result in false positives, where intermediaries block or delete lawful content upon unsubstantiated and possibly unreasonable demands by third parties, again further suffocating the essential freedom of the Internet.[14] It is important that intermediaries not be given the additional role of adjudicating disputes over online content, since experience shows that they will most likely wield this power to protect their own interests from potential liability,[15] often at the expense of the posters’ interest, thus exercising over-broad private censorship. Requiring the intermediaries to adjudicate disputes between parties has in many cases raised questions.[16] The public may question the legitimacy of such outcomes, and there is the risk of facilitating or enabling private censorship in addition to the tendency to delete content as the easiest and least costly option.[17]

It is therefore recommended that policymakers explore a liability scheme where intermediaries are not held liable for hosting or indexing actionable content, a scheme similar to CDA section 230.

This is not to say that, under a scheme similar to CDA Section 230, content hosts and search engines will be absolutely immunized from all liabilities.[18] For example, content hosts and search engines shall be responsible for complying with legitimate data requests ordered by the responsible judicial authorities, to facilitate holding a user accountable for his or her unlawful posting act. Also, for all contents that are finally or provisionally proven to be unlawful in courts of law, an intermediary may be required to remove actionable content.

However, under no circumstances should conduits (ISPs) be held liable for delivering actionable content. The end-to-end principle and its corollary, net neutrality, require that mere conduits and private messengers should be agnostic as to the content passing through their systems, and therefore should not be held responsible for its nature.

[13] Article 19, Internet Intermediaries: Dilemma of Liability, http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf, p. 11.
[14] Barceló R. J. and Koelman, K., ‘Intermediary Liability In The E-Commerce Directive: So Far So Good, But It's Not Enough’, Computer Law & Security Report 2000, vol. 4, pp. 231-239; Shielding the Messengers: Protecting Platforms For Expression and Innovation, CDT 2012, p. 21.
[15] L. Edwards, The Role and Responsibility of Internet Intermediaries in the Field of Copyright and Related Rights, WIPO, June 22 2011, p.12.
[16] Council of Europe (Council of Ministers), Declaration on freedom of communications on the Internet, 28.05.2003. Council of Europe, Human rights guidelines for Internet Service Providers – Developed by the Council of Europe in co-operation with the European Internet Service Providers Association (EuroISPA), July 2008, paras 16 and 24.
[17] Study on the liability of Internet Intermediaries, T. Verbiest, Markt/2006/09/E, 12.11.2007, p.15. Art 19 press release, 9 November 2010, at http://www.article19.org/pdfs/press/europeancommission- freedom-of-expression-needs-better-protection-in-digital.pdf.
[18] Holland, Adam, Chris Bavitz, Jeff Hermes, Andy Sellars, Ryan Budish, Michael Lambert, and Nick Decoster. “Online Intermediaries Case Study Series: Intermediary Liability in the United States”, The Global Network of Internet & Society Research Centers.

4. Limited Liability Scheme[19]

Practice 4: Imposing any liability on an intermediary should be conditional upon (a) proper notice of (b) the unlawful content or activity, and should be (c) proportional to the culpability thereof (d) and established by a court of law (e) pursuant to statute.

The role of intermediaries in the free flow of information online is different than the role of traditional publishers or broadcasters. Due to the lack of pre-publication editorial control, and the technical nature of the intermediary activities, intermediaries are not to be held responsible for screening content prior to its distribution and shall be responsible only for removing contents upon notice. Under the rule of law, imposition of liability must follow the principles of legal certainty, proportionality and due process. As a threshold matter, any form of liability based on the content of speech must be defined by law. In the specific context of intermediary liability, the rules for exemptions as well as the consequences of failure to comply with them must also be defined in legal statutes or be clearly set out in precedent.

[19] Center for Democracy and Technology categorizes intermediary liability regimes into three: (i) strict liability; (ii) safe harbor (e.g., DMCA notice-and-takedown); and (iii) immunity (e.g., CDA Section 230). Shielding the Messengers: Protecting Platforms For Expression and Innovation, CDT 2012. However, the report misses yet a fourth category fit to be named a “limited liability” regime and fit to be situated between strict liability and safe harbor where a general rule of negligence operates to hold an intermediary liable only for known third party contents.

4A. “Proper Notice”

Practice 4A: An intermediary should be held liable only when it continues to host or index the content after they were previously given clear notice of the specific content in question and its alleged illegality or have otherwise acquired equivalent knowledge.

A limited liability regime for Internet intermediaries was introduced in many countries as a compromise.[20] This compromise consisted of two basic Practices: (a) no general obligation to monitor content (see Practice 2); and (b) lack of responsibility of intermediaries for third-party content distributed on the Internet and for transactions taking place on their platform if the intermediary had not modified that content or was not aware of its illegal character (see Practice 1). The second is a corollary of the first: the possibility of being held liable for unmonitored content is equivalent to requiring an intermediary to monitor content. The first is also a corollary of the second: Holding an intermediary liable for ‘unaware’ content will force it to seek out illegal content. This form of immunity was meant to stimulate growth and innovation of the newly born Internet technology and provide positive incentives for further development.[21]

Therefore, knowledge is an essential requirement for intermediary liability (see Practice 1). Mere storage of illegal content (or activity), without an intermediary’s knowledge of the fact, should not lead to that intermediary’s liability. Intermediaries can obtain knowledge (or awareness) of the illegal content or activities in a number of ways. Apart from learning about illegal content or activity through their own voluntary monitoring mechanisms, they can be notified about the situation by a third party. Currently, most of countries foresee two possible types of third party: (a) the parties injured or threatened to be injured by specific content / administrative agencies responsible for and interested in combatting illegal contents or (b) administrative or judicial agencies in charge of enforcing the substantive laws that the illegal content or activity is alleged to be violating or have violated.[22] The possibility of filing a “private notice” or “administrative notice” has been recognized as the most efficient form of calling upon intermediaries to act. However, this solution often comes hand in hand with concerns regarding its effect on balancing of fundamental rights and specifically the risk of private censorship and the impact on the freedom of expression. Due to these concerns a slow shift can be observed in newer international OI liability regimes towards a system where private notifications are directed to courts who then decide about issuing a removal order.[23]

However, care should be taken to notice that administrative or private orders or notices shall be sufficient most times as a requirement for imposing the limited liability for contents later confirmed to be unlawful under Practice 4A although they should not have the compulsive power compelling the intermediary to take down that content, which can come only from the judiciary as set forth Practice 4D.

[20] See the Digital Millennium Copyright Act (DMCA) 1998 and the Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce), OJ L 178, 17.07.2000.
[21] OECD, Directorate for Science, Technology and Industry, Committee for Information, Computer and Communication Policy, The Role of Internet Intermediaries In Advancing Public Policy Objectives, Forging partnerships for advancing public policy objectives for the Internet economy, Part II, 22.06.2011, p. 11.
[22] See Holland, Adam, Chris Bavitz, Jeff Hermes, Andy Sellars, Ryan Budish, Michael Lambert, and Nick Decoster. “Online Intermediaries Case Study Series: Intermediary Liability in the United States”, The Global Network of Internet & Society Research Centers; and Kuczerawy, Aleksandra, and Ausloos, Jef. “Online Intermediaries Case Study Series: European Union & Google Spain”, The Global Network of Internet & Society Research Centers; and Park, Kyung-Sin. “Online Intermediaries Case Study Series: Not Just Backward but Going Back”, The Global Network of Internet & Society Research Centers; and Lemos, Ronaldo, and Carlos Affonso Pereira De Souza. “Online Intermediaries Case Study Series: Regulatory Framework and Case Law on Liability of Internet Intermediaries in Brazil”, The Global Network of Internet & Society Research Centers.
[23] See the discussion of the Marco Civil in Lemos, Ronaldo, and Carlos Affonso Pereira De Souza. “Online Intermediaries Case Study Series: Regulatory Framework and Case Law on Liability of Internet Intermediaries in Brazil”, The Global Network of Internet & Society Research Centers. See also the discussions of the Notice and Action Initiative in Kuczerawy, Aleksandra, and Ausloos, Jef. “Online Intermediaries Case Study Series: European Union & Google Spain”, The Global Network of Internet & Society Research Centers; and Kuczerawy A., Intermediary Liability & Freedom of expression: Recent developments in the EU Notice & Action Initiative, Computer Law and Security Review, Vol 31. Issue 1 2015, pages 46-56.

4B. “Unlawful Content or Activity”

Practice 4B: Only contents or actions identified as unlawful in domestic law may form the basis of intermediary liability.

Intermediaries may as a rule be held liable for third party content only if the posting of the content is unlawful (see Practice 1). Intermediary liability has traditionally been framed as “secondary” liability where “primary” liability lies with the individual or organization from whom the unlawful content or activity originated.

Intermediary liability for third party content should be attached only to the posted contents that have been found to be unlawful under domestic laws, whether under national data protection laws or copyright laws, to take just two examples. There are jurisdictions where otherwise lawful contents may still become the basis of intermediary liability. For instance, intermediaries are required to abate contents when such abatement is deemed “necessary for nurturing sound communication ethics”[24] or when someone merely claims to be injured by that content.[25] Under these laws, perfectly lawful contents become unlawful only when they go online. In other words, these laws discriminate against the Internet vis-à-vis other media. The reason for such discrimination is based on the scale and speed at which content travels through the Internet; however, it does not justify abating otherwise lawful contents.

The recent Google Spain decision from CJEU ruled that search engines can be required under EU data protection law to delink names used as search terms from results for specific webpages, even if the content of the webpages is lawful. The CJEU seems to have assigned primary liability to Google on account of its own data processing activities. Google, in this instance, was not considered a mere intermediary but an independent data controller responsible for its own actions. However, all intermediaries by definition process data in one way or another, either by hosting or searching certain illegitimate contents, and they do so often with the similar level of control or lack of control that Google exercises through its search engines. Therefore, many intermediaries risk also being considered as “data controllers”

Therefore, regardless of whether the rule of liability in the Google Spain decision is labelled “primary” or “secondary”, the consequence of it may be similar to imposing secondary, intermediary liability for contents that are themselves lawful. We believe that the only way out of this contradiction is for a data protection liability regime to clearly define which privacy violations justify delisting or removal, are i.e., unlawful under the data protection law for being “inaccurate”, “obsolete”, etc., and then define the requisite level of control by the intermediary to hold it responsible for being involved in such privacy violations.

Care should be taken to understand that the Google Spain decision did not impose damages on Google for its past processing activities. Still, knowledge can be imputed once a request for delisting has been made. If a search engine provider refuses to delist in cases where the request is justified, it may risk monetary liability.

The delinking application must be balanced with other fundamental rights at stake, such as the public interest to have access to information, in order not to disturb the liberating potential of the Internet described in the discussion of Practice 2. We believe that we can define what data is subject to the right to be forgotten or otherwise invades another’s privacy if and when disclosed publicly. We also acknowledge that each processing activity (even for the same data) differs with regard to the context of the processing and the legal ground on which it is based. We propose that, whatever we do, we remain true to the conceptualizing of intermediary liability for third party content as secondary liability and, in order to do that, define which contents are illegitimate, and which level of conscious data processing activities justifies holding the intermediaries liable as data controllers.

[24] K.S. Park, “Administrative Censorship in Korea”, [http://opennetkorea.org/en/wp/administrative-censorship]
[25] K.S. Park, “Intermediary Liability in Korea: Not Backward but Going Back”, [http://opennetkorea.org/en/wp/main-free-speech/intermediary-liability-korea-2014], also published as “Online Intermediaries Case Study Series: South Korea”, The Global Network of Internet & Society Research Centers.

4C. “Proportionality”

Practice 4C: The magnitude of the intermediary’s liability should be proportional to the harm resulting from the illegal content or action, balanced against the public interest in the content and culpability of the intermediary.

Any type of liability should be proportionate to the harm resulting from the illegal content or activity and the culpability of the intermediary. Civil and administrative liability should be distinguished from criminal liability, which should be applied with extreme caution and concern for the chilling effects that might result from such prosecution. Moreover, the sanction should not be so severe that it subsequently limits the freedom of expression enjoyed by content providers.

Also, if the content is not unlawful but merely undesirable for a specific audience or in a specific context a lighter form of response might be considered. This rule may be applicable not just to retrospective liability (holding the intermediaries liable for the past processing of third party content) but also to prospective liability (forcing intermediaries to take down content). For instance, access restrictions by age should be applied in compliance with the applicable laws and with respect to fundamental rights and freedoms.

When assessing whether a response is proportionate, consideration must be given to the rights of the public to impart, receive, and access information, as well as other fundamental rights. The applicable law shall provide guarantees for allowing access to information of public interest.

4D. Judicial Determination

Practice 4D: The liability of an intermediary for illegal content or activity should always be determined by a court of law. Orders to remove or restrict content should only be issued by independent courts.

This statement of Practice consists of two sentences. The first is a restatement of the requirement of rule of law. It is the second statement that needs elaboration: Independent of whether an intermediary is held liable for third party content, there are times that the states require directly the intermediaries to remove certain content to prevent what they believe to be future damages or future law violations, separately from imposing civil or criminal liability on the posters or the intermediaries for the past offenses. When these injunctions are not well contained, they can replicate the censorial effects of bad intermediary liability regimes in a more direct manner without going through the incentivizing of the intermediaries – contents being taken down to the detriment of the social function of the Internet.

In such situations, an ultimate order compelling an intermediary to remove (ultimate in a sense that any refusal to comply with the order is deemed unlawful regardless of the substantive legitimacy of the intermediary’s action or non-action) can be imposed solely by independent courts of law on an intermediary who has knowledge of the illegal content or activity. In France, HADOPI was one of these censorial administrative bodies for copyright protection purposes only, but after the 2009 unconstitutionality decision of the Constitutional Council[26], HADOPI was reorganized so that its decision attains force only after court approval. The Council had said that basic rights infringement can take place only through a court of law. In 2014, the Philippines Supreme Court also analogized administrative take-downs to search and seizure and ruled that administrative takedowns violate the warrant doctrine.[27] Fortunately, the number of administrative bodies issuing content takedown orders is enlighteningly small, e.g. Korean Communication Standards Commission, Turkey's Information Communications Technology Authority (ICTA)[28], and Australia Communication and Media Authority. Significantly, Marco Civil da Internet of Brazil also states that an intermediary shall not be liable for third party content unless a court has issued a removal order.[29]

Also, orders to remove or block contents that are otherwise lawful violate the principle that intermediary liability should remain as secondary liability, that is, secondary to the liability of the content originator. The Google Spain decision is challenging in this regard because it left the original source content intact but mandated the action of the intermediary. Such rule will be justified only by an argument that finds the harm in the data processing of Google, but not in the content itself, but such argument again brings us to the dilemma that being an intermediary always entails such data processing, discussed in Practice 4B.

The principle of proportionality dictates that, if the contents are not unlawful but only harmful to minors, block/delete injunctions should not be issued but only injunctions restricting minors’ access restrictions shall be issued. Also, these injunctions should be proportionate to the effect desired as in the case where HADOPI lost its power to request disconnection users because the three strikes mechanism had failed to benefit authorized services as promised.[30]

[26] http://www.conseil-constitutionnel.fr/conseil-constitutionnel/root/bank_mm/anglais/2009_580dc.pdf [last visited December 30, 2014]
[27] http://www.rappler.com/nation/special-coverage/cybercrime-law/51197-full-text-supreme-court-decision-cybercrime-law] [last visited December 30, 2014
[28] http://eng.btk.gov.tr/ [last visited December 30, 2014]
[29] Article 19 In order to ensure freedom of expression and prevent censorship, the provider of internet applications can only be subject to civil liability for damages resulting from content generated by third parties if, after an specific court order, it does not take any steps to, within the framework of their service and within the time stated in the order, make unavailable the content that was identified as being unlawful, unless otherwise provided by law https://www.publicknowledge.org/documents/marco-civil-english-version.
[30] Decree No. 2013-596 of July 8, 2013, removing the Complementary Penalty of Suspending Access to an On-Line Public Communication Service and relating to the Information Transmission Methods provided for in Article L. 331-21 of the Intellectual Property Code: http://www.wipo.int/wipolex/en/details.jsp?id=13838

4E. Legality

Practice 4E: The conditions of imposing intermediary liability shall be set out clearly and transparently.

Legal certainty requires that it be clear to all actors involved what kind of content and activities are prohibited.[31] Any restriction upon the dissemination of content should be prescribed by law or relevant regulations. This means that a restriction should have basis in domestic law (e.g. legal statutes or precedents). Moreover, it should be accessible to the concerned actors, who must be able to foresee its consequences. An OI regime that is based on uncertain requirements or unclear guidelines encourages over-enforcement and hinders participation and innovation.

[31] See Ramasoota, Pirongrong. “Online Intermediaries Case Study Series: Internet Intermediary Liability in Thailand”, The Global Network of Internet & Society Research Centers; and Kuczerawy, Aleksandra, and Ausloos, Jef. “Online Intermediaries Case Study Series: European Union & Google Spain”, The Global Network of Internet & Society Research Centers; and Arun, Chinmayi, and Sarvjeet Singh. “Online Intermediaries Case Study Series: Online Intermediaries in India”, The Global Network of Internet & Society Research Centers.

5. Specific conditions for exemptions from liability (“safe harbors”)

Practice 5: The government should adopt a “safe harbor” that (a) shall be clearly defined in the law, (b) preferably require that the intermediary connects the relevant parties rather than take content down on its own, (c) should not require the intermediary to adjudicate the legality of the content in question, (d) should allow takedown only if they also adopt a “put-back” provision, and (e) should provide a total immunity to claims when the intermediary is fully compliant.

Limited liability of some form is already practiced by courts of many countries that have extended online tort doctrines or accomplice liabilities, which we recommend to conform to Practice 4 above. However, some countries go beyond that in protecting and promoting the liberating potential of the Internet. They offer safe harbors where intermediaries can shed the risks of even limited liability by following certain clearly defined procedures carefully designed to balance the rights of the posters and the possible harms of the content.

The case studies offer examples of regulatory systems that address intermediary liability by creating safe harbor rules under which intermediaries can enjoy exemptions from liability.[32] There are various forms of safe harbor rules, and while establishing such a regime can be considered as good practice in general, the effectiveness depends on their specific conditions.

Safe harbors in most jurisdictions[33] are dependent on the intermediaries complying with conditions like acting on take-down requests within a time-frame. These conditions should not require the intermediaries to decide on the legality of the content prior to acting on a take-down request. The ultimate legality of the content should only be decided by a judicial body and not by intermediaries.

Based on the insights we have gleaned from the case studies[34] and considering the relevance of procedural guarantees for fundamental rights we propose the following as good practices.

[32] U. Gasser & W. Schulz, Governance Of Online Intermediaries: Observations From A Series Of National Case Studies. p. 8 et seqq.
[33] See DMCA and EU e-Commerce Directive.
[34] See Online Intermediaries Case Study Series.

5A. Legal certainty

Practice 5A: Safe harbors for exempting intermediaries and the conditions therefor shall be set out clearly in the law.

As is the case for ascribing liability to intermediaries discussed in Section 4E, the requirements for a safe harbor must clearly stipulate the procedure to be followed by all parties, including requirements for a valid notice, duties of the intermediary upon receipt of notice, and benefits conferred by the safe harbor.

5B. Duty of Intermediaries: “Notice-to-notice”

Practice 5B: Preferably, a safe harbor regime should require the intermediary to connect the relevant parties rather than take content down on its own.

A safe harbor’s ultimate goal should be to encourage the intermediary to facilitate communication between the complainant and poster of allegedly unlawful content, so the proper parties may address the dispute.

In the opinion of the authors, a safe harbor regime should not require that an intermediary discloses the identity of a poster, nor require that the intermediary takes down the material in question pending resolution. So long as the intermediary provides a channel of communication to both parties, safe harbor should be made available. This “notice-to-notice”[35] is preferred to “notice-and-takedown”. Scholars have criticized systems where intermediaries are called upon by the interested parties (private or state) to remove or block access to certain content.

In a well-functioning safe harbor regime, the intermediary should internally identify the origin of the concerned content upon receipt of a complaint, and alert its poster that a complaint has been received. Any response made by the poster should likewise be conveyed to the complainant. Should the poster not respond within a reasonable period, the intermediary should be allowed to take down the content at its discretion pending resolution of the matter.

[35] The Canadian system under its Copyright Act Sections 41.25 through 41.26 is an existing notice-to-notice system. Care should be taken to understand that it imposes positive obligations on the intermediaries to relay the notice to the poster, nonfulfillment of which will result in hefty fines. However, an obligation to notify the poster causes minimal risk on the Internet’s function. What is significant, Section 31.1 grants CDA 230-type immunity to the providers, from liability “by virtue of” being hosting, caching, etc. Therefore, when it comes to intermediary liability for third party content, Canada is better categorized as an “immunity” regime, on top of which “notice-to-notice” obligations are imposed.

5C. Intermediaries not to adjudicate

Practice 5C: The safe harbor should not require the intermediary to correctly adjudicate the legality of the content in question.

As noted above in Practices 2 and 3, intermediaries should not be placed in the position of adjudicating content, and as noted in Practice 5B, a preferred role for intermediaries is to connect the disputing parties and leave them to resolve the underlying issue. In the event that a notice-and-takedown regime is chosen over a notice-to-notice regime, however, intermediaries shall not lose the qualification for safe harbor for negligently deciding on the legality of the content. As noted above, intermediaries are not legal functionaries, and are therefore ill-equipped to perform adjudicative roles. To the extent a regime requires them to make a judgement, they should not be penalized for reaching a conclusion different than a subsequent court. The post hoc determination of legality or illegality of content by an adjudicatory body should not alter the safe harbor protection to intermediaries.

5D. Put-back provision

Practice 5D (Put-Back): Any safe harbor that requires an intermediary to take down content upon a complainant’s notice should also provide a mechanism for restoration of content upon the poster’s notice.

A safe harbor regime that is conditioned upon an intermediary’s take-down of content upon receipt of notice from a complainant must also contain provisions that enable such content to be restored upon receipt of a response from the poster of the content in question. Any take-down of content that is undertaken prior to issuance of a final adjudicatory verdict on its legality should be seen as tentative, and the law must make room for its restoration, should the poster provide a justification. This will again ensure that intermediaries are not made the final adjudicators of content-legality, rightfully leaving such decisions to be made by adjudicatory bodies.

5E. Exemption from legal liability

Practice 5E: A safe harbor should provide a total immunity to claims when the intermediary is fully compliant with its conditions.

In the interest of legal certainty it is essential that there is no leeway for interpretation as regards the conditions for safe harbor; furthermore the applicability of a safe harbor should not depend upon a subjective determination of compliance, or an equitable test in light of unarticulated factors (see Section 5A). Once it has been proven before the concerned adjudicatory forum that an intermediary has complied with all conditions applicable to safe harbor protection, the intermediary shall be completely exempt from liability that may arise from the content in question. Otherwise, uncertainty in the scope of exemption will only end up strengthening the intermediaries’ tendency to overcensor, and the safe harbor regulation can be a gateway for arbitrary decision with possible structural effects on public communication.

Under no circumstance will an intermediary, which stands in compliance with conditions for safe harbor, be made liable for content that has been posted by a user and adjudged as unlawful. In the overwhelming number of cases, by complying with conditions and following mandated procedures, the intermediary should be understood to have had no role to play in publication of unlawful content, save the provision of an enabling platform for end-users. Consequently, the intermediary should incur no liability that arises from content posted by individual users distinct from the intermediary itself.