India is the world’s largest open Internet society. The Digital India has enabled the empowerment of the common man. The extensive spread of mobile phones, Internet etc. has also enabled many platforms to expand their footprints in India.
These platforms are associated with a bevy of benefits and risks; and they give rise to new concerns, which have been raised from time to time in various forums including in the Parliament of India and its committees, judicial orders and in civil society deliberations in different parts of India. Prime among them is abuse of social media to share morphed images of women and contents related to revenge porn have often threatened the dignity of women and therefore, it is important to prevent the dissemination of such content.
To do the same, on February 25, 2021, the Ministry of Information, Government of India enacted the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules 2021”). Part I of Rules lays down the definitions of terms, while Part II and Part III lay down the compliances and requirements. In the context of the present case, Part II of the Rules is what is relevant, as it deals with the regulation of intermediaries, including social media intermediaries including but not limited to messaging-related intermediaries, such as WhatsApp, Signal and Telegram, and media-related intermediaries, such as Facebook, Instagram and Twitter. For the present case, it is relevant to elucidate on the following rules enshrined by the new IT Rules, 2021:
Classification of Intermediaries
The 2011 Rules regulated “intermediaries” without any classification or distinction between said intermediaries and in terms of their user base or the content hosted on their platform; however, the 2021 Rules classify the regulated entities into the following types:
- Social media intermediary with less than 50 lakh registered Indian users;
- Significant social media intermediary (“SSMI”) with more than 50 lakh registered Indian users;
- Publisher of news and current affairscontent including news aggregators;
- Publisher of online curated content which covers all online streaming platforms including Over-the-Top (‘OTT’) platforms.
Due Diligence
Rule 4 of the IT Rules, 2021 is concerned with due diligence that is to be followed by the intermediaries. It also enunciates with regard to the umbrella of the safe harbour provisions that are articulated under Section 79 of The Information Technology Act, 2000, i.e. if the intermediaries observe legal due diligence on their part, they will be entitled to safe harbour protections from liability in relation to any third-party information, data, or communication link made available or hosted by them insofar as they also meet the content neutrality conditions under the Act. The due diligence to be observed by intermediaries includes:
- informing users about rules and regulations, privacy policy, and terms and conditions for usage of its services;
- blocking access to unlawful information within 36 hours upon an order from the Court, or the government;
- and retaining information collected for the registration of a user for 180 days after cancellation or withdrawal of registration. Intermediaries are required to report cybersecurity incidents and share related information with the Indian Computer Emergency Response Team;
- No such order is required when a complaint is received about sexual imagery wherein the intermediary must take down such content within 24 hours of the receipt of the complaint; Intermediaries are also required to provide any information under their control or possession, within 72 hours of receipt of an order in this regard, to a government agency for investigation, detection or prevention of cyber security incidents or offences under any law.
Transparency
An SSMI will be subjected to a greater standard of transparency and accountability towards their users. They shall have to fulfil by publishing six-monthly transparency reports, where they have to outline how they deal with requests for content removal, how they deploy automated tools to filter offensive content, and so on. Other requirements under this transparency principle include giving notice to users whose content has been disabled, allowing them to contest such removal, etc.
Chief Compliance Officer
An SSMI is further required to be in compliance with additional obligations including the appointment of a chief compliance officer who will be liable for the failure of an intermediary to observe due diligence and a nodal contact person (who should be available 24*7) to ensure compliance with orders of courts and to coordinate with law enforcement agencies and is also required to establish a physical contact address in India.
The Delhi High Court has established guidelines in accordance with the IT Rules and its provisions aforementioned which are to be followed by courts while dealing with cases related to the removal of objectionable content from the internet to ensure removal of such offensive material at the earliest, along with limiting access to and redistribution of said material.
In the case of X v. Union of India and Others[1] the Court had to deal with a matter in which a woman had her photographs and images that she had posted on her private social media accounts on ‘Facebook’ and ‘Instagram’ have been taken without her knowledge or consent and have been unlawfully posted on a pornographic website by some miscreants and despite court orders, the content could not be removed in entirety from the world-wide-web and “errant parties merrily continued” to re-post and redirect the same to other sites.
Justice Anup Jairam Bhambhani, in his judgement began with a poignant remark, i.e. “The internet never sleeps ; and the internet never forgets!” The Court relied on various judgements across different jurisdictions to paint a more coherent picture of the state of governance when it comes to regulation of offensive content. It relied on the judgement in X. vs. Twitter Inc.[2] where the Supreme Court of New South Wales stated the following: “Where a third party such as Twitter comes into possession of confidential information and is put on notice of the character of the information and the circumstances in which it was unlawfully obtained, it becomes subject to an equitable obligation of confidence. It is liable to be restrained from publishing the information.” “…there is a public interest in making the proposed orders; in demonstrating that wrongful conduct will be remedied as effectively as can be achieved; and in ensuring that the plaintiff’s rights are respected to the extent that it is possible to do so. The plaintiff should not be left without a remedy.”
Similarly, in Google Spain SL, Google Inc. vs. Agencia Española de Protección de Datos (AEPD), Mario Costeja González[3] it was held that“the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.”
And finally, the High Court, referred to Eva Glawischnig-Piesczek vs. Facebook Ireland Limited[4] in which it was categorically stated that “in order to ensure that the host provider at issue prevents any further impairment of the interests involved, it is legitimate for the court having jurisdiction to be able to require that host provider to block access to the information stored, the content of which is identical to the content previously declared to be illegal, or to remove that information, irrespective of who requested the storage of that information.”
The Court followed the following judgements with a reference to judicial decisions within India itself, including Shreya Singhal vs. Union of India[5], ABC vs. DEF & Ors.[6] and YouTube LLC & Anr. vs. Geeta Shroff[7]. It relied on the judgement laid down under Swami Ramdev & Ans. vs. Facebook, Inc. & Ors.[8] by the Delhi High Court itself, saying: “The removal and disablement has to be complete in respect of the cause over which this Court has jurisdiction. It cannot be limited or partial in nature, so as to render the order of this Court completely toothless.”
With the help of the aforementioned judicial decisions, the Delhi High Court proceeded to lay down the following guidelines for removal of offensive content
- the court may issue a direction to the website or online platform on which the offending content is hosted, to remove such content from the website or online platform, forthwith and in any event within 24 hours of receipt of the court order. Since this timeframe is mandated in Rule 3(2)(b) of the 2021 Rules read with Rule 10 of the 2009 Rules for other similar kinds of offensive content;
- A direction should also be issued to the website or online platform on which the offending content is hosted to preserve all information and associated records relating to the offending content, so that evidence in relation to the offending content is not vitiated;
- A direction should also be issued by the court to the search engine(s) as the court may deem appropriate, to make the offending content non-searchable by ‘de-indexing’ and ‘dereferencing’ the offending content;
- The directions issued must also mandate the concerned intermediaries, whether websites/online platforms/search engine(s), to endeavour to employ pro-active monitoring by using automated tools, to identify and remove or disable access to any content which is ‘exactly identical’ to the offending content;
- Directions should also be issued to the concerned law enforcement agency/ies, such as the jurisdictional police, to obtain from the concerned website or online platform all information and associated records, including all unique identifiers relating to the offending content such as the URL (Uniform Resource Locator), account ID, handle name, Internet Protocol address and hash value of the actual offending content along-with the metadata, subscriber information, access logs and such other information;
- The court must direct the aggrieved party to furnish to the law enforcement agency all available information that the aggrieved party possesses relating to the offending content;
- The aggrieved party should also be permitted, on the strength of the court order passed regarding specific offending content, to notify the law enforcement agency to remove the offending content from any other website, online platform or search engine;
- The court may also direct the aggrieved party to make a complaint on the National Cyber-Crime Reporting Portal
Most importantly, the court must refer to the provisions of section 79(3)(a) and (b) read with section 85 of the IT Act and Rule 7 of the 2021 Rules, whereby an intermediary would forfeit the exemption from liability enjoyed by it under the law if it were to fail to observe its obligations for removal/access disablement of offending content despite a court order to that effect.
References :
[1] W.P.(CRL) 1082/2020 & Crl. M.A. Nos.9485/2020, 10986-87/2020.
[2] (2017) NSWSC 1300.
[3] Case C-131/12; ECLI:EU:C:2014:317.
[4] Case C-18/18; ECLI:EU:C:2019:821.
[5] (2015) 5 SCC 1.
[6] CS(OS) No.160/2017.
[7] 2018 SCC OnLine Del 9439
[8] 2019 SCC OnLine Del 10701.
By:
Vijay Pal Dalmia, Advocate
Supreme Court of India & Delhi High Court
Email ID: vpdalmia@gmail.com
Mobile No.: +91 9810081079
If you found this article helpful, you may be interested in Advocate Vijay Pal Dalmia, along with Advocate Siddharth Dalmia‘s book, “A Guide to the Law of Money Laundering”. This comprehensive guide provides even more in-depth information on how to recognize and prevent money laundering. It’s packed with practical tips and advice for staying one step ahead of financial criminals.