Author: Shreya Gupta, Dharmashastra National Law University
TO THE POINT
“Intermediary Liability” is a provision of holding “intermediaries,” which are facilitators of data flow in the internet world, responsible for any kind of unlawful act carried out by their users in the virtual world. The “Safe Harbour” provision is granted by the Government of India under the Information Technology Act of 2000 under Section 79. What this basically provides is immunity to platforms like Social Media (Facebook, X) and E-commerce (Amazon, Flipkart) as well as ISPs (Internet Providers) like Airtel and Jio, as they won’t be considered responsible for any unlawful activities carried out by their users in the form of content hosted on their platforms. Of course, there’s a “Due Diligence” provision a platform has to keep in mind in order to get protected by this immunity provision. If a platform fails to disregard a court order and a notice about any unlawful content hosted by them by a court of law or a governing authority in the country, as well as when they directly take part in authoring content themselves, they won’t be protected by this immunity provision and can be prosecuted as if they had committed any of these acts themselves.
ABSTRACT
The rationale underlying the digital economy is “punish the sinner and not the postman.” Due to ambiguity in their legal status, in the early days of the internet, platforms habitually operated a regime of “censorship by proxy,” whereby content that was controversial was taken down without anyone needing to ask for it, because taking down ensured protection from legal liability. Section 79 in its original construct was designed to remedy this by enshrining the platform as a neutral facilitator of information exchange. In effect, as the internet moved from simple message boards to complex algorithms and AI-driven curation, the legal regime has moved from “hands off” to “cooperative oversight.”
The operationalization of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, is a historic moment in this journey. The Rules carve out the intermediary based on their number of users and are more burdensome on the “Significant Social Media Intermediaries.” For now, the object of the law is premised on the “Duty of Care.” This may be articulated as it is not liable for the content of that post but is the proprietor of the “digital infrastructure” and thus, by extension, the social and legal obligation to ensure that it is not used to impact national security, public order, or the dignity of the person. Here, there is a thin line that distinguishes Freedom of Speech under Article 19(1)(a) and State regulation over cyber space.
USE OF LEGAL JARGON
In this investigation of the legal environment provided by the IT Act, it is necessary to traverse a nuanced field of specialized language to circumscribe digital responsibility. Safe Harbour Provision is a critical part of this context as it provides a legal waiver of liability towards third-party information. This safe harbor does not come as an absolute reprieve but in a conditional framework of liability. In order to qualify as a Safe Harbor Provider under this act on online platforms, it must emerge as a Mere Conduit Provider as it simply acts as a passive conduit to online services. In all other ways, if it matures beyond this stage to a level of editorial control over online activities, it simply will not remain in this protected category.
The activation of this obligation on the intermediary is called Actual Knowledge, which has seen considerable controversy over time regarding whether a private Complaint would be considered sufficient grounds for establishing this knowledge and therefore holding a hosting platform liable for hosting abusive material on their website. The judiciary has clarified this over time and confirmed that “actual knowledge is established only when a Court Order and a formal notice from a Sovereign Agency are received”. Upon receipt of this knowledge, there then has to be an Expeditious Takedown notice within a thirty-six-hour notice period to ensure the intermediary is acting under the boundaries of the law.
Additionally, Due Diligence is also used as a functional standard for digital platforms. Due Diligence is a preventative duty and is more of a system implementation and not a one-off process. Due Diligence requires, as a mandatory requirement, setting up a Grievance Redressal system and a Chief Compliance Officer for bigger digital platforms who is individually held accountable for ensuring legal compliance by his/her organization. The terminology has evolved over the last few years to encompass Traceability, which requires one to trace back and identify the First Originator of any message communicated. The use of this terminology has precipitated disputes involving state security and End-to-End Encryption protocol.
Finally, the law makes a differentiation between different classes of entities through the term Significant Social Media Intermediary (SSMI). This classification is based on a specific user threshold and carries with it greater burden of Proactive Monitoring and transparency. Whereas smaller intermediaries are held at a standard of “passive” compliance, SSMIs are expected to use automated tools to filter out harmful content, which shifts the legal expectation from a purely reactive stance to one of Algorithmic Accountability. Understanding these terms is particularly important, as they represent the specific legal “hooks” that determine whether a multi-billion-dollar corporation or a small web host finds itself in criminal or civil litigation.
THE PROOF
For availing the immunity under Section 79, the onus of proof rests on the intermediary to demonstrate that it has fulfilled the criteria set out under Section 79(2). First of all, the intermediary has to demonstrate that its activities were restricted to facilitating access to a communication network involving information hosted by third parties that is streamed on such a network or transiently stored on it. This signifies that it is a “neutral” intermediary.
Second, the intermediary is required to prove three kinds of negative covenants. It is assumed that it did not originate the transmission, it did not choose the person receiving the transmission, and it did not alter the information within the transmission. A third party that alters a message posted by a user on a messaging platform will no longer be considered an intermediary; it will be considered an “author.”
Thirdly, the intermediary needs to establish “observance of due diligence.” This requires adherence to compliance reports published periodically, availability of an effective grievance mechanism, as well as proof of acting in an expedited manner to remove or disable access to unlawful content as soon as it acquired “actual knowledge.” In regard to Significant Social Media Intermediaries (SSMIs), they must additionally establish meeting all requirements for designating a Chief Compliance Officer, a Node Contact Person, as well as a Resident Grievance Officer, all of whom must reside in India.
CASE LAWS
Avnish Bajaj v. State (NCT) of Delhi (2005): This is popularly referred to as the Bazee.com case. The CEO of this auction site was arrested because one of the users listed a pornographic clip for sale. The court, in the absence of proper differentiation of the law between the platform and the user, initially held the CEO liable. This case became the reason for the 2008 amendment that finally brought about the present Section 79.
Shreya Singhal v. Union of India (2015): The most critical judgment in Indian IT law thus far, the Supreme Court shielded intermediaries by ruling that “actual knowledge” of illegal content must come either from a court order or government notification. This prevented platforms from becoming “private censors” who had to judge the legality of every user complaint.
Myspace Inc. v. Super Cassettes Industries Ltd. (2016): Delhi High Court applied the “Safe Harbour” principle to copyright. It held that Myspace cannot be made liable for copyright infringement by its users unless it had actual knowledge of the infringing data and did not act.
CONCLUSION
Intermediary liability in India is now moving from a state of immunity to that of “conditional compliance.” The “Safe Harbour” principle is still the foundation of the online world, and it is because of it that platforms are able to give a voice to so many people and not have the perpetual fear of being taken to court. Today, platforms are no longer referred to as “invisible pipes” but as gatekeepers, as made clear by the 2011 and 2013 changes in the Information Technology Rules, or more specifically, by the 2021 and 2023 changes in the IT Rules, because, as it is now clear, “with great power comes great responsibility.”
The future of impunity in liability cases is likely to be shaped by what is termed as “Digital India Bill,” which seeks to supercede IT Act of 2000, which is long overdue. Not only is it likely to tighten definitions of intermediaries (AI bots, ad tech networks, et al.), but it also proposes harsh penalties for non-compliance. The new law firmly seeks to secure “Digital Nagrik” (Digital Citizen) by ensuring platforms are safe, accountable, and transparent and at the same time protect the “Right to Freedom of Expression.”
FAQS
QUESTION 1: WHAT DOES “SAFE HARBOUR” PROTECTION MEAN FOR INTERMEDIARIES?
Safe Harbour is a sort of shield provided to intermediaries in Section 79 of the Information Technology Act. According to this shield, Google or X will not be sued for an illegal post made by a user if it acts in accordance with government regulations regarding due diligence and takes down the posting based on a valid court order.
QUESTION 2: CAN A THIRD PARTY BE MADE LIABLE FOR A POST MADE BY A USER IF THEY FAIL TO DELETE IT?
The immunity of a third party may cease if they fail to “expeditiously” remove or disable access to an “unlawful” post after obtaining “actual knowledge” through a court order or a government notice. The website can be taken to court and prosecuted under the IPC for hosting any unlawful content.
QUESTION 3: WHO IS A “SIGNIFICANT SOCIAL MEDIA INTERMEDIARY” (SSMI) UNDER THE 2021 RULES?
The SSMI would mean a social media intermediary who has more than five million registered users in India. These are subject to more stringent compliance requirements, such as India-based appointments of Compliance Officer, Nodeal Officer, and Grievance Officer; and periodic publication of reports indicating the number of complaints received and actions taken.
QUESTION 4: DOES THE IT ACT ALLOW THE GOVERNMENT TO ASK FOR THE “ORIGINATOR” OF A MESSAGE?
Under the IT Rules 2021, the government can order a “Significant Social Media Intermediary” providing messaging services to identify the “first originator” of information. This, however, is restricted to serious offenses like threats to national security, public order, or sexual violence, though it remains legally contested regarding privacy.