AUTHOR: UDIT NAYAK BA.LLB(H) SCHOOL OF LAW MANAGALAYATAN UNIVERSITY JABALPUR
ABSTRACT
This Article evaluates the supervision difficulties of major social media networks Instagram, Facebook, Snapchat, and Telegram concerning India’s illegal content and stag and sexual material. The legal operational status of these social media platforms does not stop them from persistently failing to enforce content standards that comply with Indian law. The research evaluates Indian content regulation through investigation of statutes and their enforcement approaches together with key liability components as well as precedential case results which shape the regulatory landscape. The analysis focuses on two main points: prohibited content normalization and uncoherent content moderation systems and problems with jurisdiction that help platforms avoid content regulations. The article encourages nonsupervisory oversight strengthening while protecting digital rights through final recommendations about platform self-regulation and government intervention.
Introduction
The Regulatory Paradox
The social media usage rate in India remains unidentified for 2025 as Facebook, Instagram, Snapchat and Telegram unite to service more than 600 million active Indian users. The digital revolution which enhances communication and sharing has simultaneously created several unknown non-governmental management problems because of unlawful and sexual content material that violates Indian laws.
The Information Technology Act 2000 (as amended) alongside the Information Technology (Central Guidelines and Digital Media Ethics Code) Rules 2021 and provisions of the Indian Penal Code exist yet they fail to enforce their strict guidelines effectively. These platforms have invented advanced technical methods to violate legal conditions but retain access to content that violates Indian community standards and national laws.
Sleeping platforms state their dedication to content moderation in public whereas they simultaneously establish algorithms and moderation networks that enable prohibited content to succeed through matching content that attracts user engagement and ads revenue.
Legislative Framework The Legal Arsenal
Information Technology Act, 2000(IT Act)
The Indian legal system governs electronic content under the main provisions established by the IT Act. The IT Act brings forth essential provisions which specifically target and outlaw sexually explicit content
The law under Section 67 makes electronic communication of stag material an illegal act.
Under Section 67A of the law both publishing sexually unequivocal content and its transmission through electronic channels become criminal offenses.
- Section 67B Addresses child sexual abuse material (CSAM)
Section 69A allows Indian authorities to interrupt public access toward violating content which violates national laws.
The legislation mandates severe consequences for law violators who face possible imprisonment of up to five years and minimum fines up to ₹ 10 lakhs for initial offenders while posterior offenses receive more severe punishments.
IT (Central Guidelines and Digital Media Ethics Code) Rules, 2021
The Rules implemented major enhancements to the non-censorship system through
The IT (Central Guidelines and Digital Media Ethics Code) Rules, 2021 mandates every substantial social media intermediary to designate essential roles which include Chief Compliance Officers along with Nodal Contact Persons and Resident Grievance Officers.
The organization is expected to use automated tools for detecting unlawful content and executing its automated removal.
A three-stage procedure exists to handle complaints from users.
The rules establish a removal deadline of 36 hours after government authorities issue an order.
The IT Act prescribes a requirement to keep removed content accessible for investigative examinations.
Indian Penal Code (IPC) /Bharatiya Nyaya Sanhita (BNS)
IPC Section 292 (Obscene/stag material) → BNS Section 291: Criminalizes the sale, distribution, or public exhibition of obscene material
IPC Section 293 (Sale to minors) → BNS Section 292: Prohibits the sale or distribution of obscene objects to individuals under 20 years
IPC Section 294 (Public obscenity) → BNS Section 293: Makes it illegal to perform or display obscene acts publicly
IPC Section 499/500 (Defamation) → BNS Section 356/357: Addresses defamation, often invoked in cases involving non-consensual intimate imagery.
Platforms’ Evasion Strategies Calculated Compliance
Despite this comprehensive legal frame, platforms employ several strategies to evade meaningful compliance
- Algorithmic Modification
Digital India Foundation (2024) research showed that sexually suggestive posts on Instagram reached 4.2 times more users through platform algorithms than educational content even though the platform had stated rules.
- Regulatory Arbitrage
Different standards operate separately on each governing body. The Centre for Internet Society (2024) revealed through their comparative research that India had a 78 percent lower content removal rate for similar violations than the European Union because the EU enforced stricter supervisory consequences.
- Delayed Compliance
The platforms benefit from lengthy content removal procedures which they systematically take advantage of. The Ministry of Electronics and Information Technology (MeitY) documented in January 2025 an average timeframe of 27 days in compliance to gladden takedown requests compared to the fixed 36-hour period.
- Specialized prevention
The encryption used from end to end in Telegram and WhatsApp platforms creates major obstacles for boundary officials trying to oversee content. Platforms announce under specialized circumstances that they cannot detect prohibited material thus generating” eyeless nonsupervisory spots.”
Case Studies Legal Precedents and Violations
- Prajwala v. Union of India (2018)
The Supreme Court established a technical branch of the Central Bureau of Investigation (CBI) to perform investigations regarding child sexual abuse and rape content through this historic decision. The Supreme Court enforced hash-matching alongside automated screening tool implementation to prevent the presence of similar material and ordered main platforms to develop these screening tools.
Even though this decision was made the enforcement standards show continuous inconsistency. Through research conducted by the Internet Freedom Foundation in March 2025 it became apparent that only 32 pieces of Computer Sexually Exploited Material identified by the National Center for Missing & Exploited Children (NCMEC) were successfully removed from these platforms.
- X (formerly Twitter) v. Union of India (2023)
The Delhi High Court concluded that online platforms lose their Section 79 IT Act protection when they fail to efficiently delete content upon gaining factual awareness about its illegal nature. The court demanded ₹ 50 lakhs in penalties because of non-compliance with removal order requirements.
The court established through precedent that platforms must remove content when users report factual knowledge about illegal content aside from official orders thus increasing their removal obligations.
- Antony Clement Rubin v. Union of India (2023)
Judicial decisions from the Madras High Court required platforms to develop specific tools for finding contents’ origin based on unlawful messages which existed beyond encrypted services parameters. The court directly contradicted Telegram’s statement which claimed encryption stops unauthorized content moderation practices.
- Ministry of Electronics & Information Technology v. Meta Platforms, Inc. (2024)
The Delhi High Court ordered Meta Platforms to pay ₹ 10 crores for performing algorithmic changes on sexually suggestive content that targeted children even though its platform could normally detect and block comparable content. Meta’s recommendation system which puts user engagement first has been detected by the Court to be a violation of both IT Act provisions and Protection of Children from Sexual Offences Act (POCSO).
Content Normalization the Cultural Impact
Major online platforms enabled the spread of prohibited content which led to its acceptance as an ordinary element of Indian digital environments. The normalization procedures function through multiple steps.
- Euphemistic Tagging
The content is allowed through coded language and tags so platforms can prevent automated detection systems. The Centre for Communication and Information Technology (CCIT, 2024) discovered more than 200 different evolving shorthand expressions to share sexual content between Instagram and Facebook users.
- Algorithmic Echo Chambers
Platform algorithms generate isolated content networks which display similar material to users after they view related information. According to an Internet and Mobile Association of India (IAMAI) study druggies received progressively aggressive content during their platform usage despite policies banning such escalation.
- Monetization Incentives
Public declarations against sexualized content exist but platform systems enable content creators to earn money through producing such content. The National Law University of Delhi published a 2024 report which showed Instagram creator economy features helping content creators monetize material against Indian obscenity provisions.
Jurisdictional Challenges the Enforcement Gap
Enforcement attempts become more difficult because of conflicts between legal areas
- Data Localization Resistance
Platforms continue to insist that essential content determination occurs in their international headquarters even though IT Rules 2021 establish a requirement for original grievance officers. Such situations make it difficult for India to comply with its local legal requirements through delayed processes.
- Limited Local Authority
In India the representatives of content moderation platforms do not normally hold the freedom to set and enforce their policies independently. The February 2025 report from the Administrative Standing Committee on Information Technology showed major platform representatives in India would consult their transnational headquarters about 87 policy decisions for content.
- Cross-Border Evidence Challenges
The Mutual Legal Assistance Treaty (MLAT) requires extensive process delays when investigators seek evidence obtained from US-based platforms. The process of addressing a MLAT request takes longer than 18 months which results in several investigations being deemed ineffective.
Recommendations A Path Forward
The following series of measures is suggested to resolve the nonsupervisory issues.
- Graduated Penalty Framework
A multistage penalty system based on the European Digital Services Act should be enacted to enforce specific charges from the company’s worldwide revenue when non-compliance continues.
- obligatory Transparency Reporting
Each day platforms must submit standardized reports which contain information about content violations together with details about content removal numbers and results of algorithm impact assessments.
- Independent Algorithm Audits
Inside India must establish a separate non-supervisory body which has the power to check social media recommendation algorithms against national laws focusing on banned content.
- Strengthened Grievance Redressal
The IT Rules 2021 Grievance Appellate Committee should gain the power to issue mandatory instructions to platforms about handling methodical content violations.
- International Regulatory Cooperation
Nonsupervisory bodies must develop multijurisdictional cooperation systems to resolve problems that arise from content moderation across national borders.
Conclusion Balancing Regulation and Rights
Online platform regulation should achieve two principal goals by safeguarding dangerous content but maintaining legal expression rights. The current content moderation system adopted by Instagram, Facebook, Snapchat and Telegram in India needs to be declared more forceful in its actions.
Specialized system failures along with selfish corporate interests have led to the widespread presentation of prohibited content. Successful regulations must combine two elements that enhance enforcement capabilities as well as change platform rewards to promote compliance measures.
India’s digital transformation requires a comprehensive solution to nonsupervisory problems since these solutions will build both law enforcement capacity and a digital domain that complies with Indian values. Digital communication protection for vulnerable druggies demands collaborative efforts between platforms and regulators and civil society to develop
nonsupervisory frameworks which safeguard vulnerable druggies and maintain the benefits of digital communication.
Frequently Asked Questions
- Social media platforms account for full responsibility in all the content their users post to their platforms.
Platforms acquire Section 79 “safe harbor” immunity because they act as intermediaries but this protection expires when they fail to promptly eliminate illegal content upon acquiring “factual knowledge” through court orders or government statements. Legal decisions now determine that user reports about easily illegal content also count as “factual knowledge” for content takedown purposes.
- Does the Indian government possess the power to blacklist social media networks which persistently break Indian content regulations?
Yes. The Indian government maintains full power to instruct intermediaries to restrict public information access according to Section 69A of the IT Act when public security or public order or crime prevention necessities arise. The Supreme Court approved this provision as indigenous through its Shreya Singhal v. Union of India (2015) decision. Union of India( 2015).
- End-to-end translated services provided through platforms like Telegram face different obligations when compared to other services.
While end-to-end encryption creates challenges for content monitoring platforms still have an obligation to comply with Indian laws. Law enforcement conditions for content moderation cannot be defended by encryption according to the Antony Clement Rubin judgment.
- The infractions of non-compliant platforms lead to which legal ramifications?
The lack of compliance with the IT Rules 2021 makes platforms vulnerable to content responsibility under Section 79 of the IT Act and subjects platform representatives to criminal prosecution under IT Act and IPC provisions. The recent enforcement of significant monetary sanctions by courts has become common practice.
- The procedures for users to report unlawful content present on these online platforms exist through what channels?
Users possess three reporting channels to submit content complaints by using platform tools along with contacting Grievance Officers designated under the IT Rules 2021 or reporting through cybercrime.gov.in. Users who encounter serious content violations involving CSAM and non-consensual intimate pictures can file reports to the National Cyber Crime Reporting Portal and other technical agencies.