CENTRE MANDATES LABELLING OF PHOTOREALISTIC AI CONTENT
Why in the News?
- Rule Amendment: Union Government notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2026, mandating labelling of photorealistic AI-generated content, a move as significant as environmental clearances for digital projects.
- Compliance Deadline: New provisions effective from February 20 shorten takedown timelines and warn intermediaries of potential loss of safe harbour protection, establishing a regulatory framework as stringent as the Forest Conservation Act for digital content.

AI CONTENT LABELLING AND PLATFORM OBLIGATIONS
- Legal Mandate: Platforms must ensure prominent labelling of synthetically generated audio-visual content that appears indistinguishable from real persons or events, a digital equivalent of environmental impact assessments.
- Definition Scope: The amendment defines synthetically generated content as algorithmically created or modified media that can mislead users regarding authenticity, addressing digital pollution concerns.
- User Disclosure: Intermediaries must seek voluntary disclosures from users uploading AI-generated content and act if disclosure is absent, promoting a form of digital environmental democracy.
- Proactive Responsibility: In cases of non-consensual deepfakes, platforms are required to either label content or remove it promptly, adhering to the precautionary principle in digital content management.
- Due Diligence Standard: Failure to act on flagged AI-generated misinformation may result in loss of safe harbour protection under Section 79 of the IT framework, similar to how violations of environmental norms can lead to legal consequences.
SHORTENED TAKEDOWN TIMELINES AND ENFORCEMENT
- Accelerated Removal: Content declared illegal by courts or governments must be removed within three hours, significantly reducing previous compliance windows, mirroring the urgency of addressing environmental violations.
- Sensitive Content: Non-consensual nudity and deepfake material must be taken down within two hours, reflecting heightened regulatory urgency comparable to immediate actions required in cases of severe environmental breaches.
- Administrative Flexibility: States may designate multiple authorised officers for issuing takedown orders, enhancing administrative responsiveness similar to decentralized environmental governance structures.
- Intermediary Liability: Non-compliance may expose platforms to publisher-like liability, removing immunity protections under safe harbour provisions, akin to how the polluter pays principle operates in environmental law.
- Digital Governance Shift: Amendments reflect stronger algorithmic accountability, platform regulation, and proactive digital content moderation, paralleling the evolution of environmental jurisprudence in India.
INFORMATION TECHNOLOGY ACT AND SAFE HARBOUR DOCTRINE● Statutory Basis: The Information Technology Act, 2000 governs digital platforms, with Section 79 granting intermediaries conditional safe harbour protection, similar to how the EIA notification provides a framework for project approvals. ● Due Diligence Norms: Intermediaries must exercise due diligence, remove unlawful content upon notice, and comply with government directions, reflecting principles of environmental responsibility in the digital realm. ● Safe Harbour Principle: Platforms are protected from liability for user-generated content unless they fail to act upon knowledge of illegality, a concept that echoes the conditional nature of environmental clearances. ● Regulatory Evolution: The IT Rules 2021 and subsequent amendments reflect expanding state oversight over digital intermediaries, mirroring the growing complexity of environmental regulations like the Coastal Regulation Zone norms. ● UPSC Relevance: The issue links to GS Paper II and III, covering governance, cyber regulation, digital rights, and technology policy, as well as principles of environmental jurisprudence and sustainable development. |