Elon Musk-owned X (formerly Twitter) has challenged the government’s use of Section 79(3)(b) of the Information Technology Act, 2000 (IT Act) to moderate and order the removal of content on social media. The company has argued that the government’s “misuse” of the provision bypasses safeguards available under other provisions of the IT Act — namely Section 69A — that are specifically meant for the purpose of content moderation. Shreya Singhal & Section 69A In Shreya Singhal v Union of India (2015), the Supreme Court struck down Section 66A of the IT Act which criminally punished, among other things, sending false information “for the purpose of causing annoyance or inconvenience”. A Bench of Justices R F Nariman and J Chelameswar said the provision was “unconstitutionally vague”, giving the government broad, unchecked powers to restrict the freedom of speech. After this decision, Section 69A of the IT Act became the primary law governing the matter. This section allows the Centre to issue orders blocking “any information generated, transmitted, received, stored or hosted in any computer resource”, but unlike 66A, it contains safeguards against misuse, as the SC had noted in Shreya Singhal. For blocking content under Section 69A, the Centre must deem it “necessary”. This “necessity”, however, is only justifiable under grounds provided in Article 19(2) of the Constitution which “imposes reasonable restrictions” on the freedom of speech “in the interests of the sovereignty and integrity of India, the security of the State, friendly relations with Foreign States, public order, decency or morality or in relation to contempt of court, defamation or incitement to an offence”. The Centre must record its reasons in the blocking order so that it can be challenged in court. Govt’s use of Section 79 The SC in Shreya Singhal also clarified the application of another provision — Section 79 of the IT Act. The provision is a “safe harbour” measure that exempts an “intermediary” (such as X) from liability for information published on the platform by a “third party”, that is, users of the platform. But Section 79(3)(b) states that the intermediary could be held liable if it does not immediately remove such unlawful information “upon receiving actual knowledge, or on being notified by the appropriate Government or its agency”. The apex court limited the scope of this provision, ruling that the requirement under Section 79(3)(b) will only kick in once a court order has been passed to that effect, or the government issues a notification stating that the content in question is related to grounds provided in Article 19(2). But in October 2023, the Ministry of Electronics and Information Technology (MeitY) issued a directive to all ministries, state governments, and the police saying that information blocking orders could be issued under Section 79(3)(b). A year later in October 2024, MeitY launched a portal called “Sahyog” where the aforementioned authorities could issue and upload blocking orders. What X’s challenge says X’s challenge before the Karnataka High Court argues that MeitY’s orders are an attempt to “bypass the multiple procedural safeguards” provided under Section 69A. The petition relies upon the SC’s ruling in Shreya Singhal, and says that content can only be censored though the process given under Section 69A or through a court order. Section 79, X argues, “merely exempts intermediaries from liability for third-party content”. The petition states: “A full 23 years after Section 79 was enacted, and 14 years after the current version went into effect, Respondents (the government) are now attempting to misuse Section 79 to create an unlawful blocking regime without any of the protections”. X approached the Karnataka High Court with the petition on March 17, requesting that the court grant an interim order against any coercive action. While a single judge Bench of Justice M Nagaprasanna refused to do so, it passed an order stating that it was “reserving liberty to the petitioners to move the court in the event that something is done” and listed the matter for March 27. X’s petition comes at a time when its AI chatbot Grok 3 has been courting controversy for its use of Hindi slang, and responses that are critical of the government. While X has not received any notice regarding the issue, the Centre has reportedly got in touch with the company regarding the matter. The Grok controversy introduces a new angle when it comes to “safe harbour” provisions like Section 79. While intermediaries like X may not be liable for information published by users, the question of whether X is liable for information published by Grok remains unanswered. Courts will have to determine if information published by a “third party” includes AI generated responses.