Countries have grappled with the issue of regulating content hosted by internet intermediaries. As the internet allows freedom to anyone to host content without any moderation, intermediaries were allowed protection from liability for third-party content through laws such as Section 230 of the Communications Decency Act in the US and the safe harbour provisions in the EU, with certain exceptions for illegal content.Section 79 of the Information Technology (IT) Act in India also allowed exemption to the intermediaries for third-party content, provided they observed certain due diligence. The content could be removed only based on orders from a court, or from an authorised government agency with certain conditions as laid down by the Supreme Court in the 2015 ‘Shreya Singhal vs Union of India’ case.This classical interpretation of the role of intermediaries worked satisfactorily for several years, as the services they provided were predominantly passive in nature. However, the enormous growth of social media during the last decade has made the limitations of this framework starkly evident as they have been unable to check the proliferation of fake news, and illegal and harmful content on their platforms. The proliferation of fake accounts and bots has only aggravated the problem. Several countries like Germany, France, Australia and Singapore have enacted legislation to deal with unlawful and harmful content on these platforms.The new Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules must be seen in the context of the need to make these platforms more responsible and accountable. These rules specify certain due diligence and institute a mechanism for redressal of grievances. The due diligence includes informing users about their privacy policy, and an agreement not to host any unlawful or harmful content. The rules envisage removal of content only in three situations: voluntary removal due to violation of the privacy policy or user agreement; pursuant to an order by a court or an authorised government agency; or based on the grievances received.The rules also specify some additional due diligence to be observed by ‘significant social media intermediaries’, defined on the number of registered users (currently specified as 50 lakh) in India. These include appointment of a chief compliance officer, a nodal contact person and a resident grievance officer, who should all be residents in India. The intermediary should also have a physical contact address in India.The rules also include providing information about the first originator in India of any unlawful message for the purposes of investigation of specified offences that are punishable with imprisonment of not less than five years. It must be noted that the intermediary is not required to disclose the contents of the message itself.The digital media ethics code under these rules creates a largely self-regulatory framework for publishers of online news and current affairs and online curated content on over-the-top (OTT) platforms. The oversight mechanism of GoI comes into play only after the redressal mechanism at the first two levels has failed to address the grievance satisfactorily. Incidentally, the exemptions to the intermediaries under Section 79 are still available, provided they observe the due diligence as specified.Freedom of expression must come with adequate responsibility and accountability. 19th-century philosopher John Stuart Mill explicitly recognised the ‘harm principle’ while arguing for placing some limitations on free expression. The new rules seek to strike a fine balance between freedom and responsibility in the online world.The writer is additional secretary, ministry of electronics and information technology (MeitY).
Read More