The Indian Government has developed a new set of rules and regulations to regulate social media platforms, messaging services, OTT platforms and news portals.
These Regulations are referred to as the Rules of Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code), 2021. (Rules). These rules will require compliance by tech giants operating in India, such as WhatsApp, Facebook, Twitter, Netflix, Amazon, YouTube, etc.
In India, the growth of digital media platforms has so far largely been driven by a moderate regulatory framework. Considering the spiking concerns about the information and content made available on social media and OTT platforms, the Government’s comprehensive regulations on these digital media platforms were unavoidable, both domestic and foreign.
On social media platforms like Facebook, Twitter and others, the rules focuses on issues such as fake news, fake user accounts and monitoring of illegal content on such platforms.
More compliance is placed on social media platforms with a larger user base. The main provision specific to messaging services platforms, is the requirement to identify the originator of messages in case of any mischief.
Social media platforms with a minimum of 5 million registered users are categorized as significant social media intermediaries and are subject to maximum compliances. Though, the Government would want any other social media platform to also comply with the rules applicable to significant social media intermediaries if services of such platform create a material risk to the sovereignty or integrity of India.
While the actual implementation of all this remains to be seen, at this point it appears that purely as a result of user behaviour, even smaller social media platforms could be subject to stricter compliance under the rules.
Technology-based monitoring of harmful content
Except for the previously applicable Information Technology (Intermediate Guidelines) Regulations, 2011 (2011 Rules), major social media platforms are now required to implement technology-based measures, including automated tools to identify information that represents rape, child sexual abuse or conduct or information that has previously been removed.
The rules also require the maintenance of appropriate human supervision and a periodic review of automated tools. Such active monitoring by intermediaries dilutes the safe harbour protection available to intermediaries under the 2011 rules.
User account verification
If the platform is doing business in India, you need to be established in India.
All major social media intermediaries are needed to appoint:
- Chief Compliance Officer
- Nodal Contact Person
- Resident Grievance Officer
However, each of the above must be an Indian resident.
The Rules also require significant social media intermediaries to have a physical contact address in India. This compulsory physical presence in India will have huge ramifications for foreign players in terms of infrastructure and resource deployment and taxation.
However, the lack of a registration or compulsory licensing framework for digital media companies will hopefully continue to attract foreign players’ interest in setting up operations in India.
Identifying the ‘first originator’ of information
Messaging services platforms with more than 5 million users will be required to identify the first originator of the information, if required by a court order or a government order under Section 69 of the IT Act.
Such user identification calls into question the end-to-end encryption offered by services such as WhatsApp, Telegram, Signal, etc., and whether it is practically possible for a platform to identify a user as the “first originator” of mischievous information.