He suggests a self-regulatory mechanisms of the intermediaries (social media platforms) and government’s role in censorship, Edited excerpts from an interview.
How important is it to regulate social media platforms?
There is undoubtedly a need to regulate social media platforms in some way given the power of these platforms, the increasingly important role that they play in society, and the harms that can occur in the digital ecosystem. However, we should be careful to avoid over-regulation and particularly knee-jerk criminalisation, or methods that could significantly impinge on either architectural or legal structures that have made the Internet a haven for the exchange of ideas on a global scale. Evidence based, proportionate and targetted regulation is the need of the hour.
Do you think, globally governments have kept pace with digital technology and regulation.
It is incredibly difficult for the law to keep pace with the development of technology, and particularly to predict the ways in which technology will interact with society. The digital ecosystem has made it necessary to think about issues that we either took for granted or to approach issues in new ways. Jurisdictions around the world are trying to develop methods to deal with new problems posed by technologies. Europe is one of the leading jurisdictions in this respect – they have recently put in place the General Data Protection Regulation, which sets a fairly high standard for data protection. In India, our IT Act is nearly 20 years old and is arguably no longer sufficient to deal with the present digital ecosystem, whether it is in terms of the scope and nature of offences in the law, the provisions permitting surveillance and censorship by the State or even in terms of sections such as those pertaining to intermediary liability
What’s your view on India’s draft social media guidelines?
The draft Information Technology (Intermediary Guidelines) Rules of December 2018, are ill-thought out, disproportionate and overly broad. Accordingly the government faced significant pushback from civil society and industry when they were released for public consultation.
One of the primary problems with the draft Rules is that while they are intended to target certain specific types of social media companies, they apply broadly to all intermediaries – ranging from social media companies to telecom service providers and content delivery networks. Putting in place similar obligations on all intermediaries makes little sense.
The draft Rules also go against existing dicta of the Supreme Court in Shreya Singhal, in terms of how they oblige platforms to take-down content on request from the public and in terms of proscribing a long list of content – some of which (such as blasphemy) are not actually illegal in India.
The provisions obliging intermediaries to use automated tools to filter content and the obligation to trace and identify users are particularly problematic. These seek to implement substantive obligations which are not contemplated under the IT Act itself, and which could seriously affect civil liberties (speech and privacy rights in particular).
Would regulation amount to impinging on right to freedom of speech? Or encourage censorship, mostly political censorship.
Excessive regulation or poorly designed interventions could certainly lead to over-censorship. This could occur through direct censorship by government or indeed if you remove safe harbour protections altogether, as this would give companies an incentive to censor content so as to avoid lawsuits.
While there are issues with existing self-regulatory mechanisms followed by digital platforms, the problems with the government acting as censor are also clear – particularly in a country with relatively low rule of law standards and low state capacity. We’ve seen numerous instances of overly broad or arbitrary censorship of the internet in India by the State. The key is trying to figure out what exactly you want to achieve through regulation – what is the specific problem or market failure you are targetting? And then putting in place the least intrusive measures to achieve that aim in a proportionate manner.
Do you think that social media platforms currently do not take responsibility for content shared on their platforms even as most of them claim to have hired fact-checkers?
Section 79 of the IT Act does not require social media platforms to take responsibility for third party content shared on their platforms. Changing this system to mandate content removal and censorship by intermediaries themselves – would be unfair and disproportionate. The Comunications Decency Act in the US establishes a self-regulatory system for platforms. Accordingly, platforms are supposed to police the content shared on their platforms – though they continue not to be responsible for any content shared by third parties.
One of the big issues in this respect relates to consistency, transparency and accountability of platforms in implementing self-regulatory processes to moderate content. Over the last few years, there have been numerous cases of arbitrary or inconsistent censorship by many of the biggest platforms.
Due to the global pressure on platforms, some have proposed new ways in which to ensure greater transparency and accountability in their practices. For instance, Facebook has begun the process of establishing an oversight board which will independently review content moderation decisions. While such systems may also have problems – such as their legitimacy, the degree of independence, etc. this is an interesting attempt to avoid excessive state-regulation, and something that is worth keeping an eye on going ahead.
Do you see any structural issues with these technology platforms and digital technology as a whole?
There are numerous structural issues both with the digital economy as well as in the context of how society interacts with new technologies. As far as the platform economy is concerned, some of the biggest problems include: The centralisation of power in the hands of a few technology companies, caused by a number of factors ranging from network effects to the economies of scale in processing data. The Internet is no longer as “democratic” as it was originally envisioned, given the monopolies that many platforms enjoy and the consequent control over numerous aspects of our lives. Most technology platforms are built on surveillance based models. We are yet to properly figure out ways in which to address this given our addiction to free content and given that privacy harms can often be amorphous or long-term.
Should intermediaries (social media companies) be treated as platforms or publishers?
Intermediaries – which by definition refers to all the mediating entities that bring us the Internet, ranging from cyber cafes to social media platforms – should continue to face liability based on their specific functionality and the role that they play in the digital ecosystem. The current law, which only casts obligations on them if they play an active part in the commission of an offence, should stay in place.
But at the same time, it may not be appropriate to give platforms complete freedom to do as they wish, particularly given the structural problems in the digital economy.In addition to putting in place relevant norms to deal with broader issues such as privacy and competition law, as far as content moderation practices go, it may be useful to think about putting in place procedural norms for platforms to follow.