Is Regulating Social Media done alone in India ? No ! It doesn’t affect Freedom Of Speech

Is Regulating Social Media done alone in India ? No ! It doesn’t affect Freedom Of Speech

Intermediary Guidelines,not a violation of Freedom of Speech

Government of India had laid down certain guidelines for Intermediaries operating in India, and asked all to comply with the same by May 2021, all barring @Twitter have complied with it, but does that mean curbing freedom of speech ?

In a white paper on “online harms” released in 2019, the British Government set out a proposal for new regulations requiring companies to ensure users were safe online and illegal content was swiftly dealt with.

The paper suggests creating a “statutory duty of care” for companies with their users, with an independent regulator set up to oversee the new rules.

The regulator would have a wide range of enforcement powers, potentially including the ability to issue “substantial fines”, block non-compliant services, and hold individual members of a company’s senior management liable for any breaches.

While a number of countries have started to plan or take action, as Ardern noted, “none are exactly the same”.

So what can we learn from the work done by other nations so far?

Appoint a regulator who would have a wide range of enforcement powers, potentially including the ability to issue “substantial fines”, block non-compliant services, and hold individual members of a company’s senior management liable for any breaches.

The UK’s proposal comes on the heels of draft regulations released by the European Union in September last year, which would require internet companies to remove or disable illegal terrorist content within an hour and deploy filters to ensure it is not re-uploaded.

The definition of terrorist content is deliberately wide, including information used “to incite and glorify the commission of terrorist offences, encouraging the contribution to and providing instructions for committing terrorist offences as well as promoting participation in terrorist groups”.

EU states would be required to designate a “competent authority” to issue removal orders to companies and impose penalties for those who failed to comply.

While the legislation is still working its way through the European Parliament, Australia has already managed to pass a Sharing of Abhorrent Violent Material Act drafted following the Christchurch attack.Australia’s Parliament moved quickly to legislate against violent online content in the wake of the Christchurch attack – but that pace has led to criticism about the effectiveness and unintended consequences of the law. Photo: Philippa Wood.

The law would require service providers and hosting services to “remove abhorrent violent material expeditiously”. Exactly what counts as expeditious is not defined, although Australian Attorney-General Christian Porter suggested “well over an hour” was unacceptable – with penalties of up to AU$2.1 million or three years’ imprisonment for individuals and up to $10.5m or 10 percent of annual turnover for corporates.

The rushed legislation attracted the attention of two United Nations human rights experts, who wrote to the Australian Government raising concerns about its impact on freedom of expression and the lack of consultation with the public and civil society.

“We are mindful that depictions of egregious violence on the internet provoke legitimate concerns about public order and safety…

“However, it is precisely the gravity of these matters and their potential impact on freedom of expression that demand a thorough and comprehensive review,” the pair wrote.

The Australian Law Council raised concerns about potential unintended consequences, with council president Arthur Moses saying the law could stop whistleblowers from using social media to “shine a light on atrocities being committed around the world”.

Germany leading on social media

Germany is perhaps the most advanced country when it comes to social media regulation – something acknowledged by Ardern, who said its actions appeared to have caused Facebook to change its staffing levels.

The European heavyweight, historically strong on hate speech issues given its Nazi past, created a “NetzDG” law – also known as the Facebook Act – requiring “manifestly unlawful” posts such as hate speech to be removed from social media platforms within 24 hours or incur fines of up to 50 million euros.

Less blatant violations must be reviewed within seven days by the social networks, who must provide six-monthly reports on the complaints they have received and how they have been addressed.

Leave a Reply

Your email address will not be published. Required fields are marked *