Making cyberspace safe for kids

| | New Delhi
  • 1

Making cyberspace safe for kids

Sunday, 29 October 2023 | Archana Jyoti | New Delhi

Making cyberspace safe for kids

The National Human Rights Commission (NHRC) has proposed a series of measures aimed at making cyberspace safer for children and cracking down on perpetrators. These measures include replacing the term “child pornography” in the POCSO Act with “Child Sexual Abuse Material” (CSAM), clarifying the definition of  “sexually explicit” under the IT Act, and ensuring the removal of online CSAM.

The NHRC has also made a strong case for reviewing the existing punishment for offences pertaining to online CSAM to be safeguarded by giving them legislative cover.

The top rights body’s move follows concerns with the National Centre for Missing and Exploited Children’s “CyberTipline Report 2022”, which showed that out of the 32 million reports received by US-based NCMEC from across countries, 5.6 million reports pertained to CSAM uploaded by perpetrators based out of India.

Also, 1,505 instances of publishing, storing and transmitting CSAM under provisions of the IT Act and the POCSO Act were reported in 2021. Alarmed by this troubling trend, 71 Governments joined forces at the UN in June to endorse a call-to-action statement, urging swifter and more comprehensive measures to eliminate known instances of child sexual abuse materials from the internet.

The call to action statement emphasises that there is growing and global consensus that more needs to be done and with greater urgency to keep children safe across our communities and online all over the world.

In line with the UN call, the NHRC has also stated that the IT Act should include Virtual Private Network (VPN) service providers, Virtual Private Servers (VPS) and cloud service providers to avoid ambiguity and reinforce compliance of CSAM-related provisions of the Act.

It also recommended that the national database on sex offenders be expanded to include CSAM offenders convicted under the IT Act and the Pocso Act.

“Considering the speed of circulation of online CSAM, the time taken for removal of content by intermediaries after getting information from authorised agencies should not be more than six hours, as against 36 hours under the rules of the intermediary guidelines 2021,” the advisory said.

It further suggests setting up at least one specialised State police unit in every State and UT and a specialised Central police unit under the Central Government to deal with CSAM-related matters. Indicating that fund should not be an excuse to delay action in the matter, the NHRC called upon the Centre to  assist in the setting up and equipping these units, for instance, through grants under the Modernisation of State Police Forces (MPF) Scheme, Police Technology Mission and Nirbhaya Fund, it added.

In an elaborate advisory comprising four-parts, the top human rights body  said terms like “use of children in pornographic performances and materials”, “child sexual abuse material” and “child sexual exploitation material” to be preferred over “child pornography’.”

The term “sexually explicit” needs to be defined under Section 67B of the IT Act, 2000 to ensure the prompt identification and removal of online CSAM, the rights panel has said in the advisory.

Considering the gravity of the offence, the current quantum of punishment for offences pertaining to online CSAM under Section 14 of the POCSO Act and Section 67B of the IT Act (seven years or less) “may be relooked or exempt the application of Section 41A CrPC by making appropriate legislative changes,” it said.

The advisory further stresses that intermediaries, including social media platforms, OTT applications and Cloud Service Providers, must deploy technology, including content moderation algorithms, to proactively detect CSAM on their platforms and remove the same.

“Similarly, platforms using end-to-end encryption services may be mandated to devise additional protocols/ technology to monitor the circulation of CSAM. Failure to do so invites withdrawal of the ‘safe harbour’ clause under Section 79, IT Act, 2000. ISPs, web browsers and OTT players to ensure that pop-up warning messages are displayed for searches related to CSAM,” the NHRC has recommended.

A national database of CSAM with hash values of known CSAM be created by the proposed specialised central police unit so that the required content be blocked by intermediaries. This should be maintained by the proposed specialised central police unit, as per the advisory.

“Survivors of CSAM should be provided support services and opportunities for rehabilitation through various means, like partnerships with civil society and other stakeholders. Psycho-social care centres may be established in every district to facilitate need-based support services and organisation of stigma eradication programmes,” says the NHRC advisory.

Sunday Edition

On A Fun Filled Pawcation!

30 June 2024 | Sharmila Chand | Agenda

FROM THE PEN OF A GROUNDED POET

30 June 2024 | Swati Pal | Agenda

Journey to an expanded self awareness

30 June 2024 | Deepak Kumar Jha | Agenda

TANGRA TALES

30 June 2024 | Shobori Ganguli | Agenda

Disappointing Service Mars Fine Dining Experiences

30 June 2024 | Pawan Soni | Agenda

Guruspeak | Do you pray?

30 June 2024 | Sri Sri Ravi Shankar | Agenda