Skip to main content
AI for Safer Children online

AI FOR SAFER CHILDREN

 

 

The Convention on the Rights of the Child adopted in 2002 refers to child sexual abuse material (CSAM), or child pornography as “any representation, by whatever means, of a child engaged in real or simulated explicit sexual activities or any representation of the sexual parts of a child for primarily sexual purposes” (article 2.C.). In recent years there has been a marked increase in the existence of online CSAM on a global scale. Despite the profound benefits that come with new technologies, the internet, and global connectivity in general, these developments have played a central role in facilitating the growth in online CSAM, leaving today’s children more vulnerable than ever before.

The United Nations Children’s Fund (UNICEF) has noted that approximately 1.8 billion photos are uploaded to the internet every day, of which some 720,000 are believed to be illegal images of children. According to the United States National Center for Missing and Exploited Children (NCMEC), has exploded from 450,000 files in 2004 to more than 45 million files in 2018. At the same time, the number of reports of URLs containing CSAM has increased from only 3,000 in 1998 to 18.4 million today.

The ongoing COVID-19 pandemic has also played a significant role in enhancing the vulnerability of minors in this regard, as children find themselves spending increased amount of time at home and online. The perpetrators of such crimes against children equally found themselves confined to theirs and connected during the pandemic. Illustrating this correlation, Interpol published in their recent report (Sep 2020) on the COVID-19 Impact on Threats and Trends Child Sexual Exploitation and Abuse that “information from multiple sources including INTERPOL member countries indicate a significant increase in the sharing of online CSAM through the use of peer-to-peer networks during the COVID-19 pandemic”.

In November 2020, UNICRI through its Centre of Artificial Intelligence and Robotics, and the Ministry of Interior of the United Arab Emirates launched the ‘Artificial Intelligence for Safer Children’ Initiative. This new initiative seeks to combat online CSAM through the joint exploration of new technological solutions, specifically Artificial Intelligence (AI) and machine learning, together with law enforcement agencies – which play a central role in investigating and prosecuting online CSAM.

The potential of AI to safeguard children has, in fact, already been demonstrated. For instance, Spotlight, an online search platform used by law enforcement agencies in the US and Canada, analyzes web traffic related to sex advertisements on escort sites to identify victims and perpetrators, and monitor potential trafficking networks. This AI-based tool has helped to identify 14,874 child victims of human trafficking in the past four years. In the UK, SafeToNet has developed a machine learning application that helps parents monitor their children’s activity online, including speech analysis and nudity detection with a direct warning system aligning with privacy concerns, helping to prevent victims to threats of online CSAM.

In close cooperation with the law enforcement community, AI for Safer Children will take practical steps to facilitate the prevention, detection and, ultimately, the prosecution of the perpetrators behind CSAM. Network building, awareness-raising and advocacy, as well as the exploration of practical tools to support law enforcement leverage AI to combat the problem of online CSAM and will be a focus of the AI for Safer Children Initiative. A key consideration will also be to build trust in law enforcements’ use of AI and to identify and navigate the redline between the need to ensure the safety of our children and the use of potentially invasive technologies. All of these considerations are essential to combat online CSAM today and to create a safer world for our children.