Forensic Investigation | 05.03.24 | minute(s) reading time
Fighting the Growing Threat of Child Sexual Abuse Material

Elena Colotti

Ensuring the safety and protection of children is a collective responsibility that requires a unified global response. With the rise in Child Sexual Abuse Material (CSAM) online, everyone – individuals, tech companies, parents, online platforms, law enforcement agencies, and educators, can do their part to stop it from spreading.

 

 

CSAM: THE ESCALATING NUMBERS

The proliferation of CSAM is evident in the alarming increase in the National Center for Missing and Exploited Children (NCMEC) reports over the years.

 

Today, the number of files relating to child sexual abuse contained in NCMEC reports every year has surpassed 88 million.

 

While in 1998, there were over 3,000 NCMEC reports of child sexual abuse imagery reported, over two decades later, yearly reports soared past 18.4 million, containing millions of images and videos flagged as child sexual abuse. In 2022, the numbers of reports had grown to 32.1 million. Over 99.5% of the reports regarded incidents of suspected CSAM. These reports contained eighty-eight million individual files involving a variety of items relating to child sexual abuse, of which 56% were images and 43% were videos.

 

Geographical Dynamics

According to the Internet Watch Foundation,

Europe remains the largest source of CSAM hosted online, accounting for 66% of the global total,

followed by Asia at 18% and North America at 16%. The Netherlands alone documented 56,000 referrals in 2022 (AviaTor report 2022), showing a significant yearly increase from 31,000 and emphasizing the ever-growing demand for extra efforts not only from international but also national and local law enforcement.

 

Impact on CSAM Survivors

The impact of CSAM on survivors is deeply concerning. Horrifically, according to the International Survivors’ Survey for 36% of survivors, the abuse continued into adulthood. Survivors frequently spoke of the permanence of the images/videos. Most respondents indicated they constantly worry about being recognized by someone who has seen images/videos of their abuse. 30% of the respondents reported being identified by someone who has seen images/videos of their abuse.

 

The Age Factor

When discussing Child Sexual Abuse Material (CSAM), the age group between 4-13 appears to be particularly vulnerable. In a Protect Children study that compiled over 30,000 responses from anonymous offenders, 45% of CSAM users reported that they search for CSAM depicting girls aged 4-13, while 18% said they search for material depicting boys of the same age. According to ECPAT, the majority of Child Sexual Exploitation Material in 2018 depicted prepubescent children. Similarly, the International Survivors’ Survey reveals that 87% of survivors experienced hands-on abuse before the age of 11. The INHOPE annual report 2022 reports similar indicators, showcasing that 9 in 10 victims depicted in the processed CSAM reports were pre-pubescent (aged 3 to 13).

HOW CAN WE ALL HELP?

Addressing the complex issue of CSAM requires a practical approach that involves hotlines, law enforcement, tech companies, hosting platforms, jurisdictions, and individuals.

 

Facilitate CSAM Reporting

INHOPE leads the fight against Child Sexual Abuse Material (CSAM) online by connecting and supporting fifty-four hotlines around the globe. Platforms developed by ZiuZ Forensic Investigation, like ICCAM and CPORT, aim to secure the exchange of CSAM that needs to be investigated or taken down among jurisdictions, hotline analysts, and law enforcement around the world. INHOPE member hotlines have been successful in prompt content removal, with 67% of illegal content URLs taken down within three days of a Notice and Takedown order.

 

Anyone who wants to report potential Child Sexual Abuse Material (CSAM) found online, can do so through the Hotlines website.

 

Implement deterrence and perpetration prevention measures

The 2024 Protect Children report illustrates that 77% of respondents have encountered CSAM or links to CSAM somewhere on the surface web. Specifically, 32% of respondents have encountered CSAM or links to CSAM on a pornography website, while 29% on Social Media platforms. Protect Children proposes various suggestions for hosting platforms. Among others, it requests platforms allowing image or video sharing to ban search terms related to child sexual abuse, and exploitation. Additionally, it strongly recommends incorporating warning messages when searches involve these terms.

 

Detection and Identification

The digital forensic investigation process is often lengthy and complex. Cutting-edge software tools developed by ZiuZ Forensic Investigation, like AviaTor and Fenoz, are making significant strides in aiding law enforcement agencies in finding perpetrators as quickly as possible. AviaTor prioritizes aspects of NCMEC reports, allowing analysts to focus on identifying perpetrators and saving victims. Meanwhile, Fenoz employs filters and AI classifiers to analyze and detect limitless amounts of images and videos, addressing the challenge of known and unknown CSAM.

 

Other recommendations on stopping the spread of CSAM can be found here.

Related news