It is critical to the work of the hotlines that members of the public who stumble upon illegal content report it and not ignore it. The consequences of not reporting illegal content are numerous and impact victims: CSAM remains on the internet and is not taken down. This means that every time that this material is viewed by anyone, anywhere in the world, the victim depicted is re-victimised. Indeed survivors of recorded child sexual abuse say that knowing it is online for anyone to see continues to impact their lives for many years after the abuse has stopped. The significance of reporting illegal content is vital in helping survivors of child sexual abuse to reduce the repeated trauma they could suffer, as well as keeping the internet safe for all legitimate users.
The public can make a report to a hotline in the country of their residence or visit the INHOPE website
for a list of hotlines to which they can submit a report. "Most of the hotlines have standardised questions that are asked while reporting illegal content. When a report is submitted to eco (the German hotline), the reporter needs to provide only the location of the content, for example a URL. Additionally it helps if they can provide generic information like a description of what they saw, the reason for reporting it and of course how they came across upon the content. This information is vital for the analysts at each hotline
" said Peter-Paul Urlaub, a consultant at eco's complaints office and INHOPE Board Member.
Once a hotline receives a report, there are several steps to be followed before the CSAM is removed from the internet. The hotline assesses the report received and identifies the country of hosting. The hotline then informs the national law enforcement agency (LEA) and internet service provider (ISP). This information is also entered into ICCAM. ICCAM (I see Child Abuse Material) is a secure software solution to collect, exchange and categorise reports on child sexual abuse material. ICCAM is used by INHOPE hotline members and INTERPOL. Within ICCAM, hotlines need to assess CSAM within several parameters including age and gender. In case the material is hosted in a different country to that in which it was reported, the report is transferred to the hosting country for assessment via ICCAM. The hotline in the hosting country takes the appropriate action and notifies the LEA and the relevant ISP accordingly.
In 2017, the INHOPE network of hotlines identified over 259,000 images and videos. This alarming and growing number calls for more robust actions from stakeholders across the world. Hotlines in Canada (Cybertip.ca) and UK (IWF) have deployed web crawlers to scour the internet for CSAM based on hashing technology. Large companies like Facebook, Google and Twitter also use their own hashing technology to combat illegal content within their networks.
"The internet is continually changing and it is important to stay ahead of the curve. Web crawling has been implemented to quickly rid the web of known illegal content. However, something new is always going to come around the corner and we need to adapt quickly to continue the fight against illegal content being distributed. In 2017, 82 per cent of the content identified by the INHOPE network depicted girls and the victims are getting younger" said Fred Langford, Deputy CEO of the Internet Watch Foundation and President of INHOPE.
Combatting CSAM requires action from everyone – including the general public. Therefore report it, don't ignore it!
For more information on the work of hotlines, visit the INHOPE website and the "Hotlines" section of the BIK portal.