Recognising child sexual abuse material (CSAM)

With September well underway, many of us are returning to more time spent on laptops and online. The INHOPE network of internet hotlines works hard to remove illegal content and to make sure that you and your children don't come across images and videos of child sexual abuse. To help in this fight, and to provide support in this back to school season, INHOPE wants to make sure that everyone is informed on what child sexual abuse material (CSAM) is and what kind of content should be reported. INHOPE also offers tips on how to avoid photos of children and young people ending up in the hands of predators.

What is CSAM?

Child sexual abuse material (CSAM) can be described as imagery or video of a person under the age of 18 engaged in or depicted as being engaged in explicit sexual activity.

Understanding the complexities of CSAM

Often adult/legal pornographic content is tagged explicitly with "teen", "barely 18" or similar terms, but after viewing if it is clear that the persons shown in the material are 18 years or older. In these cases the content is not CSAM and thus should not be reported to your national hotline.

If you are in any doubt about any material you find please report it to your national hotline where it will be assessed by a professional Hotline analyst.

In many countries, images of children who have been instructed to pose in sexualised ways, completely or partially undressed, and images which are focused on children's sexual organs are illegal and should be reported to your national hotline.

Because what is considered CSAM differs according to the country, INHOPE recommends finding out more about what should be reported in your country by visiting your national hotline's website.

If you are still in doubt, then it is always better to report it. Your report can then be checked by a professional who has the opportunity to end the cycle of abuse.
 

Think before you hashtag

Hashtags such as #cleankids #splishsplash #pottytraining #naptime are being used by sexual predators to find pictures of children, according to a new study by Child Rescue Coalition.

By indicating keywords or topics of interest, hashtagging an image enters it into a directory created by the social network and makes the image discoverable by other users. These hashtags may be used by parents when posting entirely innocent photos of their children, or by teenagers when posting selfies, but they could also be exploited by people with a sexual interest in children.

Innocent images are often copied,  manipulated and misused which as a Dutch analyst explained that some of the CSAM they see regularly includes zoomed in images of the genitals of children playing on the beach.

Images posted online stay online forever. INHOPE encourages everyone to think before you post. Ask yourself, would your child would be happy with this image being online, available to the world when they are 30 years old?

Check that the privacy settings on all your family members social media accounts are appropriately configured – ideally with maximum security. And, be extremely cautious in applying hashtags to pictures of children to minimise the potential of those images being found by people with a sexual interest in children.

Help us achieve our vision of an internet free of CSAM and keep your child safe in this back to school season. Don't ignore it, report it.

Related news

INHOPE releases 2019 annual report

2019 has been a year of growth in both size and strength for the INHOPE network, which saw the amount of child sexual abuse material (CSAM) processed by hotlines nearly double from 2017 to 2019.

COVID-19 and INHOPE member hotlines

Strong and resilient, INHOPE member hotlines have continued the fight against child sexual abuse material (CSAM) online during the Coronavirus pandemic. Like many other sectors however, COVID 19 has had a significant impact on the work of hotlines. To classify illegal material and send the URLs to law enforcement or a hosting provider, hotlines have agreements with national authorities to allow them to review CSAM reports. To do this in a safe and secure manner, each hotline must have a secure area, available only to authorised staff. Illegal material can then only be reviewed through a computer with specific technical and physical security protocols in place. So, what do you do when you can't easily access hotline buildings due to a worldwide pandemic?