The importance of staff welfare in the fight against online CSAM

INHOPE hotlines receive reports from the general public with potentially illegal photos and videos on child sexual abuse material (CSAM). In order to take the necessary actions to remove CSAM from the internet, content needs to be reviewed by a hotline analyst. However, viewing this content can be very harmful and disruptive to the mental health of an individual. Therefore, INHOPE's network of hotlines created a best practice guide on staff welfare and recently organised a panel discussion with hotlines outlining their effective methods to safeguard the wellbeing of analysts.

2019-03-28 INHOPE awareness, hotlines
Colleagues from INHOPE hotlines share good practices on staff welfare
Colleagues from INHOPE hotlines share good practices on staff welfare
Jennifer Lopes and Andreas Hautz from German hotline shared that, at their organisation, the hotline work is carried out in shifts. Analysts assess content for specific periods of time and this is agreed upon internally as a team. Hotline analysts work closely together and never work alone. Analysts are encouraged to speak about their feelings. If the content being assessed has an undue impact on the analyst, the team acts as a backup and the analyst has the option to not work with content for that day.
A similar practise is also noted at Polish hotline, NASK. "We have rules on the maximum number of reports that can be assessed daily by content analysts, as anything over that number can be very harmful to the individual. Analysts also have the freedom to stop assessing or to take a break for a few days when they deem necessary," said Martyna Różycka, NASK's hotline manager. She adds, "We encourage our analysts to talk about how they feel and often it's not very easy to discuss it openly. Thus we use different methods to enable them to express their feelings, for example dialogue cards (cards from the Dixit game)."
The French Hotline, Point de Contact recently published a White Paper on staff welfare. Pauline Sêtre, hotline manager at Point de Contact said, "Illegal content on the internet is reported by the public to a hotline and then by the hotline to LEAs (law enforcement agencies) and ISPs (internet service providers) for investigation and removal respectively. This implies that LEA and ISP content analysts are also exposed to illegal content. The question then arises, are all content analysts well protected against it? To address this question, we developed our White Paper: ‘Child sexual abuse material and online terrorist propaganda. Tackling illegal content and ensuring staff welfare'. Over the years, Point de Contact has developed a set of staff welfare good practices that we now share with organisations, besides hotlines, via this White Paper."
Efforts are currently underway to use technology and artificial intelligence (AI) to train machines in prioritise child sexual abuse material. These efforts are aimed at decreasing the human aspect in assessing CSAM. However, while technology will reduce the impact which assessing content has on people, human input will always be necessary and thus analysts will continue to be impacted by the work they do. INHOPE ensures that its member hotlines protect their employees and continuously update their staff welfare policies to prioritise and ensure the wellbeing of hotline analysts.
Find out more about the work of INHOPE at

Related news