The importance of staff welfare in the fight against online CSAM

INHOPE hotlines receive reports from the general public with potentially illegal photos and videos on child sexual abuse material (CSAM). In order to take the necessary actions to remove CSAM from the internet, content needs to be reviewed by a hotline analyst. However, viewing this content can be very harmful and disruptive to the mental health of an individual. Therefore, INHOPE's network of hotlines created a best practice guide on staff welfare and recently organised a panel discussion with hotlines outlining their effective methods to safeguard the wellbeing of analysts.

Colleagues from INHOPE hotlines share good practices on staff welfare
Colleagues from INHOPE hotlines share good practices on staff welfare
 
Jennifer Lopes and Andreas Hautz from German hotline jugendshutz.net shared that, at their organisation, the hotline work is carried out in shifts. Analysts assess content for specific periods of time and this is agreed upon internally as a team. Hotline analysts work closely together and never work alone. Analysts are encouraged to speak about their feelings. If the content being assessed has an undue impact on the analyst, the team acts as a backup and the analyst has the option to not work with content for that day.
 
A similar practise is also noted at Polish hotline, NASK. "We have rules on the maximum number of reports that can be assessed daily by content analysts, as anything over that number can be very harmful to the individual. Analysts also have the freedom to stop assessing or to take a break for a few days when they deem necessary," said Martyna Różycka, NASK's hotline manager. She adds, "We encourage our analysts to talk about how they feel and often it's not very easy to discuss it openly. Thus we use different methods to enable them to express their feelings, for example dialogue cards (cards from the Dixit game)."
 
The French Hotline, Point de Contact recently published a White Paper on staff welfare. Pauline Sêtre, hotline manager at Point de Contact said, "Illegal content on the internet is reported by the public to a hotline and then by the hotline to LEAs (law enforcement agencies) and ISPs (internet service providers) for investigation and removal respectively. This implies that LEA and ISP content analysts are also exposed to illegal content. The question then arises, are all content analysts well protected against it? To address this question, we developed our White Paper: ‘Child sexual abuse material and online terrorist propaganda. Tackling illegal content and ensuring staff welfare'. Over the years, Point de Contact has developed a set of staff welfare good practices that we now share with organisations, besides hotlines, via this White Paper."
 
Efforts are currently underway to use technology and artificial intelligence (AI) to train machines in prioritise child sexual abuse material. These efforts are aimed at decreasing the human aspect in assessing CSAM. However, while technology will reduce the impact which assessing content has on people, human input will always be necessary and thus analysts will continue to be impacted by the work they do. INHOPE ensures that its member hotlines protect their employees and continuously update their staff welfare policies to prioritise and ensure the wellbeing of hotline analysts.
 
Find out more about the work of INHOPE at www.inhope.org.

Related news

BeSafeOnline campaign by the Finnish hotline

  • Hotlines
  • 15/01/2019
  • Nettivihje, the Finnish hotline

Nettivihje, the hotline of the Finnish Safer Internet Centre (SIC) joined forces with the National Bureau of Investigation to create the successful #BeSafeOnline campaign, which emphasised the importance of adopting digital safety skills as part of everyday activities. 

Child sexual abuse material: the journey of a report

INHOPE and its extensive network of member hotlines work to eliminate online child sexual abuse material (CSAM). During Safer Internet Forum (SIF) 2018, Fred Langford, Deputy CEO of the Internet Watch Foundation (IWF) and INHOPE President, and Peter-Paul Urlaub, consultant at eco complaints office and INHOPE Board Member, gave a presentation on the journey of a report on CSAM.

Creative learning with "Medien in die Schule"

The FSM (Association for Voluntary Self-Regulation of Digital Media Service Providers) is a publicly accredited self-regulatory body in the field of online child protection in Germany. The association is dedicated to the protection of minors from harmful media - in particular the fight against illegal, youth-endangering and development-impairing content in online media. To this end, the FSM operates an internet hotline to which anyone can turn free of charge in order to report online content harmful to minors. The FSM hotline belongs to the founding hotlines of the INHOPE network. The extensive educational work and promotion of media literacy of children, young people and adults are further tasks of the FSM.