Report Harmful Content releases annual report

  • Helplines
  • 02/06/2020
  • UK Safer Internet Centre

Report Harmful Content (RHC) has just released its annual report into the findings surrounding harmful content online. This report presents results of mixed-methods research carried out on all cases dealt with in its first year of operation (January 2019 – December 2019). In the year analysed, the RHC website received 9,282 visitors and practitioners dealt with 164 unique cases. The service's popularity rapidly increased in September, following the official service launch, and continued to grow until the end of the year.

Image of laptop, tablet and smartphone showing Report Harmful Content

© Report Harmful Content

What was found?

Cases involving bullying and harassment were most common, followed by impersonation, abuse and threats. RHC found that online harassment and abuse disproportionately affected women and was often perpetrated by ex-partners.

Three common trends were identified:

  • A combination of impersonation, bullying and harassment and privacy violation. This trend disproportionally affected women and intersected with offline domestic violence and coercive control.
  • A combination of abuse, threats and hate speech. Within this, the most common type of hate speech reported was racism/xenophobia.
  • Clients inadvertently viewing harmful content (such as violence or pornography) rather than being victim to or witnessing targeted, harmful behaviour.

Strengths of the service

  • In most instances, practitioners were able to directly assist clients in reporting harmful content online.
  • In the remaining cases, content was deemed to be either criminal or it was found to be located on platforms with which RHC does not have partnerships. In these instances, practitioners provided advice and onward signposting.
  • Of the content escalated to industry, 92 per cent was successfully actioned (removed, restricted, or regained access to) and 62 per cent was done so within 72 hours, demonstrating a high level of service speed and efficiency.
  • The service offered vital emotional support, alongside signposting to other agencies and services, either for additional emotional support or practical assistance.
  • The report identifies multiple ways in which the RHC service can be developed so as to respond to the growth and diversification of the types of reports received.

Emerging issues for the service

  • Law enforcement action – 19 per cent of RHC clients reported content which was deemed to be criminal and thus referred to law enforcement. Of that 19 per cent, however, 47 per cent got back in touch with RHC, often reporting that the police had dismissed them and incorrectly informed them that their issue was non-criminal.
  • Inconsistency – Responses from industry platforms often showed a lack of clarity around what type of content would be removed. This commonly occurred in relation to cases involving a clash of characteristics protected under the equality act (Equality Act 2010), in particular gender reassignment and sex.
  • Cultural and religious context – RHC dealt with a number of clients from particular cultural and religious backgrounds who reported the exposure of private and/or intimate material. This type of content often did not meet legal or platform thresholds for harmful content and, as such, there were issues in securing its removal and safeguarding clients.
  • Mental health – One of the most significant issues to be identified was the widespread impact of online harms on mental health; 32 per cent of RHC clients reported negative mental health impacts as a result of viewing or being the victim of harmful content online, with 13 per cent reporting suicidal ideation.

This article was originally published on the website of South West Grid for Learning, an organisation from the UK Safer Internet Centre and is reproduced here with permission. To read the article in its original location, visit

Find out more information about the work of the UK Safer Internet Centre (SIC) generally, including its awareness raising, helpline, hotline and youth participation services, or find similar information for Safer Internet Centres throughout Europe.

Related news

Dealing with harmful and/or illegal content during COVID-19

Through Better Internet for Kids (BIK), and the European Network of Safer Internet Centres (SICs), our aim is to empower children and young people to remain safe online, and equally assist those that support them. During the time of the coronavirus pandemic, Safer Internet Centres have published a number of articles and resources on dealing with harmful and/or illegal content during COVID-19.

Safer Internet Day 2019: Point de Contact launches a browser add-on to report harmful content

  • Hotlines
  • 04/03/2019
  • Point de Contact, the French hotline

To mark Safer Internet Day 2019, the French hotline Point de Contact has launched a new browser add-on allowing users to easily report harmful content to the hotline.