Blue Whale Challenge

Over the past few months, the media has been regularly reporting about a new and disturbing "online game of death" – the Blue Whale Challenge (BWC). While there is no concrete information on the existence of such a game, the speculation and buzz created by it is worrying. In April 2017, representatives the European network of Safer Internet Centres (SICs) and other key stakeholders attended a webinar on the issue. An article was also published on the Better Internet for Kids (BIK) public portal at that time. This article provides further information based on a report recently published by jugendschutz.net*, one of the hotlines administered by the German SIC.

Date 2017-09-06 Author German Safer Internet Centre Section awareness, helplines, hotlines

What is the Blue Whale Challenge?

The name is supposedly a reference to blue whales that deliberately cast themselves on land to die. The game follows a format where tasks are assigned to players by an administrator over a period of 50 days. The tasks include self-harming, carrying out secret tasks, and watching videos sent to the player, with all this culminating in suicide. All tasks need to be documented by the player and shared online.

Where do children come across this Blue Whale Challenge?

jugendschutz.net carried out a short research study in May 2017, where it discovered that the game was easy to find on social media platforms and other popular internet services by using relevant hashtags. WhatsApp groups were also found to be linked to the Blue Whale Challenge. Additionally, there exists the possibility of peer encouragement to get involved in the challenge.

What actions can platform operators take?

A proactive approach by operators can have an effective impact. Some steps that can be taken are outlined below:

  • Terms of service and content guidelines should include a clause banning harmful, threatening and abusive content, or content glorifying or trivialising self-harm.
  • Support teams should be made aware of, and specifically trained for, dealing with such phenomena.
  • In terms of imminent danger situations, all relevant information must be forwarded to law enforcement.
  • Reports from users on risky content should be prioritised and dealt with quickly, ideally in real time. It is also important to prevent content or profiles already removed from being re-uploaded.
  • Posts that trivialise, glorify and promote the challenge, or express interest in participating, should be removed immediately.
  • Relevant hashtags, as well as possible variations (such as spelling changes), should be blocked.

How can content regarding this challenge be tackled?

Users should report risky content. It can be reported to the platform or to relevant hotlines if the hosting country has a hotline that deals with illegal material. Users should not share it, re-post it, or like it, in order to avoid further promoting the content. In case of immediate threat, it can also be reported to law enforcement.

Find out more about the work of the German Safer Internet Centre, including its awareness raising, helpline, hotline and youth participation services.

 

* As the German centre at federal and state level concerning the protection of minors on the internet, jugendschutz.net looks closely at risks in internet services specifically attracting young people and urges providers and platform operators to design their content in a way that allows children and young people to use the internet free of issues. jugendschutz.net operates a hotline accepting reports about illegal and harmful content and takes appropriate action to have this content removed as quickly as possible. The focus of the work is on risky contacts, self-harm behaviour, political extremism and child sexual exploitation, but jugendschutz.net also aims at enabling young users to have safe and positive experiences online. For more information please visit http://www.jugendschutz.net/en/.

Related news