Why suicidal content on social media is dangerous

In addition to all the beautiful, interesting and funny content on social media services, there is unfortunately also negative content. For people in an acute crisis or even at risk of suicide, this content can be life-threatening. For example, if it reinforces the negative world view of those affected, condones suicide or even gives instructions for suicide. The German awareness centre klicksafe explains why social media algorithms reinforce the problem and how we can help those affected.

Date 2024-06-18 Author German Safer Internet Centre Section awareness Topic media literacy/education, potentially harmful content Audience children and young people, parents and carers, teachers, educators and professionals
A young man stares at his phone screen in distress

Serious suicidal thoughts or even a suicide attempt are often triggered by an emergency situation: bullying at school, love/relationships problems, a conflict with friends, family problems, school failure or general fear of failure. Depression can also lead to suicidal thoughts. In the adolescent's own assessment, these difficulties and crises are usually seen in an unrealistically negative light, and those affected might feel unable to continue living in the same way as before. Suicide is seen as the only apparent solution to a situation that can no longer be overcome.

At this moment, it is important for those affected to receive help and be shown possible solutions to their problems. In this situation, however, content that reinforces their already negative world view is particularly harmful. This could be:

  • Videos of other sufferers talking about their current poor condition, for example, or content that glorifies a negative world view, self-harm and suicide. 
  • Content that shows help options (e.g. counseling centers, therapy, medication) as useless or ineffective is also problematic. 
  • Lastly, all content that openly condones suicide and contains instructions for suicide.

What role do social media services play?


Many children and young people use social media platforms to communicate with friends, for entertainment, but also as a source of information. Instagram and TikTok are particularly popular among German adolescents. According to the JIM Study 2023, around 60 per cent of young people use Instagram and TikTok daily or several times a week. 

Content that promotes suicide is prohibited on both platforms. TikTok writes on its website on the topic of suicide and self-harm: "We do not allow content depicting, promoting, normalising, or glorifying activities that could lead to suicide or self-harm."
Instagram has also set out clear rules for dealing with suicide content in its guidelines: "We’ve never allowed people to celebrate or promote self-harm or suicide, and we also remove fictional depictions of suicide and self-harm, as well as content that shows methods or materials."
Despite these clear rules, both platforms are repeatedly criticised. One reason for this is that people in these situations are more likely to see problematic content. This is because when the platform's algorithm has recognised that someone is interested in content on the topic of mental health, this content is suggested increasingly often.
Amnesty International, for example, was able to document this phenomenon in the report "Driven into darkness: How TikTok promotes self-harm and suicidal ideation". In test accounts simulating the behaviour of people with mental health problems, every second suggested video contained harmful content even after a short time. These fake accounts were also shown mental health content up to ten times more frequently than other accounts.

The same problem can also be observed on Instagram. In an internal study entitled "Teen Mental Health Deep Dive", Instagram surveyed users between the ages of 13 and 17 about mental health. The research was published by Instagram a few years later after results were leaked to the press and reported worldwide. In the study, around 12 per cent of users stated that they were exposed to suicidal or self-harm content on Instagram over the last month. It is worth noting as particularly problematic that users who were experiencing mental health struggles were shown problematic content significantly more often than other users (as stated on page 40 in the report).

How can you help those affected?

  1. Be attentive and use reporting functions: parents and caregivers should always start by talking openly with their children about their online behaviour and, if necessary, about the topic of suicide. As a general rule, if you notice content that promotes suicide, always report it to the social media platform support team first (e.g. using the report feature). The platform operators are best suited to making it more difficult for children and young people to access the reported content.
  2. Keep calm: if you suspect that a child is at risk of suicide, you should not have a confrontational attitude. Try to speak directly to the person concerned, and take a neutral, non-judgemental stance if possible. Share your own worries and fears very clearly, but try to address the fear of suicide directly, without paraphrasing or trivialising it. Signal that you are available to provide support. If there is a willingness to talk, take advantage of it to see what you can do to help. Ask how serious the suicidal intentions are, and make an effort to identify together alternative ways of solving problems.
  3. Get professional help: if the person concerned is not willing to talk, or you are at a loss for advice yourself, you can still take action. Share your observations with other people you trust. Get professional help if possible. Adults in particular who are unsure how to assess the behaviour of affected children or adolescents should seek advice and help from counselling centres. In Germany, counselling services are also offered at school psychological counselling centres, educational counselling centres, or at girls' and women's clubs.
  4. Have content checked: you can also have content that is difficult to assess checked by experts for its risk potential. In Germany, you can contact the two German hotlines www.jugendschutz.net or www.internet-beschwerdestelle.de.
  5. Check the use of social media: try to assess whether social media content might have had a role in their negative mood recently, and stress the fact that algorithms can lead to more negative and problematic content being displayed. If this is the case, you can consider whether deliberate social media breaks make sense for the youth. If you don't want to give up the platform, you can also create a new account and use it if it makes you feel more comfortable. Note that the use of social media is not necessarily problematic or dangerous per se - on the contrary, social media services can also be helpful and positive for people in difficult situations, for example as a way to keep in touch with friends and relieve discomfort.

In Germany, online counselling centres addressing the topic of suicide are:

You can find more information about the work of the German Safer Internet Centre, including its awareness raising, helpline, hotline, and youth participation services, or similar information for other Safer Internet Centres throughout Europe.

Related news