A closer look at online hate speech for the International Day for Tolerance

Saturday, 16 November 2019 marks the United Nations International Day for Tolerance. This observance has been celebrated every year since 1995, when the United Nations Educational, Scientific and Cultural Organisation (UNESCO) adopted its Declaration of Principles on Tolerance.

The International Day for Tolerance (or shorter, Tolerance Day) is an opportunity to reaffirm the United Nations' (UN) commitment to strengthening tolerance by fostering mutual understanding among cultures and peoples, which is "more important than ever in this era of rising and violent extremism and widening conflicts that are characterised by a fundamental disregard for human life".

Online, this "rising extremism" notably takes the form of hate speech. According to the Council of Europe (CoE), "hate speech covers many forms of expressions which spread, incite, promote or justify hatred, violence and discrimination against a person or group of persons for a variety of reasons". A phenomenon that "poses grave dangers for the cohesion of a democratic society, the protection of human rights and the rule of law".

To prevent and counter the spread of this phenomenon, in May 2016, the European Commission agreed with Facebook, Microsoft, Twitter and YouTube a EU Code of conduct on countering illegal hate speech online – which has been joined, since then, by Instagram, Google+, Snapchat, Dailymotion and Jeuxvideo.com. These platforms have now put in place terms of service, rules or community standards prohibiting users from posting content inciting to violence or hatred against protected groups. They have also significantly increased their number of employees monitoring the content. According to the latest estimates, they now remove 89 per cent of flagged content within 24 hours.

For more information about the Code of conduct, visit the European Commission's website and have a look at the factsheet "How the Code of conduct helped countering illegal hate speech online".

SELMA (Social and Emotional Learning for Mutual Awareness) is a two-year project co-funded by the European Commission and aiming to tackle the problem of online hate speech by promoting mutual awareness, tolerance, and respect. On Thursday, 21 November 2019, at Safer Internet Forum, the SELMA partners will host an open space session to allow participants to engage with some of the materials prepared as part of the project, and gain a more in-depth understanding of the SELMA approach.

For more information about SELMA, visit hackinghate.eu. To learn about Safer Internet Forum 2019, visit betterinternetforkids.eu/sif.


Related news

Hacking online hate speech with SELMA

Hate speech is increasingly common on social media; but that does not make it any less problematic. A recent study released by the SELMA (Social and Emotional Learning for Mutual Awareness) project shows how online hate speech has become an inevitable part of young people's daily experiences online, with education and awareness-raising efforts on the topic lagging behind. To complement existing initiatives to regulate, monitor or report online hate speech, a more pro-active answer is needed.

New code of conduct for IT companies fighting hate speech

In line with recent activity of the European Commission on combating hate speech and the spread of terrorist material and exploitation on communication channels, as well as protecting the freedom of speech, a new code of conduct for companies fighting hate speech was announced in May 2016.