Online hate speech

Hateful online content is something that youth increasingly encounter. This module explores the nature of hate speech and how you can support your learners to tackle it positively and safely. 

What is online hate speech? 

There is no one single definition for online hate speech – it affects different communities/groups in different ways in different parts of the world. 

Here are two definitions that can help you start to consider the nature of the phenomenon: 

Firstly, drawing from research into online hate speech, the SELMA: Hacking Hate project (2019) created the following short definition of online hate speech: 

"Any online content targeting someone/a group based on protected characteristics with the intent or likely effect of inciting, spreading or promoting hatred or other forms of discrimination."

Alongside this, also consider this definition from Council of Europe’s Committee of Ministers (2022) for hate speech generally: 

"…all types of expression that incite, promote, spread or justify violence, hatred or discrimination against a person or group of persons, or that denigrates them, by reason of their real or attributed personal characteristics or status such as race, colour, language, religion, nationality, national or ethnic origin, age, disability, sex, gender identity and sexual orientation."

Activity

Before going any further, take a few moments to reflect on the above definitions and your current understanding of what hate speech is. Do these definitions explain the nature of this behaviour clearly to you? Do they cover the behaviours your learners may experience online (or offline)? 

If you work with younger learners, you may wish to consider how to adapt the above definitions to make them easier to understand for your learners. The following sections of this deep dive may help! 

What do we mean by protected characteristics? 

Protected characteristics are aspects of a person’s identity that are protected from discrimination – treating someone unfairly because they possess one or more of these characteristics. These include aspects such as: 

  • Race
  • Colour
  • Descent
  • National or ethnic origin
  • Age 
  • Disability
  • Language
  • Religion or belief
  • Gender and gender identity
  • Sexual orientation

(Source: European Commission against Racism and Intolerance (ECRI), 2016).

What forms can hate speech take online? 

Hate speech can take many different forms online. It could be in the form of...  

  • a public message (e.g. a message or comment on social media that anyone could see), 
  • a private message (e.g. a direct message sent to someone on social media), 
  • an image or photo (including memes), 
  • a video or livestream of video content, 
  • a sound or voice recording, 
  • behaviour in a game (e.g. ganging up on a player because of who they are), 
  • or a longer piece of writing such as a blog, website or conversation on a message board. 

Some websites are overtly hateful (often those run by extremist groups) but others are more subtle. ‘Cloaked websites’ are those that appear objective and neutral at first sight but are actually promoting messages of discrimination, hatred and violence through information buried deeper on the site, or through links on the site that take a user to more extreme material elsewhere. 

Hate speech can be expressed by individual users online, by groups with a specific agenda, but also by social media users with a large following/audience, and public figures such as influencers, politicians, world leaders and celebrities, who can often share their views and opinions more widely than other users. This can also influence others to behave in a hateful or even violent way towards a targeted group if they see prominent public figures expressing hate online.  

Mainstream news outlets also play a role in online hate speech. Headlines or stories may be published that display prejudice or discrimination towards groups of people; sometimes based on their protected characteristics. While these news stories may report the facts, they may also include negative words/phrases or commentary from journalists that show bias against a targeted group. The posting and disseminating of these news stories online can encourage some people to adopt the use of these negative terms and ideas to further spread negative messages. 

Who does hate speech target? 

We all possess protected characteristics; parts of our identity that are protected, such as our gender, ethnicity, religious beliefs and sexual orientation, to name but a few. 

Some individuals and groups often get discriminated against online more than others. For example, Research into hate speech on X (formerly Twitter) in 2023 found that hate speech was not only increasing, but that the most used hateful terms/phrases included hate aimed at women & girls, LGBTQIA+, Jewish people, Muslim people and black people. The Centre for Countering Digital Hate found that, as of September 2023, tweets promoting and glorifying antisemitism, anti-Black racism, neo-Nazism, white supremacy and/or other racism were still present on the platform, even after being reported. 

It is often minority groups that receive hate speech, and who the minority is depends on where you are in the world. Within your school community there may be young people who are more vulnerable than others to being targeted by hate speech. 

Researchers in this area have also commented that online hate speech is also posted by a minority of people – most online users do not create or spread hate speech, but there are small, dedicated groups and communities online that seek to harm others. 

How does hate speech make people feel? 

Hate speech is designed to abuse others and provoke strong emotional reactions such as anger, fear and upset. It often attempts to dehumanise the target individual/group by using words or phrases that make a person or a group sound inferior to other groups in society (e.g. use of words like ‘scum, ‘trash’ and ‘rubbish’, or comparing a target group to animals based on negative connotations – ‘rats’ or vermin, ‘dogs’, ‘pigs’, etc.). 

For someone targeted by hate speech, or seeing hate speech online, this can lead to feelings of worry or anxiety, loss of confidence and self-worth. It can also lead to feelings of helplessness if their views and opinions are being drowned out by other hateful views. Online hate speech often reinforces existing prejudices that are present in society or a community and, if inciting real world violence or aggression, can make targeted individuals/groups fear for their safety offline. 

Hate speech often involves in-group and out-group dynamics based in Social Identity Theory – for someone spreading hate speech, it is about demeaning those who are different to you, while reinforcing or justifying your own self-worth by making you ‘superior’ to others. This activity from SELMA: Hacking Hate can help you explore these concepts with learners. 

Understanding hate speech in order to tackle it also relies on understanding emotions; both in yourself and others, and how they influence people’s views, choices and actions (online and offline). 

Understanding SEL and emotions 

Collaborative for Academic, Social, and Emotional Learning (CASEL) define Social and Emotional Learning (SEL) as: 

"…the process through which all young people and adults acquire and apply the knowledge, skills, and attitudes to develop healthy identities, manage emotions and achieve personal and collective goals, feel and show empathy for others, establish and maintain supportive relationships, and make responsible and caring decisions." 

These competencies can be sorted into five key areas and different domains/settings, as pictured below: 

Graph showing competences part of SEL: self-awareness, self-management, responsible decision-making, relationship skills, social awareness.

<

Source: CASEL.org

To develop these competencies, a firm understanding of emotional states is also important. The Mood Meter, part of the RULER programme from Yale’s Centre for Emotional Intelligence, provides a framework for categorising and understanding emotions:

Table categorising a series of moods.

Source: SELMA: Hacking Hate

Helping learners to build emotional vocabulary (by understanding the words in the above chart) empowers them to recognise and explain their own feelings. Once experienced in defining your own emotions, you can develop strategies to help shift your emotional state from one emotion to another, or from one quadrant to another. The above chart shows four quadrants: blue and red emotions are ‘less pleasant’, yellow and green are ‘more pleasant’. Also, red and yellow emotions are ’high energy’ and blue and green emotions are ‘low energy’. The intensity of the emotions are greater towards the edge and the corners of each quadrant.

Managing your own emotional state can give an individual more control in a situation – they can decide what to say/do and when to say/do it. However, it can also enable an individual to understand the emotions of others. This is fundamental in developing empathy, and is a key skill required for combatting online hate speech. Understanding the emotions of others and how these affect their behaviour can help an individual to develop strategies to ‘nudge’ others into behaving differently – both for better and for worse!

Meta-moments and your best online self

Recognising and managing your emotions aren’t quick and easy skills to learn; for many people it can take a lifetime. However, teaching SEL to children and young people can help equip them with strategies to regulate their emotions. This can give them the foundations to make rational, considered decisions and responses, as opposed to acting impulsively or with a strong emotional response. As hate speech is intended to produce a strong emotional response in the people it targets, these skills can be a useful defence against this hateful content.

As part of this approach, the concept of a meta-moment is a valuable one in a busy connected digital world. The video below from Yale’s Centre for Emotional Intelligence explains the idea of a meta-moment in more detail:


In order to create a meta-moment, a young person must have strategies for regulating their emotions when encountering online content or communication that could elicit a strong unpleasant emotion such as anger or upset. These could include:

  • Controlled breathing – using breathing techniques to regulate their breathing and bring them to a state of calmness.
  • Private self-talk – discussing a scenario with themselves to consider possible choices. This could be done in their head or out loud.
  • Picturing positive things – thinking about a place, activity, person, pet or object that makes them feel happy.
  • Distraction – switching to a different activity or task to take their mind away from the content that triggered them.
  • Creating distance – putting physical distance between themselves and a device results in having to make a conscious choice to engage with the content again.
  • Use up energy – Pent up energy can be expended by physical activity (running, walking, boxing) or even screaming/shouting!

Once a young person is emotionally regulated, they can consider what their ‘best online self’ might do to respond to the situation. 

Activity

Take a few moments to consider what your ‘best online self’ would look like. What would you do/not do? What would you say/not say? 

Here is an example of someone’s best self online:

Source: Kid_ACTIONS Educational Toolkit

Using the above example for inspiration, draw a picture of yourself (or an outline of a body) and label the different parts of the picture with the qualities and behaviours of your ‘best online self’.

This is an activity that can be run with children and young people to explore positive behaviour online, and provides a template they can refer to when considering how to show positive and respectful online behaviour.

What advice can I give to young people about hate speech?

Hateful online content can be deeply distressing, even if it isn’t aimed at you. However, there are a number of steps that can be taken to tackle it in a positive way that can protect young people:

  • Report – Use reporting tools on social media apps and platforms whenever you see content that is unacceptable. Encourage others to also report it – the more reports a post receives, the more likely a platform is to take it seriously.
  • Respond with care – Hate speech is often designed to evoke strong emotions. Responding with strong emotions is unlikely to improve the situation. Some users can be called out for their behaviour, but others want to start an argument or fight. Considering when to communicate with a hater and when to stay silent is important.
  • Think positive – Positivity can counteract hate, and provide support to someone being targeted.  One effective strategy is ‘flipping’ negative comments into positive ones by replacing the negative language with positive sentiments and reposting the comment.  
  • You can’t always change other people - Recognise that everyone has their own views and opinions, and even if you think someone is ‘wrong’, it may not be possible to change their view or get them to agree with you. Sometimes you have to recognise when you can’t win online!
  • Safety comes first – Always consider your own safety before engaging with an online hater. Sometimes the safer option is to use the reporting tools on social media, rather than getting into a confrontation. 
  • Block and mute – Encourage anyone targeted by hate speech to block/mute the hater. This can help prevent seeing future hateful content from that user or similar users.
  • Don’t be a hater – Posting when angry or upset can lead to us treating others without respect. Being mindful of your audience, and remembering that people might misunderstand your words/actions (or find them offensive) are important considerations.
  • Understand the laws in your country – Learning about relevant laws around protected characteristics, threatening behaviour and harassment can help young people exercise their rights online and seek help from law enforcement.
  • Seek help – As with other distressing or upsetting online experiences, ensuring young people know they can talk to a trusted adult, helpline or other support service to receive advice, reassurance and support is vital.

Further reading and support

The following provide further opportunities for exploring and understanding hate speech online, and ways to access help and support.

  • Insafe network – Details of the network of Safer Internet Centres across Europe, providing information, advice and support through helplines.
  • SELMA: Hacking Hate – Resources and materials for secondary educators to explore with learners the phenomenon of hate speech and what can be done to positively and safely challenge it.
  • No Hate Speech Movement – Details of resources and youth campaigns from across Europe to tackle and raise awareness of hate speech and hate crimes.
  • Countering hate speech – Resources from UNESCO on understanding and countering hate speech online and offline.