Algorithms

Although this module focuses on computer algorithms, the concept of an algorithm dates thousands of years to the mathematics of the Ancient Greeks. The name ‘algorithm’ is derived from the latinised name of a ninth-century Persian mathematician, ‘Algoritmi’, and the modern use of the word emerged around the nineteenth century. Today, algorithms are used constantly in many processes and systems that power and run our daily lives. Awareness has grown in recent years due to the frequent use of algorithms in the online games, sites, and apps that many of us use, including young people.

This deep dive will provide an overview of what algorithms are and how they are used by online services. Both the positive applications and possible risks of algorithms will be highlighted, as will practical tips for you as an educator on how to support your students to understand the impact that algorithms can have on their identity, wellbeing and safety.

 

What are algorithms?

Algorithms are sets of instructions given to computer programs to perform specific tasks. This can be as simple as classifying or sorting objects based on given criteria (for instance, sorting a set of numbers from smallest to largest or searching data for a particular value) to more complicated tasks such as mathematical formulae, through to extremely complex tasks such as those performed by artificial intelligence algorithms using machine learning, learning from the data sets given to the algorithms.

How are algorithms used online?

Algorithms in online sites, apps and games can perform various functions. Many of these are based around tailoring a user’s experience to be more relevant and interesting based on their previous behaviour and personal data.

Activity

Before going further in this module, take a few moments to make a list of any online processes you are already aware of that use algorithms to alter your online experience. This could be on social media, websites, search engines or online games.
Don’t worry if you can’t think of many – the following section contains some examples to help demonstrate the depth and breadth of algorithm use online!

So, how are algorithms used online?

 
The video above provides some straightforward examples of social media algorithm use and tips for children and young people about steps to minimise the impact (which we will return to later in this module).

However, there are a wide number of algorithm applications across the online services that children and young people use.

These include:

  • Gaming – Algorithms are used in all aspects of video games and online games to perform many functions such as controlling characters, scoring points, pathfinding, goal completion and making non-player characters appear more lifelike, to name just a few.
  • Search engines – When searching keywords on Google, Bing and others, a massive collection of algorithms is working behind the scenes to organise, categorise and rank the possible search results based on relevancy, recency, and interconnectedness. If a user is logged into a search engine, algorithms will take other factors into account, such as the user’s personal data profile, previous/recent search queries and web browsing behaviour.
  • Websites – Some news websites will use algorithms to sort and classify the news articles you will see on each visit based on personal preferences. Any websites that display advertising (including search engines) will be using collected data about each user to display adverts that are more relevant to the user.
  • Social media – Social media services use algorithms extensively to analyse and learn from a user’s behaviour. This can include content that has been ‘liked’ or ‘favourited’, which accounts the user subscribes to or follows, who they add as friends or followers, what they post, what they share with other users, how they comment or interact with others’ content, and how they interact with online videos.
  • Gambling services – Although these are not for children, it is important to recognise that online gambling sites use a series of complex algorithms to ensure that the games they offer are fair and random, and to prevent players from manipulating or exploiting the system.

What are the benefits of online algorithms?

The main aim of algorithms (particularly on social media) is to help a user experience things that are relevant to them, rather than showing a user every piece of content and requiring them to choose. 

Here are some of the main ways that algorithms can be beneficial to children and young people online:

  • Reduce the ‘noise’ – By selecting the content a user is interested in, algorithms can shield them from experiencing every piece of information or anything irrelevant. Over time, the algorithms become more efficient in recognising the patterns of a user’s behaviour and can more accurately present them with interesting and relevant content.
  • Targeted advertising – As advertising is a key revenue stream for most sites and apps, removing or disabling the ads is often difficult. However, if a user is going to see adverts, the experience is likely to be more enjoyable if they see adverts for products that interest them rather than products that don’t. Algorithms learn which adverts are relevant to each user.
  • Help a user find what they like – Algorithms can help users to find more of what they enjoy. If a child watches a video about a particular video game on YouTube, algorithms will recommend more videos that are similar.
  • Keep content age-appropriate – If a child has given their age or date of birth in their account, if parental controls or age restrictions are in place, or if the algorithms have accurately predicted their age, the recommended content is more likely to be appropriate for their age group.

What are the possible risks to young people?

Although algorithms are designed to improve a user’s experience, they can also introduce possible risks that can affect the safety and wellbeing of young people:

  • Exposure to inappropriate or harmful content – As algorithms will often filter content based on a user’s behaviour and interests (for instance, what they watch most frequently, words that they search for), there is a risk that, if a young person searches for harmful content that is harmful, the algorithm mistakenly believes that this is content they want to see, and provide more of it in the future. Even searches for appropriate content can sometimes stray into the realms of upsetting or inappropriate, particularly if the algorithms decide that a user should see something that produces a stronger emotional response (such as content that contains hate or animal cruelty). In some cases, young people have been exposed to content that can harm them, such as content about self-harm, eating disorders and dangerous exercise/diet regimes.
  • Exposure to misinformation – Algorithms may serve up content to young people that presents misleading or false information but may be presented in a way that suggests it can be trusted. This can have a potential negative impact on their beliefs and behaviour. As content is often selected based on relevance rather than how recent it is, it can be hard to work out what is ‘new’ and what may have happened days or weeks ago.
  • Exposure to targeted adverts – While targeted adverts might be more interesting to watch, they can also be more persuasive in encouraging young people to purchase or desire products and services.
  • Persuasive design – As algorithms are used as part of persuasive design techniques, the frequent recommendation of relevant content can encourage young people to spend longer on social media or vide-sharing platforms than they wish. The ‘just one more…’ effect can be very persuasive!
  • Filter bubbles and echo chambers – These are two possible side effects of algorithmic-led experiences on social media. Filter bubbles are when a social media platform only recommends content similar to that you have already experienced. This makes it hard to see new things or experience different viewpoints, perspectives, and opinions. This is similar to echo chambers, where an algorithm may recommend other users to friend or follow. This can also result in a young person only interacting with people who share the same worldview, making it more difficult to encounter new perspectives. It can also lead them to believe that their views and opinions are more common than may be true.
  • Profiling – Alongside the personal data that users share, algorithms will make judgements about a user, some of which may be false or inaccurate. Algorithms may also classify groups of users as being the same when they are not – for example; algorithms might treat all female users aged 13 in the same way and presume they all have the same interests and opinions and show them all the same types of content. There are also concerns that, because algorithms and artificial intelligence systems are built and trained by humans, that bias can be present in the way algorithms present information to people of different protected characteristics (such as gender, ethnicity, religion, and country).

Some systems of algorithms can profile users very quickly, and it can be difficult to change this profile once it has been established. As an example, Wall Street Journal conducted a study in 2021 into how TikTok’s algorithms figured out the interests of its users. They discovered that the algorithms carefully tracked time and engagement with every video a user watches to figure out their interests. The app could correctly identify the main interests of a new user within 30 minutes of them joining, even if they hadn’t searched for or specified that information in their profile.

How can I support young people in this area?

It is important to to discuss with your students their experiences of algorithms online and the steps they can take to help minimise the chances of seeing or experiencing hurtful or harmful content. Things to consider include:

  • Algorithmic literacy – In the same way that media, information literacy and digital literacy are key life skills for young people, algorithmic literacy is also important. This doesn’t require an in-depth understanding of how algorithms operate, but having an awareness of how algorithms are used to target advertising and content to an individual is important.
  • Teach critical thinking and reasoning skills –Students need to learn strategies for discerning what is accurate, inaccurate, or false in the content they experience online. Developing these skills is key to help them manage situations where they encounter misleading or potentially hateful or harmful content.
  • Explore safety and privacy settings – Encourage young people to review their settings on their social media accounts, including providing an accurate date of birth. This can help the algorithms select age-appropriate content. Exploring where block/mute, unfollow and report features are can also equip young people with strategies to deal with inappropriate content that may appear in their feeds.
  • Encourage exploration – To ensure that young people aren’t trapped in filter bubbles or echo chambers, encourage them to search for content across a variety of apps/platforms so that they can more readily encounter different views and perspectives.

Further information and resources

The following resources offer further opportunities to extend your knowledge and understanding of algorithms, as well as offer learning opportunities to your students: