Generative AI

Artificial intelligence (AI) is a term you have probably heard of before – many of the online services, devices, and digital tools that we use in our daily lives include some form of artificial intelligence that helps us perform tasks more quickly and easily, be it for work, leisure or learning. Generative AI is one subset of artificial intelligence that can enable anyone to create digital content.

 


This deep dive will provide an overview of generative AI, how it is used, and the opportunities it can present to young people. The issues and risks around using this technology will also be explored, along with practical advice for how you can support your students in using generative AI safely and responsibly.


What is Artificial Intelligence?

Watch this video for a helpful introduction to the features of Artificial Intelligence:

 

In summary, at the broadest level, artificial intelligence involves creating computer systems that can act and think in human-like ways. This includes the ability to process data, problem solve, reason, and draw patterns or classifications and apply them to different tasks or functions.
Systems that can learn and apply that learning sit under the subfield of AI known as Machine Learning, the ability for computers to learn without being given explicit programming. Within this area, another subset known as ‘Deep Learning’ uses artificial neural networks, systems that can process more complex patterns and data than traditional machine learning. 

 

 

What is Generative AI?

Deep Learning subsets fall into two types of models – discriminative and generative.

Discriminative models are trained on labelled data to learn the patterns and the relationships between the data and its labels. These models are used to classify data or predict future data. For example, a discriminative model could be trained on photos of animals to learn to identify the animal in each photo. It could then be given new photos and be able to accurately identify the animal in each one.

Generative models are also trained on data – a small amount of labelled data and a large amount of unlabelled data. These models can not only classify, identify, and predict data but also generate new data like that it has seen before. The data it produces is similar to what humans can produce – text or speech using natural language, images/videos and audio. This is known as Generative AI.

There are a wide number of Generative AI tools available online for users to interact with. 

These include Large Language Models (LLMs) such as ChatGPT and Google’s Bard. These are generative AI tools where users can give text inputs (ask questions, set written tasks, request summaries of concepts or information, etc.), and the tool provides a text output. Because these tools are based on artificial neural networks (complex computer systems that mimic the neuron structure of the human brain), these tools can also respond to follow-up questions/instructions and can refine responses based on further information from the user.

Here's an example: ChatGPT3.5 was asked to explain what generative AI is using non-complex terms, and this is one result it produced:

Generative AI can be seen as a digital artist in the computer world. It learns by looking at many pictures, stories, or music pieces. Then, it uses what it learns to create its own pictures, stories, or music that seem real, even though they're actually made by the computer.
Imagine it like a friendly competition: one part of the computer tries to create something, and another part tries to decide if it's real or not. They keep practising and competing until the computer gets really good at making things that seem almost like they were done by a human.
This kind of AI can write new text that sounds like it was written by people, make images that look like real photos, and even compose music that sounds like it was made by humans. It's also useful for making more data to teach other AI, coming up with new ideas, and even helping scientists discover new medicines.
In simple terms, generative AI helps computers become artists by learning from what they've seen and heard before and using that knowledge to create new things that seem real
.”

It is worth keeping in mind that the above response was the third answer ChatGPT produced. The first response was far too long and complicated, so it was asked to summarise in 250 words or less. The second response also used a lot of complex language, so it was asked to simplify the language. Users can refine the response based on what they require; the more criteria you give the LLM, the better it can meet your requirements.

As ChatGPT mentioned in its response, there are other Generative AI tools that can produce other outputs, such as images/videos and audio/music. Popular image-generative tools include DALL.E 2 and Midjourney, and popular audio-generative tools include Meta’s AudioCraft and Beatoven.ai. A large (and growing) number of tools are available, and many are free to online users, although do require creating a free account. Most image-generation tools require a paid subscription.

Activity: 

Before going any further in this deep dive module, it is worth exploring some generative AI tools to get a feel for how they work and what they can produce. Here are some suggested ideas to try on a tool, but feel free to be as creative or inventive as you wish – it is likely that your students are already using some of these tools.

  • Ask a LLM like ChatGPT to write lyrics for a new song in the style of your favourite band/artist. What instructions can you give to refine the output? (e.g. more/less verses, base the song on a theme/topic, make the lyrics alliterative, etc.).
  • Instruct an image generation tool like Gencraft (free) to create an image of your favourite animal doing something unexpected in an unusual location. Below  is what was created with the ‘cartoon style’ selected and the prompt ‘A panda wearing a green suit playing saxophone next to a pink waterfall’:

What are the benefits of using generative AI?

As you have no doubt discovered through exploring some generative AI tools, they can be very entertaining and fun to manipulate.

However, there are some very real opportunities and benefits that these types of AI tools can offer to young people:

  • Creativity – While the quality of the output from these tools can vary dramatically, they act as an excellent starting point for developing and exploring ideas or creating a stimulus. As an educator, you might also be able to benefit from using these tools in the classroom. Imagine asking your students to write a story or description based on an AI-generated image or a story idea generated by AI. Or to create AI artwork that mimics the styles of several artists at once.
  • Fast and easy – Many Generative AI tools require nothing more than a text prompt from a user in order to create something, but offer layers of complexity for those who wish to explore and refine the output. As most tools operate in the cloud, no software needs to be downloaded or installed, and the huge power of cloud processing produces results in seconds.
  • Fast and easy – Many Generative AI tools require nothing more than a text prompt from a user in order to create something, but offer layers of complexity for those who wish to explore and refine the output. As most tools operate in the cloud, no software needs to be downloaded or installed, and the huge power of cloud processing produces results in seconds.
  • Enables creation without abilities/skills – While it is hard to argue that the current level of generative AI content is equal to (or even superior to) that of work produced by humans, these tools can allow anyone without the required skills or abilities to create content. Someone with no artistic ability can use an image generation tool to create a piece of art, LLMs allow the creation of lyrics and poetry without the necessary skills. These tools also offer value to those who do possess skills; an artistic person might be able to refine or enhance their work using generative AI in a way that a non-artistic person cannot.
  • Mostly free – A few tools require a paid subscription, but there are many free alternatives that may be able to meet the creative needs of a young person. Some free tools give users a limited number of generations a day.
  • Perform tasks to free up time – Many tools, particularly LLMs, can be used to quickly perform tasks that would normally require a time investment from a human, but don’t require human thinking or intervention. This can free up people’s time and capacity to focus on more complex or challenging tasks that require human input.
  •  Accessibility – A huge benefit of some tools is the access they enable for users with disabilities or learning needs. For example, AI screen readers can accurately describe images to a partially sighted user or convert text to natural-sounding speech. LLMs can convey text information in different ways (such as level of complexity or length) depending on a user’s needs. Some tools can learn a user’s needs and automatically format or present output to meet those needs.

What are the issues and risks for young people in using Generative AI?

While there are many benefits that Generative AI tools can offer to all users, there are a number of issues and risks that young people need to be aware of and understand:

  • Accuracy – LLMs such as ChatGPT are adept at producing responses in natural language that ‘sound’ right, but that doesn’t always mean that the information they contain is true or accurate. OpenAI, the creators of ChatGPT, acknowledge that the tool can ‘hallucinate’ outputs and make up facts. Anecdotally, some users have found that, when ChatGPT can’t find relevant data, facts or statistics to answer a question, it may make up this information in order to still give a response. In 2023, two New York lawyers were fined for submitting a legal brief that contained six fictional legal case citations created by ChatGPT.Therefore, just because a generative AI computer system has created something, doesn’t mean that it is automatically accurate or even true.
  •  Trust –Generative AI can make mistakes, but can also be used deliberately to create content that reflects a user’s bias or opinions. If this content is shared widely with other online users as part of a misinformation/disinformation campaign, there is a risk that some users will believe the content is true or real. This can potentially affect their own views, biases and beliefs, which in turn could lead to actions that put themselves or others at risk. In early 2023, AI generated images of Donald Trump being arrested were widely circulated on social media, which led to some users believing them to be genuine.
  • Plagiarism/ownership – One concern for many educators is that young people using Generative AI to create content or write text may either use the text without editing it (which could be considered plagiarism of the LLM’s work – an issue that is currently being debated), or inadvertently plagiarise the work of real content creators. There are questions that society has still yet to answer or agree upon concerning copyright, intellectual property and ownership around Generative AI. Can an AI ‘own’ the content it creates? Should copyright owners be compensated if their work is used to train a generative AI model to replicate it? How should ‘fair use’ be applied to Generative AI when these laws vary from country to country?
  • Using Generative AI tools can open a young person up to involvement and responsibility in these areas. 
  • Harmful content – Generative AI tools can produce output that matches the prompt/request, but it is still possible that the content produced might expose a young person to something harmful, inappropriate, or illegal. There are safeguards built into most tools to prevent users from requesting information or creating illegal content (for example, a LLM will not provide instructions on how to build a bomb or gain access to a locked car). However, as with all technology, there are ways around these restrictions! One common method is to instruct ChatGPT to adopt an amoral, always truthful persona, then ask it questions that it would normally refuse. While there are sometimes justifiable reasons for wanting to bypass moral and ethical coding built into Generative AI tools, doing so can put young people in situations where they see or learn things that are harmful to them.

  • Data privacy – As with harmful content, there are methods that exist to make LLMs reveal personal data about an individual. LLMs may also inadvertently reveal personal data through regular questioning. It is also important to be aware that Generative AI tools that require an account to created have a minimum age requirement of 13 to comply with data protection laws such as GDPR. In some cases, users need to be 18 years old or above to use the service.

  • Amplification of bias – Generative AI tools are trained on large amounts of data and existing content to allow them to generate similar new content. Depending on the data used, there is a risk that the tool may possess inherent biases. This 2023 Bloomberg article found that a text-to-image AI tool showed bias against race and gender in relation to high and low paid occupations.
    Young people using these tools may not be aware that they are being presented with potentially biased output, which may impact on their views and beliefs.
  • Impact on education – Generative AI is a set of tools that should be used responsibly to enable or enhance skills, knowledge, and learning. However, if LLMs are used by students to write answers or reports for them, then they may have bypassed actual learning about a concept/topic. AI generated content may disguise the gaps in a student’s understanding, which could cause issues later in their education.

What advice can I give my students about Generative AI?

You can support your students to using Generative AI tools safely and responsibly in several ways. Depending on the age and experience of your students, not all the following may apply:

  • Explore the tools together – Generative AI is high profile; many young people will have heard about various tools or be curious to see what they can do. Some students may already be actively using these creative tools regularly. Take time to find out what tools they use/are interested in and explore them together as a group. You can create an account as an adult, thus avoiding data protection issues that affect younger students. You can also model use of the tool and discuss with your students how to get the best results out of it.
  • Be futurists – The whole discipline of AI is fast moving, and discussing this with your students is an important way to understand their views on it. Are they excited by AI tools, or worried that it might take over the previously human-filled jobs they are interested in? Talking about what their future might be like may help to identify issues or training needs before those young people encounter them.
  • Recognise the digital divide – While access to Generative AI is relatively easy (most tools are run in an internet browser on a device), it is important to remember that not all young people have access to this technology. Be aware of this and look for opportunities to help them learn about Generative AI within their school-based learning.
  • Focus on input skills – The best way to avoid many of the issues detailed in this module is to develop good skills and experience in working with Generative AI tools. Understanding how to give a text prompt to get the results you want, and how to use further prompts and instructions to refine the results, are important skills to nurture. This will empower young people to use these tools independently in safe and responsible ways.
  • Build data privacy skills – Personal data revealed by LLMs can only occur if it can be found publicly online. If your students regularly use social media or other online platforms that can publicly display their personal data, take the time to discuss and explore the privacy settings that exist on many platforms to limit public access to that data. Regular discussions about what personal data should/shouldn’t be shared publicly online (including in photos and videos) is also helpful.

Further information and resources

Want to learn more about supporting young people in using Generative AI to its full potential? These resources may be useful:

  • Better Internet for Kids Resources – Educational resources from across the Insafe network of Safer Internet Centres. You can search for ‘artificial intelligence’ or ‘AI’, for resources in your language and for resources for different age groups.
  •  CO:RE Evidence Base – A database of publications and research on youth online experiences. Searching the database with ‘artificial intelligence’ allows you to browse and read relevant research related to this issue.
  • School of Social Networks – This resource for primary-aged children, teachers and parents/carers provides information and advice on a range of online issues, including around managing personal data online. There are accompanying activities that teachers can use in the classroom and parents can use at home.
  • Common Sense Media – This guide on how to handle AI in schools provides some useful information and resources for educators.
  • Google Cloud Skills Boost – Want to deepen your technical understanding of Generative AI? This free course by Google can help you understand the area in greater detail. Note: This course is complex!
  • Generative AI Learning Resources – A list of resources compiled by the Tuck School of Business provides interesting further reading on Generative AI and the role it may play in education.