Text-based AI tools
Text-based Generative AI tools include ChatGPT, Gemini and Copilot. These tools can generate responses based on what a user types whether that’s a question, a task, or an idea.
Why it matters to learn about this together
For many young people, these tools are becoming part of everyday learning and problem-solving. The tools can explain concepts, help with writing, generate ideas, create summaries, or provide answers to questions.
However, these tools don’t “know” things the way people do, instead they predict what sounds right based on the patterns of data they've been trained on. This means they can sound confident, even when they’re wrong.
Young people might use generative AI text-based tools to:
- Help with homework
- Understand something they’re stuck on
- Generate ideas or complete tasks faster
It can feel helpful, quick, easy, and like having instant support available. But it can also change how young people approach learning, think about their own ability or decide what counts as their own work.
Using AI can be both helpful and confusing for young people. It can build confidence when they’re stuck on a problem or reduce stress around schoolwork, but can also blur the line between getting help and doing the work for them. AI may create feelings of pressure to produce “perfect” answers, or lead to comparison and a feeling that "AI writes better than me”.
Text-based generative AI tools may use the information shared in prompts to continue building datasets, and this information may be shared in responses to other people's prompts. Never share personal information with an AI tool to protect your privacy.
Try this together
You don’t need to understand the tool fully, just explore it together to start understanding how it behaves.
- If your child already uses AI, ask them to show you how they use it for something like homework, writing, or generating ideas
- Try asking AI the same question together, then compare the answer with a textbook or trusted website. Look for anything that seems unclear or incorrect
- Ask the AI something you already know the answer to and see how it responds
- Try improving a prompt together, for example adding more details like “make this clearer” or “explain it like I’m 10”
The goal is to work together to see how it works, and where it can go wrong.
Talk about it
Try these conversation starters:
- “What do you usually use AI for? Helping, checking, or doing the work?”
- “Has it ever given you information that didn’t seem right?”
- “How do you decide if an answer is actually correct?”
- “When is it helpful to use AI and when is it better to do it yourself?”
- “What would you never type into an AI tool?”
- “When you use AI, do you feel like you understand the answer or just get something that works?”
- “Do you ever feel tempted to copy what AI gives you instead of working it out yourself?”
- “Do your friends use AI? Does it ever feel like you have to use it to keep up?”
- “What do you think is okay to use AI for and what should be your own work?”
- “Would you ever put personal information into AI? Why or why not?”
- “Have you ever seen AI give an answer that didn’t seem right?”
- “What would you do if AI gave you something incorrect or confusing?”
- "Do you think AI can ever be biased or unfair? What might that look like?”
- “Have you ever seen AI give an answer that didn’t seem right?”
- “What would you do if AI gave you something incorrect or confusing?”
- “Do you think AI can ever be biased or unfair? What might that look like?”
These conversation starters focus on reflection (not rules) and real behaviour (not hypotheticals), and may lead to bigger chats that help your young person build judgement, confidence and critical thinking.
Key risks and how to prevent them
AI can make mistakes, share inaccurate information or invent facts (“hallucinations”)
- Remind your child that AI is a starting point and not the final answer
Young people may become reliant on using AI to do the work for them, and copy answers without understanding the information
- Practise fact-checking together using trusted sources
There are genuine privacy concerns about sharing personal info, names, photos, or school details with AI tools, that may use that information in answers given to other users
- Encourage them to protect their privacy: no full names, photos, addresses, school names, or personal stories
AI tools base answers on information and data it was trained on, and this data may include unfair, one-sided or biased views. This may occasionally produce inappropriate or unfair content being produced in a response to a prompt
- Talk about what ethical use looks like at their age and in their school.
There can be confusion about originality, and young people may feel unsure about what counts as “their” work
- Reinforce that their voice, ideas, and effort matter more than perfection
Final word
AI can be a powerful support for learning and creativity but it works best when young people understand it's limitations and know where to go for help and support if they're unsure.
Encourage your young person to stay curious, keep questioning and to trust their own thinking. AI should be a starting point, not the final answer, and information should always be checked for accuracy and fairness against other sources. Importantly, ensure your young person knows to keep their personal information private.





