AI Chatbots: What They Can and Cannot Do
A conversation guide to help children understand AI chatbots, their limitations, and why they should not be treated as friends or trusted sources.
AI chatbots are becoming part of everyday life for young people, from homework help to entertainment. This conversation helps your child understand what AI can and cannot do, why it should not be treated as a trusted friend, and how to use it safely.
When to have this conversation
When your child starts using AI tools for homework, entertainment, or curiosity — typically from age 12 upwards.
Before you start
- • Try using the AI tool your child uses so you can speak from experience.
- • Prepare a simple example of AI getting something wrong to share during the conversation.
- • Think about what boundaries you want to set around AI use in your home.
Conversation by age group
"I have been reading about AI chatbots and I wanted to chat with you about them. Have you used any?"
parent
"AI chatbots like ChatGPT can seem really clever, but they actually do not understand anything. They predict words based on patterns. That means they can sound confident even when they are completely wrong."
Use simple language to explain the core limitation.
child
"But it got my homework question right last time."
parent
"Sometimes they do get things right, and sometimes they make up entirely false information that sounds convincing. That is why you should always check facts against a reliable source, like a textbook or an established website."
parent
"One more thing — never share personal information with a chatbot. Your name, school, address, problems — none of that. Anything you type could be stored and used in ways you cannot control."
Tips for this age
- • Try a fun exercise together: ask the chatbot factual questions and check if it gets them right.
- • Set a rule that AI tools are for brainstorming, not for final answers.
"I want to talk about AI tools — I know you probably use them, and I think it is worth having a conversation about how to use them wisely."
parent
"AI chatbots are genuinely useful for some things — brainstorming, exploring ideas, getting a first draft. But they have real limitations that matter."
parent
"They can generate false information, they have no real understanding of context, and some character-based AI apps are designed to be emotionally engaging in ways that can be unhealthy."
child
"I only use it for schoolwork."
parent
"That is fine, as long as you are checking the facts. And if you ever come across an AI app that feels like it is trying to be your friend or therapist, please talk to me about it. Those tools cannot care about you, even if they sound like they do."
Tips for this age
- • Discuss the difference between using AI as a tool and relying on it emotionally.
- • Talk about academic integrity and how schools view AI-generated work.
Follow-up actions
- → Review which AI tools your child has access to and whether any need parental restrictions.
- → Agree a family rule: AI can help with ideas but should never replace your own thinking or real human support.
Related safety topics
This is practical educational content to support families. For case-specific concerns about a child's safety, contact the NSPCC helpline on 0808 800 5000 or your local safeguarding team.