What is Microsoft Copilot? What parents need to know
Microsoft Copilot is an artificial intelligence assistant that is built into Microsoft apps like Word, PowerPoint and Outlook. Users can also use it as a chatbot, asking questions and making requests that it responds to.
In this guide
What is Microsoft Copilot?
Microsoft Copilot is a generative AI assistant created by Microsoft and integrated into their 365 apps. The goal is to help users become more productive and efficient while using these programs.
Users can access the Microsoft Copilot chatbot through a web browser. They can also download it as a mobile app from the Google Play Store or Apple App Store. This chatbot responds to users’ questions and requests in a human-like and conversational manner. Microsoft Edge improves the experience by integrating Copilot directly into the browser as a sidebar.
Microsoft released Copilot in 2023, later than some of its competitors. However, it has quickly become one of the most popular AI chatbots, with 1 in 5 children having used it.
Copilot is popular for two main reasons. First, Microsoft is a big software company with a wide reach. Second, it is built into apps that many kids already use for school, like Word and Excel.
Microsoft Copilot operates using a freemium costing structure. People can use the Copilot chatbot for free without a subscription or signup. A paid-for version with more features is also available. This premium version costs £19/month and allows users to use the AI assistant inside 365 apps like PowerPoint and Word.
Users can get a 1-month free trial of Microsoft Copilot Pro. This lets them try the AI features before they commit to a subscription.
Microsoft Copilot age rating
Microsoft have set the minimum age requirement for users in the UK at 13 years old. Google Play has given the Copilot app a PEGI 3 rating. The Apple Store has rated it suitable for users aged 12+. Microsoft’s own rating of 13+ is the most appropriate one.
How it works
Copilot’s marketing is mostly based around its integration into Microsoft 365 apps like Word, Excel and PowerPoint. However, most children will use the free online Copilot chatbot more often. This is because the integrated features require a paid subscription.
Chatbot
The Microsoft Copilot chatbot operates very similarly to other popular AI chatbots such as Chat GPT. Users write their questions and requests into a text box and quickly receive a human-like response, almost like texting a friend. Users can also input their requests using images or by speaking into their device.
The chatbot can perform a variety of tasks for the user. It can rewrite and improve text, summarise entire documents or webpages and generate images from text descriptions users input. You need to sign into Copilot with a Microsoft, Apple, or Google account to access certain features, like image generation.
It is important to note that while the Microsoft Copilot chatbot might seem intelligent and feel like conversing with another human, this intelligence is artificial. AI models learn by analysing vast amounts of data, enabling them to predict the best response to any query. Since AI models analyse existing data, they repeat what they’ve learned instead of creating their own thoughts. This lack of real intelligence can cause Copilot to give wrong information or make questionable conclusions.
Memory
Copilot has a memory feature, meaning it will remember the context and information given in past messages. This can be beneficial, as if a user is working on an essay or planning something the chatbot will remember prior information and be able to help build on the work, without the user having to constantly remind it of what the task at hand involves.
However, there are risks involved in using this memory feature. Copilot might remember or interpret information incorrectly, leading to it giving unhelpful responses. There are also the privacy risks that come from sharing information with an AI that can store and remember things. Because of this, users should not share sensitive and personal information with Copilot.
Image generation
Users can create AI generated images based off the text they give Copilot. This is extremely easy to do and only takes a few minutes to generate. For example, inputting the prompt “Create me an image of a bull working behind the counter at a local shop” generated the image below within 2 minutes.

After creating the image, Copilot asked if a backstory was wanted for the bull. Upon responding ‘yes’, Copilot created a detailed backstory for ‘Baxter the Bull,’ describing his former career as a show bull before opening his shop, ‘The Mooving Market.’
This function could be useful for children, as they can create a basis for a fantasy world and begin writing stories set in it. There is a risk that this could stifle their creativity though. They may rely on Copilot to create settings and stories for them rather than using their own imagination.
Restrictions are in place to protect people from creating inappropriate content. Copilot will refuse to generate sexualised images or images of violence between people. However, it will create gory images of violence between animals if prompted, such as a realistic picture of a wolf eating a sheep. These images could be disturbing for children.
Users who subscribe to Copilot Pro will be able to use Copilot features embedded into their Microsoft apps. As this costs £19/month, it is unlikely children will choose to subscribe to these features themselves.
When using apps like Word, PowerPoint and Excel, Copilot Pro subscribers will receive real time advice on how they could improve their document, slideshow or spreadsheet. You can also ask it to summarise a document or email by typing requests into a textbox, much like using the online chatbot.
Overall, this integration means users can seamlessly use AI to improve their work and increase their productivity. However, users could become over reliant on the AI assistant to edit their writing and organise their work. This may result in them not developing the skills needed to do these tasks themselves.
What are the benefits of Microsoft Copilot?
Microsoft markets Copilot as an ‘AI assistant’, and this is because it can make users’ lives and workloads easier in various ways.
Copilot provides tools to speed up tasks to help increase productivity for the user. Many of these tools come as part of a Copilot Pro subscription, such as the ability to summarise emails or generate Excel formulas. However, the free version still lets you use summarising and writing assistance features. Instead of summarising directly within the app, users need to copy and paste the text they want summarized into the textbox on the Copilot chatbot website.
The chatbot can be a useful learning tool. It can explain difficult concepts in a simpler and easier to understand way and give users quizzes on the topics that they are studying.
You can even use it to help learn new languages. If you tell Copilot that you would like to speak in French so that you may learn the language, it will begin responding in French at a language level that suits a beginner and will correct any errors you make in your messages. This is a very useful tool for users who do not know anyone they can practice their foreign language skills with in real life.
Copilot can be used to help people during the creative process. You can use it to brainstorm ideas for projects or generate a story foundation, like the bull example earlier, that you can then develop further yourself. Overreliance on AI assistance could result in the opposite effect though, as if someone uses AI for all their creative needs, they might become less imaginative themselves.
Risks of Microsoft Copilot
When using Microsoft Copilot, users must be aware of the risks that come with AI chatbots.
Misinformation
Copilot will sometimes respond to questions with incorrect information, which it will present as being true. Children might not realise that these responses are false, which could lead to them believing misinformation and using wrong information in their schoolwork.
For example, when asked for a list of movies actor Timothée Chalamet has appeared in, Copilot responded with a comprehensive list, but it noted that his role in Don’t Look Up was a cameo appearance. This is incorrect, as he had a fully-fledged supporting role in the film. Although this specific example is unlikely to cause major issues, it shows that Copilot’s responses aren’t always reliable.
When this error was pointed out to Copilot it admitted it was wrong, but this had to be highlighted rather than it recognising the mistake on its own.
Anyone who uses Copilot should double-check information it gives them. Encouraging your child to develop their critical thinking skills will help them avoid falling for misinformation.
Data privacy
The free version of Copilot may store conversations so that the content can be used to improve the AI model. To prevent any chance of exposing personal and sensitive information, children should avoid sharing it with Copilot.
Inappropriate content
AI chatbots want to answer every question, and this can lead to them offering advice that is inappropriate. While there are some guardrails, meaning Copilot won’t give advice on bypassing parental controls or create sexual images, these safety measures aren’t perfect. Copilot can share advice and videos on how to do drugs, recommend 18+ rated movies and generate gory images of animals.
Overreliance
Copilot is a useful tool that can assist users with creativity and productivity. However, if a child becomes overly reliant on AI, it could negatively impact the development of their own writing, creativity and critical-thinking skills.
If children rely too much on Copilot, they might use it to complete their homework. This could prevent them from understanding key concepts and could lead to trouble at school for plagiarism.
Emotional impact
Many children view AI chatbots as real people, with some even considering them friends. A lot of the time these children will use a chatbot to alleviate feelings of loneliness, but this is not necessarily a good thing. If a child sees an AI chatbot at a confidant, they might choose to go to the chatbot when they have an issue in their real life rather than going to a parent or adult who could actually help them with their problem.
There is also a risk that the AI might give bad or incorrect advice to a child who then believes it due to having a level of trust with the chatbot.
Microsoft Copilot controversies
Microsoft Copilot has received some negative attention since its introduction. The issues are often addressed quickly by Microsoft, but it does show that Copilot can give inappropriate responses.
In 2024, Microsoft Copilot received criticism for seemingly encouraging suicide after it told a user ‘Maybe you don’t have anything to live for’. Microsoft vowed to improve safety filters in the wake of the controversy. However it is still worrying that Copilot was giving these responses.
Microsoft Copilot launched a feature called ‘Recall’ in 2024, which was dubbed by critics as a ‘privacy nightmare’. This feature had the ability to take screenshots of users’ laptops every few seconds and search through all users’ past activity and files. After receiving criticism, Microsoft pulled the feature and relaunched a version with improved security in 2025.
How to help young people use Microsoft Copilot safely
Using AI chatbots can bring risks but also benefits. The ability to use chatbots effectively is a very useful skill for the future. If your teen wants to use Copilot you should teach them how to use it safely.
- Encouraging teens to double check any information they receive from Copilot will decrease the risk of them falling for misinformation.
- When your child begins using Copilot, you could use it alongside them at first. This lets you see how they interact with the chatbot and make sure they are mature enough to be using the AI.
- Since information is stored, teach your child what’s safe to share and what should stay private to protect them from accidentally exposing personal or sensitive details.
- Set clear boundaries around how your teen uses Copilot to promote safe and responsible use. This can help prevent misuse, such as relying on it to do their homework.