📎 Copilot 2: Return of Clippy

This time it's personal!

Good afternoon human,

It is the Easter break here in the UK, which affords me a bit more time to put these together and with the most up-to-date information that I think is relevant and interesting in all things AI. Grab a drink and a biscuit, and dive right in.

📚 Knowledge builders

  • 72% → Teacher Tapp had some interesting data from a recent poll where it asked teacher to self-report how much time they believe AI had saved them the past week. 72% if teachers believed that it saved them upwards of 2 hours, with 2% of teachers believing that it saved them more than 10 hours. The full breakdown can be found below.

    Obviously this data set needs to be cautioned as all self-reporting does. I do think quantifying time savings with AI is a tricky business to do, but I do know one popular AI tool that tries to do this in a very simplistic way. I do wonder how many teachers used that number to support their self-reporting?

     

    Regardless, there is a clear trend that for those that use AI, they perceive it to save them time, and this will only drive up adoption in the future.

  • Protecting children’s privacy when using Artificial Intelligence → In the depths of the DfE website, there is a useful page around AI and pupil data. It precedes the AI product safety announcement, but I do believe it is a crucial watch for senior leaders who are responsible for safeguarding and internet safety. The video can be found here, but I have broken down some key areas with time stamps below.

    Checklist for Leaders: Responsible AI Use in Education

    • Implement the "Need, Read, and Proceed" framework: [00:36]

      • Need: Before using AI, assess if it's truly necessary to improve educational outcomes [02:41].

      • Read: Ensure you are familiar with your school's data protection policies [02:56].

      • Proceed: Verify the accuracy, relevance, and ethical compliance of AI-generated content [03:26].

    • Understand Open vs. Closed AI Tools: [00:44]

      • Be aware of the data protection implications of each type.

      • Avoid using identifiable information with open AI tools [01:00].

      • Recognize that closed AI tools generally offer better data security [01:27].

    • Define Personal Data: [01:56]

      • Clearly understand what constitutes personal data, including sensitive information.

    • Establish Data Breach Protocol: [03:44]

      • Create a clear process for reporting accidental input of personal data, including notifying the Data Protection Officer or head teacher.

    • Provide Additional Resources: [03:57]

🤖 Industry updates

  • LLMs Pass the Turing Test → Seventy-five years ago, Alan Turing (1950) proposed the Turing test to assess machine intelligence. A human interrogator communicates via text with both a machine and a human, attempting to identify the real human. If the interrogator cannot reliably do so, the machine is said to have passed, demonstrating its ability to mimic human-like intelligence.

    In a recent research article, it seems that when prompted to adopt a humanlike persona, ChatGPT 4.5 was judged to be human 73% of the time - significantly above random chance. The results constitute the first empirical evidence of an LLM passing the Turing Test. What does this mean for education? Right now, not so much. However, there are potential implication again for safeguarding and our young people if it is becoming increasingly difficult to determine between LLMs and humans.

  • Copilot updates â†’ A few days ago, Microsoft unveiled some planned enhancements to the Copilot (the version you would use with a personal account. Readers of a certain age… may well remember Clippy, arguably the first digital ‘assistant’ who would ask us if we needed help formatting letters etc. One of the new enhancements planned for Copilot is called ‘Custom Appearances’. The user can give Copilot an appearance, in the case of the Microsoft demo, they showed off Clippy. These are the updates coming to Copilot that could be relevant to teachers:

    • Vision > You will be able to use your phone’s camera and let Copilot interact with what it sees. You could potentially use this to identify common misconceptions in some classwork and generate some tasks to address them.

    • Podcast > Copilot can now generate AI-powered podcasts that deliver personalised audio content based on your interests. This provides an engaging and effortless way to consume information.

    • Deep Research > Deep Research is a new Copilot feature that streamlines complex, multi-step research tasks, significantly reducing the time needed to gather and analyse information. For educators, this could be invaluable first step when preparing to teach a new topic.

✨ Fresh prompts

  1. Humanlike persona → Carrying on from our Turing Test, it is great to see that research involving LLMs are increasingly including the prompts that are used. For me, this is good practice that ca help validate claims made be researchers. It also provides a glimpse into new prompt styles. Below is the prompt that was used be the researchers who conducted the Turing Test study, and it is a beast.

The full PERSONA prompt used to instruct the LLM-based AI agents, (Jones and Bergen, 2025).

Did you find something of interest in this edition? If so, share it with your colleagues and wider network by clicking the button above!

Until next time, keep on prompting.

Mr A 🦾

Reply

or to participate.