Can AI be safe, intuitive and ready for schools? That was the focus of our latest TeacherMatic webinar, ‘Safe, Ethical, Intuitive and School-ready AI with TeacherMatic,’ hosted in partnership with the Schools & Academies Show, featuring insights from leading educators and IT specialists:
The session explored how TeacherMatic helps schools adopt AI responsibly, with clear benefits for staff confidence, student outcomes and strategic rollout. Additionally, how AI enhances (rather than replaces) the professional judgement of every educator who uses the TeacherMatic platform.
Building AI that’s designed for real schools
Rather than building ‘one-size-fits-all’ tools, TeacherMatic was developed around the realities of teaching in schools today. In the opening of the webinar, Peter Kilcoyne shared how the platform was shaped by educators’ day-to-day needs to put simplicity, clarity and educational value first:
“In my long career since 1986 in education, I’ve seen workloads grow and grow and grow over those years. This technology offered some real potential benefits for saving time, for, therefore, improving teacher health and wellbeing, retention, recruitment, etc., The creativity, and generally improving teaching, learning, assessment. But it was also clear to me that there was a big ‘but.’
And that’s tools like ChatGPT, Gemini and Copilot. They’re not necessarily easy to use. That when a teacher goes onto these tools, they’re faced with a blank screen with just a box to type in. And it’s kind of, “What do I do with that?” And in order to get good outputs from these tools, you’ve got to be quite skilled in this thing called ‘prompt engineering.’”
This inspired Peter and his team to build TeacherMatic by partnering with 300+ teachers from around the UK to test prototypes and be actively involved in the initial design process using a ‘co-design model.’ This has been the focus to collaborate with educators from day one and continues to be the focus as more advances and discussions arise concerning how well is AI designed for schools.
Is it secure? Is it easy to use? Is it cost-effective?
We’ve conducted a lot of research on the time-saving support with the following findings:
- 2-hours per week (Jisc study)
- 3.8 hours per week (TeacherMatic user survey)
- 4-8-hours per week (Cited by individual organisations)
Saving teachers’ valuable time is essential, as is providing responsible AI support.
Safety, compliance and trust as AI foundations
For schools considering AI, safeguarding and trust is non-negotiable. That’s why TeacherMatic places data security, compliance and transparency at the heart of all its generators and tools.
During the webinar, Peter stressed how these principles are embedded in every feature and TeacherMatic’s approach to data responsibility.
“We offer a very high-level of security, so we are Azure-hosted, we’ve got content filtering.”
Notably, this isn’t available with free versions of generative AI tools like ChatGPT, Copilot and Gemini, etc.
“These large language models are full of biases and we have filters to try to reduce the bias in outputs. We don’t feed any large language models, so anything you upload goes nowhere. It’s a walled garden, so your data, your intellectual property is safe.”
A school‑wide approach to adoption: Insights from Connect Education Trust
Rolling out AI across a multi-academy trust requires a scalable strategy. Gulev Karayel shared how Connect Education Trust built momentum through a shift in their pedagogical approach:
“We’ve got a one-to-one device for every pupil, and it was very much more, “We’ve got these tools, what shall I do with it?” And then, it was the approach of thinking about the pedagogical practice and then the belief. Very quickly, we knew that there had to be this shift. There had to be that shift of that mindset.
We created a strategy for AI, throughout our process as well, with the mind of that pedagogical approach, and why we are doing that, why are we introducing these tools and what impact have they got? So we started with the ‘why.’”
This journey to discover the ‘why’ led the Connect Education Trust team to create a questionnaire, which revealed insights about workload, accessibility and personalised learning.
From the questionnaire, Connect Education Trust were able to develop an action plan to implement their new approach and understand what actions they needed to take: how to embed AI with purpose.
“We had different groups of stakeholders, from our operational, admin staff, site managers, to our classroom teachers. We wanted our approach to be the same for all of the roles, to make sure that everyone was really clear about what is the approach we are taking.
We didn’t want AI, in any kind of way, to replace what we were doing, some of our good practice. But what we wanted to do is to enhance that creativity: that critical thinking, that saving time, but at the same time, being very cautious about making sure of the safety as well.”
Gulev went into further detail about how her team introduced TeacherMatic through a pilot scheme, assigned TeacherMatic “champions” to help identify the gap and needs for trusted and reliable AI support across workload, data and policy.
How school culture supports meaningful AI use: Anglo European School case study
We understand the importance of each school’s culture and how it can make or significantly impact the AI adoption process.
Cate Peeters offered a powerful example of how Anglo European School integrated TeacherMatic in a way that aligned with their values and teaching priorities as part of their National Priority Project through National Consortium for Languages Education (NCLE) funding.
“We’re doing an assessment project, but we wanted to find a way to integrate AI in a safe way, and in a way that would not only help the MFL teachers that are involved with the National Priority Project […] to kind of piggyback off the back of this funding, to sort of wider service our whole school community when it came to our staff, but also our professional support staff.”
After discussing with Cate’s leadership team about what the key questions they had for using AI in their organisation were, there was a consensus:
“We did not feel comfortable with our staff feeding that large language model, with anything to do with school. So, we knew we needed to find something that would help us with that, and so that’s kind of the third question about what ethical and safeguarding and data privacy considerations we need to address.”
Echoing Gulev’s points, Cate further added the benefits of AI heavily depends on how a school community learns how to use it for their needs and integrate this into their daily routines:
“In what ways can AI change the professional role of teachers and what new skills or mindsets might be required? And then we had to think about, well, how can we incorporate this into our CPD plan for the year, for the years ahead.”
These reflections demonstrate how policy, pedagogy and professional learning must be considered when implementing AI responsibly.
What this means for educators
From the webinar, a number of clear messages surfaced for schools and trusts that are considering or about to begin their AI adoption journey:
- AI only delivers impact when it is ethical, transparent and accountable
- Implementation succeeds when staff feel supported, trained and trusted
- The greatest value comes when AI is used to enhance teacher expertise, not replace it
- Leadership, collaboration and shared ownership are essential for scaling meaningful use
Ultimately, AI confidence grows through culture, professional learning and a clear strategy that includes your entire team.
Watch the full webinar recording
Interested in how the TeacherMatic can work for your school or trust?
Here’s how to get started: