| Updated:
Labour had pledged to put the UK at the forefront a revolution in AI and edtech, but should we really be using ChatGPT in the classroom? We get two writers to hash it out in this week’s Debate
YES: When used critically, LLMs can help students understand things more deeply
Let’s be honest: tools like ChatGPT are already in classrooms, whether we like it or not. The real question is, do we as education leaders help students use AI responsibly, or leave them to figure it out on their own?
I think AI in schools should be about preparing young people for life, not just exams. When used critically, LLMs can help students understand things more deeply, explore ideas creatively and learn in ways that suit them. It’s not about letting AI do the work, it’s about teaching students how to think with it: how to question, check and make sense of what it gives back.
And let’s talk about fairness. Students from better-off households – with access to tutors, private devices and fast Wi-Fi – are already using AI to get ahead. Young people from lower-income families don’t always have that support at home. Banning AI in schools doesn’t level the playing field, it widens the gap, leaving those who could benefit most even further behind.
And this isn’t just a technical matter, as AI permeates every sector from journalism to creative arts and everything in between, it’s vital that every young person has the digital literacy to thrive and understand its potentially profound impact on the world they’ll be entering.
We’re proud the UK’s AI sector is one of the best in the world. If we want the next generation to lead it, with confidence and integrity, we have to start now by making AI part of learning, not something to fear. That means proper guidance, teaching digital skills, and helping students use tech wisely. ChatGPT isn’t the problem – not knowing how to use it critically is. Let’s give our young people the tools to keep up, and lead.
Julia Adamson MBE is the managing director of education at BCS, The Chartered Institute for IT
NO: Children don’t just need answers – they need struggle
Introducing ChatGPT into classrooms might look modern, but it risks impoverishing the very thing education is meant to nourish: the human mind.
Children don’t just need answers – they need struggle. Discomfort. Slow, painful clarity. Thinking is forged in that friction. We risk replacing deep cognition with a veneer of fluency. If students outsource the process of forming ideas to a machine, what exactly are they learning?
Throughout most of Western history, learning was grounded in memorization – not as rote drudgery, but as the root of creativity, rhetoric and understanding. In the Roman world, under authors like Virgil, to memorise was to internalise – to shape the architecture of the mind itself. Students recited entire epics not just to remember, but to become more articulate, imaginative and cognitively agile.
Today, with the world’s propositional knowledge a prompt away, we face a paradox: unprecedented access, but vanishing depth. We may raise a generation fluent in navigating surface information, but devoid of the slow-burn wisdom that only comes from true mental possession of an idea.
That kind of internalised knowledge is the cornerstone of a high-functioning brain. Without it, reasoning becomes brittle, attention fragmented and originality scarce.
I’m not anti-AI. I build with it. More than that, as the founder of Quid, a leader in AI-driven trend analysis and insight, I’ve been working with LLMs since 2016. But in schools, we must put our children’s development first.
Let children read slowly, write awkwardly, speak clumsily – and learn to think for themselves.
Bob Goodson is the president of Quid and co-author of Like: The Button That Changed the World
THE VERDICT
Should children be allowed to use ChatGPT in the classroom? With Labour this week pledging to put Britain at the forefront of a “revolution” in education technology, it’s a good time as ever to weigh up the debate.
On the yes side, Ms Adamson is right to lay out the obvious: ChatGPT is already in the classroom. Not to mention the big bad world children will eventually have to engage with. Better, then, to teach children how to use ChatGPT critically so they are equipped for their adulthood deep in the throes of the AI revolution. Certainly rational, but may we also consider that allowing schools to shield children from the big bad world (where citing Virgil is all too rare) at least for a little bit is not such a bad thing.
It can be easy to see schools simply as funnels for later GDP-yielding members of the workforce, but their most critical purpose is just to teach us how to think. ChatGPT, as many current GDP-yielding members of the workforce know, is popular exactly because it allows us not to think. Let’s not get the next generation dependent on it.