Main content

Should my kids use AI to do their homework?

Artificial Intelligence, like all technological advancements before it, is changing our world. What’s different perhaps, is the speed of this change. As AI rapidly evolves, our way of life is affected in ways we hadn’t even considered. Will AI take our jobs? Teach our kids? Infringe upon our very humanity? For some it may even feel scary, like an acceleration towards some Terminator-style doomsday scenario.

In Radio 4's The Artificial Human, social psychologist Aleks Krotoski and Doctor Kevin Fong set out to "solve" AI by answering our pressing questions starting with one from Anna from Aberdeen – should we allow our kids to use AI to do their homework?

The Artificial Human

Artificial Intelligence is in our homes, phones, schools and workplaces... What does this even mean? In The Artificial Human Aleks Krotoski and Kevin Fong answer the questions we're all asking about AI.

How is AI affecting our children’s ability to learn?

Having all gone through some form of schooling system, it can be shocking to hear how our children’s education differs from our own. Anna has concerns regarding AI’s changes to the classroom. Do teachers and educators really know what they mean when they say “AI”? And will AI supplant the academic exploration that fosters learning in young minds?

Will AI make children lazy, brute-forcing generative AI to get the response they need?
Anna

Although AI may seem amorphous and unproven, it’s a tool. When used with sound judgement, AI can enhance children’s learning. Kevin compares his schooling with previous generations, pointing out that his cohort no longer had to “memorise logarithms or use slide rules”. Neither of these advancements hindered his learning, instead shifting the focus to problem solving. AI is already being used in classrooms as a helpful tool, including text-to-speech software and language translation software such as Google Translate.

Anna still worries, however, that the advent of AI will make children lazy, brute-forcing generative AI to get the response they need. Her concern is perhaps reminiscent of the aversion to GPS sat nav systems for navigating roads, reducing the map awareness gained by using the classic A-Z Road Atlas. How can we teach kids to use AI while still understanding the questions at hand and having ownership over their own knowledge?

Will AI help my kids cheat?

Plugging an essay prompt into generative AI has made plagiarism easier than ever, bypassing the need truly to understand the material the student submits. So, are fears that students might use AI to cheat on homework, tests and assignments justified?

Berry Billingsley, a Professor in Science Education from Canterbury Christ Church University, is optimistic about the inclusion of AI in education, suggesting we ask, “How is the AI going to help us to think better?” rather than assuming it will think for us. In saying that, Berry does admit that the temptation to cheat is real. Fifty percent of Berry’s students say they’ve put an assignment into generative AI and been impressed by the results.

Creative educators are combating these kinds of shortcuts by crafting tasks that play to the weaknesses of AI. For example, AI acts like a search engine by presenting the most popular results first. By including a personal element into the assignment such as a geographically specific prompt, the AI’s answer becomes less convincing. Additionally, younger children might be kept away from advanced AI tools where possible, so that they develop critical thinking skills before driving the AI equivalent of “a souped-up fantastic sportscar”, as Berry puts it.

How does AI alter where we get our facts from?

One thing AI will impact is what Berry calls “epistemic agency” – put simply, the choices we make when we’re trying to answer a question, including how we interpret said question and how active we feel in solving it. Berry gives an example question, such as “Why did the Titanic sink?”, explaining that some people may take a historical approach to this question while others would take a purely physics-based approach. The point is that AI may provide the answers, but it still requires critical thinking to provide the correct prompt for our purposes.

[Al] doesn’t just lie, but it lies in an incredibly enthusiastic, convincing way.
Ollie Bray

Berry suggests that to develop these critical thinking skills, it may be necessary to develop a new curriculum or limited lesson plan devoted to AI usage skills. The idea wouldn’t be too dissimilar to old-school library lessons that taught students how to use the Dewey Decimal system. The end goal would be to empower students to feel comfortable using new technology and make their prompts targeted, efficient and ethical.

Can critical thinking fact check AI?

Of course, even the best AI input still requires students to think critically about the answers they’re given. Ollie Bray, Strategic Director at Education Scotland, is all-too aware of how misleading AI can be: “[Al] doesn’t just lie, but it lies in an incredibly enthusiastic, convincing way”. One of the exercises he conducts is to produce text through ChatGPT, then challenge educators to find the inaccuracies within its results.

Given current AI’s tendency towards fiction, it’s increasingly important to actively teach kids critical thinking. This means knowing where to find information, citing their sources, and understanding the validity or bias of that source. Although AI such as ChatGPT pulls from across the internet, the internet itself is biased as are the most popular web pages. These critical thinking skills help with media literacy online outside of AI as well, from news to social media to forums.

What makes good homework?

Prof. Berry Billingsley says teachers are fed up with correcting AI-generated essays.

Will AI affect all kids equally?

Even if AI can benefit learning in the ways that Berry and Ollie describe, there’s still the question of whether it will advantage kids evenly. Thankfully, the technology is readily accessible. Anyone with a computer and an internet connection can access services like ChatGPT and Google’s Gemini (formerly Bard). Additionally, AI technology can assist disabled children through text-to-speech functionality and other accessibility enhancing software.

But while the technology is readily available, education and instruction on how to best use the technology may not be. Educators who are uncomfortable adapting to new technology or politically inclined against the use of AI may avoid teaching it specifically. Yet others may simply lack the up-to-date knowledge of AI required to properly teach it. Private schools may end up better equipped to handle AI education than public schools.

Regardless, the growth needs to begin somewhere. Whether it’s with officials, educators or even parents, engaging with AI technology in education could prove advantageous, but that means keeping up with progress in the AI space whilst deciphering the truth when reading AI generated answers. Ollie hopes that “we use [AI] in their most powerful way to transform education”, rather than simply building on the out-dated foundations of traditional education.

Listen to The Artificial Human: Should I let my kids use AI for their homework? here.

More from the ±«Óătv