New guidelines to address ‘dangers’ of AI in the Island's schools

Pupils are beginning to use artificial intelligence such as ChatGPT to help with schoolwork (35949711).

THE Education Department is developing ‘guidelines and procedures’ for the use of artificial intelligence in schools amid fears the technology is aiding cheating and could lead to the ‘death of coursework’.

Fears have been raised that text-generating chatbots such as ChatGPT are being exploited by students in a bid to improve their grades.

Nationally, a group of UK school heads have launched a body to advise and protect schools from the risks of AI.

In a letter to The Times, the teachers warned of the ‘very real and present hazards and dangers’ to education and said that schools must collaborate to ensure that AI was working in the best interests of pupils.

And Children’s Minister Inna Gardiner said that although AI could bring a range of benefits, schools also needed to be aware of its dangers. She said: ‘AI presents significant opportunities for education, and I know that the research and discussion in this area is ongoing. In the UK and elsewhere, AI-powered tutoring systems are being used to provide personalised instruction and feedback to students. AI is also being used to automate admin tasks to free up teacher time for student interaction.

‘However, there are also risks and challenges associated with AI. AI-powered tools can enable cheating, including through automated essay generators. Schools and colleges need to make sure that they have robust measures in place to prevent cheating and ensure academic integrity.’

Huw Davies, who teaches GCSE and A-Level English at Hautlieu, said the number of students who asked ChatGPT to write coursework extracts, essays or speeches (which are part of the GCSE curriculum) was ‘absolutely insane’.

‘The most obvious example comes through in coursework,’ he said. ‘To them, it sounds like such an easy fix because they can ask ChatGPT to write them a polished paragraph. But teachers aren’t dumb, and as soon as it sounds different, it’s an immediate red flag. All students have developed a unique writing voice and style, so it can be really obvious when there is a sudden change in that.’

While exam boards have plagiarism checks in place, he said the procedures to catch AI use were still in development because ‘it is all still so new’.

At the moment, Mr Davies said he had to rely on ‘scare tactics’ when reprimanding students for using the technology.

‘I remind the students that they should be using their own work, and that when these are externally marked and the exam board recognises it as copyright or plagiarism, then that can affect not only the individual student but the whole cohort. As soon as the exam board believes there is malpractice, they will believe we have facilitated it and all the students could see consequences.’

He added: ‘It’s not ruining the education system, but it is a danger for things like coursework, which is about going away and producing a piece to the best state it can be. The kids will flock to a technology that could make that easier. It’s the death of coursework.’

One of the most worrying aspects of AI, Mr Davies said, was that ‘it has a tendency, if it doesn’t know, to make things up’.

He said: ‘Students need to think critically about what information they are being given and whether that AI has an agenda, and who is programming it. We could end up with kids using a monoculture of answers.’

The ‘vast majority’ of students were still not using it, said Mr Davies, because they recognised that ‘they are not going to get the rewards and learning’.

Deputy Gardiner said: ‘In Jersey, some schools are exploring AI-related topics as part of science, technology, engineering and mathematics education, introducing students to concepts like machine learning and data analysis. However, the extent of AI integration into the broader curriculum may vary, and it is an area that is being reviewed in line with the digital education strategy.

‘Like other governments, we recognise that we need to address ethical concerns and data privacy. CYPES is in the process of developing guidelines and procedures for the safe and responsible use of AI to support learning. We will provide updates on the progress of this in due course.’

Deputy Catherine Curtis, chair of the Children, Education and Home Affairs Scrutiny Panel, said: ‘AI may well be a disruptive innovation in education and not something that can be incorporated into current processes, but rather something that changes education paradigms.

‘The important thing to do is to engage with schools to get their views, which my panel will be aiming to do.’

– Advertisement –
– Advertisement –