Students may be punished if caught using artificial intelligence

ChatGPT and other AI tools which can generate essays within seconds seen as significant threat to academic integrity

Universities are being asked to consider the repercussions for any graduates or students found to have engaged in serious academic misconduct during their studies. Photograph: iStock
Universities are being asked to consider the repercussions for any graduates or students found to have engaged in serious academic misconduct during their studies. Photograph: iStock

Students or researchers could be punished for using artificial intelligence (AI) for exams or assessments under new measures to tackle academic cheating.

The State’s watchdog for academic standards, Quality and Qualifications Ireland (QQI) is understood to be planning to extend academic misconduct legislation to cover the use of artificial intelligence for the first time, according to Government sources.

In addition, QQI is to ask all education providers to review their internal policies on academic integrity and ensure they capture these newly evolved risks.

Each provider is being asked to consider the repercussions for any graduates or students found to have engaged in serious academic misconduct during their studies. If such has occurred, awards or qualifications may be withdrawn.

READ SOME MORE

Separately, if an individual is found to have facilitated cheating by an enrolled learner and they are on a professional register, processes to notify their relevant body should be enacted.

A white paper on academic integrity has been circulated to all higher education providers and is scheduled to be published in 2024.

Until recently, contract cheating – such as essays and assignments written to order – were regarded as the biggest risk to academic integrity,

However, the emergence of AI tools such as ChatGPT, which can create human-like essays within seconds, are regarded by academics as a much greater threat because they are typically free and easy to access.

Minister for Further and Higher Education Simon Harris said he believed AI presented a “huge opportunity” in education, but that it could not be “masked as academic learning and we have to safeguard against that.”

“These issues are not novel. Plagiarism, cheating and academic integrity have been at the helm of the regulator’s agenda for years.” he said.

“However, the use of AI has posed serious challenges for academics and education providers. It used to be the case that contract cheating was the main issue. This is where a third party is paid when academic assessment is outsourced. However, the use of AI has made it harder to establish whether misconduct has taken place.”

ChatGPT, the AI language model from OpenAI, has been making headlines since November 2022 when users saw its ability to answer complex questions within seconds. Photograph: John Walton/PA Wire
ChatGPT, the AI language model from OpenAI, has been making headlines since November 2022 when users saw its ability to answer complex questions within seconds. Photograph: John Walton/PA Wire

The Qualifications and Quality Assurance (Education and Training) (Amendment) Act, enacted in July 2019, specifically empowers QQI to prosecute those who facilitate academic cheating. It covers areas including impersonation and provision of cheating services.

Many third level institutions have been updating their own academic integrity guidelines in recent months and revising how they assess students in light of the threat posed by AI.

These measures include incorporating more oral presentations, setting more detailed and nuanced questions in assignments, as well as requiring evidence of draft work from students for assignments.

ChatGPT, the artificial intelligence language model from OpenAI, has been making headlines since November 2022 when users saw its ability to answer complex questions within seconds.

It can write nuanced essays, poetry, generate code, translate languages, among other functions, in response to prompts from human users. An updated version performed astonishingly well on demanding tests such as bar exams (beating 90 per cent of humans) and advanced college entry tests, as well as tax returns and accountancy challenges.

The new tool sparked alarm on college campuses as videos circulated online with millions of views showing students how to use it for assignments.

It has since been updated with the capacity to solve complex maths problems and generate computer code, among other functions, while tech giants such as Google, Meta and Microsoft have been pouring billions into rival tools available online. Apple is also reported to be developing its own tools.

While some saw ChatGPT as an existential threat to higher education a few months ago, the conversation has evolved.

Talk of banning or restricting its use on campus has been replaced by a recognition that the technology is here to stay. Now, some are figuring out how to include it in their assignments.

The technology, which artificial intelligence experts refer to as neural networks, is built by using vast amounts of digital text to recognise patterns.

Carl O'Brien

Carl O'Brien

Carl O'Brien is Education Editor of The Irish Times. He was previously chief reporter and social affairs correspondent