As semester one concludes, university and school students around the country have submitted their assignments and now anxiously await their results.
For many, the first to assess their work won’t be their professor or lecturer, but rather anti-cheating software like Turnitin. The technology mercilessly compares each assignment against other student work, academic articles and the world wide web, picking up on similarities requiring further investigation.
Universities and schools routinely use software like Turnitin, alongside academic integrity policies and staff training to guard against plagiarism. Yet the emergence of artificial intelligence, or ‘AI’, writing and paraphrasing tools has given rise to a new form of cheating that is testing the capabilities of Turnitin and educational institutions. As well as raising fundamental questions about originality and the future of learning and assessment.
In March, Professor Peter Coaldrake, chief commissioner of higher education regulator the Tertiary Education Quality and Standards Agency, wrote to the Vice-Chancellors and CEOs of all Australian universities and higher education institutions advising them to ensure academic integrity policies were up-to-date and communicated to students.
He urged that all institutions needed, “to ensure that higher education students have a strong and current understanding of academic integrity that recognises the fast-moving pace of activities being leveraged by individuals and companies that seek to profit from academic misconduct.
“Increasingly, this includes file-sharing sites and written works that are produced by artificial intelligence paraphrasing and text-generating tools,” the commissioner wrote.
Such AI writing tools have gained prominence since the 2020 release of GPT3 – short for Generative Pre-trained Transformer 3 – a technology developed by research lab OpenAI.
GPT3, and other models like it, can write human-like text from a few prompts in seconds. The technology is based on machine learning, where systems are being trained from text scraped from the internet. Another example of AI-driven content creation tools include ‘spinners’ which can paraphrase text from an original source.
Dr Lucinda McKnight, an expert in writing and teaching for the digital age, said plagiarism concerns meant that students’ use of these tools had largely remained a secret practice.
She explained how a student might use the technology – “they find an essay on Romeo and Juliet, on a similar topic to the one that they’ve been given, on the internet. They pop it into an AI spinner, and you know it will produce a whole new version of that essay that they can submit.”
“The kids are onto it, they’re all spinning their essays,” she said.
But while AI tools capable of drafting high quality, seemingly original assignments in seconds might seem like a clever way for wannabe cheaters to evade anti-cheating software, the plagiarism sleuths at Turnitin say even robot writers leave their trace.
“They leave very telltale signs”, said Eric Wang, the vice president for AI at Turnitin.
He said Turnitin has several techniques for detecting a robot writer’s signature.
One example is the AI’s tendency to use high frequency words more often than a human writer would. High frequency words are words that are commonly used like ‘you’, ‘the’, or ‘what’, and are the first ‘sight words’ prep students are taught to read and spell.
There can also be clues in the metadata around a document, like the IP address it was written on, or submitted from, Wang said.
“Being able to emulate someone’s style, it’s doable. But changing the metadata of your machine such that your paper formatting matches something else, that takes a little bit more effort.”
Spinners were harder to detect, he said, but Turnitin has its own AI that can search through content efficiently and can often identify the potential original source document.
Even as anti-cheating technology improves, a spokesperson for the Tertiary Education Quality and Standards Agency said, “anti-plagiarism software alone will not detect or address all breaches of academic integrity.
“Staff training, institutional academic integrity policies, assessment design, student education and awareness activities are all required to support the effective use of anti-cheating software.”
Turnitin calls the rise of AI writers and spinners a new wave of cheating made possible by technology. It follows the ‘first wave’ of the late 1990s, where the internet gave students access to vast amounts of information and the ability to copy and paste text directly from the web.
The second wave came from the ability, via social media and other digital platforms, to connect with a huge pool of potential authors. Coaldrake, in his letter to education providers, said the regulator had been fighting back against this, working with platforms like Meta (owner of Facebook, Instagram and WhatsApp), LinkedIn and Gumtree to take down more than 300 posts promoting academic cheating services.
Wang said a key difference with cheating made possible by AI writers and paraphrasers, was the lack of “meaningful financial, perhaps moral boundaries” that would usually be present when a student asks another person to contract-cheat.
He would not say how extensive the use of AI tools was, but noted, “I think there will come a time where this becomes a very visible problem to educators”.
While Wang acknowledged the value and potential benefits of these technologies, he said the rise of AI writing tools is raising questions about the future of learning and assessment that – “every single journalism department, English department, I mean heck, even every physics department with their lab reports” – is now grappling with.
“What does originality mean, in a world where AI assistance is, you know, increasingly prevalent, and ubiquitous?”
McKnight argued that instead of trying to shut down and prevent the use of AI writers, educational institutions like universities and schools should engage with the technology critically, understand its limitations, biases and risks, as well as its benefits and capabilities.
She said the rise of robot writers raises questions for educators and society at large.
“How are we going to quantify the human content in writing? How are we going to acknowledge it and how are we going to recognise what other kinds of sources were used?”
McKnight said she believes the writer of the future will be ‘post-human’.
“When I think of the writer of the future… I think about a sort of merged human-machine entity that is producing writing.”