If you haven’t heard, ChatGPT is not your average chatbot.

Developed by OpenAI, it has surged in popularity as it’s able to understand human language, allowing it to respond to just about any query.

You can ask it to generate code for a program, come up with a plan for a birthday party, summarise a report, write a long piece on any topic, and even translate it to Bahasa Malaysia.

A mechatronics engineering student based in Kuala Lumpur, who only wanted to be known as Ian, says that for him, using ChatGPT was akin to engaging with a human rather than a bot.

“It’s a pretty fun experience as it feels like interacting with an actual person who is able to provide reliable answers.

“I believe it can help with writer’s block when you can’t figure out how to structure a sentence for a report,” he says.

However, since it was launched last November, questions have been raised about its impact on academia, as students can use the bot to do their work and pass it off as their own.

Ian admits that the thought of using ChatGPT for assignments has crossed his mind, but he has resisted the urge.

“The only thing that’s stopping me from doing so is my pride. The joy of submitting an original work will be dulled if it’s not my own.

“It’s also unfair to others who would have put in a lot of hard work for their assignments,” he says.

Writing wrongs

Dr Caryn Lim, Monash University Malaysia’s director of education excellence, says the university takes plagiarism and other breaches of academic integrity seriously as they can impact students’ personal and professional growth.

Lim says that Monash University Malaysia may consider designing assessments with AI in mind. — MONASH UNIVERSITY MALAYSIALim says that Monash University Malaysia may consider designing assessments with AI in mind. — MONASH UNIVERSITY MALAYSIA

“Our policy on academic integrity defines plagiarism as ‘the act of using another person’s ideas or manner of expressing them and passing them off as one’s own’.

“So, yes, it could be considered plagiarism if the work that was submitted was written by an AI (artificial intelligence) tool like ChatGPT.

“This is not to say that using AI tools always constitutes plagiarism, but if a student was being assessed on their ability to write an essay, for example, and ChatGPT wrote the essay without the educator’s knowledge, then it is plagiarism,” she says.

Prema Ponnudurai, the head of school for the School of Media and Communication at Taylor’s University and its deputy director for Education For All Impact Lab, explains that AI-powered tools such as Grammarly (for checking grammatical and spelling errors) and language learning software have long been a part of academic practice.

According to her, the university relies on tech tools embedded in its teaching and learning portals to identify, evaluate and monitor plagiarism.

Prema says it’s important not to limit students’ access to technology that could be beneficial to their learning experience. — TAYLOR’S UNIVERSITYPrema says it’s important not to limit students’ access to technology that could be beneficial to their learning experience. — TAYLOR’S UNIVERSITY

“There are tools such as Turnitin Originality, which is part and parcel for the submission of assignments,” she says.

Turnitin has announced that it will incorporate detection tools capable of identifying the work of AIs, including ChatGPT, this year.

Prema also says that the university will explore new ways to safeguard academic integrity, including looking at apps like GPTZero, which was developed to determine if an essay was written by a human or ChatGPT.

OpenAI has also introduced the AI Text Classifier, a free tool that can be used to predict whether a piece of text was generated by AI.

However, it has limitations, as it was primarily trained on English content written by adults and requires a work with a minimum of 1,000 characters.

OpenAI also admits that its classifier may not always be accurate, adding that it could mislabel both AI-generated and human-written text.

Drawing the line

When it comes to works of art as academic submissions, plagiarism can be harder to define.

One Academy junior lecturer, Pearson Ooi Chek Tung, says that students can base their design on existing work and still avoid being seen as plagiarising by applying their own style.

Ooi says art and design students are worried about AI stealing their jobs in the future. — PEARSON OOIOoi says art and design students are worried about AI stealing their jobs in the future. — PEARSON OOI

“They can argue that the work is a homage to their favourite movie or artist. So, we assess students based on execution and how well they have worked to bring their ideas or concepts to fruition.

“They have to submit a working file to show the ‘layers’ and progression of the work,” he says.

While educators ponder how AI may affect academic integrity, students, on the other hand, are worried about being made irrelevant by AI.

Ooi, who is based in Penang, says the students were alarmed when AI tools like Dall-E and Midjourney, which is accessible via the chat platform Discord, went mainstream.

Social media was abuzz with users sharing “their artwork”, created in minutes with a few simple text prompts.

“The Internet was a source of fear and anxiety for them because they started reading articles on how AI will make human artists like themselves irrelevant in the future.

“One of my students asked, ‘What should I do, sir? I’m paying so much money for my course, but AI will be taking over my work’,” he says.

Ooi responded by telling his students that clients will still have to rely on humans for specific needs.

“AI is good, but I don’t think it can fully replicate the way we work. For example, if the client wants an artwork done a certain way, AI can only do so much based on the information fed to it,” he says.

While educators ponder how AI may affect academic integrity, students, on the other hand, are worried about being made irrelevant by AI.While educators ponder how AI may affect academic integrity, students, on the other hand, are worried about being made irrelevant by AI.

A media studies major who asked to be identified only as Lee says she, too, can’t escape the thought of how AI could change the way work is done in the future.

“Yes, I do have concerns about AI potentially coming to the workplace and replacing me.

“It is worrying to think that my fellow classmates and I, who may have bright ideas waiting to be shared and expressed, may not get the opportunity to show our capabilities at the workplace,” she says.

Boon and bane

There are concerns that as AI develops at a rapid pace, it will become harder to distinguish its work from that of humans.

Ooi says he and his fellow educators would be worried if AI tools like Dall-E started providing the working steps involved in creating artwork.

“If AI can show progression, then students who want to cheat can submit that as their working file.

“It would be hard for us to make the students prove that they have indeed done the work on their own,” he says.

According to Prema, getting caught for plagiarism could lead to a number of consequences.

A student may get a warning, be required to repeat the module or be suspended, which could ultimately lead to expulsion.

The offences would also go on the student’s record, she adds.

“This may tarnish the students’ image and reputation, possibly impacting their chances of enrolling in another university,” she says.

To date, both Lim and Prema have not heard of cases where students have been caught submitting works produced by an AI.

“Because the issue is still quite new, the university is currently discussing a specific procedure on how to handle these cases,” Lim says.

In the United States, measures have been introduced to address the potential misuse of ChatGPT.

The New York City education department released a statement in January banning the use of ChatGPT on school devices and networks due to concerns that it does not “build critical-thinking and problem-solving skills”.

Meanwhile, Reuters reported that Sciences Po, a top French university, has informed students via email that it’s forbidden to use ChatGPT for the production of any written work or presentations.

In Malaysia, the discussion is on-going, with Lim saying: “Many conversations are happening at various levels of the university and there is some anxiety around it, but we are still learning what it means for us.

“For the most part, we remain positive about generative AI tools and their educational potential.

“We are exploring ways to support students in using the tools ethically and responsibly,” Lim says.

Prema says it’s important not to limit students’ access to apps or any form of technology that could be beneficial to their learning experience.

“If we had banned calculators and the Internet in our classrooms, would we survive in today’s world?

“We must remember that innovation and creativity are human traits and AI apps perform based on the information we feed them,” she says.

Rethinking the rules

Dr Aznul Qalid Md Sabri, head of the department for AI under the Faculty of Computer Science and IT at Universiti Malaya, says the university has formed a Natural Language Processing (NLP) task force to study the effects of ChatGPT-type tech, including the pros and cons on the teaching and learning process.

For Universiti Malaya, helping students learn how to use ChatGPT for research purposes is one of its top priorities, according to Aznul Qalid. — UNIVERSITI MALAYAFor Universiti Malaya, helping students learn how to use ChatGPT for research purposes is one of its top priorities, according to Aznul Qalid. — UNIVERSITI MALAYA

“The task force is looking at various possibilities. One option is to use apps like GPTZero to be able to detect AI-generated output,” he says.

He added that the task force may look into the need to redesign assessment criteria.

“It may come out with certain specifications to help faculty members assess students in a way that ChatGPT can be used as a search engine like Google,” he adds.

Monash University’s Lim also says that the university may consider designing assessments with AI in mind.

“So assuming that students will use ChatGPT, how can we incorporate it into our assessment task?

“This is a ‘If you can’t beat it, join it’ type of approach that is important to consider since AI tools like ChatGPT are only going to become more accessible and potentially relevant in future workplaces,” she adds.

Ooi says he would encourage his students to use AI tools like Dall-E as part of certain processes, such as moodboarding.

“This is an industry practice, where you create a reference board filled with images of the concept that you want to achieve in your final work.

“What most people do is spend a lot of time finding certain references or pictures to match their vision.

“With AI, you can save time by generating the images that you want,” he says.

Ooi also hopes to see more efforts to raise awareness about the ethical use of AI and other emerging technologies.

“If you’re using AI to generate artwork and then selling the artwork as your own, then that is not ethical.

“On the other hand, I also like how AI is helping more people explore their creative side.

Like they may have ideas but couldn’t execute them due to a lack of drawing or technical skills,” he says.

Lim states that the university is in favour of supporting the responsible use of tools such as ChatGPT.

“As a first step, we’ve produced resources for students and educators around using generative AI tools in learning and teaching.

“We are also exploring procedures for declaring the use of AI when completing assessments,” she adds.

Prema adds that Taylor’s University is consistently reviewing its policies and practices to evolve with conversations around AI.

She believes that students should be taught how to properly reference AI-generated materials in order to avoid being accused of plagiarism.

“Most importantly, they must be instilled with ethical and moral principles, so they are able to self-regulate their actions internally,” she says.

For Universiti Malaya, helping students learn how to use ChatGPT for research purposes is one of its top priorities, according to Aznul Qalid.

“The university recently implemented guidelines for open book examinations to encourage lecturers to set up examination questions that would promote critical thinking among students.

“Lecturers may set up limitations to the reference materials used, and this may include the extent of ChatGPT use,” he says.

Though Lee says she has seen some pushback against AI in the workplace, she feels there must be a way for humans to coexist with technology.

“I’d like to think we might be able to come to a compromise and use AI as a ‘co-worker’ instead of seeing them as our replacements,” she says.