It is sometimes difficult to determine whether texts created by generative artificial intelligence (AI), or other works created using AI-based tools, are genuine originals created by the person using the AI tool, or whether it is in fact copied from the large amount of training material used to create the AI tool.

In the example of a take-home exam, an assignment involving own work or other assignments that require the student to create, process, revise and finalise their work in an environment that is not fully under the control of the teacher, examiner or invigilator, it can be problematic to know whether the student has produced or not. There are AI-based tools that can produce an academic text at a very high level, and the cost is no greater than buying a professionally written essay. But is a generated text considered plagiarism?

Summary

Usually, generated text cannot be seen as plagiarism, but it may still be considered cheating.

Intellectual property

To answer the question, in the absence of clear regulations in higher education in Sweden, we can partially answer the question by examining the concept of intellectual property (IP). The company OpenAI, which created the now infamous AI ChatGPT, a generative type of AI, is described in a response from the European Commission's IP Helpdesk External link, opens in new window.. The answer is that an AI cannot own intellectual property, and the company disclaims ownership of what is generated. Therefore, whoever instructed the AI tool owns what was generated. This also means that the responsibility falls on the person who instructed the AI. Thus, even if, as is unlikely, the text or work turns out to be plagiarised in a traceable way, the person who instructed the AI is responsible. The question is complicated, but in general it can be said that generated text from a generative AI is not to be considered as belonging to the AI. Of course, this may differ depending on the contract the user has with the provider, but in general, an AI cannot be considered to own intellectual property rights or intellectual property. You can read more about the specific provisions in paragraph 15 of the European Parliament Resolution of 20 October 2020 on Intellectual Property Rights for the Development of Artificial Intelligence Technologies External link, opens in new window..

The reason why there is a very low probability that an AI-generated text, for example, would be flagged for plagiarism in anti-plagiarism tools like Ouriginal, is that this is not how the AI works. It rarely generates text by copying, but rather by creating new sentences and paragraphs based on the language patterns it has identified in the training data.

Generative AI, such as GPT (Generative Pre-trained Transformer), works by training on large amounts of data to learn patterns and generate new text based on this training. GPT has no creative thoughts or intentions of its own, but rather generates text based on patterns it has learnt from its training data. Therefore, it can be argued that the responsibility for any plagiarism lies with the creators or users of the AI model, rather than the AI itself.

However, it can be considered unethical (according to the European Commission, among others) to use AI in a way that is not transparent. See page Ethical guidelines. It is comparable to hiding parts of the method in a methods section of a paper or thesis.

Definition of plagiarism

At JU, what is considered an academic disciplinary offence is regulated in the policy document Regulations regarding Disciplinary Measures at Jönköping University (JU 2022/5173-113 § 90, Appendix 8). The text deals with disciplinary measures at Jönköping University (JU) and establishes rules to counteract cheating, promote good behaviour and actively combat harassment. It emphasises that the Jönköping University Foundation is responsible for establishing a disciplinary board and laying down rules for disciplinary measures, including expulsion. The rules describe what constitutes disciplinary offences and what disciplinary measures can be taken, such as warnings and suspensions.

The description of offences in the text (paragraph 11) does not explicitly mention plagiarism. The following offences are described:

(i) If a Student uses unauthorised aids or otherwise attempts to cheat in an examination or other assessment of the Student’s performance,

(ii) If a Student disrupts or impedes teaching, examinations or other activities within the framework of the courses and study programmes at JU,

(iii) If a Student disrupts activities at JU’s libraries or another JU facility,

(iv) If a Student subjects another Student, Participant, Studying person or employee at JU to harassment or sexual harassment, as defined in Chapter 1, Section 4 of the Discrimination Act (2008:567).

From the previously mentioned perspective, that it can be considered unethical and contrary to good academic integrity to try to conceal or mislead in such a way that presents an AI-generated text as being produced by the student himself by other means, without aids, it is thus a disciplinary offence to use AI tools - if it is clearly stated in the assignment that it is not allowed. To learn more about how you can create instructions that prevent cheating and make unauthorised use of AI difficult and detectable, see the suggestions and explanations on the pages Instructions that prevent and Preventing and detecting unauthorised use of AI.

Anti-plagiarism Guide

Furthermore, we can also compare what AI tools can do (see page Artificial Intelligence) with the description of plagiarism in the Anti-plagiarism Guide, a tool for both students and teachers at JU. It then becomes clear that AI tools can plagiarise, but that it is also highly likely that what is produced is not to be considered plagiarism. The main reasoning is that AI tools cannot be considered as authors of a text.

The definition of plagiarism in the Anti-Plagiarism Guide is "Plagiarism can include, for example, using another author's wording in a written assignment without acknowledging the source. It can also mean using someone else's computer code or experimental results as your own, or developing a product in a design project that is too similar to the original you were inspired by (Carroll & Zetterling, 2009). Submitting someone else's work as your own is a serious form of plagiarism."

The Anti-Plagiarism Guide is being developed to take greater account of unauthorised or unethical use of AI.