There is a wide range of tools that use artificial intelligence (AI), among other things, to easily carry out tasks that even students might do as part of assessment and examination. Unless the use of such tools is authorised, there are measures we can take to make unauthorised use difficult and detectable.

In this context, it is important to note that there are currently few tools that can identify whether AI tools have been used to produce a text, find source material, debug code, create images or analyse data, for example. The tools that do exist (see page Types of AI in the education context) are rarely reliable without also having a great deal of knowledge and understanding of what tools have actually been used in the creation process. Some advice recurring in similar guides from Swedish colleges and universities, such as being specific in assignment instructions or always requiring students to include correct source references, is not specific advice for navigating assessment and examination in an AI context. Rather, they are prerequisites for any type of examination and assessment. Most such guides take a relatively narrow perspective that can be applied to take-home exams and writing assignments. This page reflects that, but also takes a broader perspective, applying ethical guidelines and the teaching of academic integrity and good academic practice to also create a learning environment that prevents cheating and promotes learning.


It is up to the examiner to decide which rules apply. Clear instructions are an integral part of prevention and detection, but more extensive adaptations of pedagogical methods, assessment and examination may yield better results.

Different conditions and impact on examination tasks
In the text

The potential impact of AI tools on assessment, Bommenel and Forsyth (2023) present a clear picture of the conditions that limit or enable the use of AI in assessment and examination. The text discusses the potential impact of AI writing tools on assessment in higher education. It explains what AI writing tools are and how they work, emphasising their availability and integration into commonly used software. The limitations of these tools are also mentioned, including the risk of generating false information. The text presents three main options for considering AI writing tools in assessment: prohibiting their use, assuming their use and incorporating them into tasks, or fully integrating them into assessment. Advantages and disadvantages are given for each option. The text concludes by suggesting that educators should consider their course requirements and good assessment practices when deciding how to deal with AI tools. In many cases, it is the decision of the examiner and the individual teacher that determines how assessment and examination can be carried out with legal certainty.

Verbatim from Bommenel and Forsyth (2023)

1. To ban their use (like the calculator example)

The only way to do that reliably is to have the assessment in constrained conditions – a supervised examination room with no use of external technology, or using a computer which is locked down so only certain software can be used. This may be the right approach if you need students to reproduce factual knowledge under constrained conditions.

You might do this if these are the kinds of learning outcomes you have: The student will be able to: define, identify, reproduce, describe, sequence, combine, classify…



  • You can test factual knowledge
  • You know the answers are the students’ own work, or at least produced from their memory



  • You may have wanted the students to spend more time and thought on the task than is possible in an examination situation
  • In future work, students may need to use these kinds of technologies in more realistic conditions and they will not have experienced them.


2. To assume that they may be used, and set tasks which incorporate them

This means thinking about what skills students need to prompt the tools and how they will assess the results.

You might do this if these are the kinds of learning outcomes you have: The student will be able to: compare, explain, analyse, create…



  • You allow students to explore the power of these tools in relation to your subject area
  • You can work with students to explore academic honesty and the purpose of academic assessments.



  • You will have to rewrite the assignment tasks to focus on critique and analysis – this may require changing the course learning outcomes.
  • Not all students may currently have access so you might have to set a choice of questions – one using these tools, one not using them, and ask students to say which they have picked.


3. To require them to be used – fully incorporate the tools in the assessment

This is not possible at the time of writing (March 2023), we have not yet studied the GDPR implications of the service, and as capacity on the free service is limited, and you couldn’t guarantee that students would be able to create an account. But in the future these tools will be more generally available so you could be preparing for how to do this in the future.

You might do this if these are the kinds of learning outcomes you have: The student will be able to: evaluate, theorise, generalise, predict, hypothesise, imagine, reflect…



  • You ensure that students can explore the power of these tools in relation to your subject area
  • You can work with students to explore academic honesty and the purpose of academic assessments.



  • You will have to ensure everyone has access
  • You will have to rewrite the assignment tasks to focus on critique and analysis – this may require changing the course learning outcomes.

Suggestions from ChatGPT

Based on the other sources for this text, which were entered as training data, we instructed ChatGPT to generate a bulleted list of suggestions on "how teachers in higher education can make it difficult to detect cheating with AI tools in, for example, take-home exams, programming assignments, and similar assessments". The list generated is not entirely without pedagogical basis, and for example we can see traces of commonly used methods such as student active learning, differentiation, and case methodology. It is interesting to note that several of the suggestions involve changes to the tasks themselves, and not just adapting the task instructions.

  1. Creating unique tasks: Instead of using generic or commonly available tasks, teachers can create unique and customised tasks for each student or group. This makes it harder for students to find predetermined answers or solutions online.
  2. Use multiple choice questions: Instead of using only open-ended answers, teachers can include multiple choice questions that require comprehension and analytical thinking. This reduces the possibility for students to find exact answers online and forces them to demonstrate their own knowledge and reasoning.
  3. Use auto-generated tasks: Teachers can use software to auto-generate unique tasks for each student or group. This makes it almost impossible for students to cheat by copying from each other or using predetermined solutions.
  4. Use practical applications: Design tasks that require practical applications of knowledge. By using scenarios or case studies, students cannot rely solely on finding answers online, but must demonstrate that they can apply their knowledge to real-life situations.
  5. Use anti-plagiarism software: Use plagiarism detection software and services that can identify the extent of copying and referencing in student work. These software can help identify whether students have copied from the internet or from each other.
  6. Use retests and oral examinations: Supplement written assessments with retests or oral examinations. This allows teachers to ask follow-up questions and assess student understanding in a more direct way, reducing the risk of cheating with AI tools.
  7. Asking for the work process: Instead of just asking for the end result, teachers can ask students to explain their work process or reasoning behind a solution. This makes it harder for students to just copy or use AI tools without actually understanding the topic.
  8. Vary assessment methods: Use different types of assessments, such as projects, labs, group work, and written tests. Varying assessment methods makes it harder for students to cheat by using the same cheating method for different types of assignments.
  9. Create strong communication: Build strong communication with students and emphasise the importance of honesty and academic integrity.

Proposed measures to prevent and detect unauthorised use of AI

Summarised from Uppsala universitet & Cullhed, M. (2023), used with permission and gratitude

Instructions for take-home assignments

Changing the wording of instructions is a relatively small measure, which does not necessarily address underlying difficulties or problems with the examination or assessment.

Writing of the assignment

  • Require even shorter texts to include references. This helps to encourage students to use correct references and demonstrate their ability to substantiate their reasoning.
  • Formulate questions where answers are based on or applied to specific and local contexts that are not familiar to AI tools. This makes it more difficult for students to rely on AI-generated answers.

Review examination practices

  • Review which course objectives are examined through homework and consider using other forms of examination such as classroom exams or oral examinations when more appropriate.
  • Introduce more opportunities for oral examination. This can be time-consuming, but it can be an effective way to assess student knowledge and reduce the possibility of unauthorised use of AI.
  • Provide feedback on different versions of the assignment. Having discussions and peer feedback can make it more difficult for students to use AI tools to generate their answers.
  • Consider staged examination where oral feedback is given after the submission of written assignments. This provides an opportunity to discuss and explain answers with students.

Use AI tools to track texts created by AI tools

  • Explore the possibility of using AI tools to detect if students have used AI to generate their answers. This may be a more advanced measure that requires resources and technical expertise, but it can be a way to combat unauthorised use of AI. Be aware that this can generate so-called 'false positives'.

It is important to remember that these measures are not comprehensive and that keeping up with the rapid development of AI is an ongoing challenge. It also requires collaboration between teachers, management and administration (e.g. the JU Examination Organisation, Tentamensorganisationen) to implement effective strategies and support colleagues in dealing with these issues.

Other suggested measures

Based on texts from Linnaeus University (2023), KTH & Naimi-Akbar, I. et al. (2023), University of Borås (2023) and Lund University (2023).

Clarify rules and guidelines

  • Teachers/examiners should clarify which uses of AI chatbots are allowed and not allowed during the examination.
  • Examples of permitted uses include using AI chatbots to improve texts, find errors or create an overview of subject areas. This requires the student to explain how the AI chatbot has been used.
  • Unauthorised uses include submitting an AI-generated text as their own, which is considered cheating.
  • In peer review/opposition, the student should make their own judgement and not use raw results from the AI chatbot.

Adapt the forms of examination

  • Avoid unsupervised home examination and supplement it with other forms of examination, e.g. oral examination or classroom writing.
  • Carry out supervised "open book" examinations.
  • Have more reconciliations and step-by-step presentations for longer text production/project work.
  • Create context-based assignments that are specific to the course or have local conditions to complicate the use of AI chatbots.
  • Clear requirements that course and other literature should always be referenced with page references.

Suspicion of cheating

  • If there is a suspicion of unauthorised use of AI chatbots, the offence must be investigated and, if there is a well-founded suspicion of cheating, reported in accordance with the disciplinary process.
  • Tools to assess the likelihood of AI-generated texts can be used as part of the investigation, but need to be complemented by other material to provide sufficient evidence.

Promoting awareness and reflection

  • Teachers should include discussions about the benefits, problems, inaccuracies and biases of AI chatbots and the generated text in their teaching.
  • Students should be encouraged to critically examine the AI chatbot's responses and be made aware of the potential for inaccuracy and bias.
  • Comparisons can be made between the AI chatbot's answers and those of experts to reflect on different perspectives and values.
  • Emphasise that AI-generated material must be carefully reviewed to ensure that it captures the course content.


Bommenel, E., & Forsyth, R. (2023). The potential impact of AI tools on assessment. För lärande i en digital värld. Link External link, opens in new window.

ChatGPT (OpenAI). (2023). Prompt with instructions to generate, based on the other sources presented on this webpage, a bulleted list presenting functional methods to reduce and detect unauthorised or unethical use of AI tools in examinations.

Högskolan i Borås. (2023, 18 april). AI och examinationer. Hämtad 18 juni, 2023, Link External link, opens in new window.

KTH & Naimi-Akbar, I. et al. (2023) Att främja lärande och förebygga fusk. PriU-gruppen bedömnings- och examinationsmetoder. Link Pdf, 486.5 kB, opens in new window.

Linnéuniversitetet. (2023). Högskolepedagogiska funderingar om öppen AI – några förslag och rekommendationer. Nationell resurs om undervisning - kursutveckling. Link External link, opens in new window.

Lunds universitet. (2023, 2 maj). AI i undervisningen.

Stockholms universitet (n.d.). Vägledning om användning av AI-drivna chattbotar vid utbildning och forskning. Medarbetarwebben. Link

Sunet. (2023). Forumet AI och högre utbildning. Link External link, opens in new window. (account needed)

Sunet. (2023). Forumet SUNAI (Sveriges universitetsnätverk för akademisk integritet). Link External link, opens in new window. (account needed)

Uppsala universitet & Cullhed, M. (2023, 21 februari). Försvåra och upptäck otillåten användning av AI. Medarbetarportalen. Hämtad 18 juni, 2023, Link External link, opens in new window.