Webbläsaren som du använder stöds inte av denna webbplats. Alla versioner av Internet Explorer stöds inte längre, av oss eller Microsoft (läs mer här: * https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Var god och använd en modern webbläsare för att ta del av denna webbplats, som t.ex. nyaste versioner av Edge, Chrome, Firefox eller Safari osv.

Questions and answers about AI tools

These questions are being answered by members of Lund University's working group on AI tools, particularly Johanna Bergqvist Rydén, Elin Bommenel och Rachel Forsyth. Don't hesitate to send us your questions - we are all learning and there are no dumb questions. As you  see, we will anonymise any questions if they are added to this page. 





What is an AI tool?

AI tools are being talked about a lot. AI stands for Artificial Intelligence. This isn’t a very accurate term, but we will use it as it is in common use. The tools we are talking about are Large Language Models, or LLMs. They work by predicting what the most likely should come next in a sentence. You will be used to this from your mobile phone, which suggests words for you. These models are huge and have been trained on massive banks of articles, books, internet pages, and so on, so they have many options to choose from. They are ‘helped’ by human trainers who have graded the coherence and accuracy of the predictions. The material that the system has ‘seen’ may be biased in different ways, and so may be the human trainers. In addition, the way the system works, by predicting the most likely words which should come next, mean that entirely new concepts or books or websites may be invented. The outputs cannot be considered to be entirely factual or genuine.  For subjects which are very specialist (like your research, probably), the system will probably not have seen much information about it, and you will get bland and possible meaningless outputs.

Whilst you will see people talking about ‘what the computer said’ in colloquial language, always remember that these are outputs based on probability, just like what you get from using a search engine like Google or Bing. The computer is not thinking in the way that we do.

How much do teachers need to understand ChatGPT's functionality and capability?

It is a good idea to find out the basics (see our films on education.lu.se) so that you can decide how to work with these tools in your teaching and discuss them with your students. Work with your students to find out more, if they are curious.

Does ChatGPT provide sources to support the information in the output text if asked to do so?

GPT-3 does not, but other tools do – eg Perplexity, and the new Bing AI will do this.

Skärmdump från perplexity.ai, som visar att den lägger till källor.
Screen shot from perplexity.ai, showing that it adds sources.












What is going to change in education as these AI tools become more common, and integrated into software we use everyday such as internet searches and Office products?

There is no doubt that AI tools are going to become the main way we locate and summarise information. In the internet era, we have become used to search engines such as Google or Bing to look things up, and sites such as Wikipedia in which humans bring together information into neat summaries. We’ve also developed ways to verify the provenance of such information and summaries. The AI tools will bring these things together to provide up to date information. Like its predecessors, this may not be accurate, and it may be biased. So what will change is that we will want to use them more, and we will need to identify and apply ways to critique and verify the outputs.

If I have plenty of written material from my research, can I ask ChatGPT to write a first draft of a final report or a book if I provide a structure for it?

This is certainly what people will be doing as AI tools become more embedded in software we use every day such as search engines and Office software such as Word or Powerpoint. Focus on what is important here: is it that you wrote all the words, or that you collected and analysed the data? There will need to be many discussions about this in disciplinary meetings, and with publishers and research funders.

Is it possible to cite an AI tool as a source?

Yes. As usual, librarians are already sorting this out for us – for instance, the University of Queensland already has several citation formats for AI sources.

Is it possible to cite an AI tool as a co-author?

Probably not right now, although perhaps definitions of ‘author’ may change over time. AI tools may be considered a source. If large parts of the work include AI-generated text, then this should be stated, and you can let people decide if you should be claiming the authorship.

How can we access the creative thinking of the student if the student uses AI tools to prepare their assignments?

Creative thinking is a key aspect of a university education. Students have always been able to prepare assignments in ways we may not have intended or wanted. The development of AI tools means that alternative approaches are now more available and cheap.  Whilst there are no simple answers to this question, some techniques to develop are:

  1. Explaining to students the value of their own work and its importance to you. The teacher-student relationship is a key aspect of effective learning (Felten & Lambert, 2020) and this is a chance to talk about why you want to see what they can do, not just that they can pass an assignment.  
  2. Seeing students’ work in progress, building their final assignments through abstracts, outlines, and drafts. This may seem onerous but you can think about encouraging them to use AI tools to access basic content you might have taught them, and use the time freed up to facilitate their creative development.   

Both of these approaches need thought, time, and development. Start with small changes, and contact the pedagogical unit your faculty works with to see what workshops and courses are available to support you.  

Felten, P., & Lambert, L. M. (2020). Relationship-rich education: How human connections drive success in college. JHU Press.

What opportunities do AI tools such as ChatGPT offer teachers and students?

We all have a lot of thinking to do about how these kinds of tools will affect our work, and the responses will be different in different disciplines. Some uses of these tools are likely to include: 

  • Generation of ideas 
  • Improvement of text  
  • Automation of tasks – eg suggesting the text of routine emails, some kinds of grading and feedback, checking attendance and engagement. 
  • Analysing data   
  • Answering questions 24 hours a day (even if not always accurately) 

We have to discuss these types of tools in programme teams and with colleagues in other universities to come to some consensus about their values and limitations, and where we draw the line in their use.

How can we best integrate this new technology into students' learning?

We can incorporate these tools in lots of ways. Here are some bullet points, and the pages on education.lu.se develop some of these ideas further.  

  1. Digital literacy: Ask students to use the tools to answer a question, and then to critique the outputs in small groups. Ask them to identify key points they are looking for to verify the outputs.  
  2. Idea generation: Ask students to use AI tools to brainstorm some ideas for thinking about a difficult question or topic, then discuss the appropriateness and feasibility of these ideas.  
  3. Structuring: Ask students to use AI tools to generate the outline of a written assignment such as an essay, then ask them to identify what further information they need to start writing. In class, get them to write some paragraphs (see the ‘Shut up and write’ technique). 
  4. Starting points: Ask students to use AI tools to generate the outline of a literature review, then ask them to identify one reference for each paragraph in the introduction and justify its inclusion in their own words.  

For all of these examples, you can add peer review of drafts, justifications and commentaries. It doesn’t need to add work.

Can I make it compulsory for my students to use ChatGPT?

No. The University working group is looking at the legal issues  around these tools such as licences, use of copyright material, and GDPR. If students wish to use these tools and you are happy for them to do so, then you should consider whether you need to amend the learning outcomes or marking criteria, the skills students will need to prompt (ask questions of) the tools and evaluate the output, and whether you need an declaration that they have used the tools. (e.g. “If you use AI tools to generate ideas, then cite them clearly, include the transcript as an appendix, and explain how you have used the information they provide, and its value).

Is there any way for me to recognise the output of a tool such as ChatGPT? 

No. Some people have observed that the output can be repetitive and bland, but so can work written directly by humans. There is no reliable way to recognise the output, and the speed of development is such that even if we gave tips now, they would be out of date within weeks, if not days.

What is the quality of the output of tools such as ChatGPT?

The output from these programs is grammatically correct and often apparently coherent. However, it is important to remember that they are simply putting text together based on likely combinations of phrases and there is no fact checking. The combinations of likely phrases can extend to the creation of citations and webpages which do not exist.

Can we reliably detect whether a student has used an AI tool in an assignment?

There is software which claims to detect the use of various AI tools. They are being tested both here and by other universities. However, we cannot recommend their use right now, as they have not been verified for GDPR compliance. You should not upload student work to any systems which have not been approved by the Legal department.

Is it a disciplinary offence for a student to submit an assignment which includes sections generated by AI tools?

It is a disciplinary offence if: 

  1. The use of the tool was considered to be an unauthorised aid, and banned in that course – this would need to be in the course plan or clearly mentioned in the assignment instructions; or  
  2. The student pretends that they did this work themselves.  

Check the university regulations and the link to the Higher Education Ordinance for more details, but remember that it is hard, if not impossible, to detect the use of these tools, and that will get harder.


Do you have questions about ChatGPT and AI in teaching? Email to Rachel Forsyth: rachel [dot] forsyth [at] rektor [dot] lu [dot] se