Skip to Main Content

Using Generative AI Tools in Academic Work: Home

This guide is aimed at students who wish to know how generative AI tools may be used to help with academic work. It currently covers using AI to help with literature searching and summarising. It also covers how to use AI ethically and responsibly.

What is generative AI?

Generative AI tools are programs that can generate new content like text, images, audio, and video. They do this in an automated way based on given inputs and parameters.

There are many different kinds of AI tool with multiple uses: 

  Generate human-like text and conduct conversational dialogues (e.g. ChatGPT, Copilot, Claude and Gemini)

  Assist researchers by finding and summarising research papers (e.g. Elicit, Semantic Scholar etc.)

  Create images from text descriptions (e.g. DALL-E and Stable Diffusion )

  Generate human-like voices and convert text into natural speech (e.g. Jasper and Whisper)

  Generate new music compositions and songs based on different genres, instruments, etc. (e.g. MuseNet and Amper Music)

​​​​​​​ Deepfakes (algorithms that can swap faces in images and video or generate fabricated video/audio that resembles real people)

Advantages of using AI in academic work

  • AI tools can generate novel, high-quality content fast. This means they can help with generating new ideas and approaches, such as generating textual mind maps, plans, definitions, and explanations.
  • They can create personalised recommendations and content and can generate content specific to your task.
  • They can assist with processing and interpreting information. They can both produce information based on your prompts and support your interpretation of information by generating summaries and critiques.
  • They can automate tasks that require creativity. For example, tools such as DALL-E can generate images very quickly that would be likely to take a human a much longer time to create with a graphics editor.

Next steps

Your next step is to learn how AI tools may help to support your work.Next page arrow

  1. Using AI tools to support your work
  2. Literature searching with AI
  3. Critical analysis with AI
  4. Using AI responsibly
  5. Using AI ethically
  6. Suggested resources
  7. QMU rules and regulations around the use of generative AI

Want to learn more about generative AI for your studies?

Part seven of our self-enrol Study Skills Canvas module focuses on generative AI. There is guidance on different kinds of generative AI tools, information about things you should think about before using generative AI, and information about how you can learn more about using generative AI tools.

Enrol on the module here.

Disadvantages of using AI in academic work

  • Results can be nonsensical or inappropriate. AI models can confidently present incorrect or false information as factual (also know as hallucinations).
  • Potential bias in training data. The presence of unfair or unrepresentative information in the dataset used to train an AI model can result in unfair outcomes when using the AI tool.
  • May not be up to date. AI tools may not have access to real-time information and may have been last updated quite a long time ago.
  • Ethical issues. Many ethical issues have been raised about the potential for AI tools to reproduce the biases in their training data, to be exploited to spread misinformation or disinformation, to be used to create deceptive content that threatens privacy, to endanger intellectual property rights, to displace jobs in industries that rely on content generation.

  • Environmental costs. Generative AI systems consume a substantial amount of energy, require large amount of water to cool their processors and generate electricity, and lead to significant carbon emissions.

Risks of over-reliance on generative AI

While fully AI-generated outputs can seem impressive on the surface, they can often contain factual errors, lack nuance, critical engagement, and depth of expression and understanding.

Importantly, overreliance on AI tools simply to generate written content, software code or analysis reduces your opportunity to develop and practice key skills (e.g. writing, critical thinking, evaluation, analysis, coding, reasoning). These are all important aspects of your learning at university and will continue to be required in your working life.

Written work is a key way of demonstrating critical thinking and deep engagement with your course material, much of which happens during the process of writing. Relying on AI-generated output will prevent you from developing the skills you acquire when you are doing it yourself. A vital aspect of your learning at university is about developing these advanced skills, learning how to think and build an argument through writing. Generative AI is no substitute for this.

While generative AI can be useful for some tasks, it is essential that you are aware of its many limitations that include the following:

  • Generative AI tools are language machines rather than databases of knowledge – they work by predicting the next plausible word, image, or snippet of programming code from patterns that have been ‘learnt’ from large data sets.
  • They have no understanding of what they generate.
  • The datasets that such tools are learning from are flawed and contain inaccuracies, biases, and limitations
  • They generate text that is not always factually correct. A knowledgeable human must check the output.
  • Generative AI can create software/code that has security flaws and bugs. Often the code or calculation produced by AI will look plausible but contains errors in detailed working on closer inspection. A human trained in that programming language should fully check any code or calculation produced in this way.
  • The data generative AI models are trained on are not necessarily up-to-date – they may have limited or constrained data on the world after a certain point.
  • They can occasionally produce fake citations and references.
  • Such systems are amoral – they do not know that it is wrong to generate offensive, inaccurate or misleading content, and sometimes do so.
  • They include hidden plagiarism – meaning that they make use of words and ideas from human authors without referencing them.
  • Generative AI may use illegal libraries and material generated from AI may infringe copyright or intellectual property.

Acknowledgement

This guide is a modified version of the University of Edinburgh Libguide 'Using generative AI tools in academic work' created by Ishbel Leggat, Anna Richards, and Robert O’Brien.