Skip to Main Content

Middlesex Campus Library

Middlesex ChatGPT and Generative AI-A Student's Guide

This guide provides important information about Generative AI to students.

Why Fact-Checking is Always Needed?

An AI hallucination is when an AI model generates incorrect or misleading information but presents it as if it were a fact. AI hallucinations are impossible to prevent. 

  • A common AI hallucination in higher education happens when users prompt ChatGPT or Gemini to cite references. These tools scrape data that exists on this topic and create new titles, authors, and content that does not actually exist. For example, you could never find the article below from JAMA Pediatrics - ChatGPT made it up.

ChatGPT fake citation example

  • GenAI produces biased information. For example, the prompt supervisor talking to staff members was entered in Canva AI image generator. Below is the first three images the AI generated. Without exception, all supervisors are male. AI Bias example