Skip to Main Content

Middlesex Campus Library

Middlesex ChatGPT and Generative AI-A Faculty's Guide

Hallucination

An AI hallucination is when an AI model generates incorrect or misleading information but presents it as if it were a fact. AI hallucinations are impossible to prevent. A common AI hallucination in higher education happens when users prompt ChatGPT or Gemini to cite references. These tools scrape data that exists on this topic and create new titles, authors, and content that does not actually exist.

Bias

Large Language models (LLM)produces gender bias and racial stereotyping. For example, women were described as working in domestic roles far more often than men. They ignore or distort artists’ text prompts to stereotype or censor Black history and culture. AI detectors are more likely to flag the works of authors whose first language is not English.

Privacy

When you enter contents or prompts into AI tools, they ingest, store, and use them to further train the large language models. The information you submitted will be shared with others in some manner. Therefore, you should

  • NEVER enter protected student data into AI tools. This could potentially result in violations of FERPA.
  • NOT enter student work directly into a prompt.
  • avoid including personal, sensitive, or confidential information.
  • read through any user agreements if you sign up to use a particular tool.
  • check the data privacy policy of the tool regularly for how data is collected, used, and stored.

Copyright

Generative AI tools can be used to infringe on a copyright owner’s exclusive rights by producing derivatives. A number of copyright infringement lawsuits have been filed against AI platforms. Before entering any copyrighted material into a generative AI tool as part of a prompt, you need to get permission. Further, entering/uploading material such as articles you obtained from library subscribed databases to AI may be a violation of copyright. Currently, copyright protection is not granted to AI generated works.