Skip to Main Content

Using Artificial Intelligence

This guide is a starting point for creating pages about AI for students, staff, faculty, etc.

Overview

What is ChatGPT?

ChatGPT (Generative Pre-trained Transformer) is a language model developed by OpenAI that uses machine learning algorithms to generate human-like text in response to a given prompt or question. It's trained on vast amounts of data from the internet and books, allowing it to mimic human language patterns and generate coherent text. Currently, it is one of the most well-known generative AI tools, but there are many others like it. 

How does generative AI, like ChatGPT, work?

Generative AI works by using a deep neural network architecture called a transformer to analyze a given prompt or question and generate a response based on the patterns it has learned from its training data. It uses a process called autoregression, meaning it generates one word at a time based on the previous words in the sentence.

Limitations of Generative AI

While generative AI has a number of uses, it's important to use these tools ethically, as well as be aware of its limitations. The following highlights the primary limitations of generative AI tools, like ChatGPT:

  • Generative AI can only generate text based on the data it has been trained on, which means it may not have answers to every question or be able to generate content on every topic. 
  • Generative AI does not have the ability to fact-check or verify the accuracy of the information it generates, so it may generate false or misleading information.
  • Generative AI is biased. It is trained on a large dataset of text from the internet and books, but this data is not comprehensive or unbiased. This can result in AI tools generating inaccurate or biased responses. For example, if the training data contains gender stereotypes, AI may produce sexist or discriminatory responses.
  • Generative AI tools are not good for answering complex or nuanced questions that require critical thinking or human expertise. It is just replicating patterns, so it may not be able to understand complex topics or questions that require critical thinking. For example, ChatGPT may not be able to provide a nuanced analysis of a literary work or a complex scientific concept.
  • Generative AI is NOT recommended for generating citations or bibliographies based on a topic. Tools like ChatGPT simply replicate patterns or phrases, it will often generate citations that appear real but are, in fact, fake.
  • Generative AI cannot provide emotional intelligence or empathy. It cannot fully understand or respond to emotional or interpersonal nuances in communication, and its responses may be tone-deaf or inappropriate in certain situations.

A decorative image with the quote "Mimicking the output of good scholarly work is no the same as achieving it"

Conclusion

Generative AI is not a replacement for human interaction or expertise. While tools like ChatGPT can assist in research and learning, it cannot replace the act of learning or scholarly participation. It should not be used to generate ideas or to answer questions on which the user has no previous knowledge or background. It's important to recognize the limitations of generative AI and use it as a supplement to your own research and learning, rather than relying solely on its generated content. Mimicking the output of good scholarly work is not the same as achieving it.

For more information on how to use generative AI tools to supplement your learning, please see:

Remember, AI does not know truth from fiction and it often generates false or misleading information. It is important that any use of generative AI is done critically and that users always double check the information it generates.