Online resources available. The Library is closed for maintenance.

Icon MACDONALD-KELCE LIBRARY Building

"Fake News" & Misinformation

Learn how to evaluate news sources, identify unreliable sources, misinformation, and conspiracy theories

ChatGPT

ChatGPT

Welcome to ChatGPT

This awesome little guy is here to help, right? He can write papers, make flowery prose and do some very creative stuff. Unfortunately, it could all be fake! This is hard to understand considering ChatGPT can pass law exams and numerous other tests. GPT4 appears even more promising in its ability to complete standardized exams. GPT stands for generative pre-trained transformer, which is part of the large language model for AI learning. Basically, it's capable of processing millions of words in human speech and rendering a plausible speech flow in text.

Be mindful that what ChatGPT says might be fake.

Yes, it's excellent for poetry, and imaginative stories, but not for your assignment. A thirty-year veteran New York lawyer Steven A Schwartz used ChatGPT to write a legal brief which he submitted to the court. He even asked if the citations were real. ChatGPT lied to him directly and he believed that lie. He submitted the brief with quotes and

citations supporting his case and now may face possible sanctions on Jun 8, 2023. "US District Judge Kevin Castel confirmed the development, adding that 'six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations.' " (Sarkar, 2023).

In doing research a few months ago a UTampa faculty member ask me to verify sources presented with ChatGPT.  I also found these sources to be fake. I discovered the source to be fake by backtracking the journal to the year and article cited. In some cases, ChatGPT gave links to articles and DOIs that lead nowhere, or to the wrong article. When I verified the sources were indeed fake I ask ChatGPT and it said they were real sources! When I pressed that they were fake, it apologized and said it was only as accurate as its own source data. I do not think this is a source data problem, but creativity with language that starts with the text it's creating and includes all quotes and citations in that text. ChatGPT knows how to make things sound very convincing but it appears to have no morals about making up quotes and citations.

Verify all quotes and citations.

You can verify by Googling the article or using the library journal search and other tools. If you cannot verify something ChatGPT presents it might be best to assume it's fake. ChatGPT may present great imaginative story ideas, poetry, and plot starts, but be cautious about the facts presented. The AI large language models know how to mimic human speech but may also mimic facts. If you want to read more about when you should and should not use ChatGPT, see Aaron Welborn's article from Duke University Libraries.

 

Marut, R. (2023, May 28). Lawyer apologizes for fake court citations from ChatGPT. CNN.  https://www.cnn.com/2023/05/27/business/chat-gpt-avianca-mata-lawyers/index.html

Sarkar, S. (2023, May 29). A New York lawyer is in trouble for using ChatGPT to write legal brief. The Tech Portal. https://thetechportal.com/2023/05/29/new-york-lawyer-chatgt-legal-brief/

Welborn, A. (2023 Mar 9). ChatGPT and fake citations. Duke University Libraries. https://blogs.library.duke.edu/blog/2023/03/09/chatgpt-and-fake-citations/
 

Macdonald-Kelce Library - The University of Tampa - 401 W. Kennedy Blvd. - Tampa, FL 33606 - 813 257-3056 - library@ut.edu - Accessibility