Skip to Main Content
     

A Guide to... AI in Scholarship: Using AI for Research

Can AI be used to help with coursework?

Research and AI

Artificial intelligence and be a useful tool in the research process, but like any other discovery tool, it should be treated with a critical lens and used as an aid rather than a direct resource. 

Some things that AI can be useful for in your school work are:

  • Brainstorming research topics and search key words
  • Creating practice tests or quizzing you on class content
  • Generating simplified restatements of tricky sentences or paragraphs in class readings
  • Editing your work for grammar, spelling and syntax 

Some things that AI is NOT capable of:

  • Writing complex arguments
  • Finding academic resources 
  • Correcting you when you submit wrong information

AI Hallucinations

False or Misleading Information from AI

The biggest reason that AI should not be treated as a catch all resource is that AI will often make up information if given a prompt it doesn't know how to answer. This phenomenon is called Hallucination and it can include:

  • Making up citation information for articles that do not exist. For example, if you ask chatGPT to give you academic journal articles on a topic, instead of admitting that it doesn't have access to these things, it will give you a list of citations that are usually all false. It may use the name of a real article, but attribute it to the wrong journal or even a journal that doesn't exist. It also might give you a real author but falsify the article title. 
  • If given incorrect information in a prompt, chatGPT will not correct you or deny that your information is inaccurate. It will simply operate as if what you have given it is the truth and therefore make up information in order to give you the answer you want. 
  • Incorrect or completely fabricated quotations. ChatGPT will often create false quotations to suit whatever context you're asking for, or simply supply you with a fake quote due to some error or inconsistency in its training. This is especially true for any chat bots that are linked to internet, such as the paid version of chatGPT. It may pull a JFK quote from someone's blog without knowing that the blog is inaccurate or farcical. 

See the box at the bottom of the page for concrete examples of Hallucination

AI Tools to Generate...

Below are generative AI tools you can use to generate textual content, whether that be drafting an email or asking it to rephrase a tricky paragraph in a class reading in a simpler way. This kind of AI is great for brainstorming, studying, and simplifying complex concepts, but is not reliable for generating written content, creating or coming up with citations, providing you with reliable sources, or general reference questions. 

Tools for paraphrasing: 

When using AI image generators please keep in mind:

  • AI images are not a replacement or substitution for creative artworks or artists. Just like with text generation, any image, video or audio you make with AI is made using the creative work of someone else.
  • Use caution when generating content using anyone's likeness or voice to create content. Be respectful. 

When using AI audio generators please keep in mind:

  • AI audios are not a replacement or substitution for creative artworks or artists. Just like with text generation, any image, video or audio you make with AI is made using the creative work of someone else.
  • Use caution when generating content using anyone's likeness or voice to create content. Be respectful. 

When using AI video generators please keep in mind:

  • AI videos are not a replacement or substitution for creative artworks or artists. Just like with text generation, any image, video or audio you make with AI is made using the creative work of someone else.
  • Use caution when generating content using anyone's likeness or voice to create content. Be respectful. 

AI & Information Literacy

Examples of Hallucination

Examples of Hallucination from ChatGPT

Below are some examples of AI hallucination I solicited from ChatGPT. Some of these examples are where I fed ChatGPT false information to begin with and some are examples of ChatGPT just supplying false information itself. Below each screenshot is an explanation of the hallucination and tips for avoiding it in your own AI use. 

Screen Shot 1

A screenshot of an interaction between the librarian and ChatGPT. Starts with the Librarian saying: write a summary of the chapter in Jane Eyre where Jane and Bertha Mason have tea. ChatGPT Responds: In the chapter of Charlotte Brontë's "Jane Eyre" where Jane and Bertha Mason have tea, the atmosphere is tense and eerie. Jane, the protagonist, is staying at Thornfield Hall, employed as a governess, where she encounters Mr. Rochester's mysterious wife, Bertha Mason, who is kept hidden on the third floor due to her mental instability.  During the tea encounter, Bertha's presence unnerves Jane, who senses an unsettling energy emanating from her. Bertha's behavior is erratic and wild, contrasting sharply with Jane's composed demeanor. Despite Bertha's attempts to disrupt the tea gathering, Jane maintains her composure, although she is deeply disturbed by Bertha's presence and the enigmatic circumstances surrounding her.  The scene serves to deepen the mystery surrounding Bertha Mason and her relationship with Mr. Rochester, while also highlighting Jane's resilience and ability to navigate uncomfortable situations. It foreshadows the revelation of Thornfield Hall's dark secrets and Jane's eventual confrontation with her own desires and moral convictions.

Screen Shot 1: I supplied ChatGPT with false information from the start - in the novel Jane Eyre by Charlotte Brontë, there is no scene where Jane and Bertha Mason have tea together. Instead of correcting me, the AI supplied me with a summary of the made up scene. When asked to provide quotations from the scene, ChatGPT created those quotations by mimicking the style of Charlotte Brontë's writing. 

 

Screen Shot 2

Screenshot of an online interaction between the librarian and chatGPT. The librarian said: provide some academic sources on bertha mason in jane eyre. ChatGPT Responded: Here are some academic sources that discuss Bertha Mason in Charlotte Brontë's "Jane Eyre":  Gilbert, Sandra M., and Susan Gubar. "The Madwoman in the Attic: The Woman Writer and the Nineteenth-Century Literary Imagination." Yale University Press, 1984.  This seminal work explores the portrayal of women in nineteenth-century literature, including the representation of Bertha Mason as the "madwoman in the attic." It offers insightful analysis on the significance of Bertha's character and her role in "Jane Eyre." Morrison, Toni. "The Madwoman in the Attic." New Literary History, vol. 14, no. 3, 1983, pp. 503-517.  This article discusses the characterization of Bertha Mason and its implications in "Jane Eyre." It explores themes of race, gender, and colonialism in Brontë's portrayal of Bertha as the Other. Fraser, Hilary. "Bertha Mason Revisited." Victorian Literature and Culture, vol. 19, no. 2, 1991, pp. 205-220.  Fraser's article provides a critical examination of Bertha Mason's character, focusing on her representation as a colonial Other and the implications of her madness within the context of imperialism and gender in Victorian society. Gilbert, Sandra M. "Inappropriate/d Others: The Spectacle of the Madwoman in the Literature of the Americas." Comparative Literature Studies, vol. 30, no. 3, 1993, pp. 204-219.  This article examines the portrayal of madwomen in literature, including Bertha Mason in "Jane Eyre," within the broader context of colonialism and cultural imperialism in the Americas.

 

Screen Shot 2: When asked for specific academic resources, often times ChatGPT will give made up results simply because it does not have access to resources behind paywalls. The first result it gives, a book titled "The Madwoman in the Attic: The Woman Writer and the Nineteenth-Century Literary Imagination.", is a real resource. It has access to book titles and descriptions because that information is freely given on publication sites and online bookselling platforms. However, academic journal articles are often behind several different library memberships and other paywalls, so it will often falsify this information. Results 2,3 and 4 of this list are all made up article titles attributed to real academic professionals and real academic journals.

For example, result 2 is an article titled "Madwoman in the Attic" by Toni Morrison found in the journal New Literary History. Toni Morrison is a real author and scholar, but she never wrote an article by that title. New Literary History is a real academic journal, but the volume number given by chat GPT does not contain an article by that title or author, and in fact is entirely devoted to essays on Renaissance Literature. The other given articles are false in similar ways - the journals exist, and the authors are real scholars writing on literary topics, but the article title does not exist.