Artificial intelligence and be a useful tool in the research process, but like any other discovery tool, it should be treated with a critical lens and used as an aid rather than a direct resource.
Some things that AI can be useful for in your school work are:
Some things that AI is NOT capable of:
The biggest reason that AI should not be treated as a catch all resource is that AI will often make up information if given a prompt it doesn't know how to answer. This phenomenon is called Hallucination and it can include:
Below are generative AI tools you can use to generate textual content, whether that be drafting an email or asking it to rephrase a tricky paragraph in a class reading in a simpler way. This kind of AI is great for brainstorming, studying, and simplifying complex concepts, but is not reliable for generating written content, creating or coming up with citations, providing you with reliable sources, or general reference questions.
Tools for paraphrasing:
When using AI image generators please keep in mind:
When using AI audio generators please keep in mind:
When using AI video generators please keep in mind:
Below are some examples of AI hallucination I solicited from ChatGPT. Some of these examples are where I fed ChatGPT false information to begin with and some are examples of ChatGPT just supplying false information itself. Below each screenshot is an explanation of the hallucination and tips for avoiding it in your own AI use.
Screen Shot 1
Screen Shot 1: I supplied ChatGPT with false information from the start - in the novel Jane Eyre by Charlotte Brontë, there is no scene where Jane and Bertha Mason have tea together. Instead of correcting me, the AI supplied me with a summary of the made up scene. When asked to provide quotations from the scene, ChatGPT created those quotations by mimicking the style of Charlotte Brontë's writing.
Screen Shot 2
Screen Shot 2: When asked for specific academic resources, often times ChatGPT will give made up results simply because it does not have access to resources behind paywalls. The first result it gives, a book titled "The Madwoman in the Attic: The Woman Writer and the Nineteenth-Century Literary Imagination.", is a real resource. It has access to book titles and descriptions because that information is freely given on publication sites and online bookselling platforms. However, academic journal articles are often behind several different library memberships and other paywalls, so it will often falsify this information. Results 2,3 and 4 of this list are all made up article titles attributed to real academic professionals and real academic journals.
For example, result 2 is an article titled "Madwoman in the Attic" by Toni Morrison found in the journal New Literary History. Toni Morrison is a real author and scholar, but she never wrote an article by that title. New Literary History is a real academic journal, but the volume number given by chat GPT does not contain an article by that title or author, and in fact is entirely devoted to essays on Renaissance Literature. The other given articles are false in similar ways - the journals exist, and the authors are real scholars writing on literary topics, but the article title does not exist.