This post was written by NCTE member Val O’Bryan.
If you read my earlier post, you know that I used ChatGPT to try to generate essay content that would satisfy my assignment specification, with mixed results. I started with a personal narrative essay and a text analysis. The text analysis prompt required the blending of several elements, which ChatGPT wasn’t able to do. I wondered how the tool would perform with the issue analysis report that I have students write as a precursor to our researched argument essay.
The tool is capable of developing responses that make it sound “smart” and real. BBC’s Science Focus explains, “As a language model, it works on probability, able to guess what the next word should be in a sentence.” Even though I knew before using the tool that it was merely a text generator, a language model that makes text predictions, I found myself starting to converse back and forth with it. As I continue to think of ways that I can incorporate this into my own instruction, this is one thing I will be reminding students about–ChatGPT can’t think for you and it can make mistakes. Even though ChatGPT feels like you are interacting with an omniscient being, it has limitations and can’t do everything (yet!).
Despite limitations, ChatGPT remains a marvelous tool for helping students get started with the writing process: choosing a topic. As any instructor knows, a common complaint of students is “I don’t know what to write about.” I asked ChatGPT, “What is a good issue to write about for an issue analysis report for college?”, and the tool responded with a paragraph about the impact of climate change and pointed out that there are many angles to approaching the topic and that there is a “wealth of research and data, making it an ideal subject.” I responded with, “What if I hate climate change?” The tool then generated a list of six possible topics and added the advice, “It’s important to choose a topic that you’re passionate about and interested in, as this will make the research and writing process more enjoyable and engaging.” As soon as I set my students loose to write, I wander around the tables, checking in with students–especially those who seem to be staring off into space. Although I am always happy to conference with students to help them find a topic they are interested in, I can see myself pointing them to the ChatGPT tool to get some ideas before we conference.
Once I had chosen a topic for my issue analysis report, I continued to ask ChatGPT about the topic. A student could waste an immense amount of time, getting lost in asking questions. A student can pose a question to ChatGPT and feel like they are getting a real (and correct) answer, as opposed to Googling the question and having to dig through a list of sources (although Google is getting better at answering questions and citing where it found the answer). If I were to point my students to this tool for questions, I would definitely limit their time, reminding them that this is a short exercise to help them identify subtopics and angles. This is a helpful boost in the thinking process. Students in freshman composition are often teenagers who are still learning about many of our world’s issues. They don’t have much depth of knowledge on these topics that they are curious about and can quickly become overwhelmed by trying to learn about the topic and simultaneously having to quickly synthesize what they are learning and meet instructor writing demands (in addition to their other coursework and life circumstances). Having a tool that can make some suggestions to get them started can help with learner fatigue.
Since the ChatGPT responses to student questions depend on a lot of nuances in how they ask the question, students can get very different responses, and it is valuable to talk through everything that the tool generated together. If students become interested in an angle or focus that was inspired by the ChatGPT responses, that is a good time to send them off to do some research to validate the information. When I asked the tool for citations based on the information it gave me, it responded with “I apologize, as I am a language model and do not have the ability to provide citations for research studies.” It also suggested that I search academic databases such as JSTOR, Proquest, and Google Scholar and listed some key terms for me to search. As the tool stands today, it can not provide citations, so students still need these critical research skills.
Yes, students may try to feed ChatGPT their assigned prompt and just have it generate an essay for them. However, instructors can have them use the tool at different stages of the writing process, stopping them along the way to discuss and think critically about the generated content and searching through databases to try to find credible sources that support the generated content. The tool still has many limitations. Asking the same question might get you the same response or different responses. There are no citations. Sometimes the tool is incorrect. However, this tool can be used to help students over some of the initial hurdles in the writing process.
Dr. Val O’Bryan currently works for the Utah Education Network. She specializes in instructional design and is passionate about helping educators integrate technology into their instruction. She earned a doctorate from Kansas State University in curriculum and instruction and a master’s of English in writing, rhetoric, and digital media studies from Northern Arizona University. She was formerly a high school English and Spanish teacher and adjunct instructor.
It is the policy of NCTE in all publications, including the Literacy & NCTE blog, to provide a forum for the open discussion of ideas concerning the content and the teaching of English and the language arts. Publicity accorded to any particular point of view does not imply endorsement by the Executive Committee, the Board of Directors, the staff, or the membership at large, except in announcements of policy, where such endorsement is clearly specified.