This post was written by NCTE member Val O’Bryan.
When I started seeing OpenAI and ChatGPT all over my news feed and faculty emails, I knew that I needed to learn more about these constantly evolving technologies. People in the English department are worried about students using ChatGPT to write their essays for them. I saw AI-generated text in the news media, and it seemed plausible that the tool could compose entire coherent essays on a topic. I decided I would feed ChatGPT my own essay prompts to see what it could do.
The first essay that my students write in our freshman composition course is a personal narrative. I prompted ChatGPT with “Write a 700-word personal narrative” and then watched it generate the text in real time. It took about one-and-a-half minutes to construct the essay. The essay was 663 words and ended with an incomplete sentence. The last phrase was “But I also know that I have the strength and resilience to,” without any punctuation. The tool was able to deliver a first-person narrative about someone who is hit by a car as a child and overcomes their serious injuries. Despite the final sentence, the tool even created a conclusion with an overarching message, something that my students often struggle with. They can tell a sequence of events in order but can’t always thread a theme through their writing that ends with a culminating takeaway message.
I entered the same prompt into ChatGPT two more times, just to see what would happen. Would it create similar narratives? Completely different? As it turns out, even with the exact same prompt, the tool will create unique responses with different themes. The second narrative was 668 words about someone who nearly drowned as a child and became a teacher to help children (and did have a complete ending sentence). The third narrative was 670 words about someone who grew up wrestling competitively and had to overcome an injury. This essay also ended unfinished. Yes, ChatGPT can turn out a solid personal narrative, but one of these essays was unusable for an assignment in a freshman composition course because it specifically mentioned graduating from college. As I continued playing with the tool, I tried to get more specific about topics and themes, and the tool was able to construct something age-appropriate, including an essay about a student who learns to love themself through taking AP Biology. When the user is more specific in their instructions to the tool, it results in content that is more appropriate for what the user needs.
The next essay that we tackle is a text analysis, using the rhetorical situation as outlined in The Norton Field Guide to Writing for our framework. We start by looking at one part of the rhetorical situation at a time and applying our knowledge to a video commercial of the students’ choosing. After a few weeks, we put together an essay based on the writing the students have been doing all along about each part of the rhetorical situation. The result is a blend of direct quotes from the textbook that help us understand the rhetorical situation and then the application of the rhetorical situation on the chosen advertisement.
I had a hard time figuring out how to prompt ChatGPT in a way that would genuinely fulfill the assignment expectations that I had as an instructor. If you ask the tool to generate a text analysis or a rhetorical analysis of a specific video ad, it can do it. The problem is that it doesn’t bring up any of the information from the class discussions or the textbook. If you prompt it to use the textbook, it still doesn’t incorporate direct quotes. Additionally, it will pull information on rhetorical analysis that hasn’t been a part of the instruction at all, making it very clear that none of the course instruction was internalized in any way.
For a moment, I was happy that I had designed an essay prompt that “beat” ChatGPT, although the tool continues to evolve and develop daily. Trying out the tool for myself gave me a lot to think about as an instructor. How do I make my essays and assignments something students want to do, so they don’t attempt using this tool? How can I create assignments that don’t drive students to this tool in desperation? How do I manage our time together in class to encourage the development of ideas and drafting (when I can see it happening!). The thought also occurred to me that I can do everything I can to have them write in class, but there is still the possibility that they use the tool outside of class and bring the generated text back. Realizing that there was nothing I could do to keep students from using this tool made me think about ways that I could use this tool as part of my instruction. Although I do encourage everyone to try out their prompts on ChatGPT just to see what can happen, I have decided I am not focusing my energy on trying to catch students who use AI. Instead, I am looking for more ways to use this tool with students.
Dr. Val O’Bryan currently works for the Utah Education Network. She specializes in instructional design and is passionate about helping educators integrate technology into their instruction. She earned a doctorate from Kansas State University in curriculum and instruction and a master’s of English in writing, rhetoric, and digital media studies from Northern Arizona University. She was formerly a high school English and Spanish teacher and adjunct instructor.
It is the policy of NCTE in all publications, including the Literacy & NCTE blog, to provide a forum for the open discussion of ideas concerning the content and the teaching of English and the language arts. Publicity accorded to any particular point of view does not imply endorsement by the Executive Committee, the Board of Directors, the staff, or the membership at large, except in announcements of policy, where such endorsement is clearly specified.