Cardboard robot

In my first-year writing classes, I typically start with five minutes of freewriting. Since some folks don’t know where to start when they set pen to paper or fingers to keys, I use a random word generator to give students a nudge if they need it.

The fish listened intently to what the frogs had to say.

Frog fountain

Today, I realized the random word generator I use also has a random sentence generator. According to the FAQ on that page, the sentences are not computer-generated; instead, the site draws from a database of human-authored sentences. (It isn’t clear where these sentences come from, although the FAQ says it’s possible to “donate” your own sentences to their database.)

Pat ordered a ghost pepper pie.

Now serving beer and wine...with pie?

Next week, a handful of my Framingham State colleagues and I will start planning this year’s retreat for first-year writing instructors. The topic of this year’s retreat will be the impact of ChatGPT and large-language models (LLMs) in composition classrooms. Although much of the media coverage of LLMs focuses on plagiarism and cheating, I’m equally interested in the ways tools such as ChatGPT can be used ethically, as a way to kickstart (not replace) creative and critical thinking.

I used to live in my neighbor’s fishpond, but the aesthetic wasn’t to my taste.

Margaret C. Ferguson Greenhouses

Earlier this week, I heard an NPR story in which a college student described the ways he uses ChatGPT as a brainstorming tool in his academic work. In a textual analysis of The Iliad, for example, he used ChatGPT to generate possible thesis statements, then he chose a thesis he agreed with and asked ChatGPT to write an outline. Given that outline, he went back to the text to find illustrative quotes, then he wrote his own paragraphs to flesh out the argument, creating an essay that would be difficult to flag using existing plagiarism-detection tools.

Carol drank the blood as if she were a vampire.

No more interviews with vampires.

Using ChatGPT to write an entire essay is clearly wrong, but is it wrong to use LLMs to help with brainstorming, organization, or other composition tasks? I had an international student this past semester tell me he uses ChatGPT to correct the grammar of his essays, for example, and I (personally) don’t have a problem with that. Is relying upon spell- or grammar-check (or hiring an editor) unethical? What about tools such as Grammarly and auto-correct? Does every single idea in a given essay have to come from your own brain, or is it okay to use a random word generator or quick Google search to jumpstart your thinking?

The fifty mannequin heads floating in the pool kind of freaked them out.

Mannequin heads

We encourage students to ask their professors and writing tutors for help, and we know students sometimes ask their friends, roommates, or even parents to read their essays. How many brilliant essays started as thought-provoking conversations where multiple people contributed ideas? Does asking for help or conferring with peers count as cheating? If asking a human for help is okay, why is collaborating with a computer different?

I can’t believe this is the eighth time I’m smashing open my piggy bank on the same day!

Trojan Piggybank

When it comes to the impact of LLMs in the first-year writing classroom, I have more questions than answers. I know tools such as ChatGPT are here to stay, and I know this generation of students will use generative AI in the workplace of the future. Given those realities, teaching students how to use technology responsibly and transparently is more helpful than banning technology outright. Sometimes allowing (and admitting) the randomness of real life leads to something creative and curious.

Be curious!

Although I myself wrote these paragraphs (with occasional grammar and usage corrections from Google Docs), I did not write the random sentences in between.