Discussing ChatGPT and Writing with Students

Author
Daniel Emery

The arrival of ChatGPT has sent shockwaves through popular media and higher education circles. Headlines have suggested that artificial intelligence could render some familiar genres and technologies obsolete (including the college essay and Google). Since ChatGPT-3’s release, the platform’s user base has grown almost exponentially, and reports of early adopters using the technology to ‘fool’ instructors soon followed. In this blog post, we’ll offer a quick overview of AI large language models, discuss how they have already changed the production of writing, and offer some advice on addressing the role of this technology in the future of writing.

What is ChatGPT, and how does it work?

ChatGPT is a form of artificial intelligence called a large language model (LLM). While ChatGPT-3 has garnered most media attention for its ability to rapidly generate text in response to a prompt, many large tech companies have their own LLMs (Google’s Pathways Language Model and Microsoft NVIDIA’s Megatron Turing LNG are reportedly the largest) and are developing specialized models for fields like law and medicine.

When a user asks ChatGPT to generate text, the technology uses a massive number of statistical parameters (175 billion) to examine a gigantic quantity of text on which it is ‘trained.’  Based on its analysis of previously written and available discourse, it generates a text that mimics the plausible output of an actual writer.

Emily Bender and colleagues described LLMs as “Stochastic Parrots” in their 2021 critique of the dangers of LLMs. Unlike writers, Language Models don’t treat words as carriers of meaning and value. Because it’s based on mathematical probabilities and limited and imperfect training sets, output from models becomes less accurate and less plausible as tasks become more complex.

writing robot generated by stability artificial intelligence dreamstudio

Talking to students about ChatGPT

Much of the public discourse around ChatGPT has focused on questions of academic integrity. Instructors may add statements about using AI to their course policies on academic integrity. While ChatGPT is exploring a ‘digital watermark’ that will make it easier to detect AI-generated text, technologies for identifying machine-generated text are far behind the capabilities of most text-generation tools.

If students feel that they understand the value of a writing assignment and perceive themselves capable of completing it, the likelihood of academic misconduct drops dramatically. The Office For Community Standards, the Center for Educational Innovation, and the Teaching with Writing program all offer faculty resources to help design courses and assignments that promote academic integrity.

A more productive conversation about ChatGPT might focus on the strengths and limitations of AI-generated texts and the value of writing as a mode of learning.  While AI tools are often successful at providing overviews, summaries, and examples, they are challenged when applying knowledge to specific contexts, identifying personal connections, or synthesizing information from research. While ChatGPT can create a semblance of research writing by adding citations, the tool can’t access most sources located behind paywalls and doesn’t ‘know’ that its auto-generated citations are simply made up.

Because AI and machine learning are rapidly changing the nature of research in many fields (from biology to psychology to archaeology to creative writing), ChatGPT and other technologies can help students in your classes understand more about novel and exciting research. By emphasizing AI as a tool for enhancing understanding rather than a shortcut to generating words, we can help students prepare for an unpredictable future.

Additional Resources

AI and ChatGPT in Teaching: Context and Strategies by Clare Forstie, Center for Educational Innovation, offers a brief, clear, and well-detailed overview of how AI is transforming learning on campus.

AI Text Generators and the Teaching of Writing from the WAC Clearinghouse provides meaningful resources, sample course policies, and an updating archive of news and academic features on AI.

The Future of Writing: Collaborative, Algorithmic, Autonomous, by Ann Hill Duin and Isabel Pedersen, explores how non-human agents are changing the landscape of discourse production worldwide.

Further Support

See the Teaching with Writing web pages or teaching resources. As many of you know, our WAC program also hosts the popular Teaching with Writing event series. Each semester, this series offers free workshops and discussions. Visit us online and follow us on Twitter @UMNWriting.

Are you looking to change up writing assignments or grading strategies? Talk to us! We like thinking with faculty members, instructors, and TA/GIs about all matters related to teaching with writing in courses across the University curriculum. Do you have questions about writing assignments and activities, grading writing, providing feedback, or using digital tools? Contact us to schedule a phone, email, or face-to-face teaching consultation.

Comment

I have not played around with ChatGPT much, but I have heard that it can do a good job of rewriting paragraphs and fixing poor language. I teach students for whom English is not their first language and I'm wondering if ChatGPT could be used to 'smooth out' the language around a set of logical points that they want to make in their paragraphs (especially if they did it one paragraph at a time.

Hi Judy- "Good job" is somewhat relative. The more technical the topic and the more obscure the vocabulary, the more likely GhatGPT-3 will reveal its tendency to make up facts and spout nonsense. Chat GPT reliably creates sentences that surmount the bar of grammatical and syntactical correctness, but it tends to offer surface-level summaries and will not accurately cite relevant literature. A knowledgeable writer might play around with AI for revision, but the output is based on statistical probabilities, not meaning. I suggest a strong Caveat Emptor for anyone who wants to try it and note that such technologies are prohibited for some institutions, programs, and publication venues.

At the same time, many tools exist that will attempt to rewrite and revise prose (called 'text spinners'). In their earliest incarnations, they tended to work like overactive thesauri (and tried to generate synonyms and cognates ad nauseum). Still, they have grown a little more sophisticated and less embarrassing. Some multilingual writers use Grammarly and other text-suggestion tools, including those who use word processing and communication technologies.

The notion of "smoothing out" accented writing is also controversial. ChatGPT is not trained in the vast array of vernacular discourses that are a part of English Language usage worldwide, and it lacks consciousness that can make choices about code meshing. While we know that language discrimination is an obstacle to equitable participation in scholarly writing, I am not sure if an AI will be an agent of positive change or turn our unconscious biases into algorithmic rules. In my utopian vision, humans will collaborate with AIs to create previously impossible texts and enhance the spread of helpful knowledge across differences. In my moments of less optimism, I worry that AI will work like the Borg on Star Trek, assimilating all linguistic differences into a bland, pale, and technocratic monotone.

I appreciate this timely post. When Ann Hill Duin and Isabel Pedersen ("The Future of Writing," above) presented at IEEE ProComm last summer, chatbots felt to me like a game changer for any course in which instructors assign writing is a mode of learning. Prompts in ChatGPT return discourse instead of links, so that feature really does change the search game.

The tool can be instructed to provide citations as well--a research aid for students and instructors alike.

Still, from my instructor's point of view, the bot generates source material, not essays or reports or technical descriptions or anything else I assign. I don't yet have enough experience with bots or students' use of them to know what challenges might arise when students receive an entire 1500-word draft of something from ChapGPT and then have to figure out how to put it all into their own words.

What does it mean to instruct students to "paraphrase all source material" when their source material is a whole draft of a feasibility report?

Hey Dan,
Any thoughts on my question, “What does it mean to instruct students to "paraphrase all source material" when their source material is a whole draft of a feasibility report?” Forgot to add my name to my initial post.
Thank you,
Joe

Hi Joe-
I have thoughts but no easy answers. When we ask students to paraphrase, we're asking them to take the content of an idea and restate it in their own words. Of course, the writing researcher in me will always say that the line between "Ideas" and "Words" is never clear cut, and the notion of anyone using "their own words" is pure fiction. We're always committed to use language that precedes us in culturally sanctioned modes that allow us to be heard. Even then, clear communication of an idea doesn't always produce the desired effect in an audience, sometimes with tragic consequences.

Hi Joe-
I have thoughts but no easy answers. When we ask students to paraphrase, we're asking them to take the content of an idea and restate it in their own words. Of course, the writing researcher in me will always say that the line between "Ideas" and "Words" is never clear cut, and the notion of anyone using "their own words" is pure fiction. We're always committed to use language that precedes us in culturally sanctioned modes that allow us to be heard. Even then, clear communication of an idea doesn't always produce the desired effect in an audience, sometimes with tragic consequences.