Recently, I’ve been thinking a lot about how we teach writing, problem-solving, and scientific experimentation. In most cases, there’s an attempt to present this stuff in a step-by-step process to help demystify their complicated natures. For example, a lot of teachers teach a five-paragraph essay structure to lead students in their writing. In science, many teachers teach the scientific method as a linear process that involves forming a hypothesis, conducting experiments, making observations, and drawing conclusions. These are algorithmic approaches to these complex tasks. They attempt to formalize messy processes by presenting them in more procedural ways.
But, writing and experimentation aren’t formulaic. Sure, writing and science have rules and conventions that guide their practices, but good writing doesn’t come from a formula. Very few experiments are conducted following the step-by-step scientific method. Teaching this stuff in a formulaic way doesn’t fairly represent the complex nature of these tasks.
A better approach would be to teach these processes as heuristics. If you’ve been reading this blog for a long time, you may remember that I wrote about algorithms and heuristics way back in 2012. At the time, I was reflecting on conversations that were happening on our campus about writing instruction. In that original post, I wrote: “Students need to know that writing doesn’t follow a simple formula or equation. Writing is an organic process that is informed by practice and guided by strategies. People become better writers not only by understanding conventions and grammar rules but by writing in different genres and gaining real experience with the art and craft of writing. Heuristics can help guide the developing writer and foster a better sense of what writing as a process is.”
Jump ahead a decade, and now I’m thinking about how we’re teaching prompt writing for genAI tools. I’ve been reading a lot of books on AI in education, and I’ve seen a ton of example prompts that people have shared as a way to instruct others on how to write their own. In almost every book I’ve read on genAI, the pages are filled with prompt examples. Prompt examples aren’t cookbook recipes, though. If I follow a recipe in a cookbook, I can be reasonably certain the finished product will closely resemble the goal. I can’t imagine a world where I’d follow a recipe to bake a chocolate chip cookie and accidentally make a coconut macaroon instead. Using an example prompt, however, produces a different outcome every time it is used. Using genAI is not using Google or searching within a document. It is generative, which means it will create something new every time it is used. I can use the exact same prompt multiple times in a row and get very different responses. And, sometimes the response can resemble a metaphorical coconut macaroon. The genAI response is so far away from my goal that it can be laughable.
To combat this, we need to teach strategies that people can apply to their prompt writing. For example, check out this resource from the Center for Excellence in Teaching and Learning at the University of Connecticut. While it provides different example prompts, the resource also includes detailed strategies for developing good prompts and considerations to guide prompt writing. If you’re really interested in stepping up your prompt writing (or how you teach it), check out this getting started guide from Harvard University.
To hear a lively conversation about this topic (and last week’s post), give a listen to the Science In Between podcast episode that drops tomorrow.