Exploring the Myth of the AI First Draft

I listened to the Teaching in Higher Ed podcast recently and heard an episode with Leon Furze, an educational consultant from Australia who is also the author of the book, Practical AI Strategies, Practical Reading Strategies and Practical Writing Strategies (2024). Furze studies the implications of generative artificial intelligence (GenAI) on writing instruction and education. In the episode, Furze references a blog post he wrote a few months ago that challenged the use of GenAI tools like ChatGPT or Microsoft Copilot to help people write first drafts. If you’ve listened to any GenAI company marketing their technology, you’ve probably heard that the tools can help with brainstorming, outlining, and tackling the dreadful “blank page.” GenAI tools, the proponents argue, can offer some assistance to get started with the difficult task of writing. Furze has some concerns with this approach.

In his post, Furze outlines a few reasons to “be cautious of the AI first draft.” Furze’s first reason is pretty esoteric but important. Furze worries about capitalism, oppression, and the larger impacts on literacy and expression. He introduces a term called the “computational unconscious” which posits that technology and technology companies have created an invisible infrastructure that impacts human thought, communication, and interaction. It’s heady (and scary) stuff.

While I share Furze’s concerns about the “computational unconscious,” I’d prefer to dig into one of his other reasons here. Furze worries that GenAI can undermine the purpose of writing. He writes:

“The purpose of writing isn’t just to demonstrate knowledge in the most expedient way. Writing is to explore knowledge, to connect and synthesize ideas, to create new knowledge, and to share. When students use AI to generate a first draft, they skip 90% of that work, creating something that may well be worth sharing, but which has not in any way helped them form and make concrete their own understanding.”

This rationale resonates with me on a bunch of levels. As a writer, I realize how difficult this process is. But I also realize the benefits. In a 2012 blog post, I shared my reasons for writing this blog. I wrote:

I need to write to learn. I usually have a bunch of ideas swirling around my head. Blogging forces me to connect my thoughts in a coherent way and make sense of the sometimes disparate concepts. It’s not true for everyone, I’m sure. But writing gives me the opportunity to solidify my thoughts and learn.   When I start writing a blog post, I usually have a general idea of the subject matter and some of the points I want to make. After getting started, however, I may end up in an entirely different place because the writing process leads me to consider my thoughts in a new way.”

For me, writing helps me make sense of my ideas and construct my understanding of things. Using GenAI to help me write a first draft would rob me of that sensemaking. Sure, it would be easier and more efficient, but it would be taking the learning from me. It’s like how Terry Doyle writes in his book Learner-Centered Teaching (2011), “the one who does the work does the learning.” If GenAI is doing all the heavy lifting, then I’m not doing the learning. And that’s one of the main reasons I write.

Leave a comment