I identify with the Xennial micro-generation, the generation between Gen X and Millennials born between 1977 and 1983. I owned the Atari 2600 as a young child, learned to type on a computer in third grade, was an earlyish email adopter when my college freshman computer science class required one in 1996, and have seen every generation of every piece of modern technology. I have always embraced technology. As a parent, I see it as my responsibility to always better understand my own kids’ devices than they do. I am the last person to have some knee-jerk reaction against a new technology.
I also use LLMs frequently in my own life. ChatGPT has been my co-pilot as I developed ice-breakers for a kid’s Buddhist program and created basketball drills to meet the needs of the third-grade kid’s team I was coaching. My favorite use of ChatGPT so far, and I upgraded to GPT 4 for this, was to plug in a bunch of notes and barely coherent ranting about my difficulties getting some medical procedures covered by insurance – GPT4 wrote a masterful letter to the insurance company and shortly thereafter all my procedures were paid for by my insurance company.
Given this background, I was initially excited about using AI in my freshman composition classroom. I attended the Achieving the Dream Virtual Workshop: Implementing AI in Your Classroom where Education Professor Zia Hassan presented his enthusiasm for the coming wave of AI technology and shared many ideas about implementing it in the classroom.
I thought one of Hassan’s best ideas for integrating AI into the classroom was to think bigger – to ask students to do harder work, even work that is orders of magnitude more difficult. I am a bit of an all-or-nothing personality myself, and this idea of designing an English freshman composition course that has no limits on AI was appealing to me. I went back into my project-based learning training (I have a secondary teaching credential from High Tech High in San Diego in addition to my Master’s in Composition) and reached for the stars. Students would work hand in hand with AI as a co-intelligence. They would craft public-facing digital portfolios and develop metacognitive perspectives on the role of learning and language in their own lives. They would complete assignments about diction, sentence structure, and tone and make their own choices about the voice in their final AI/human compositions. They would learn to ask the most difficult questions and to pin down the best, most nuanced, most subtle answers available based on careful interpretation of the available evidence.
And then I thought of my student population. I teach many students who are the first in their families to go to college, and many have had difficult educational experiences and difficult experiences with writing in an academic setting. What would it mean for them to come into this supercharged curriculum working with AI as a true co-intelligence? And in asking this question I realized, not just for my students but for all students learning the foundations of reading, writing, and critical thinking at the college level, AI is far too helpful to be a useful educational co-pilot. There is not a thing a student could ask AI for help with that doesn’t short-circuit the learning process and our current LLMs are all to happy to do the work for the student.
Take just one small example, writing an outline. If an AI generates an outline for a student paper, so many steps are skipped. Leave out the creative brainstorming stage where good ideas can appear anywhere, especially in the shower. Jump right past the organizing, adding, and winnowing stage – no more deciding if this idea fits or not or thinking about which order to put the ideas in. Never mind the synthesis stage where the student has the opportunity to decide what all the ideas add up to. There are other steps omitted here as well – what about the stage of having a conversation with another person to come up with ideas, or the stage of asking for help, or the stage of sharing one’s ideas with a friend or loved one? Skip all that.
Think of the higher order tasks Hassan alluded to. Imagine asking for an AI’s help reading several academic studies on a controversial topic and poking holes in the methods. Or asking AI to help the student develop an action plan for making progress on an issue that is important to them. Or asking an AI to help them come up with their own deeply thought out student philosophy. AI is happy to help with all of these tasks all to much. But without the skills students learn in lower division writing classes, they are not equipped to use AI as a true co-pilot. They are more like a passenger trying to learn to fly a plane: the smart thing to do is to just let the pilot takeover.
Like an AI significant other, AI is designed to remove all barriers, all conflict, all struggle. And in human relationships like in learning, the value is produced as a result of the struggle. No struggle no real love; no struggle no real learning.
This current crop of overly helpful LLMs is so desperate to please, so eager to rewrite a phrase to make it more assertive, or to do the writing for a student even when explicitly configured not to, that they essentially erase student learning opportunities.
And so I have, reluctantly at first, decided, at least with this current version of large language model artificial intelligence, there is no place for AI in my freshman composition classroom.
Leave a Reply