No, Artificial Intelligence Is Not the End of High-School English

If you’ve been on social media lately, you’ve doubtless seen fictional stories and essays generated by “ChatGPT,” an artificial intelligence program that can generate remarkably solid pieces of prose, instantly, and in any imaginable style.

While some find this thrilling, others—mostly writers and teachers—are filled with existential dread. “My life—and the lives of thousands of other teachers and professors, tutors and administrators—is about to drastically change,” wrote teacher Daniel Herman, who predicted “the end of high-school English” in The Atlantic.

I’m not convinced. In fact, what’s happening here is not terribly new. For starters, let’s immediately dispense with the idea that artificial intelligence will make writing instruction obsolete. Remember the “21st Century Skills” movement? There was breathless insistence that with all the world’s knowledge now in our pockets and mere keystrokes away, K–12 education should prioritize critical thinking, problem solving, creativity, and communication. The question on the lips of education’s smart set back then was, “Why cram kids’ heads with a bunch o’ facts when we have Google?” Similarly, why teach writing when “AI” can generate sophisticated text at the touch of a button?

E.D. Hirsch, Jr. nailed the answer 20 years ago: “The Internet has placed a wealth of information at our fingertips. But to be able to use that information—to absorb it, to add to our knowledge—we must already possess a storehouse of knowledge,” he wrote. “That is the paradox disclosed by cognitive research.” University of Virginia professor Dan Willingham has described how research from cognitive science has shown that “the sorts of skills that teachers want for students—such as the ability to analyze and to think critically—require extensive factual knowledge.”

In other words, it takes knowledge to communicate knowledge—or even to have the discernment to judge whether an AI-generated piece of text makes sense or sufficiently responds to a prompt. Herman writes that what GPT can produce right now “is better than the large majority of writing seen by your average teacher or professor.” Perhaps so, but this is a problem unique to the education equivalent of the “worried well” and their teachers. A tiny minority of American high school students are like Herman’s, discussing and analyzing “Anzaldúa’s radical ideas about transcending binaries, or Ishmael’s metaphysics in Moby-Dick.” For most students a cogent five-paragraph essay remains a daunting challenge.

Skilled teachers who know their students seldom fail to notice when an assignment has the thumbprint of a little extra help from home or doesn’t sound original. Since the invention of the pocket calculator math teachers have faced this problem and solved it with three simple words: “show your work.” It will be no different with AI. Another English teacher, Peter Greene, whose rural Pennsylvania students are far more typical than those in elite prep schools, noted in Forbes that ChatGPT covers gaps in its knowledge by making things up and embellishing. “In other words, it has an eerily human capacity for bullshitting its way around gaps in its data base.” Greene suggests that teachers try out their writing assignments on the chatbot. “If it can come up with an essay that you would consider a good piece of work, then that prompt should be refined, reworked, or simply scrapped.”

Concerns that AI makes writing instruction obsolete is a manifestation of the “Curse of Knowledge,” an idea popularized by Chip Heath and Dan Heath in their 2007 book Made to Stick. The curse of knowledge is “a cognitive bias that occurs when an individual, communicating with other individuals, unknowingly assumes that the others have the background to understand.” It’s a common problem in education: to the well-educated and language proficient, problem solving, critical thinking—and clear, sophisticated written analysis—all feel like “skills” that can be practiced and mastered (or plausibly faked via artificial intelligence), because they are already rich in knowledge and sophisticated language. Herman fears students will avoid “doing the hard work of actual learning.” But many students would be hard-pressedto read with comprehension AI-generated essays, let alone pass them off as their own work.

The threat is not that artificial intelligence will make writing obsolete, it’s the assumption that it does. The world of futurists, technology enthusiasts, and educated elites is simply not the same as the world occupied by the substantial majority of American students whose main challenge is to reach basic levels of language proficiency. Artificial intelligence will provide time-saving tools for knowledge-haves, but it will be fatal to the interests of knowledge have-nots, if they are denied the opportunity to develop the language proficiency the well-educated take for granted, and which make AI tools useful.

The post No, Artificial Intelligence Is Not the End of High-School English appeared first on American Enterprise Institute – AEI.