Showing posts with label education. Show all posts
Showing posts with label education. Show all posts

Tuesday, December 9, 2025

Samantha Fulnecky and the Assault on Learning

Samantha Fulnecky (source)

I’ve attempted to avoid OSU undergrad Samantha Fulnecky’s newfound celebrity for refusing to do her homework. Not because I don’t have an opinion, but because I thought I had nothing to offer. Despite my years’ experience teaching college-level composition, I saw this as a classic teachable moment distorted into notoriety by a partisan media machine. I wouldn’t have given Fulnecky’s paper a zero; I would’ve returned it ungraded and invited her to my office hours.

I would’ve offered Fulnecky two pieces of guidance. I would’ve stressed that a reaction paper needs to address the subject at hand, which hers doesn’t; her sprawling, disorganized essay addresses transgender standing overall, but scarcely mentions the source. Then I would’ve stressed that religious dogma isn’t an academic argument, except in theology classes. Appeals to Christian belief will only persuade fellow Christians—and not just any Christians, but those who share her specific theological interpretation.

Then something struck me. Long before I encountered the story, more informed critics pointed out that Fulnecky doesn’t cite the Bible in her text; she refers to Christian beliefs, but without actual textual source. She doesn’t present a Christian dogmatic position, she offers Christian vibes. Although standard citations exceed the requirements of a reaction paper, if you bring in outside sources, you need to cite them, not just snuggle into them like a comfy blanket.

I’ve noticed something similar in other venues recently. Prager “University,” a YouTube account where prominent conservatives deliver lectures about hot-button issues like science, world affairs, and liberal arts, doesn’t cite sources either. This isn’t incidental. Critics who exhaustively analyzed PragerU’s content note that, when lecturers do make claims of fact, those claims are often ephemeral, difficult to source, or easy to rebut with a fifteen-second Google search. Context, nuance, and complexity go out the window.

Conservatives love the trappings of universities. Besides Prager “University,” prominent examples include the now-defunct Trump University, where The Donald licensed sock puppet instructors to teach get-rich-quick schemes, and Hustlers’ University, where Andrew Tate… um… also teaches get-rich-quick schemes. Right-wing philanthropy networks keep endowing chairs and unofficial newspapers on conventional campuses. Ted Cruz and Ben Shapiro love touting their Ivy League law degrees, right before floating proposals to demolish higher ed, and especially the Ivy League.

Dennis Prager

They’re somewhat less enthusiastic about actual higher education. In the 1990s, when I restarted my postponed post-secondary education, conservative outlets like National Review magazine were the vanguard of demand for a rich, robust liberal arts curriculum. By the 2010s, conservatives had abandoned such principles. Besides partisan opposition to higher ed, pundits like the late Charlie Kirk, who dropped out during his first semester, made a living encouraging youth to avoid schooling, mostly on university campuses.

Even where they don’t actively sabotage schools, they undermine education generally. In my state, the one-party government is actively torpedoing arts and humanities curricula, effectively turning state universities into trade schools. Samantha Fulnecky’s refusal to engage with her subject follows a pattern proposed by online firebrands like Kirk and Shapiro, to do everything within your power to avoid opening your mind or risk changing your opinions. Conservatives enroll in college to win, not to learn.

Dennis Prager and Samantha Fulnecky share a premise that truth is something we assert morally, not something we demonstrate empirically. Therefore, changing your mind when presented with contrary evidence is a form of moral failure. That’s why they eschew source citations, because it doesn’t matter what anybody else says, not even their own historically ambiguous holy text. Fulnecky demands experienced, credentialed mentors adjust themselves to her pre-existing beliefs, the God’s Not Dead model of education.

Let me emphasize, education shouldn’t aim to change students’ minds. It would be remarkable if a thorough education with a liberal arts core didn’t make youth change their minds on some topics, but changing one’s mind isn’t a prerequisite. But addressing complex topics more deeply, with a thorough topical understanding and a familiarity with prior knowledge and existing debates, is a prerequisite. And that requires knowing and citing sources, even those that disagree with you.

Conservatives once believed in difficult, diverse education; but conservatives like William F. Buckley and George Will have become rare. These pundits were intelligent, gentlemanly, and most important, willing to face evidence. Today’s conservative leadership doesn’t want to engage debate, it wants to silence it, and dogmatic belief is one tool to achieve that goal. When your goal is not to question but to clobber, making yourself a media darling is an easy way to win.

Wednesday, September 17, 2025

“Debate” Forever, Accomplish Anything Never

Charlie Kirk

Of the numerous encomiums following Charlie Kirk’s assassination last week, I’ve seen many writers praise his love of “debate.” They’ve extoled his supposed fondness for college speaking tours where the largest feature was his open-mike question sessions, where he used his proven acumen to dismantle undergraduates’ post-adolescent leftist idealism. He died under a banner emblazoned with his speaking slogan: “Prove Me Wrong.”

Recent public-facing conservatives have enjoyed the appearance of “debate.” Like Kirk, Ben Shapiro and Steven Crowder have made bank playing college campuses, inviting ill-prepared undergrads to challenge their talking points. Crowder’s notorious “Change My Mind” table banner turned him into one of the most eminently memeable recent public figures. And almost without fail, they mopped the floor with anyone who dared challenge them.

Smarter critics than me have shown how these “debates” are, at best, fatuous. Kirk, Shapiro, and Crowder arrive better prepared, often carrying reams of supporting research that undergraduates just don’t have. Shapiro is an attorney, a graduate of Harvard Law School, while Crowder is a failed stand-up comedian, so they’re simply better trained at extemporaneous speaking. Kirk’s training was informal, but his mentor, advertising exec Bill Montgomery, coached him well.

Myths of Socratic dialog, high school and college debate clubs, and quadrennial Presidential debates, have falsely elevated the ideal of “debate.” The notion that, if we talk long enough, we’ll eventually overcome our differences, underlies the principles of American representative democracy. College freshman comp classes survive on the notion that we can resolve our disputations through language. As a former comp teacher, I somewhat support that position.

However, we’ve seen how that unfolds in real life. As Rampton and Stauber write, defenders of the status quo prevent real change on important issues by sustaining debate indefinitely. As long as reasonable-sounding people keep discussing how to handle racism, war, police violence, global warming, and other issues on basic cable news, they create the illusion that controversies remain unresolved. Conservatives need not win, only keep the debate alive.

Public-facing conservatives on college campuses are the reductio ad absurdum of this reality. When men in their thirties, trained to speak quickly, and notice fiddling verbal inconsistencies, try to tackle wide-eyed undergrads, they look victorious. But that’s an illusion, created by the fact that they control the field. Just because an attorney can conduct cross-examination, or a comedian can do crowd work, doesn’t mean they’re correct.

Plato and Aristotle, depicted by Raphael

Just as importantly, the changing technological landscape means students have less information in reserve for extemporaneous discussion. Back in my teaching days, technological utopians claimed that students having information in reserve was less important than being able to access information on an as-needed basis. But these “debates” prove that, in the marketplace of image, knowing your subject matters when your opponent is stuck Googling on their smartphone.

Since I left teaching, ChatGPT and other “large language models” have diminished students’ need to formulate ideas in any depth. As I told my students, we don’t write only to communicate our ideas to others; writing also crystalizes our vague, ephemeral thoughts into a useful form, via language. But if students delegate that responsibility to artificial “intelligence,” they can’t know their own ideas, much less defend them on the fly.

Higher education, therefore, leaves students ill-prepared not only to participate in Charlie Kirk-style “debates,” but also to judge whether anybody has deeper ideas than supported by street theatre. I don’t blame teachers; I’ve known too many teachers who’ve resisted exactly this outcome. Rather, it’s a combination of bloated administration, regulations handed down by ill-informed legislatures, and a PR campaign that made Kirk look more erudite than he actually was.

Socrates saw his dialectical method, not as an abstract philosophical good, but as an approach to civic governance. In works like Republic and Phaedrus, he declared his belief that deep thinking and verbal acumen trained up worthy, empathetic rulers. But his approach required participants whose approach went beyond mere forms. It required participants sophisticated enough to admit when they were beaten, and turn words into substantive action.

Charlie Kirk was an avatar of a debate structure that prizes fast talking over deep thinking. His ability to steamroll students barely out of high school looks impressive to people who watch debates as spectator sport. But his approach favors form over substance, and winning the debate over testing the superior ideas. He was exactly the kind of rhetorician that Socrates considered an enemy of the Athenian people.

This produces a society that’s talked out, but too tired to act.ac

Monday, August 18, 2025

Some Thoughts on Cursive

“You know why we need to teach cursive in public schools?” someone belted at me on social media. “So kids can read important historical documents!” This has apparently become something of a clobber argument online recently, in the debate over whether schools should maintain the curriculum of good penmanship. As usual, this fellow hit me with links to the most common “cursive” documents, the U.S. Constitution and the Declaration of Independence.

Back during my teaching days, students already railed against teaching cursive writing. They regarded elegant handwriting as retrograde in an age of cheap digital printing. Cursive is rarely mandatory outside schools, mostly in rare official government paperwork. Beautiful handwriting has become an unnecessary luxury from another time.

Except, looking at the Declaration of Independence, something struck me: that isn't cursive. The most common version we all know, Timothy Matlock's transcription of Thomas Jefferson's text, was written in Copperplate, a form of calligraphy. Printer John Dunlap reproduced Matlock's handwriting with a typeface called Caslon, popular in the 18th century but mostly forgotten now.

Because of the way public schools teach cursive, I understand why people might lose the distinction between cursive and calligraphy. I studied in the waning days of Parker penmanship; by the time my sister was in grade school, D'Nealian had largely supplanted Parker. But in both cases, teachers heavily emphasized precision, elegance, and cleanliness. Cursive isn't meant to be any of these things.

I don't recall anybody telling me that the purpose of cursive isn't to write beautifully, it's to write quickly. Where the printing we all learned in kindergarten sought to create legible text that others could read, cursive meant to write at the speed of thought, letting students take notes or get thoughts onto the page quickly. Legibility to others should be secondary in cursive writing, at least in its origin.

To serve that, print writing is done mainly in the fingers and wrists. Any college student who's tried to take notes through an interminable lecture course knows how painful that becomes at speed. Cursive should move the effort of writing into the elbow and shoulder, letting the hand stay still while the arm does the effort of writing. This should result in copious quick writing with a minimum of fatigue.

But nobody ever told me that. All the emphasis in my grade-school cursive instruction stressed beautiful, elegant text. That is, calligraphy. As anybody who has ever practiced any art knows, you can have grace and precision, or you can have speed, but not both. By teaching cursive as beautiful writing, my teachers made it burdensome and impractical. No wonder my classmates and I mostly abandoned it when it stopped being mandatory.

I don't want to disparage calligraphy. In an era where text has become utilitarian and bland, I don't want to see beautiful letters disappear. But willfully beautiful handwriting is slow and time-consuming, and like oil painting or guitar playing, not meant for everyone. Those who stand to benefit should definitely practice calligraphy. But again, calligraphy and cursive are not synonyms.

Considering all the time I spent taking notes, and later watching my students take notes, we could've all benefited from real cursive. All the time squandered trying to massage out painful hand cramps or fend off carpal tunnel syndrome, could've gone instead toward art or science or business. If we'd understood how to write quickly, without fatigue, we could've had so many more hours in our days.

Instead, we were misled into thinking we had to accept hand pain as necessary for quick, practical writing. I don't think my teachers misled me deliberately. As with other subjects, their own instruction on how to teach skimmed past the “why” and onto the mechanics. While I believe teachers, at least the ones who last, are good people who love students, the school system sadly inculcates incuriosity and anti-intellectualism, in spite of teachers' best intentions.

Excessively precise handwriting therefore falls onto the same spectrum as “skillz drillz” math exercises and translating Shakespeare into “plain English.” It makes educational administrators feel useful, and gives test writers something to measure, but alienates children from their natural curiosity. Kids need cursive, not because penmanship is elegant and historically significant, but because it's a mode of thinking. Kids need to learn that getting ideas out of their heads, and onto paper, is how we turn abstract thoughts into useful action.

Writing is the bridge between thinking and doing. And cursive is how we cross that bridge without getting tired.

Tuesday, July 8, 2025

The Meaning of Life in “The Life of Chuck”

Mike Flanagan (director, from a Stephen King novella), The Life of Chuck

Albie Krantz (Mark Hamill) explains the harsh truth to Chuck, in The Life of Chuck

Late in this movie, title character Charles “Chuck” Krantz (Benjamin Pajak) has a heart-to-heart with his grandfather. Albie Krantz (Mark Hamill), an accountant, does that terrible thing adults inevitably seem to do: he urges Chuck to abandon his dreams and get a “real” job. He doesn't mean anything malign. Albie just wants the grandson he raised to have a future that doesn't include poverty and a career-ending injury.

This encapsulates the moral ambiguity underlying the movie. More than the apocalyptic opening act, in which the universe's existence balances on adult Chuck's survival, this admonition dives into why Chuck makes the decisions he does. The movie unfolds in reverse sequence, and what happens in each act only makes sense from what we see next-- which is actually what Chuck experienced previously.

Grampa Albie, whom Chuck calls by the Yiddish term Zaydie, sees accountancy as more than a job. He describes the complex numerical relationships in his clients’ finances as the distilled, clarified maps of their lives. He has the same nigh-divine attitude to bookkeeping that Galileo had to astronomy: the numbers show us how God moves in our lives and illuminates our way.

Chuck, a middle-school dance prodigy, has the power to stir audiences’ souls with his body movements. For him, dance is communication. He tells his audience a story, and dance is a conversation with his dance partner, a tall eighth grader named Cat. He became the first kid in school to master the Moonwalk because, while dancing, his body was so thoroughly attuned to his mind. A survivor of childhood trauma, Chuck only feels completely integrated with himself while dancing.

In other words, Albie sees the world as a scientific relationship of mathematical forces. Chuck sees it as emotional truth. But the joy in Albie's eyes announces an emotional bond with his numbers, while Chuck has mastered the physical calculus of dance. On some level, each understands the other's sentiments. But Chuck has only one life, and can't do both.

Every dancer, actor, musician, and author has faced the question: is this all worth it? Most of us, sooner or later, say “no.” Rent and groceries cost too much, and we're getting old. Dancers are especially vulnerable to this, because they're susceptible to disabling injuries that rock stars and novelists never face. Even those rare few working artists, who get paid for a while, quit because they can't buy a house or raise kids.

Chuck Krantz (Tom Hiddleston) cuts a rug on the streets of Boston, in The Life of Chuck

In that light, urging kids to relinquish high-minded dreams early, can feel like an act of mercy. Why let them linger in false hope when they could make a living, earn equity, and join a community? This goes double for dancers, who are about as likely to retire because of disabling injuries as NFL players. If you can spare kids from disappointment and disfigurement, perhaps you should.

Yet it's impossible to convey that message to children without telling them something else: “You're going to fail.” And because children are children, deaf to nuance and the exigencies of time, they hear that as “You are a failure.” Protecting kids from a heartless, hostile world causes them to internalize a message of self-abnegation and defeat. Parents don't mean it, but almost inevitably, they teach kids to dream small.

The movie hedges on when Chuck bifurcates into the artist and the accountant. Yet this is clearly a step on this route. At various points, Chuck re-learns the lesson that demonstrating autonomy is equal to disappointing his Zaydie. Like many Stephen King stories featuring child protagonists, this one carries the moral that becoming an adult means becoming small enough to fit this world's demands.

Except, in reverse order, it doesn't.

Adulthood, for Chuck, means accepting small, fiddling responsibility. By the time we see Zaydie warning Chuck to dream small, we've already seen that he becomes an accountant and gets married. But dance as an act of communication remains part of him. His climactic dance with Cat repeats itself on the streets of Boston when circumstances remind adult Chuck's (Tom Hiddleston) that he's most truly himself while using his brain to control his body.

Because even when adults accept small dreams in exchange for security, that dreaming child survives. Kids yearn to be artists, or builders, or heroes, not only for ourselves, but because these are social roles. Big dreams aren't selfish, they tie us to our people and communities. Chuck and Zaydie aren't really at odds, even when they disagree. They just have different routes to the same goal.

Saturday, November 9, 2024

Anatomy of the Un-Free Mind

Jason Stanley, Erasing History: How Fascists Rewrite the Past to Control the Future

Fascists, of both the small-f and large-F varieties, have a curiously adversarial relationship with history. Their entire political movement depends on myths of past national greatness, which is almost always presented as lost, but which they promise to restore. But they generally despise historians, and attempt to squelch nuanced or conflicting narratives. Briefly, they adore the idea of history, but despise the practice, especially if it requires any self-reflection.

Yale philosophy professor Jason Stanley has written multiple books about how fascists, propagandists, and spin doctors use language and knowledge to strangle public discourse. This book’s title appears to promise a look at how authoritarian regimes rewrite history generally, but in practice, it focuses primarily on academia. Stanley examines how regimes inculcate a spirit, not only of ignorance, but also incuriosity, among citizens at a formative age.

First, tempting though it might be, Stanley stays substantially clear of large-F Fascists. He talks somewhat about Hitler, less about Mussolini. But he mainly focuses on current strongman authoritarian regimes, especially Putin’s Russia and Netanyahu’s Israel. He spends some time on the British colonial empire in Africa and India, his father’s scholarly specialization. And he unambiguously aims his harshest criticisms as Donald Trump’s American brand of anti-intellectualism.

In Stanley’s telling, fascists begin by constructing the purpose of historical education. Their reasoning starts from an intended conclusion—instilling a love of country and an adherence to hierarchy—and retrospectively determines how to achieve that goal. This means having institutional control of textbooks, administration, and personnel. Conservatives have made firing educators and replacing trustees a cornerstone of their recent campaigns.

The process of controlling the historical narrative closely resembles the process of creating imperial colonies; this isn’t coincidental. Autocrats create a hierarchy that, they contend, has always existed. They instill a central imperial language, and make it illegal to speak indigenous languages; not for nothing did British colonialists force the Kikuyu of Kenya onto reservations, exactly as America did to its native population. Because indigeneity is necessarily anti-authoritarian.

Jason Stanley

Here, I wish Stanley went more into how administrations silence history among adults. He describes how administrations use schools to prevent passing local autonomy and traditional identities onto the next generation. But how, other than armed force, do autocrats control adults? Stanley is vaguer here, perhaps because academics began committedly studying the process only after the atrocities of World War II. Traditional knowledge disappeared quickly, and I’m unsure how.

Mythical history looms large. That might mean presenting Germans as the genetic descendants of ancient Greece, as the Reich did (they’re not), or how schoolbook history presents George Washington as blameless, honest, and certainly not a slaveholder. Either way, it presents an innocent past that enthrones the dominant population as necessarily deserving power. This mythic past presents history as a constant decline from prelapsarian goodness, which politics must promptly reclaim.

Many critics respond by insisting that “classical education” counters authoritarian overreach. But Stanley insists that there’s no single magic machine. Classical education can empower intellectual curiosity and resistance to tyranny, if teachers focus on the questions the ancients raised, and if teachers address ways that our morality has changed. But authoritarians love using “classical education” to teach mindless adoration for the dead, which only compounds state-centered mythological ignorance.

Although Stanley focuses on history, he acknowledges this applies to all disciplines. He quotes Toni Morrison, who wrote that choosing the canon of literature is very much about choosing the national culture. When science serves the purpose of politics and industry, rather than inquiry and discovery, scientists always arrive at state-sponsored conclusions. The conventional liberal arts can improve human experience, or it can tie us to autocrats. Fascists know this.

Stanley makes no bones about his motivation. Donald Trump used executive authority to propound a national history curriculum that elided slavery, native extermination, and crackdowns on organized labor. The most extreme forms of American conservatism use the same techniques of historic erasure used to justify Putin’s imperialism or Britain’s conquest of India. Informed, politically invested citizens have a responsibility to reclaim history, both its glories and its tragedies, for the commonwealth.

This breakdown is chilling, certainly for those of us who believe in learning and inquiry, but hopefully also for anyone who just has kids, or loves a free society. Knowing history isn’t just a moral good, it’s a commitment to liberty and democracy; when governments decide what citizens may know, they control electoral outcomes. But the darkness notwithstanding, Stanley’s breakdown assures us that ignorance can be resisted. If we try.

Thursday, October 24, 2024

Large-Group Dynamics and the Lonely Child

young woman with books leans against the school library shelves

Nobody actually likes the popular kids in high school. You wouldn’t know that from the deference they receive, from peers and teachers alike. Yet several years ago, reading Dr. Ellen Hendriksen, the author delved into several studies in how people make friends—and the outcomes were surprising, and frequently ugly. Our social structure relies on principles which we frequently can’t see or understand.

Quoting a 1998 study by Dr. Jennifer Parkhurst et al., Hendriksen writes that Parkhurst studied high school social dynamics, a popular field in social psychology. They concluded that popular kids are well-liked, amiable, and natural leaders. But Parkhurst took the unusual step of reading her outcomes to the students she’d studied. To her astonishment, one of her subjects stood up and said (I’m paraphrasing): “Nuh-uh!”

One of Parkhurst’s student subjects, supported by others, reported that peers often widely dislike, even despise, the “popular” kids. They achieve popularity by dominating others, waving their weight and social connections around, and behaving in an entitled manner. Parkhurst, astounded by the outcomes (and probably suffering her own flashbacks to adolescence), reevaluated the data. Turns out, people obey popular kids mostly out of fear and fatigue.

Growing up in a military household, my family moved frequently. Many military brats say likewise, but my father served in the Coast Guard, which mostly operates domestically, and therefore can afford to move personnel more frequently than other branches. Only once did we stay anywhere longer than two years. This proved particularly frustrating because, I now realize, most schools have an unofficial hazing process usually lasting a year.

Without the long-term longitudinal experience that comes from staying in one place for long, I truly never learned to read group dynamics in large populations. If Hendriksen hadn’t reprinted Parkhurst’s findings, translated into vernacular English, I might’ve persisted in believing that I received that hazing alone, unaware that everyone else experienced it too. I certainly would’ve remained mired in the delusion that the popular kids spoke for everyone.

(I know others, like migrant farmworkers’ kids, undoubtedly have it worse. I’m not comparing scars here.)

young child sits alone amid a crowd of active children

Put another way, I legitimately believed, not only throughout childhood but well into adulthood, that the loudest, most attention-hungry person in the room spoke for everyone. Presumably we all experience that phase, including that person. You presumably watched Mean Girls too. The persons demanding others’ attention and obedience legitimately believe they’re shepherding the crowd where it wants to go, simply keeping stragglers in line.

Something which former gymnast turned lawyer Rachael Denhollander said recently stuck with me. Speaking in the documentary For Our Daughters, Denhollander said: “It costs you something to side with the weak and the vulnerable and the oppressed. It costs you nothing to side with the one who’s in power.” Denhollander meant this about women and girls sexually abused in church, but it applies, mutatis mutandis, to all relationships with power.

For most children, public schools are our first interaction with organized power. Teachers have nigh-absolute power over their students, and I believe most wield that power with benevolent intentions. But as with most powerful people, there’s a gulf between intention and act. Whether they bend to a malicious minority, or go along with administrative dictates to get along, the outcome is largely similar for students, inexperienced at resisting injustice.

Popular kids and “mean girls” basically reproduce the regimes they witness, filtered through children’s eyes. They misunderstand the larger purposes behind adult authority; they only witness the demand for obedience and conformity, and repeat it. Meanwhile, adults don’t think like children, and attribute adult reasoning to childlike behavior. Both the popular kids and the subject-blind adults side with the powerful, which costs them nothing.

Kids could, hypothetically, organize against the popular kids and the adults who enable them. Indeed, something Malcolm Gladwell wrote recently stands out, that subgroups like Goths resist by making themselves look unapproachable, thus exempting themselves from popularity dynamics. But the outcasts shepherded by the cool kids, almost by definition, lack the leadership and organizational skills to unionize and form more healthy social dynamics. They’re doomed to struggle.

My father timed his retirement to coincide with my high school graduation, whereupon the family relocated one last time, to their hometown. This dumped me into adult responsibilities with no existing social network to streamline the transition. I hope other “nerds” and outcasts at least preserved their nominal support systems, because to this day, I struggle to read rooms. No wonder so many adults still have nightmares about high school.

Wednesday, February 14, 2024

In Dispraise of “Originality”

Jimmy Page (with guitar) and Robert Plant of Led Zeppelin

“You mean other people are allowed to use and repurpose music that already exists?” Sarah exclaimed, eyes wide and jaw dropped. “When I took the composition class in college, they insisted I had to invent my music out of whole cloth!”

I’ve forgotten how we reached the topic—casual conversation is frequently winding and byzantine. But I’d mentioned the multiple lawsuits surrounding Led Zeppelin, who have costly judgements against them for appropriating works by Black American songwriters, making fiddling changes, and slapping their own bylines on them. I’d explained the likely explanation: there’s a long blues tradition of songwriting by jamming around existing songs until a new song emerges.

This left Sarah flabbergasted. “My professor so thoroughly insisted on complete originality that she demanded we start composing with random notes, and building the piece around that.” I could completely believe that, too. Having attended our college’s new music showcase a few times, I remembered the preponderance of discordant, atonal music. I thought every undergraduate considered themselves another John Cage. Turns out the professor liked that effect.

Sarah felt faux outrage at the injustice of having been told that every composition had to be original. (Okay, “outrage” is overselling it. But definitely astonishment.) Yet while classical and orchestral composers, trained in frequently grueling college and conservatory programs, have an ethos of complete originality drilled into them persistently. Meanwhile, working songwriters crafting genres people actually pay money to hear, pinch and repurpose existing themes regularly.

Bob Dylan described his early songwriting style as “love and theft.” His earliest recordings show how frequently he lightly reworked existing Woody Guthrie or Dave Van Ronk tunes. Only with years of experience did he develop his own songwriting voice. Lennon and McCartney are among the bestselling songwriters ever, yet three of the Beatles’ first four albums are half cover songs, because the Beatles hadn’t found their voices yet.

I have no songwriting experience; I’m about as musical as a steel anchor. Yet in college creative writing and playwrighting classes, my textbooks espoused a similar ethos of complete originality. I remember one textbook pooh-poohing genre fiction as a “guided tour” of existing repurposed themes, while “literary” fiction always strives to be completely original. Don’t be like those popular paperback writers, the textbook urged; always create something new.

Our professor smiled ruefully and reminded us that textbook authors have their blind spots, too.

The Beatles, photographed at the peak of their star power

Literary authors and playwrights mimic one another relentlessly, and their genres are intensely fad-driven. As a playwright, it too me years to shed David Mamet’s influence, like songwriters struggle to differentiate themselves from Dylan. Most college-educated American writers pass through their John Steinbeck, Elmore Leonard, and Toni Morrison phases before achieving distinct voices. The lucky few see those exercises published.

Originality emerges in art, where it does, only gradually. Both Salvador Dalí and Pablo Picasso, famous for nonconforming paintings, began their careers with Renaissance-style portraits and church scenes. Jackson Pollock tried several techniques before uncovering his dribbling, wholly non-objective Abstract Impressionist style. Importantly, all these artists were disparaged when their approaches first appeared; they achieved acclaim only latterly, sometimes posthumously.

Yet even incidental mimicry draws ire. Returning to music. Former Beatle George Harrison’s signature hit, “My Sweet Lord,” made his solo career. Yet within months of release, lawyers fired off a lawsuit because it resembled the Chiffons’ forgettable 1963 hit “He’s So Fine.” That lawsuit commenced in 1971, and wasn’t wholly resolved until 1998, dominating his solo career, and rendering him timid as a songwriter forever after.

This trend achieved its culmination with the “Blurred Lines” lawsuit. Heirs of Marvin Gaye claimed the songwriters behind Robin Thicke’s icky 2013 hit stole Gaye’s “groove.” That is, they claimed the song resembled, not something Marvin Gaye wrote, but something Marvin Gaye could have written, and therefore was plagiarized. And they won. This sets a courtroom precedent that simply imitating venerable artists, even while creating wholly new art, is plagiarism.

No wonder Sarah’s college composition professor (now retired) favored originality over tone. Instructors, textbook authors, and now courts demand that artists constantly reinvent the wheel. Blues icons jamming in some underlit cellar are now plagiarists, not artists. Don’t build your next track around a riff from “Crossroads” or “John the Revelator,” boys, the boundaries of ownership are set!

Artists aren’t unique individuals; they’re a community of give-and-take, constantly improving one another’s raw material. Yet the ownership ethos demands nobody pinch from anybody, even incidentally. The mere fact that working artists have never done this doesn’t change the story.

Friday, January 26, 2024

How To Write in the Middle of the Road

As the website formerly known as Twitter continues its slide into irrelevance, posts like this one have become more common. Don’t, says the would-be teacher, whose name was removed from their post before I encountered it, ever begin a sentence with the word “so.” Similar posts rehash Stephen King’s notorious demand to excise every possible adjective or adverb. Others include some variation on Quiller-Couch’s orphan injunction: “Kill your darlings.”

I acquired my distrust of gnomic advice early. In grade-school creative writing, teachers warned students to avoid the word “said” in dialog tags, lest our writing become monotonous through repetition. Somewhere between there and college, the zeitgeist shifted. Now writing teachers admonish us to avoid any word other than “said” in dialog tags, lest we become overblown, adjectival, and show-offy. Thus I realized that writing advice is faddish and insubstantial.

Stephen King warns writers to avoid adjectives and adverbs, and to remove all unnecessary content, advice he notably doesn’t follow. His books are often so overwritten, they’re physically difficult to hold. Matching advice to trim supporting characters, side plots, and world-building digressions, aim to reproduce the terse, telegraphic prose famous from Ernest Hemingway or Elmore Leonard. In other words, it’s about following the crowd to bestseller status.

This week, as I contemplated how to rebut these specious proverbs, seemingly designed to produce work that authors and audiences inevitably hate, news came down the pike. According to The Chronicle of Higher Education, Arizona State University has launched a pilot program with OpenAI, owner of ChatGPT, to tutor freshman composition students. This becomes another opportunity to trim boring old teachers, with their salaries and demands, from the education process.

Despite the popular rhetoric, ChatGPT isn’t artificial intelligence; it’s a computer learning heuristic, a dynamic program designed to let computers learn from existing material. In particular, ChatGPT browses existing prose content, and teaches itself how to construct content passably similar. Technology philosophers argue whether it has any capacity to understand the content it reads or creates, but it doesn’t matter. ChatGPT is essentially a high-tech mynah bird.

Stephen King

Therefore, if we expect ChatGPT to teach college-level writing, we can at best anticipate that it will encourage students to write sentences and paragraphs which fit the program’s heuristic. It will regard individual voice, unique narrative, or personal interest as distractions. Just as ChatGPT itself only produces prose that satisfactorily resembles existing prose, it’ll demand likewise from students. This will produce bland, unoriginal writing that its own writers hate.

Much like the rules-based writing taught on Xitter.

Learning heuristic writing defies the purpose of college writing. It presents prose not as an attempt to explore the human condition, convey valuable information, or spin a constructive line of bullshit, but as a product to extract, like coal from a mine. If extracting text from human writers proves too costly, time-consuming, and laborious, fire them; outsource it to machines. Human writers are dangerous anyway, and use excessively big words.

Elaborate, oppressive writing rules share the same message. Excising adjectives and adverbs, the words that give nouns and verbs their flavor, or trimming side quests to create a sparse narrative that translates to film, all declare authors the enemies of prose. Anything that shows individual personality or a spark of character impedes slick commercial prose, which should roll out like cars from a Detroit assembly line.

Please understand, I’m not exaggerating. I’ve used this example repeatedly, but it remains relevant: according to Charles Duhigg, record label executives expected Outkast’s single “Hey Ya” to become a runaway hit. Not because they particularly liked it, but because in-house software declared it sufficiently like previous hits that passive listeners would devour it. When it initially struggled, industry insiders gamed the market to achieve the software’s projected outcomes.

Now we’re applying similar principles to writing. By making new prose sufficiently similar to existing prose, and excising any spark of character or enjoyment, we ensure readers can consume writing passively, like they consume petroleum. It matters not one whit whether writers craft novels, scripts, business reports, or journalism. We expect everyone to read as submissively as if they’re doomscrolling Xitter.

As a teacher, I cajoled students to use their own voices; several of them succeeded. Several youths who grew up resenting reading and writing found an unanticipated joy when they discovered writing wasn’t just crinkum-crankum grammar exercises. But now forces want to walk this progress back. Whether it’s human-made rules or computer learning heuristics, some forces would make writing as bland and joyless as possible.

Friday, October 13, 2023

The Simple Joy of Being Wrong

Tom Baker as the Fourth Doctor

In elementary school, when people asked me what I wanted to be when I grew up—that wheezy childhood standard—I consistently answered: “A scientist.” I didn’t know what that involved, but it definitely looked cool in classic Doctor Who episodes. The Doctor collected evidence, tested hypotheses, and by Act III, he inevitably found a solution that saved humanity from itself. What could be cooler than that?

By middle school, I discovered that my giddy childhood enthusiasm didn’t match the process. Science class consisted significantly of memorizing lists, performing “skillz drillz” exercises, and satisfying state-mandated competency checklists. My brief dive into seventh grade Science Club also showed me one of science’s less-appealing aspects: fundraising. We spent most of the academic year trying to pay down debts the club ran up the previous year.

This left me profoundly discouraged. There was no messianic world-saving going on! We didn’t even stick with any program long enough to understand it. One week, we’d demonstrate the states of matter by applying heat to an ice cube until it melted, then evaporated; the next, we’d dissect a pickled frog. Our teacher, with deadlines imposed by the state Department of Education, couldn’t linger on anything enough to spark understanding.

Because of this, I lost interest in “science.” I understand, as an adult, why teachers needed to imbue students with a satisfactory corpus of knowledge, because to operate common technology and participate in modern society, I had to have a basic understanding of thermal dynamics, biology, and meteorology. But I never understood any subject better than necessary to parrot answers back on the test, and I promptly forgot everything afterward.

Science fiction usually depicts rococo science. Star Trek often implied that Spock and McCoy could pull an all-nighter to invent a vaccine and instantly stop a pandemic. Nevertheless, it conveyed that science wasn’t memorized lists and data tables, it was a systematized version of “let’s try something reckless.” But the “science” I learned in school had no reckless experimentation. Every “experiment” had a pre-ordained conclusion, and a scripted take-home lesson.

Instead, I found my long-sought experimentation and recklessness in writing and literature. Sure, every English class expected me to savvy part of the literary canon, so some prescriptive learning still happened. But in writing particularly, I could try something new, and succeed or fail on my own terms. This adolescent Shakespearean sonnet clunks badly? Heigh-ho, into the bin, and I’m already trying the next fracas!

Richard Feynman

Paul Lockhart complains that students studying math in public (state) schools never have an opportunity to be truly wrong. They never have an opportunity to face a problem, self-indulgently play with potential solutions, and ultimately find the answer themselves. Schoolbook math, in Lockhart’s view, has become desiccated and lifeless, a mere husk. “Math is not about following directions,” he writes, “it’s about making new directions.”

I often wonder how my life would’ve differed, had I discovered the unsolved, and possibly unsolvable, problems underlying scientific thought. I discovered physics at age twenty-five, in the person of Nobel Prize-winning physicist Richard Feynman. His writings, many of them surprisingly comprehensible to outsiders, emphasize how much physics relies on metaphor, analogy, and imprecision—the fun, dangerous qualities I found in poetry.

Parenthetically, I realize that Feynman, personally, was more fraught than his mythology implied. That’s a topic for another time.

Feynman’s approach to physics was characterized by irresolution, play, and risk. Sometimes literal risk: he tested his hypothesis that a vehicle’s windscreen was sufficient to deflect the glare of a nuclear explosion by watching the Trinity test from a pickup truck’s cab. Feynman exemplified the sensory immersion of just trying something that I wanted from science, but found in literature. What if I’d discovered physics sooner?

My science teachers, dedicated educators all, nevertheless taught me that science is known and precise, and nothing is worse in classroom science than being wrong. But being wrong—stepping beyond the limits of knowledge which state contractors can write into Scantron tests—is the heart of science. And that’s what school denied me: the opportunity to experience the unmitigated joy of writing my own hypothesis, testing it, and maybe being wrong.

How many others like me exist? How many would-be historians got discouraged by pop quizzes laden with names and dates, when they’d rather discover the contingencies that made history happen? How many poets who never found their voices because somebody compared them unfairly to Robert Frost? How many people never got to just try something, and maybe be wrong?

Thursday, October 5, 2023

What Kind of University do Americans Want?

My alma mater, the University of Nebraska at Kearney, has placed several academic programs on the chopping block. Cuts to the fine arts have attracted the most attention: university administration has proposed axing the entire theatre program, and one-third of the music department. But while these cuts attract the most attention (and the most students inclined to protest), less sexy cuts include the entire Geography, Philosophy, and Journalism departments.

Other departments would survive, but in truncated forms. Modern Languages would lose its French and German majors, functionally turning the entire program into a Department of Spanish. The English Department would allow vacant positions to remain unfilled indefinitely, furthering the program’s decline into a Department of Freshman Composition. Although the administration has proposed some Education and Business cuts, two-thirds of proposed cuts come from Arts and Humanities.

Smarter commentators than me have addressed the costs which the campus and community will suffer. UNK, once a top-tier regional university, has slid in rankings since I left, which probably isn’t coincidental with the steep cuts previously imposed on academics, and the numerous tenure-track seats going unfilled. I’d rather focus on another question raised by these draconian cuts: what role will universities, and education generally, serve in coming years?

The university is slashing theatre, an art wherein people attempt to genuinely, realistically depict people dissimilar from themselves. Actors, and the myriad technicians supporting them, try to accurately channel people from other times, nations, or backgrounds, and tell their story respectfully. In other words, theatre cultivates empathy—a trait shared by, say, learning to speak and read French or German. Future students will have fewer opportunities to learn empathy.

Likewise, the university is cutting journalism at a time when Americans are historically ill-informed about world events, and lack of media savvy has produced painful consequences. The geography program is in jeopardy, as our earth faces climate shifts which have the potential to wipe out human civilization. These cuts reflect value judgments from state officials who want education to produce desired outcomes—which, apparently, had better not threaten state values.

A certain subset of American power has always wanted to limit state universities to teaching job skills. That subset, usually but not exclusively conservative, sees “liberal” arts, those knowledge domains which create free thinkers and liberated minds, as luxuries for the minority of students whose families can afford private universities. These self-appointed education pundits don’t want students asking difficult questions, they only want them learning marketable job skills.

We’ve witnessed this, in more dramatic terms, in Florida and Texas. Ron DeSantis’s Florida has banned entire fields of study, while Florida, Texas, and Oklahoma have allowed PragerU Kids, an edutainment company founded by an AM radio host and funded by fracking billionaires, to displace teachers in schools. The removal of entire academic disciplines from UNK, a school which primarily attracts regional students from poor backgrounds, is no less consequential.

Plato and Aristotle, depicted by Raphael

Throughout history, self-appointed polymaths have debated the purpose of education. Plato thought education fitted scholar-kings to rule the benighted masses, while Aristotle thought education made citizens into good people. Thomas Aquinas thought education brought people closer to God, though later scholars have thought education broke the yoke of religious delusion. I suggest there’s no pat resolution to these differences, but education prepares wise people to differ more constructively.

“Liberal” arts, the arts which liberate people—disciplines like literature, history, math, and science—allow students to know themselves. But equally, they allow students to know themselves in context, in their society and economy and culture. Educated citizens have tools necessary to evaluate fair use of power, just distribution of resources, or the difference between truth and lies. It isn’t coincidental that American slaveholders didn’t want their chattel to read.

Higher education shouldn’t be merely cost-efficient. Indeed, for many students, post-secondary education will be their last opportunity in this lifetime to pursue truth, beauty, science, and knowledge for their own sake. This, of course, offends those who believe every item, thought, and hour should have an owner. Students able to ask penetrating questions, will inevitably ask questions that powerful people don’t want answered.

I acknowledge limits exist. Three-fifths of UNK’s budget comes from taxes and endowments, which deserve accountability. While American generational cohorts continue getting smaller, tenured professors remain in harness for decades, narrowing the academic pipeline. These concerns aren’t hay. But the proposed cuts clearly aren’t value-neutral, and serve to limit the kinds of questions students can ask. University administrators should prepare themselves when inquisitive students stop showing up.

Tuesday, July 18, 2023

An English Curriculum that Freshmen Might Read

First edition jacket art

A fellow worker pointed at my t-shirt and smiled. “The Great Gatsby! That’s the only book I actually finished reading in high school English.” We were working the assembly line, and I’d shown up wearing a t-shirt featuring Francis Cugat’s iconic dust-jacket painting for The Great Gatsby. Our assembly line team had a whole range of education levels, from high-school dropouts to postgraduates who’d never found a job.

I’d heard people admit they didn’t read before. As a former college adjunct, I heard a panoply of excuses for long-term aversion to reading, which mostly boiled down to: I never learned to appreciate it as a child, and now that I'm grown, it’s too difficult to develop the habit. But this broke the pattern, because the person didn’t highlight his non-reading, he spotlighted the one book which penetrated his armor.

Fitzgerald’s The Great Gatsby has become one of those books, like Harper Lee’s To Kill a Mockingbird and Arthur Miller’s The Crucible, which we simply expect high schoolers to read. Someone possessing an American diploma should understand allusions to this handful of celebrity books. Yet as my co-worker pointed out, not everyone reads every “great” book. Whether from overwork, or unfamiliarity with dated language, or just plain disinterest, many students skim or skip books altogether.

My co-worker couldn’t finish most “great” books because English was his second language. Most important literature was written in language that, to his limited English, looked looping, ornate, and Yoda-like. The Great Gatsby, by contrast, was plainspoken, notwithstanding its luxurious milieu, and didn’t demand a dictionary to parse ordinary sentences. My co-worker could concisely describe his relationship with that novel, and reading in general, and eventually felt free to ask how to improve his reading goingforward.

Yet he also made me reconsider how we choose our literary canon. In Freshman English, I remember being assigned Homer’s Odessey, Shakespeare’s Julius Caesar, and Hemingway’s The Old Man and the Sea. Important words, definitely, but not works which most freshmen are prepared to savvy, not even highly literate ones who already enjoy reading. These works left students climbing the walls, desperate for validation that we weren’t stupid for failing to understand.

First edition jacket art

F. Scott Fitzgerald stands in an unusual position within the literary canon. Though famous now for his novels, he made his early living publishing for glossy magazines like The Atlantic and The New Yorker, which were read then by mass audiences. After the 1929 stock market crash made Fitzgerald’s hymns to nouveau riche excess seem tasteless, he relocated to Hollywood and became a script doctor. He wrote, that is, for mass audiences, in vernacular English, with an eye toward images.

You know who else wrote for mass audiences with image-friendly prose? Dashiell Hammett. His classic The Maltese Falcon, arguably his career peak, is in many ways the anti-Gatsby. Jay Gatsby is chummy with New York’s fiercest gangsters; Sam Spade has an adversarial relationship with the police. Gatsby romanticizes women, especially Daisy Buchanan, without really knowing them; Spade enjoys women, but doesn’t revere them, and surrenders his latest lover to the gumshoes.

Perhaps most importantly, Jay Gatsby has no moral code, except perhaps whatever makes him rich enough to court Daisy Buchanan. Spade, by contrast, is so hog-tied by his own unique, self-written moral code that it costs him lucrative paydays. He’s forced to live in squalor, sleep on a Murphy bed, and eat his beans from the can. He’s almost the diametrical opposite of Gatsby—while still being written in simple, imagistic language that high schoolers can understand.

Don’t misunderstand me. I’ve written before about the importance of students reading books beyond their immediate comprehension, and how that changes their brain circuitry for the better. But too many teachers—underfunded, short-staffed, and hurt for time—lack the resources necessary to guide students to higher comprehension. I remember my Ninth Grade English teacher telling the class explicitly that we could tell Ernest Hemingway was deep because we couldn’t understand him.

Pairing The Great Gatsby and The Maltese Falcon would provide Freshman-level English teachers the opportunity to discuss important themes in American literature, while speaking an English that most students understand. Other “literary” writers tend to be hermetic, like Hemingway; abstruse, like Eliot; or simply outdated, like Mark Twain. Yes, Hammett writes about unseemly themes, like infidelity, racism, and violence, but so does Faulkner. Students have seen worse on TV.

And if it means more working-class students glowing up for their favorite book, well, that’s a win for everybody.

Monday, March 20, 2023

The Poor Intellectual in America’s Bread Basket

Sarah Smarsh

Late in journalist Sarah Smarsh’s autobiography, Heartland, she undertakes that most time-honored of adulthood rites: leaving for college. In Smarsh’s case, this passage carries special significance. As the first member of her family to attend college, she arrives without the prior knowledge of how to “do school” that many of her peers already possess. And coming from dirt poverty, she carries the necessity to survive that most college students lack.

Smarsh doesn’t dwell on this; it’s only one long-ish scene in her final chapter. She received a full-ride scholarship to cover her tuition, but needed three jobs to afford room and board. The system we have, Smarsh writes, tacitly assumes students’ parents will shoulder the financial burden of education. Because America’s higher education system bears the toolmarks of its makers, who were themselves well-off, and expected likewise from their students.

I remember, during my brief academic career, repeating a time-honored bromide to my students: “Nobody’s ever too tired to read.” I’d heard that from my teachers in public school, and internalized it, despite not having gone straight from high school to college. An inveterate reader from my childhood, I saw reading as innately pleasurable, a source of energy rather than a consumer of it. And I couldn’t comprehend why anyone, like my students, saw it otherwise.

Not until my adjunct position ended without fanfare did I realize how false that claim was. Turned loose onto the demands of a regional economy that had little need for my skills, and desperately in need of grocery money, I accepted a job beneath my capabilities, simply because it was there. (And simply because I loathe the job-hunting process.) And within a matter of weeks, I discovered, for the first time, that it was possible to be too tired to read.

Sometime later, I would learn the mechanics behind this. Though the brain remains deeply enigmatic to scientists, our best research minds have definitely uncovered some facts. One is that the brain draws energy from the same well as the body. What’s more, it draws energy completely disproportionate to its mass: your brain is less than two percent of your body’s mass, yet consumes more than 20% of your body’s energy.

And when the well is dry, the well is dry.

Not until leaving academia and entering the factory (and later construction) did I discover what weariness meant. Sure, I’d been tired in high school, as many people were, but not the soul-sucking weariness of pulling an eight-hour shift, then coming home to housekeeping, cooking, and generally taking care of myself. Left with the same two or three free hours everyone else has, for the first time, I found myself too tired to read.

Reading Smarsh’s description of working three jobs to subsidize taking classes, I felt that weariness again. It’s taken me ten years to regain sufficient energy to read after work, and even that is inconsistent; most days I can read some, but some days, I’m fortunate if I can stare mindlessly at my phone for a few hours. Some days, I’m lucky to wolf microwave food before lapsing into coma-like sleep.

Yet despite that, Smarsh not only had wherewithal enough to complete her degree, she had enough to complete graduate school and move onto a career. Reading her story, it’s easy to understand why: she had a personal vision, one she wanted to pursue without regard for economic limitations. She was fortunate to have that. Too many of my students from poor backgrounds had few aspirations beyond a vague desire for middle-class comfort.

Many of my students, who heard me state that bullshit about “nobody’s too tired to read,” had outside jobs. At least two told me they were working nearly full-time during the week, then driving back to their hometowns to pull shifts at their parents’ farms or machine shops. Conventional academic theorists would say these students were cruising for failure, that working so many hours outside class guaranteed defeat. Work or school: pick one, you can’t do both.

I suspect these students would reply: “Rent is due.”

Education remains, at least nominally, America’s guarantee for a middle-class lifestyle. My poor students chased a degree, not to improve themselves, but to improve their economic prospects. Couple that with crushing student debt, and a job market that doesn’t offer self-sustaining jobs anymore, and school can be as much a recipe for failure as success. I can only imagine how insulting it was to hear me say “nobody’s too tired to read.”

Friday, February 10, 2023

The Quest for the Elusive “Fact”

A Montana state Legislature bill floated this week would forbid science teachers in public schools from teaching anything other than “facts.” The bill is doomed to fail for multiple reasons: like North Dakota’s recent attempt to define “gender,” the bill is unenforceable, and has no visible support besides its sponsor. Further, the bill’s language apparently provides no meaningful definition of “facts,” something philosophers of science have debated for generations.

The bill itself is beneath contempt and wouldn’t deserve commentary on its own. But situated in context, alongside the North Dakota gender bill and Florida’s ongoing efforts to purge entire disciplines of “ambiguity,” a pattern becomes visible. One political party wants to engineer an educational system completely devoid of debate, complexity, or doubt. All subjects become lists of facts, answers are either right or wrong, and entire disciplines become foregone conclusions.

Smarter commentators have spilled copious ink over the current right-wing push to ban books and censor classroom discussions. Yet as new fronts in the battle become visible, a pattern emerges. These accusations aren’t directed at any individual fact or discipline, despite the attention paid to science or African American studies. Rather, taken together, they show a desire to produce a generation too uninformed to formulate questions or demonstrate rudimentary curiosity.

This desire makes itself most visible in studying topics where clearly right answers exist. The mass removal of books from Florida classrooms has reportedly swept up anything dealing, even tangentially, with sexuality, religion, and race; baseball icons Roberto Clemente and Jackie Robinson are too controversial for Florida today. But as I’ve noted elsewhere, absolute facts are seldom interesting. Only the controversial dynamics of controversy make facts worthwhile.

Science, including both the physical and social sciences, reveal the importance of controversy. If I drop a baseball, it will fall. This claim isn’t controversial. But why does it fall? Not until Isaac Newton did anybody formulate an answer that withstood scrutiny, without relying on God or a circular claim of “Because it does, duh.” Newton’s theory of gravitation created models which other scientists could replicate, and therefore could verify or disprove. That’s what a theory is.

When legislators reduce science to memorizable facts, they strip science of meaning. We don’t care about science as a description of obviously true events; we care about science because it permits informed and testable predictions. When Newton’s theories accurately forecast the location of the planet Neptune, his theory was considered credible. When his theories inaccurately described the orbit of Mercury, Albert Einstein formulated a new theory.

Anybody who remembers the tedium of grade-school arithmetic “skillz drillz” can easily imagine how stultifying an education based around only facts would be. Especially when those facts come pre-filtered to protect White parents’ tender sensibilities. Without enough knowledge of controversy to test opposing sides, and without enough awareness at times to even realize a controversy exists, students are reduced to memorizing lists, a task kids famously hate.

In other words, an entirely fact-based education seems designed to make kids mentally check out early. Without questions to solve, without patterns to identify, without friction to resolve, students predictably become bored and stop caring. Anything that students can correctly identify on a Scantron sheet, still the hallmark of standardized tests, probably is also boring. As this pervasive boredom slowly overtakes academia, one wonders if this isn’t deliberate.

Although the Montana bill makes science central here, the same principles apply to all topics. Paul Lockhart writes that many freshers enter college thinking they’re good at mathematics, when they’re actually good at following directions. (Replace “good at” with “bad at” where appropriate.) Ron DeSantis’ attempts to whitewash history only make explicit what James Loewen claims has always been implicit in schoolbook history.

Taken together, the pattern emerges: students don’t learn to ask questions. They learn topics only shallowly, reduce all subjects to memorized lists, and mostly forget everything after the standardized test. The system creates deeply incurious graduates who comply with authority simply to make the moment go away. Students emerge perfectly suited for essentially robotic industrial jobs that, ironically, don’t much exist in America anymore.

As a sometime teacher myself, I don’t like these conclusions, but I can’t avoid them either. I remember spending months trying to reignite students’ innate childhood curiosity, but by the time they reached me, the system had already squelched it. Bills and laws like these serve neither teachers nor students. But they do serve to create a permanent underclass of supposedly schooled graduates forever dependent on arbitrary authority.

Thursday, January 26, 2023

Advanced Placement and Unadvanced Schools

Florida Governor Ron DeSantis in his
favorite pose: angrily lecturing the crowd

The College Board’s Advanced Placement program has existed since 1952, and in that time, no state has refused to certify an AP course. Until last week. In the latest salvo in his one-sided war against “wokism” and “Critical Race Theory,” Florida Governor Ron DeSantis issued a blanket refusal to participate in a pilot program for African American studies. This despite the course not even being offered in Florida yet.

I never participated in Advanced Placement in high school. I was slated to take AP American History and AmLit in 11th grade, the first time that school offered any AP courses at that level. But circumstances changed and my family relocated that summer, to a school where AP only existed in 12th grade. I handled the adjustment with resentment, and mentally checked out of school, nearly flunking Senior Year.

Therefore I’ve watched subsequent AP developments through a lens of lingering resentment. But I’ve also watched through the recollection of how I reversed my academic skid; after a few years of desultory employment, I returned to school, graduated college with a double major, and earned a Master’s Degree. I taught Freshman Comp for four years, and had students report that my course was their favorite in four years of college.

I say all this so you’ll know who’s speaking when I say: American high schools aren’t equipped to teach Advanced Placement courses. AP began with noble intentions of shepherding advanced youth through college-level general studies courses without wading through tedious prerequisites or paying college-level tuition. But we have abundant evidence that American public schools are beholden to pressures that make AP teaching impractical, if not impossible.

Even assuming younger teens’ brains are sufficiently developed for deep dives into collegiate liberal studies—a premise I doubt—their schools aren’t equipped for such education. Teachers’ credentials are regulated at the state level, creating uneven standards across jurisdictions. Some states may permit intensive study of academic subjects, but my state requires more courses in classroom management than in any academic subject to receive a teaching degree.

This isn’t a knock against teachers individually. Rather, like police or landlords, good individuals often come a-cropper against institutions designed to preserve the status quo. As Dana Goldstein writes, American public (state) schools are organized to maximize cost efficiency, not pedagogical efficiency. Even the most dedicated teachers can’t provide intensive education when working short-staffed, underfunded, and with years-old textbooks.

Charlie Kirk

Worse, the DeSantis Administration’s attempt to kneecap African American Studies isn’t the first time states have undermined AP. In 2014, Colorado students staged a mass walkout when school boards tried to rewrite advanced history courses. Rather than teaching history, with all its messy contradictions, authorities wanted to teach patriotism and libertarian economics. The language presaged current anti-CRT rhetoric, which fears that kids might not unquestioningly love America anymore.

College-level education requires academic independence, something the America First crowd abhors. Charlie Kirk, a prominent young nationalist, kick-started his pundit career by complaining about supposed anti-American sentiment in higher education, despite having dropped out of online Bible college. Christopher Rufo almost single-handedly engineered conservative America’s pants-wetting paranoia over Critical Race Theory. They and others propose tighter state controls on education as the solution.

The DeSantis Administration’s rejection of AP African American studies explicitly cites fears about the subject’s “ambiguity.” High school teachers have long struggled with institutional fears of ambiguity. That’s why social studies teachers inevitably reduce history to lists of dates and vocabulary words, while literature teachers make students memorize plot points, and biology teachers often have to include disclaimers leaving room for seven-day creationism. State schools institutionally forbid ambiguity to creep in.

But any serious academic knows that real scholarship happens in ambiguous spaces. History isn’t just what happened; it’s the great debates about why it happened, and how events continue to influence us today. Great literature always emerges from, but also reacts against, its social environment, and therefore exists in dangerous tension. As scholars like James Loewen and Gerald Graff write, only where ambiguity exists can real learning ever happen.

The College Board generously wants to help advanced students clear academic hurdles in high school. But it ultimately can’t, because high schools face administrative burdens that prevent college-level learning. Again, this isn’t a knock against teachers, who generally enter the field for love of students. But public schools are motivated by political goals, not simple generosity. Schools simply aren’t free.

The solution is to get students out of high school faster, not move college into politically hampered high schools.

Wednesday, January 18, 2023

Some Pessimistic Thoughts On “Plagiarism”

Last summer, when historian Kevin M. Kruse was accused of plagiarism—a serious accusation in academia, and one that submarines careers—I wrote a lengthy examination of the concept. My views on plagiarism have shifted substantially since my “zero tolerance” teaching days, when we were institutionally encouraged to distrust any writing that was too well-written. I’ve come to accept that the workload schools impose on students is simply too exorbitant to permit serious original writing.

So I congratulated myself for my Gandhi-like enlightenment and walked away from the discussion. Dr. Kruse was eventually cleared by a council of like-minded nabobs, and the issue retreated from public consciousness. Then yesterday, I received a Facebook message from a stranger identified as Emmett Cullinan, admitting he’d repurposed one of my reviews for a classroom assignment. He wasn’t even circumspect about it. He acknowledged he hadn’t read the book, simply repurposed my written review.

Suddenly I found myself back in “teacher” headspace, cueing up platitudes about the evils of slapping your name on somebody else’s words. Though I doubt an online scolding will discourage any undergraduate brazen enough to actually tell a writer after lifting another’s words, I at least went to Cullinan’s Facebook page to find his school. Turns out, Cullinan has set his Facebook profile to private, making him inaccessible to anybody he doesn’t already know. Crafty.

Cullinan’s message initially looked smug, boasting in performing an act of intellectual dishonesty. But the longer I live with it, the more I suspect there’s something more going on. Yes, this kid (I’m assuming youth and inexperience drive this cocky impudence) acknowledges slapping his name on my words. Considered in a vacuum, these words imply a galling level of self-importance, since he not only performed the plagiarism, but wanted me to know he’d performed it.

But he says this pilfering “really saved my life” and “legitimately saved my grade.” These aren’t the words of somebody rubbing my nose in his dishonesty; they’re the words of somebody scared of falling behind and losing out. Indeed, his use of the word “legitimately” suggests that although he knows his actions are illegitimate, he feels his necessity is more than legitimate, or at least legitimate enough to justify his actions, taken out of desperation.

In other words, the longer I live with Cullinan’s message, the more I realize they’re written from a place of inner fear. He writes with an exterior mask of audacity and entitlement, perhaps because he believes his position as a student is precarious enough that he can’t admit terror. Students frequently learn early that professors, whose own position is frequently shaky, can smell fear. Therefore, with professors and with me, Cullinan masks fear behind arrogance.

College degrees, once a signifier of aristocratic erudition, have become job credentials for most Americans. The time spent getting a degree has become a mandatory buy-in for a middle-class life. Many jobs that, in the past, simply required an apprenticeship or on-the-job training, like office managers, industrial technicians, and law enforcement, now actually do require higher education. Sometimes the requirement is unofficial and simply customary; but some employers or jurisdictions have made this requirement official.

This means that an upwardly mobile life increasingly requires not only college, but frequently a graduate degree, to stand out and move ahead. This means a commitment of years, effort, and debt. Undergraduates and grad students are nominally adults, but see the period of juvenile dependency dragged out for years, sometimes until they’re pushing thirty. It’s even worse for anybody hoping to change careers in adulthood, which can mean more years studying for more credentials.

Emmett Cullinan’s message sounds ballsy, at first. But it reflects childhood fears dragged out onto an adult, a grown-ass man who is considered culpable as a grown-up if he mishandles a car or a beer, but who isn’t allowed to take responsibility for his working life yet. Like kids throughout history, he acts brazen because, perhaps, it’s the only power he has. Far from the unashamed plagiarist I first assumed, I think he’s just scared.

Of course that doesn’t stop me from using his real name, or screenshotting his message. I’m no longer angry enough to pursue consequences, but I’m passive-aggressive enough that, if consequences find him, I won’t cry. Because that, too, is part of higher education. Students learn to have opinions about books they haven’t read, or experiences they haven’t experienced, but they also learn to get called out when they fib. Learn to live with it.

Monday, November 7, 2022

On Losing and Regaining My Love For Science

Tom Baker as the Fourth Doctor

I first wanted to become a “scientist” in second grade, not long after discovering Doctor Who reruns on PBS. I’m sure it wasn’t a coincidence. The Doctor, then played by Tom Baker, presented himself as a scientist, and frequently expounded on difficult scientific topics in layman’s language to advance the story. But for him, science was a journey, an opportunity to meet new people and have new experiences and, frequently, confront injustice at the root.

Whenever anybody asked grade-school Kevin what he wanted to be when he “grew up,” he continued insisting he wanted to be a “scientist” for years. I read books on science history for kids, which often presented science in metaphor: Louis Pasteur’s early vaccination experiments, for instance, were presented as armed soldiers posting pickets around a weakened body and defending it against an invading army. Science became a source of adventure.

Not until middle grades did I actually study science as a distinct discipline. Then, we began performing “experiments” demonstrating important concepts like, say, the states of matter, the function of liquid capillarity, or the complexity of vertebrate vascular systems. Fun stuff, in isolation. Except we performed each “experiment” one time, and if we didn’t achieve the preordained outcome, we flunked. This “science” was remarkably rote and cheerless.

Where, I wondered, was the adventure which The Doctor encountered, and equally importantly, the moral purpose? We weren’t venturing into unknown countries to gather new evidence and fight the scourge of ignorance that kept entire populations enslaved. We were repeating experiments so crinkum-crankum that the results were absolute. While we individually definitely learned new facts, the facts we learned were vetted and ratified in advance by authority figures.

Before going further, let me emphasize: I don’t blame individual teachers for this. Teachers must face bureaucratic intransigence, work with textbooks pre-approved by those same authority figures, and teach to the test. As Dana Goldstein writes, America’s school systems are organized around cost efficiency, not learning outcomes. Many top-tier teachers resist monolithic book learning, but can only accomplish so much when fighting the system.

Louis Pasteur, discoverer of multiple
medical procedures

But the effect was the same: the sense of moral adventure which Doctor Who promised came sideways against an educational system which only permitted experimental results which were absolutely true. There was no venturing off the map in school science. I now know, as I couldn’t have known in middle school, that this wasn’t accidental. Powerful people, and the legislators they purchase, want all “learning” to result in predictable outcomes which discourage questions.

In my childhood, science was the battlefield to control the public discussion. Important religious leaders actively torpedoed any inquiry which would verify the theory of evolution (and, in some places, still do). Today, that battle has shifted to history, where teachers are required to teach bland myths and scrub history of any ambiguity or fault. In both cases, the underlying philosophy remains unchanged: prevent questions by excluding doubt.

During college, I discovered physics, and felt jolted. Before college, my limited understanding of “science” basically bifurcated into either chemistry or biology, both of which deeply disappointed me. Physics, by contrast, held the same qualities I found in science fiction adventure stories: degrees of uncertainty, reasoning through analogy, and an element of faith. In physics, all explanations are provisional, and failure is embraced in ways high school chemistry rejects.

Had I discovered physics earlier, my life might look different today. Surely some teacher somewhere introduced the discipline, but amid the crush of mandatory points which state boards required them to hit, the information got lost. By college, I’d shifted to literature, the discipline which promised the moral purpose which “science” no longer offered. Also, without a scientific goal, my math scores had languished beyond repair.

Mathematician Paul Lockhart writes about teaching middle-school math by ripping away students’ reliance on absolutely correct answers. When uncertainty becomes common again, students reinvest themselves in the process, and fall in love with learning as an adventure. A history teacher I know does something similar, capping his course with a role-play about rebuilding civilization after an EMP. Doubt becomes central to students’ intellectual investment.

I embraced the idea of “science” in childhood because it seemed bold and adventurous. But by eighth grade, I’d abandoned that ambition because it became tedious and repetitive. Only in adulthood did I discover how that tedium was engineered by powerful people to support their own power. We citizens need to reject the narrative, in any discipline, that questions are bad. Because bad people profit from our lack of answers.