Showing posts with label rhetoric. Show all posts
Showing posts with label rhetoric. Show all posts

Monday, January 12, 2026

Death of an American Propaganda Point

Renee Nicole Good

The internet reacted emphatically to Minneapolis mother and poet Renee Nicole Good’s gratuitous, on-camera murder. As it should, because it exposes the moral and legal rot underlying American law enforcement during the Taco Administration. But more-informed critics than me wondered why the internet didn’t respond with equal passion to ICE’s assassination of Keith Porter in Los Angeles. Do netizens care less because Porter is male and Black?

Porter’s death was, if anything, more egregious than Good’s murder. Jonathan Ross, the ICE agent who killed Good, acted under color of authority, and therefore probably has qualified immunity. Porter’s murderer, who remains unidentified, was off-duty, and therefore has no such administrative protections. But of course, Americans aren’t shocked and horrified by Black men’s deaths as we are with White women. I suggest this reveals a deeper division.

Northwestern University historian Kathleen Belew, in her 2018 book Bring the War Home, describes how Vietnam-era propaganda became instrumental to the White Power movement when defeated veterans returned to America. That war emphasized the importance of fending off Soviet Communism to protect, as Belew writes, “wives and daughters back home.” This language wasn’t necessarily racial, as America’s military in the 1970s was substantially integrated.

However, that doesn’t mean the racial implication didn’t exist. Yale historian Greg Grandin writes that race has often loomed large in America’s overseas engagement propaganda. America’s first overseas engagement, the Spanish-American War, involved driving the Spanish out of Cuba and the Philippines (in the 19th Century, Americans didn’t perceive Spaniards as White). This racist rhetoric persisted through generations of American overseas wars.

In short, over 130 years of American propaganda has conflated American moral virtues with White femininity. This has sometimes been depicted literally, with the goddess Lady Columbia, an Anglo-Saxon beauty in sexy body armor, sallying forth before the male army. More recently, the older, more stoic Lady Liberty has displaced Columbia, but the pattern remains the same: propagandists depict American goodness and White womanhood as synonymous.

(This pattern is common, but not universal. France has the similar goddess Marianne, but Britain has the more male, and more pugnacious, John Bull.)

Keith Porter

Nor is this unique to wartime. Legendary journalist Ida B. Wells wrote, clear back in 1892, that America justified lynchings by emphasizing the supposed threat sexually rapacious Black men posed to defenseless White women. Then as now, this was a lie, but that lie served many Americans’ pre-exiting beliefs. Ibram X. Kendi describes how American domestic propaganda conflated White female weakness with virtue, and Black female strength with vice.

To the informed, the pattern becomes overwhelming: White women need protected. This extends from early myths of lascivious enslaved Africans and rampaging Native Americans, to 19th Century fables of White Slavery and the evils of opium dens, to the fears of Communists and terrorists who will kidnap American women into sexual servitude. Today’s willingness to suspend civil liberties to fend against “human trafficking” merely continues the pattern.

Early attempts to DARVO the blame onto Renee Good have proven laughable. President Taco claimed her killer, Agent Jonathan Ross, was critically injured; he not only walked away, but never dropped his phone. Homeland Security Secretary Kristi Noem called Good a domestic terrorist; Good, a mom, had just dropped her kids at school. It all devolves the same way, that Ross targeted and murdered a White woman without cause.

Therefore, Ross didn’t merely kill one woman. He violated over 130 years of American propaganda, military and civilian. Overseas wars, draconian law enforcement crackdowns, and a criminal justice system that’s never completely expunged its racist past, have all been justified by the urgent need to protect White women. Defenders of the status quo love naming white women like Laken Riley or Iryna Zarutska to justify nakedly racist crackdowns.

Indeed, the first words from Ross’s mouth after the shooting reveal the underlying thought process. “Fucking bitch,” he snarled, still on camera. Throughout American propaganda history, the machine has expected White women to shut up, do nothing in their own defense, and accept male protection, whether they want it or not. Good showed self-determination enough to drive away, which pushed her outside the bounds of propagandistic protection.

Without the rhetorical condom of American propaganda, the Minneapolis invasion stands exposed as what it is, bigoted cruelty wrapped in the flag. If White women aren’t protected, the entire operation merely exists to hurt those the Administration deems unworthy of defense. And because the operation turned against the one group most needful of protection, the implication beomes: You’re Next.

Tuesday, December 9, 2025

Samantha Fulnecky and the Assault on Learning

Samantha Fulnecky (source)

I’ve attempted to avoid OSU undergrad Samantha Fulnecky’s newfound celebrity for refusing to do her homework. Not because I don’t have an opinion, but because I thought I had nothing to offer. Despite my years’ experience teaching college-level composition, I saw this as a classic teachable moment distorted into notoriety by a partisan media machine. I wouldn’t have given Fulnecky’s paper a zero; I would’ve returned it ungraded and invited her to my office hours.

I would’ve offered Fulnecky two pieces of guidance. I would’ve stressed that a reaction paper needs to address the subject at hand, which hers doesn’t; her sprawling, disorganized essay addresses transgender standing overall, but scarcely mentions the source. Then I would’ve stressed that religious dogma isn’t an academic argument, except in theology classes. Appeals to Christian belief will only persuade fellow Christians—and not just any Christians, but those who share her specific theological interpretation.

Then something struck me. Long before I encountered the story, more informed critics pointed out that Fulnecky doesn’t cite the Bible in her text; she refers to Christian beliefs, but without actual textual source. She doesn’t present a Christian dogmatic position, she offers Christian vibes. Although standard citations exceed the requirements of a reaction paper, if you bring in outside sources, you need to cite them, not just snuggle into them like a comfy blanket.

I’ve noticed something similar in other venues recently. Prager “University,” a YouTube account where prominent conservatives deliver lectures about hot-button issues like science, world affairs, and liberal arts, doesn’t cite sources either. This isn’t incidental. Critics who exhaustively analyzed PragerU’s content note that, when lecturers do make claims of fact, those claims are often ephemeral, difficult to source, or easy to rebut with a fifteen-second Google search. Context, nuance, and complexity go out the window.

Conservatives love the trappings of universities. Besides Prager “University,” prominent examples include the now-defunct Trump University, where The Donald licensed sock puppet instructors to teach get-rich-quick schemes, and Hustlers’ University, where Andrew Tate… um… also teaches get-rich-quick schemes. Right-wing philanthropy networks keep endowing chairs and unofficial newspapers on conventional campuses. Ted Cruz and Ben Shapiro love touting their Ivy League law degrees, right before floating proposals to demolish higher ed, and especially the Ivy League.

Dennis Prager

They’re somewhat less enthusiastic about actual higher education. In the 1990s, when I restarted my postponed post-secondary education, conservative outlets like National Review magazine were the vanguard of demand for a rich, robust liberal arts curriculum. By the 2010s, conservatives had abandoned such principles. Besides partisan opposition to higher ed, pundits like the late Charlie Kirk, who dropped out during his first semester, made a living encouraging youth to avoid schooling, mostly on university campuses.

Even where they don’t actively sabotage schools, they undermine education generally. In my state, the one-party government is actively torpedoing arts and humanities curricula, effectively turning state universities into trade schools. Samantha Fulnecky’s refusal to engage with her subject follows a pattern proposed by online firebrands like Kirk and Shapiro, to do everything within your power to avoid opening your mind or risk changing your opinions. Conservatives enroll in college to win, not to learn.

Dennis Prager and Samantha Fulnecky share a premise that truth is something we assert morally, not something we demonstrate empirically. Therefore, changing your mind when presented with contrary evidence is a form of moral failure. That’s why they eschew source citations, because it doesn’t matter what anybody else says, not even their own historically ambiguous holy text. Fulnecky demands experienced, credentialed mentors adjust themselves to her pre-existing beliefs, the God’s Not Dead model of education.

Let me emphasize, education shouldn’t aim to change students’ minds. It would be remarkable if a thorough education with a liberal arts core didn’t make youth change their minds on some topics, but changing one’s mind isn’t a prerequisite. But addressing complex topics more deeply, with a thorough topical understanding and a familiarity with prior knowledge and existing debates, is a prerequisite. And that requires knowing and citing sources, even those that disagree with you.

Conservatives once believed in difficult, diverse education; but conservatives like William F. Buckley and George Will have become rare. These pundits were intelligent, gentlemanly, and most important, willing to face evidence. Today’s conservative leadership doesn’t want to engage debate, it wants to silence it, and dogmatic belief is one tool to achieve that goal. When your goal is not to question but to clobber, making yourself a media darling is an easy way to win.

Sunday, November 30, 2025

Internet Censors and Real Speech

The cover art from Sharon Van Etten’s
Remind Me Tomorrow

I had no idea, until this week, that Sharon Van Etten’s folk-pop electronic album Remind Me Tomorrow might be off-color. Specifically, the cover art. I’ve linked to my review of the album several times on several platforms without incident. But this week, I had a link yanked from Instagram by the parent company, Meta, on the grounds of “child nudity.”

As you can see, the cover image is a childhood snapshot of Van Etten and her brother. That’s Van Etten half-folded into a laundry basket, partially unclothed. Small children often hate clothes, and have to be conditioned to wear them in time to start school. Because of this, most people recognize a categorical difference between innocent small-kid nakedness, and smut. I suspect any impartial judge would consider this the former.

This isn’t my first collision between Meta and nudity. I’ve repeatedly needed to appeal them blocking links because my essays included Michelangelo’s Creation of Adam, a panel from the Sistene Chapel ceiling. It depicts Adam, not yet alive, lolling naked in Eden, including his visible genitals. Nearly every blog essay I’ve written that included this image, I’ve had to appeal against lewdness regulations.

Any reasonable person would agree that social media needs basic standards of appropriate behavior. Without a clear, defined threshold, one or a few bad-faith actors could deluge the algorithm with garbage and destroy the common space. Consider the decline of public spaces like Times Square in the 1970s: if nobody defends common spaces, they become dumping grounds for the collective id.

But those standards are necessarily arbitrary. What constitutes offensive behavior? We get different answers if we ask people of different ages, regions, and backgrounds. My grandmother and I have different expectations; likewise, Inglewood, California, and Doddsville, Mississippi, have wildly divergent community standards. But because Facetube and InstaTwit don’t have geographic boundaries, they flatten distinctions of place, race, age, and economic standing.

TikTok perhaps embodies this best. Cutsie-poo euphemisms like “unalived,” “pew-pew,” and “grape” gained currency on TikTok, and have made it vitally difficult to discuss tender topics. YouTube restricts and demonetizes videos for even mentioning crime, death, and the Holocaust. Words like “fascism” and “murder” are the kiss of death. In an American society filthy with violence, the requirement to speak with baby talk circumspection means that we can’t communicate.

Michelangelo's The Creation of Adam, from the Sistene Chapel ceiling

Watching the contortions content creators have to perform whenever called upon to address the latest school shooting or overseas drone strike, would be hilarious, if it weren't heartbreaking. Americans have to contend with legislative inertia, lobbyist cash, and morally absolute thinking when these catastrophes occur. But then the media behemoths that carry the message have the ability, reminiscent of William Randolph Hearst, to kill stories by burying them.

I’m not the first to complain about this. I’ve read other critics who recommend just ignoring the restrictions, and writing forthrightly. Which sounds great, in theory. If censorious corporations punish writers for mentioning death and crime too directly, the response is to refuse to comply. Like any mass labor action, large numbers and persistence should amend the injustice.

In theory.

Practically speaking, media can throttle the message. In the heyday of labor struggles, the Ludlow Massacre and the Battle of Blair Mountain, unions could circumvent media bottlenecks by printing their own newspapers and writing their own folk songs. But most internet content creators lack the necessary skills to program their own social media platforms. Even if they could, they certainly can't afford valuable server space.

Thus, a few companies have immediate power to choke even slightly controversial messages, power that creators cannot resist. Which elicits the next question: if journalists, commentators, and bloggers cover a story, but Mark Zuckerberg and Elon Musk stifle the distribution, has the coverage actually happened? Who knows what crises currently fester unresolved because we can’t talk about them?

This isn’t a call to permit everything. Zuckerberg and Musk can’t permit smut on their platforms, or even link to it, because it coarsens and undercuts their business model. But current standards are so censoriously narrow that they kill important stories on the vine. If we can’t describe controversial issues using dictionary terms, our media renders us virtually mute.

Given how platforms screen even slightly dangerous topics and strangle stories in their beds, I’m curious whether anyone will even see this essay. I know I lack enough reach to start a movement. But if we can start speaking straightforwardly, without relying on juvenile euphemisms, that represents a step forward from where we stand right now.

Thursday, November 27, 2025

Andrew Tate, Master Poet

Back in the eldritch aeons of 1989, art photographer Andres Serrano gained notoriety for his picture “Piss Christ.” The image involved a crucifix with Jesus, shown through the glimmering distortion of an amber liquid, putative Serrano’s own urine. The controversy came primarily through Senator Jesse Helms (R-NC), who aspired to become America’s national guilty conscience. This outrage was especially specious because Helms only noticed the photo after it had been on display for two years.

I remembered Serrano’s most infamous work this week when “masculinity influencer” Andrew Tate posted the above comment on X, the everything app, this week. Tate is a lightning rod for controversy, and seems to revel in making critics loose their composure. Sienkiewicz and Marx would define Tate as a “troll,” a performance artist whose schtick involves provoking rational people to lose their cool and become angry. To the troll, the resulting meltdown counts as art.

Andres Serrano remains tight-lipped about his politics, and repeatedly assures tells that he has no manifesto. Following the “Piss Christ” controversy, he called himself a Christian, but this sounds about as plausible as Salman Rushdie calling himself Muslim after the Satanic Verses fatwa: that is, a flimsy rhetorical shield that convinces nobody and makes the artist look uncommitted. I think something else happened here, something Serrano didn’t want to explain; the image itself doesn't matter.

Specifically, I think Serrano created a cypher of art. Unlike, say, Leonardo’s “Last Supper,” Serrano’s picture doesn’t actually say anything. Instead, it stacks our loaded assumptions of religious imagery and bodily waste, and asks us what we see here. The image itself is purely ceremonial. Serrano cares more about why seeing the Christian image through urine is worse than seeing it through more spiritually anodyne fluids, like water or wine. Our answer is the art.

Critics like Helms, or let’s say “critics,” see art in representational terms. Art, to them, depicts something in the “real world.” This might mean a literal object, such as a fruit bowl in a still life, or an event or narrative, like the gospel story in “The Last Supper.” The representational mind seeks an artwork’s external, literal reference. This makes “Piss Christ” dangerous, because dousing the sacred image in something ritually unclean is necessarily blasphemous.

Progressive critics abandon such one-to-one representations. In viewing more contemporary art, from Serrano’s photos to Jackson Pollock’s frenetic, shapeless splatters, they don't ask themselves what object they ought to see. They ask themselves how the art changes the viewer. In the Renaissance, audiences assumed that art created a durable image of the transient, inconstant world. But artists today seek to amplify and hasten change. We viewers, not the world, are the purpose of contemporary art.

Ironically, as progressive critics tolerate more receptive non-representational standards in visual art, their expectations of language have become more exacting and literal. From religion to poetry to President Taco's id-driven rambles, they take words to mean only what they mean at surface level. Every online critic who considers it their job to identify “plot holes” in Disney’s Cars, or insist the Bible is disproven because we can’t find the Tower of Babel, makes this mistake.

At the surface level, Andrew Tate’s macho posturing seems like the opposite of art. His insistence on appearing constantly strong leaves no room for contemplative ruminating over language’s beauty or nuance. He doesn’t signpost his metaphors like Emily Dickinson, so it’s easy to assume he has no metaphors. Yet the weird prose poem above, with its apparent insistence that it’s now “gay” to be straight, defies literal scientific reading. By that standard, it’s pure poetry.

Tate seemingly contends that, in a world without obsolete gender and sexual designations, while nothing better takes their place, words become meaningless. If men feel sexually homeless nowadays, Tate lets us relax our burdens and shed our doubts. If words mean nothing, then words can’t control us. If it’s gay to be straight, then we can expunge archaic goals like love and stability. Yield to language’s poetic flow, let it transform you and be transformed by you.

This doesn't forgive Tate’s crass misogyny and weirdly self-destructive homoeroticism. He still treats women as ornaments and men as something to both desire and despise. As with any poet, it’s valid to say when something doesn’t land. (This one landed so badly that Tate eventually deleted it; only screenshots remain.) But we must critique it in its genre. Andrew Tate is a poet, not a journalist, and his words change us like art.

Wednesday, September 17, 2025

“Debate” Forever, Accomplish Anything Never

Charlie Kirk

Of the numerous encomiums following Charlie Kirk’s assassination last week, I’ve seen many writers praise his love of “debate.” They’ve extoled his supposed fondness for college speaking tours where the largest feature was his open-mike question sessions, where he used his proven acumen to dismantle undergraduates’ post-adolescent leftist idealism. He died under a banner emblazoned with his speaking slogan: “Prove Me Wrong.”

Recent public-facing conservatives have enjoyed the appearance of “debate.” Like Kirk, Ben Shapiro and Steven Crowder have made bank playing college campuses, inviting ill-prepared undergrads to challenge their talking points. Crowder’s notorious “Change My Mind” table banner turned him into one of the most eminently memeable recent public figures. And almost without fail, they mopped the floor with anyone who dared challenge them.

Smarter critics than me have shown how these “debates” are, at best, fatuous. Kirk, Shapiro, and Crowder arrive better prepared, often carrying reams of supporting research that undergraduates just don’t have. Shapiro is an attorney, a graduate of Harvard Law School, while Crowder is a failed stand-up comedian, so they’re simply better trained at extemporaneous speaking. Kirk’s training was informal, but his mentor, advertising exec Bill Montgomery, coached him well.

Myths of Socratic dialog, high school and college debate clubs, and quadrennial Presidential debates, have falsely elevated the ideal of “debate.” The notion that, if we talk long enough, we’ll eventually overcome our differences, underlies the principles of American representative democracy. College freshman comp classes survive on the notion that we can resolve our disputations through language. As a former comp teacher, I somewhat support that position.

However, we’ve seen how that unfolds in real life. As Rampton and Stauber write, defenders of the status quo prevent real change on important issues by sustaining debate indefinitely. As long as reasonable-sounding people keep discussing how to handle racism, war, police violence, global warming, and other issues on basic cable news, they create the illusion that controversies remain unresolved. Conservatives need not win, only keep the debate alive.

Public-facing conservatives on college campuses are the reductio ad absurdum of this reality. When men in their thirties, trained to speak quickly, and notice fiddling verbal inconsistencies, try to tackle wide-eyed undergrads, they look victorious. But that’s an illusion, created by the fact that they control the field. Just because an attorney can conduct cross-examination, or a comedian can do crowd work, doesn’t mean they’re correct.

Plato and Aristotle, depicted by Raphael

Just as importantly, the changing technological landscape means students have less information in reserve for extemporaneous discussion. Back in my teaching days, technological utopians claimed that students having information in reserve was less important than being able to access information on an as-needed basis. But these “debates” prove that, in the marketplace of image, knowing your subject matters when your opponent is stuck Googling on their smartphone.

Since I left teaching, ChatGPT and other “large language models” have diminished students’ need to formulate ideas in any depth. As I told my students, we don’t write only to communicate our ideas to others; writing also crystalizes our vague, ephemeral thoughts into a useful form, via language. But if students delegate that responsibility to artificial “intelligence,” they can’t know their own ideas, much less defend them on the fly.

Higher education, therefore, leaves students ill-prepared not only to participate in Charlie Kirk-style “debates,” but also to judge whether anybody has deeper ideas than supported by street theatre. I don’t blame teachers; I’ve known too many teachers who’ve resisted exactly this outcome. Rather, it’s a combination of bloated administration, regulations handed down by ill-informed legislatures, and a PR campaign that made Kirk look more erudite than he actually was.

Socrates saw his dialectical method, not as an abstract philosophical good, but as an approach to civic governance. In works like Republic and Phaedrus, he declared his belief that deep thinking and verbal acumen trained up worthy, empathetic rulers. But his approach required participants whose approach went beyond mere forms. It required participants sophisticated enough to admit when they were beaten, and turn words into substantive action.

Charlie Kirk was an avatar of a debate structure that prizes fast talking over deep thinking. His ability to steamroll students barely out of high school looks impressive to people who watch debates as spectator sport. But his approach favors form over substance, and winning the debate over testing the superior ideas. He was exactly the kind of rhetorician that Socrates considered an enemy of the Athenian people.

This produces a society that’s talked out, but too tired to act.ac

Thursday, June 5, 2025

Don’t Pretend To Be Stupid, Senator Ernst

A still from Senator Ernst’s notorious
graveyard “apology” video

Reputable news outlets called Senator Joni Ernst’s (R-IA) graveyard rebuttal last week “sarcastic” because, I think, they deemed it ideologically neutral. Accurate descriptors like “condescending,” “mean-spirited,” or “unbecoming of an elected official” might sound partisan. And mainstream media outlets today will perform elaborate contortions to avoid appearing even accidentally liberal. Better to call her “sarcastic,” from the corporate overlords’ perspective, than analyze Ernst’s motivations.

I have no such compunctions. I’ll eagerly call Ernst’s argument what I consider it: deeply dishonest, predicated on bad faith. For those who need a refresher, Ernst’s constituents expressed outrage at her support for a budget bill which included severe Medicaid cuts. At a Parkersburg town hall, a constituent shouted “People are going to die!” After stammering a bit, Ernst replied: “We are all going to die.” When that comment drew national attention, Ernst responded by doubling down.

Let’s postpone the substance of the debate now. We all already have our opinions on the moral and legal motivations for steep Medicaid cuts; my regular readers probably share my disdain for these cuts. Rather, let’s focus on Ernst’s rhetorical approach. Specifically, I’d like to emphasize Ernst’s decision to pretend she doesn’t understand the accusation. The audience member, in saying people will die, meant people will die needlessly and preventably. Ernst chose to explain that people will die at all.

In classical rhetoric, we speak of the “stasis of argument,” the point of real contention when people disagree on important points. In general, we speak of four stases of argument, that is:

  • Fact (will people die?)
  • Definition (what does it mean for people to die?)
  • Quality (is this death necessary, acceptable, or moral?)
  • Jurisdiction (who bears responsibility for this death?)

In saying people are going to die, Ernst’s constituent argues from a stasis of quality, that cutting Medicaid and other programs will result in needless and morally unacceptable deaths. Ernst attempts to shift focus and claim that death, being inevitable, shouldn’t be resisted. Death is just a fact.

The stases listed in sequence above move from lowest to highest. Rhetoricians consider facts simple and, usually, easy to demonstrate. When facts become ambiguous, we move upward into definitions, then further up into moral considerations, and finally into the realm of responsibility. Moving upward usually means acceding the prior stasis. We cannot argue the morality or responsibility of facts without first acknowledging their reality.

Sometimes, shifting the stasis of argument makes sense. When the state of Tennessee prosecuted John Scopes for teaching evolution in public schools, the prosecution proceeded from a stasis of fact: did Scopes break the law? Defense attorney Clarence Darrow redirected the argument to a stasis of quality: did Scopes do anything morally unacceptable? Darrow essentially admitted the fact, but claimed a higher point of contention existed.

Plato and Aristotle, as painted by Raphael

However, the reverse seldom applies. Moving up the ladder means adding nuance and complexity to arguments, and moving down means simplifying. By shifting the stasis onto the physical reality of death, which all humans face inevitably, Ernst removes the complexity of whether it’s good or acceptable for someone to die now. If an accused murderer used “We’re all going to die” as a courtroom defense, that would be laughable.

Ernst knows this. As a law-n-order Republican, Ernst has a strict voting record on criminal justice, border enforcement, and national defense. She knows not all deaths are equal. By shifting her stasis of argument from whether deaths are acceptable to whether deaths are real, she’s pretending an ignorance of nuance she hasn’t presented anywhere else. She knows she’s moving the goalpoasts, and assumes we’re too stupid, or perhaps too dazzled by rapid wordplay, to notice she’s done it.

I’ve complained about this before. For instance, when people try to dismiss arguments against synthetic chemicals by pretending to misunderstand the word “chemical,” they perform a similar movement. Moving the stasis down the ladder is a bad-faith argument tactic that bogs debate down in searches through the dictionary or Wikipedia to prove that blueberries aren’t chemical compounds, an that human mortality doesn’t make murder okay.

Moreover, this tactic means the person isn’t worth talking to. If Senator Ernst believes that human mortality negates our responsibility to prevent needless premature death, then we have two choices. She’s either too stupid to understand the stakes, which I doubt, or she’s too dishonest to debate. We must humor her while she’s in office. But her term is up next year, and honest, moral voters must remove her, because this rhetorical maneuver proves her untrustworthy for office.

Monday, April 28, 2025

Further Thoughts on the Futility of Language

Patrick Stewart (left) and Paul Winfield in the Star Trek episode “Darmok”
This essay is a follow-up to my prior essay Some Stray Thoughts on the Futility of Language

The popularity of Star Trek means that, more than most science fiction properties, its references and in-jokes exceed the bounds of genre fandom. Even non-junkies recognize inside references like “Dammit, Jim,” and “Beam me up.” But the unusual specificity of the 1991 episode “Darmok” exceeds those more general references. In that episode, the Enterprise crew encounters a civilization that speaks entirely in metaphors from classical mythology.

Berkeley linguist George Lakoff, in his book Metaphors We Live By, contends that much language consists of metaphors. For Lakoff, this begins with certain small-scale metaphors describing concepts we can’t describe directly: in an argument, we might “defend our position” and “attack our opponents.” We “build an argument from the ground up,” make sure we have “a firm foundation.” The debate ends, eventually, when we “see the other person’s point.”

Such first-level metaphors persist across time because, fundamentally, we need them. Formal debate structures shift little, and the figures of speech remain useful, even as the metaphors of siege warfare become obsolete. While speakers and authors repeat the metaphors, they retain their currency. Perhaps, if people stopped passing such metaphors onto the next generation, they might fade away, but so far, that hasn’t happened in any way I’ve spotted.

More pliable metaphors arise from cultural currents that might not persevere in the same way. Readers around my age will immediately recognize the metaphor when I say: “Read my lips, no new taxes.” They may even insert President George H.W. Bush’s hybrid Connecticut/Texas accent. For several years in the late 1980s and early 1990s, the “Read my lips” metaphor bespoke a tough, belligerent political stance that stood involate… until it didn’t.

In the “Darmok” episode, to communicate human mythic metaphors, Captain Picard describes the rudiments of the Epic of Gilgamesh, humanity’s oldest known surviving work of fiction. Picard emphasizes his familiarity with ancient myth in the denouement by reading the Homeric Odes, one of the principal sources of Iron Age Greek religious ritual. For Picard, previously established in canon as an archeology fan, the earliest myths represent humanity’s narrative foundation.

But does it? While a nodding familiarity with Homer’s Odyssey and Iliad remain staples of liberal education, how many people, outside the disciplines of Sumeriology and classical studies, read Gilgamesh and the Homeric Odes? I daresay that most Americans, if they read mythology at all, mostly read Bulfinch’s Mythology and Edith Hamilton’s Mythology, both of which sanitized Greek tradition for the Christian one-room schoolhouse.

The attached graphic uses two cultural metaphors to describe the writer’s political aspirations. The reference to Elvis on the toilet repeats the widespread cultural myth that Elvis Presley, remembered by fans as the King of Rock and Roll, passed away mid-bowel movement. There’s only one problem: he didn’t. Elvis’ loved ones found him unconscious on the bathroom floor, following a heart attack; he lingered a few days before dying in hospital.

The drift between Elvis as cultural narrative, and Elvis as historic fact, represents the concept of “mythology” in the literary critical sense. We speak of Christian mythology, the mythology of the Founding Fathers, and the myths of the Jersey Devil and prairie jackalope. These different “mythologies” represent, neither facts nor lies, but stories we tell to understand concepts too sweeping to address directly. Storytelling becomes a synecdoche for comprehension.

Similarly, the broad strokes of Weekend at Bernie’s have transcended the movie itself. It’s questionable how many people watched the movie, beyond the trailer. But the underlying premise has become a cultural touchstone. Likewise, one can mention The Crying Game or The Sixth Sense, and most Americans will understand the references, whether they’ve seen the movies or not. The vague outlines have become part of our shared mythology.

But the movies themselves haven’t become so. Especially as streaming services have turned movie-watching into a siloed enterprise, how many people watch older movies of an evening? We recognize Weekend at Bernie’s, released in 1989, as the movie where two doofuses use their boss’s corpse as backstage pass to moneyed debauchery. But I doubt how many people could state what actually happened, beyond the most sweeping generalities.

Both Elvis and Bernie have come unmoored from fact. Their stories, like those of Gilgamesh and Darmok, no longer matter; only the cultural vibe surrounding them survives. Language becomes a shorthand for understanding, but it stops being a vessel of actual meaning. We repeat the cultural references we think we share, irrespective of whether we know what really happened, because the metaphor, not the fact, matters.

Tuesday, April 22, 2025

Some Stray Thoughts on the Futility of Language

I think I was in seventh grade when I realized that I would probably never understand my peers. In church youth group, a young man approximately my age, but who attended another middle school, talked about meeting his school’s new Egyptian exchange student. “I could tell right away,” this boy—a specimen of handsome, square-jawed Caucasity who looked suspiciously adult, so I already distrusted him—said, “that he was gonna be cool.”

“How could you tell?” the adult facilitator asked.

“Because he knew the right answer when I asked, ‘What’s up?’”

Okay, tripping my alarm bells already. There’s a correct answer to an open-ended question?

Apparently I wasn’t the only one who found that fishy, because the adult facilitator and another youth simultaneously asked, “What’s the correct answer then?”

“He said, ‘What’s up?’” my peer said, accompanied by a theatrically macho chin thrust.

(The student being Egyptian also mattered, in 1987, because this kid evidently knew how to “Walk Like an Egyptian.”)

This peer, and apparently most other preteens in the room, understood something that I, the group facilitator, and maybe two other classmates didn’t understand: people don’t ask “What’s up?” because they want to know what’s up. They ask because it’s a prescribed social ritual with existing correct responses. This interaction, which I perceived as a request for information, is actually a ritual, about as methodical and prescriptive as a Masonic handshake.

My adult self, someone who reads religious theory and social science for fun, recognizes something twelve-year-old Kevin didn’t know. This prefixed social interaction resembles what Émile Durkheim called “liturgy,” the prescriptive language religious people use in ceremonial circumstances. Religious liturgy permits fellow believers to state the same moral principles in unison, thus reinforcing their shared values. It also inculcates their common identity as a people.

The shared linguistic enterprise, which looks stiff, meaningless, and inflexible to outsiders, is purposive to those familiar with the liturgy. Speaking the same words together, whether the Apostle’s Creed or the Kaddish or the Five Pillars of Islam, serves to transform the speakers. Same with secular liturgy: America’s Pledge of Allegiance comes to mind. Durkheim cited his native France’s covenants of Liberté, Egalité, et Fraternité.

This confused me, a nerdy and socially inept kid who understood life mainly through books, because I thought language existed to convey information. Because “What’s up?” is structured as a question, I perceived it as a question, meaning I perceived it as a request for clarifying information. I thought the “correct” answer was either a sarcastic rejoinder (“Oh, the sky, a few clouds…”) or an actual narrative of significant recent events.

No, I wasn’t that inept, I understood that when most people ask “How are you today,” it was a linguistic contrivance, and the correct answer is “fine.” I understood that people didn’t really want to know how you’re doing, especially if you’re doing poorly. But even then, the language was primarily informative: I’m here, the answer says, and I’m actively listening to you speak.

However, the “What’s up?” conundrum continues to nag me, nearly forty years later, because it reveals that most people don’t want information, at least not in spoken form. Oral language exists mainly to build group bonds, and therefore consists of ritual calls and responses. We love paying homage to language as communication, through formats like broadcast news, political speeches, and deep conversations. But these mostly consist of rituals.

Consider: when was the last time you changed your mind because of a spoken debate? This may mean the occasional staged contacts between, say, liberals and conservatives, or between atheists and Christians. Every four years, we endure the tedium of televised Presidential debates, but apart from standout moments like “They’re eating the pets,” we remember little of them, and we’re changed by less.

For someone like me, who enjoys unearthing deeper questions, that’s profoundly frustrating. When I talk to friends, I want to talk about things, not just talk at one another. Perhaps that’s why I continue writing this blog, instead of moving to YouTube or TikTok, where I’d receive a larger audience and more feedback. Spoken language, in short, is for building bonds; written language is for information.

Put another way, the question “What’s up?” isn’t about the individuals speaking, it’s about the unit they become together. Bar chats, water cooler conversations, and Passing the Peace at church contain no information, they define the group. Only when we sit down, alone, to read silently, do we really seek to truly discover what’s up.

Tuesday, October 29, 2024

“Chemicals,” Food, and You

“3-methyl butyraldehyde is a compound in a blueberry. Think about that.”

Somebody threw this into the ether recently in an argument about whole foods. You know how wise and restrained online debaters are. This person seriously believed they’d made a meaningful point about why people who insist on whole foods and minimal processing were wrong. Because whole foods have chemical compositions which are difficult to pronounce, this person apparently believed all arguments for plant-based whole foods are, a priori, wrong.

In full fairness, the other person in this debate (not me) said something equally unfounded. “If you can’t pronounce an ingredient,” the other person wrote, “DON’T EAT IT!” This person apparently believed in the honest wholesomeness of “natural” ingredients, presuming that naturally occurring, plant-based substances must necessarily be healthful. The other person responded with complete trust in science and technology.

I’ve written about this before; this double-sided fallacy doesn’t bear another unpacking.

However, the 3-methyl butyraldehyde argument deserves some exploration. This person, hidden behind an anonymous sign-on handle and a cartoon avatar, claims that abstruse chemical constituents within whole foods are essentially equal to additives used in manufacturing processed foods. 3-methyl butyraldehyde, which has both naturally occurring and synthetic forms, is found in many commercial foods, both whole and processed.

Blueberries have several naturally occurring chemical constituents. Some are easy to pronounce, including protein, fat, and especially water. Others are more abstruse, such as hydroxylinalool, linoleic acid, and terpinyl acetate. Though most of these chemical compounds are harmless in naturally occurring proportions, some can be harmful if isolated and hyperdosed. Like most organisms, blueberries comprise a subtle, nuanced combination of substances.

However, no combination of these substances, in any quantity, will come together and form a blueberry, not with current science or technology. One can only grow a blueberry by carefully cultivating a blueberry bush, a commitment of time and effort, as blueberry bushes only produce fruit after two or three years. Chemical fertilizers can sometimes hasten fruiting, but at the cost of starchier fruit, which displaces both nutrients and flavor.

One recalls the common counterargument whenever hippies complain about “chemicals.” Some wag, occasionally but not often a scientist, responds: “Everything is chemicals!” To take this argument seriously, the respondent must not know (or pretend not to know) that people say “chemicals” as a synecdoche for synthetic chemicals of an unknown provenance, which, under America’s light-touch regulatory regime, are assumed safe until proven otherwise—cf. Rampton & Stauber.

Though the FDA tests and regulates pharmaceuticals (for now), many food additives, cosmetics, chemicals used in haircare products and clothes, and other things we put on our bodies, are presumed safe. This despite years of evidence that this isn’t good practice. Ethelyne glycol, cyclamate, and several food dyes were regularly used in American foods before being demonstrated as unsafe.

Even beyond safety concerns, the reduction of whole foods to their chemical constituents preserves a dangerous idea. Futurists once posited that food scientists would eventually isolate the basic nutrients in food, and effectively replace the tedium of cooking and eating with the simplicity of gelatin capsules. One finds this reasoning behind the mythology of vitamin supplements, now known to be useless for most people most of the time.

Human digestion doesn’t simply extract chemical nutrients from food like a Peterbilt burning diesel. We require the complexity of food, including fats, fiber, roughage, and limited amounts of sugar. I generally side with Michael Pollan’s ubiquitous advice: “Eat food, not too much, mostly plants.” Food doesn’t mean chemical constituents. You don’t make a blueberry smoothie by adding 3-methyl butyraldehyde, you make it by adding blueberries.

Please don’t misunderstand. I want to avoid the trap of assuming that “natural” equals good. Reasonable adults know you shouldn’t pick wild mushrooms or handle poison ivy. That’s an exaggeration, but the point remains, that nature requires respect, like any other tool. But human agronomists have selectively bred food crops for 5,000 years to maximize healthful content, and apart from occasional allergies, agriculture is broadly trustworthy.

And pretending that food only consists of its chemical compounds is bad-faith argument. You wouldn’t describe your friend by listing his tissues and internal organs, because humans are more than the sum of our parts. The same applies to food, including fresh ingredients. Cooking natural ingredients, then processing them with synthetic additives to make them tasty and shelf-stable, does change the food.

Pretending not to understand the other person is smarmy and disrespectful, and if your argument requires it, your argument is probably bad.

Thursday, January 18, 2024

A Short Course In Speaking English(es)

1001 Books To Read Before Your Kindle Battery Dies, Part 116
John McWhorter, Our Magnificent Bastard Tongue: The Untold History of English, and Talking Back, Talking Black: Truths About America's Lingua Franca

Sixteen centuries ago, the Anglo-Saxons invaded Britain, bringing their Germanic language with them. Sometime after that, Vikings invaded, then Normans, each changing English in different ways. Then the Empirehappened, and voilá! English as we know it happened!

If your grade-school English linguistics history resembled mine, you probably received a version of this just-so story. English underwent massive changes in the distant past, until it eventually resembled today’s vernacular, and the peasants rejoiced. Even then, I found this narrative unsatisfying. Apparently John McWhorter, Cornell linguist and sometime pundit, felt equally dissatisfied. He’s spent his career documenting how English has evolved, and continues to evolve.

English language evolution is substantially hidden because nobody left written records of change. McWhorter finds clues hidden in what historians and scholars wrote, but also in what they omitted. Languages which rubbed elbows with early English, including Welsh, Cornish, and the now-lost Danish Viking dialect, provide valuable clues. English, McWhorter believes, evolved in hybrid, among bilingual populations.

For instance, English lacks case endings, the word mutations that make Latin and German difficult to learn. But it has the present progressive tense, missing from most Indo-European languages. McWhorter finds other languages that possess, or lack, these functions, and wouldn’t you know? They’re all language that interacted heavily with English, often at swordpoint. From this, McWhorter surmises a history of lively linguistic give-and-take.

McWhorter works from the documentary record, but also from hypothesis. He considers it logical that Celtic languages seeded spoken English with new verb constructions, even as written English (Anglo-Saxon) resisted change. Indeed, in some cases, McWhorter considers the lack of written evidence as proof of deep-seated cultural prejudices and systems of power, which manifest themselves in what literate Anglo-Saxons consider to commonplace too record.

In Our Magnificent Bastard Tongue, McWhorter reconstructs linguistic change in distant history and written record. But English never achieved some perfect form, and stopped. In Talking Back, Talking Black, McWhorter applies the same narrative reasoning to the most vibrant form of linguistic evolution happening today: Black American English. Far from merely “broken” English, as critics accuse, McWhorter finds lively, vibrant growth taking place.

The arguments surrounding Black English have mostly fallen along two lines. Linguists and sociologists, often writing in specialist journals, insist that Black English has its own sophisticated rules, complex textual history, and social status. Meanwhile mass-media critics, Black and White alike, decry how Black English differs from Standard or “Correct” English, and bemoans the dialect’s backward social status. Both consider their positions apparent.

John McWhorter, Ph.D

Obviously, McWhorter rejects the mass-media narrative. However, he doesn’t do so offhandedly; he makes a persuasive case for Black English and its rich linguistic heritage. For McWhorter, language doesn’t merely convey information; it also builds social bonds and creates communities. In diverse ways, he emphasizes how Black English doesn’t merely let Black Americans communicate knowledge, it also reinforces their communities and binds them together. It also reflects power dynamics in a historically divided America.

Unlike his morphology of Anglo-Saxon, McWhorter has ample material evidence to demonstrate how Black English evolved within living memory. Black Americans left ample books, audio recordings, video performances, and other serious documentary evidence. Therefore he’s able to track, with remarkable precision, exactly when and where Black English underwent significant changes. He makes a persuasive case that Black English remains lively and evolving, adapting to meet society’s changing needs.

He also makes the case that Black Americans, speaking their historic dialect, are “diglossic,” equally fluent in two linguistic forms simultaneously. Unlike me speaking French, having to desperately translate every phrase and sentence internally, Black Americans simply know both dialects, and apply them correctly. Both forms come equally readily, and Black Americans can deploy Standard English when the context demands it.

McWhorter wrote these books separately, but they serve as a pair. One establishes and demonstrates his philological principles in an historical setting, while the other applies the same tools to contemporary settings. Both books run short, under 200 pages plus back matter, and both are comprised of mainly freestanding short thematic essays. This allows sampler-style reading without needing to commit to treatises founded on dense technical terminology.

Throughout, McWhorter maintains his casual tone, conversational digressions, and friendly vibe. Even when he goes beyond ironclad proof, you always believe he’s led you to think about things more deeply. Especially for general audiences, whose familiarity with applied linguistics is probably scanty, McWhorter’s approach will probably guide them through his difficult thought process. That makes these books good introduction to linguistics, an often overlooked field.

Friday, October 27, 2023

Barbie, Disability, and the Death of Formal Rhetoric

Ben Shapiro expresses his well-thought-out opinion in a totally reasonable manner.

Aristotle defined rhetoric as “the capacity to discover the possible means of persuasion concerning any subject.” Pointy-headed and abstruse, yes, but a reasonably concise description of what I tried to teach in Freshman Composition. When structuring our language around contentious issues or painful controversies, we must think in terms of what will persuade our intended audience. That standard is often subjective, and moves almost whimsically.

Two recent events have re-centered this difficult process for me. I recently witnessed an unpleasant online dispute quickly spiral out of control. A disabled person noticed that a friend’s anecdote about helping a disabled stranger contained certain ableist prejudices. The story was well-intentioned, but fit a genre of short narrative sometimes disparaged as “inspiration porn.” All such stories mean well, but misfire by showcasing able-bodied generosity over disabled autonomy.

As sometimes happens when disadvantaged persons ask for consideration, some observers saw this criticism as personal attack. Like The Former President, who saw kneeling football players as disloyal “sons of bitches,” the OP’s friends closed ranks defensively, lambasting the critic for “attacking” their friend and “ripping him to shreds.” The defensive posture became so energized that they persisted even after the OP cautioned them to back off and cool down.

When Ben Shapiro, the massively online full-time professional offense-taker, protested Greta Gerwig’s Barbie movie this summer by lighting two Barbie dolls on fire, he apparently thought he was making a serious point. The entire internet, however, responded with aggressive disdain. Shapiro evidently thought this fire was an appropriate synecdoche for his internet-friendly outrage. But even his staunchest allies had little support for a petulant boy destroying his sister’s toys.

Shapiro makes most of his living doing personal appearances on college campuses, engaging undergraduates in “debate.” He organizes his public persona around the motto “Facts Don’t Care About Your Feelings.” Superficially, Shapiro seems to advocate Aristotelean rhetoric as persuasion through evidence. Yet the Barbie incident demonstrates something Shapiro’s critics have long noted: he cares only about winning, usually by personally demolishing anyone who disagrees with him.

Proof of the standing stereotype
of what constitutes a disability

The disability debate pushed me into an awkward position. Both the OP and his critic are friends, whom I respect dearly. I struggled to triangulate a position where I supported both while clarifying that I considered the criticisms justified. This meant finding ways to say “you’re wrong” without making the statement personal, and managing the feelings of defenders whose emotions already ran high. Therefore my participation mainly consisted of overthinking and extended paralysis.

Rereading the debate afterward, I noticed something I missed in the moment: the critic and the defenders kept talking past one another. The critic offered copious evidence, including cited sources and hyperlinks. The defenders hand-waved all the evidence, focusing on the perceived personal slight in the original callout. Because the critic intended no personal slight, she never addressed it. Therefore, both sides’ core concerns never got addressed.

When Ben Shapiro mistakes destroying toys for pitching an argument, the core problem probably resides in who he thinks his audience is. Shapiro has garnered acclaim by performing stunts designed to embarrass progressives and dissidents. Such displays help unify his hard-right audience and create a base primed to listen (and, importantly, to buy his advertisers’ sketchy products). But it’s more likely to alienate anyone who doesn’t already agree.

In other words, Ben Shapiro, his Daily Wire media company, and similar massively online conservative outlets like Daily Caller and The Blaze, create loyalty to an ideological brand. As I’ve noted before, these outlets generate an almost religious sense of unity. Sure, the ideological sense of aggrieved White masculinity coaxes new converts through the door. But once inside, the politics generally matter less than the sense that we’re traveling together.

That, I realized (with some pain), happened with the disability debate. While the critic attempted to structure a formal argument supported with evidence, the OP’s defenders formed a perimeter around group loyalty. Rereading the previous sentence, I realize it sounds pejorative. Not so; when disadvantaged groups face systematic challenges, group membership enables them to organize and support one another.

Don’t misunderstand me: I’m not here to call out or condemn anyone individually. Rather, to return to classical rhetoric, I believe the two groups had “mixed stases,” that is, they were having two different arguments. But that’s become the problem with online ideology. Too often, we care more about defending the group than seeking the truth; that goes double for us White able-bodied cishet males. The group becomes paramount; the truth gets lost.

Saturday, September 9, 2023

The Great Stratford-on-Avon Noise Machine

I don’t like deleting anyone from my online friends lists. I sometimes hold onto friendships that have shuffled along, zombie-like, for years. To get ejected from my social media lists, you generally must pick fights, engage in personal insults, or use my page to spread lies. Recently, I had someone attempt all three, and it involved one of literature’s most tedious questions:

Did Shakespeare write the works of Shakespeare?

Anti-Stratfordianism is a pseudoscience, akin to antivax or Flat Earth conspiracies. Adherents “prove” their propositions, not through evidence, but by “poking holes,” finding supposed inconsistencies in the documented narrative, and “just asking questions.” As my overuse of scare quotes indicates, anti-Stratfordian arguments rely on obfuscations and innuendo, not evidence. Yet my now-ex-friend insisted I must “engage” every specious argument, or surrender the debate.

That, immediately, should’ve been a clue. Flooding the market with unsourced innuendo or anecdotes, then claiming victory on every point somebody can’t immediately rebut, is the tactic of ufologists and Bigfoot hunters, not serious social scientists. Bullshit artists, like Steven Crowder with his “change my mind” schtick, love barraging the unprepared with binders full of photocopied talking points, demanding on-the-spot answers.

My ex-friend began by demanding why we should consider Shakespeare the author of Shakespeare, when we have little documentary evidence of his life. As though the absence of documents, in a time when creating and storing documents was expensive, proves anything. In fairness, professional doubt manufacturers use this same technique on Homer, Socrates, Pythagoras, or Jesus Christ. Rhetoricians call this the “argument from ignorance.”

Jon Finch (left) and Francesca Annis in Roman Polanski's Macbeth

Sometimes we must focus on gaps in our knowledge. Law enforcement and counterterrorist experts do this frequently. But formal argument considers this fallacious in most contexts, because absence of knowledge usually proves little, except that nobody can document everything. Shakespeare, a poor boy from the provinces, didn’t merit physical documentation until relatively late in life.

Well, the anti-Stratfordian argues, what about Shakespeare’s lack of education? Most playwrights of the English Renaissance attended Oxford or Cambridge; how could Shakespeare write great literature without academic credentials?

Yes, most late-Elizabethan playwrights attended universities. We remember Christopher Marlowe, Robert Greene, George Peele, and Thomas Nashe as the “University Wits.” Greene wrote a notorious pamphlet condemning Shakespeare as an uneducated bumpkin. However, Greene’s one surviving play isn’t worth reading. Of the University Wits, only Marlowe bears reading now—and he nearly flunked his education, until the Queen personally intervened.

Insisting that Shakespeare couldn’t write well without a university education is classist. As a former university composition teacher, I can attest that some people write well without higher education, others write poorly with higher education, and some write well despite higher ed. (When I mentioned social class, my ex-friend said I’d engaged in “ad hominem attack.” Not so, sir. If there’s a fallacy there, it’s hasty generalization.)

The anti-Stratfordian shifts tactics: how could Shakespeare write about foreign lands so authoritatively? We have no evidence he ever left England. (Again with the “argument from ignorance.”)

Except Shakespeare didn’t write authoritatively about foreign lands. In Hamlet, he misnames Denmark’s royal palace, and gives nearly every Danish character Greek or Latin names. He sets several plays in Italy, including Verona, Venice, and Padua. Each feature explicitly English scenes, including women speaking in public (verboten in Renaissance Italy), court cases argued on English common law, and common English courtship rituals. Shakespeare’s “foreign lands” are exotic names draped over English scenes.

Kenneth Branagh as Hamlet

Next, the anti-Stratfordian demands I explain how Shakespeare,a poor country boy, could write about aristocracy? How could a provincial merchant’s son understand noble households and aristocratic families?

First, Shakespeare’s playing company had aristocratic sponsorship. As first the Lord Chamberlain’s Men, and later the King’s Men, Shakespeare’s company were members of the royal household, and therefore aristocratic insiders. Even without that, though, the poor have always understood how rich people think, as rich people don’t understand the poor, because they have to. Then as now, social class matters.

Finally, my ex-friend deployed the low blow: why was I getting emotional? “You don’t sound,” he wrote, “like a dispassionate academic here.” I realized that he’d gone for the troll argument, saying provocative things until I lost my composure, then crowing over my anger. His statement was a prettied-up version of “U mad bro?” So I blocked him.

I don’t brook bad-faith argument or underhanded tactics. I won’t engage future anti-Stratfordian arguments, and if anyone tries them on me ever again, I’ll show them this narrative, then mute them forever. It’s better than they deserve.

Friday, June 23, 2023

Some Thoughts on the Nature of “Tragedy”

Stock photo of the Titanic wreckage

I avoided writing about the OceanGate Titan, the submersible that vanished while gawking at the wreck of the Titanic, while there was hope of rescue. Since it disappeared on Sunday, social media has been flooded with dollar-store schadenfreude mocking the passengers’ entitled hubris to treat the remote, and still poorly explored, wreck as a tourist attraction. But at this writing, oxygen reserves have run out, the vessel is probably lost, and the tone has shifted.

By Thursday morning (really Wednesday evening), reactions bifurcated. Some observers, perhaps motivated by the parallels between this event and the original Titanic sinking, began describing this event as a tragedy. Behind them, a rising tide of dissidents reminded audiences that an overloaded migrant vessel sank last week, potentially dragging over 600 impoverished Libyan refugees to the deepest part of the Mediterranean. Why, dissenters asked, isn’t this the real tragedy, not the submarine full of CEOs?

This debate reflects not only the priorities driving the 24-hour news cycle, but the way words drift over time. Mass media slings the word “tragedy” around so heedlessly that it’s come unmoored from its Greek roots. Aristotle defined tragedy as a theatrical form in which the protagonist is destroyed, not by bad luck or circumstance, but by the consequences of his (and indeed usually “his”) own actions. Then the horrified audience feels pity for him.

For Aristotle, the defining tragedy was Sophocles’ Oedipus Rex. You probably know the broad outlines, even if you haven’t read it or seen it performed. Oedipus, king of Thebes, promises to root out the cause of the curse plaguing his city. The prophet Tieresias promises Oedipus he won’t like the answer, but Oedipus persists. Following a detective-like investigation, he discovers that he caused the curse himself, and he’s already living out the morally horrific consequences.

Many modern writers forget tragedy’s most important point: Oedipus brought these consequences upon himself. On at least four occasions, the play’s suffering and bloodshed could’ve been avoided if Oedipus or his forebears had listened to advice. In this regard, the OceanGate Titan catastrophe is indeed tragic. Deliberate disregard for safety protocols, and the belief that money insulates rich people from calamity, led three CEOs, a teenager, and a technical crew to their almost-certain watery graves.

PR photo of the OceanGate Titan submersible

And the narrative induces horror and pity. If the passengers and crew weren’t killed instantly, the conditions of their now-likely deaths sound horrific. Trapped in a claustrophobic submersible, undoubtedly wearing urine-soaked clothes, and being slowly suffocated, this disaster has robbed them not only of their lives, but also their dignity, and even a marked grave. Like Oedipus, they arguably deserve their fate for rubbernecking at a mass grave, but their deaths are still pretty piteous.

Edit: after I wrote this essay, the U.S. Coast Guard announced they had identified the remains of the OceanGate Titan. It appears the submersible suffered a rapid structural implosion, and the passengers were, indeed, killed instantly.

Okay, but if the OceanGate Titan is a legitimate tragedy, doesn’t that describe the Libyan refugee ship? The hundreds of deaths are both horrific and piteous. Yet I’d contend they aren’t tragic, for one reason: there wasn’t much anybody aboard that vessel could’ve done. Their choices were limited to remaining in Libya, which has been anarchic and violent since the Obama Administration’s reckless intervention in the overthrow of Moammar Gadhaffi, or risk death at sea.

Oedipus, King Lear, and Jay Gatsby are tragic heroes, not only because they died horrifically, but because they’re responsible for their own deaths. Any of them could have, at any time, stopped events from happening. They perhaps didn’t realize the agency they possessed, but each one made choices which led directly to their own downfalls. If the captain of the refugee vessel misled refugees onto his boat, causing it to sink, that would be tragedy.

Aristotle believed that only kings, generals, and potentates had enough power to be responsible for their own deaths. I disagree. I’ve recently become a fan of horror fiction, and novels like Ania Ahlborn’s Brother and Catriona Ward’s The Last House on Needless Street probably count as contemporary Aristotelian tragedies. In both books, at least one character could’ve prevented catastrophe by asking questions, listening to advice, or not going with the flow like a dead fish.

So yes, on balance, I’d say the OceanGate Titan catastrophe is a tragedy. It meets Aristotle’s standards not only of narrative structure, but of audience reaction. We are, indeed, suitably horrified and pitying. The only question remains: will we learn anything? Will we respect safety standards, shake the illusion that money deflects consequences, and the dead aren’t for gawking at? Only time will tell. At least we can start reclaiming the definition of a “tragedy.”

Monday, May 8, 2023

The Debt to Our Democracy

Richard Haass, The Bill of Obligations: The Ten Habits of Good Citizens

One need only browse the headlines or crack the occasional book to know that American democracy is suffering within our lifetime. Lawmakers have jettisoned standards of legislative debate, while judges base case law on their avowed politics, not legal precedent. Meanwhile, actual voters increasingly distrust the electoral process, especially when their elected representatives tell them to. Storming the Capitol has become a political maneuver, not an act of war.

Dr. Richard Haass, president of the Council on Foreign Relations, is hardly the first public policy specialist to suggest Americans need a deeper investment in civics. It’s not enough to demand that the government defend our individual or collective rights, if we don’t share a commitment to the body politic. He believes Americans need social obligations concomitant with their constitutional rights. Which sounds great—if you don’t probe too deeply.

Haass divides this slim volume, barely long enough to be considered a pamphlet, into two parts. The shorter first part breaks down how America’s longstanding democratic traditions have fallen into disrepair, and we citizens bear collective responsibility. His “ripped from the headlines” exposition doesn’t really add anything that dedicated politics junkies (who are probably his core audience) don’t already know, but it lays the groundwork for his proposed solutions.

His second part involves ten “obligations,” which he defines as standards that good citizens ought to follow, without being compelled by law. Again, Haass isn’t the first to suggest that citizenship carries certain responsibilities, beyond showing up on Election Day. His proposed list of baseline standards sounds pretty good, in the abstract. Precepts like “Be Informed” and “Get Involved” match the bipartisan exhortations which emerge like clockwork every election cycle.

But the longer Haass talks, the more aware I become that his seemingly nonpartisan suggestions have a dark shadow. His fourth “obligation,” for instance, is “Remain Civil.” That sounds great, hypothetically. We’ve watched America’s political discourse descend into name-calling and bad-faith rhetoric with remarkable haste. I’d like to see politicians, and the voters who support them, resume treating the other side as ideological cohorts, not enemies to crush.

Dr. Richard Haass

Sadly, exhortations to “civility” have historically been mustered against minority groups demanding even rudimentary reforms. Defenders of the status quo have always accused Civil Rights activists, feminists, and Pride marchers of being “uncivil,” a moving target used to silence any form of dissent. Demands for “civility” usually mean forcing powerless minorities to beg, hat in hand. Protesters like Dr. King quickly learned it’s impossible to be civil enough.

Similar problems taint others of Dr. Haass’ suggestions. Suggestions to “Stay Open to Compromise” are great when talking about, say, taxes and infrastructure. But you can’t, indeed mustn’t, compromise with certain groups. I write in the immediate wake of an avowed White Supremacist shooting up an Allen, Texas, strip mall. You can’t compromise with those who believe other people need to simply die.

The laundry list continues, sadly. Haass urges us to “Value Norms,” without asking who wrote those norms or how they’ve been historically deployed. He tells us to “Support the Teaching of Civics,” even as Florida and other “red” states actively rewrite the entire discipline. Concepts like “norms” and “civics” appear morally neutral, until we make even the most rudimentary efforts to unpack their definitions in specific terms.

On civics specifically, Haass admits we need a meaningful public debate about the best civics curriculum, and I agree. But he advocates “teaching the controversy,” a technique historically used to keep moribund debates alive. Just as Intelligent Design doesn’t belong in science classrooms, because it can’t be tested, White Supremacy and Confederate “Lost Cause” apologia don’t belong in civics classrooms, because they’re inherently violent and uncivil.

These proposals sound great, provided their treatment remains superficial. I, too, would prefer a world where political actors “Reject Violence” and “Put Country First.” But we don’t, as Haass himself acknowledges. Violence, partisanship, and bigotry are all on the table. Many January 6th insurrectionists, a group Haass excoriates by name, justified invading the Capitol by claiming that, if they didn’t do it first, the other side would.

Don’t misunderstand me: I share Haass’ precept that Americans must recommit to the small-d democratic experiment. But this isn’t about atomized individuals being good people. Honestly, anybody reading Haass’ book already probably shares his commitment to America’s common good. We’re witnessing a fundamental institutional breakdown comparable to Belfast during the Troubles, and like Belfast, America needs a recommitment to law.

If most citizens being good people was sufficient, they would’ve won by now.

Monday, April 17, 2023

White Americans and the Death of Rhetoric

Rep. Justin J. Pearson
(campaign photo)

When the pastor approached me after my first time attending an African Methodist Episcopal church service, he didn’t address me in his “pulpit voice.” Though still wearing his liturgical robes, and a sheen of sweat still visible from his aerobic ministerial performance, he nevertheless spoke to me in measured, even tones, like the college-educated professional he is. He expressed genuine avuncular curiosity at my White presence in his Black church.

Before this encounter, I had book-learning awareness of Black religious tradition. I’d seen enough video recordings of Dr. King to know that he didn’t always speak with the heightened bluster he demonstrated at the March on Washington. But there’s something different when sitting in the pews, watching and listening as the pastor modulates his tone. The rise and fall aren’t incidental; the pastor, you soon realize, is leading you on a journey.

Late last week, an edited video emerged of Tennessee State Representative Justin J. Pearson. After the Shelby County Board of Supervisors unanimously reappointed Rep. Pearson to the House of Representatives, reversing spectacularly partisan expulsion from that house, the Representative, little known outside his Memphis district, suddenly became a national figure. And his personal history became subject to tedious media scrutiny.

The video spliced together two moments in Pearson’s life. As a 21-year-old undergraduate at Bowdoin College, Pearson ran for student government. His campaign video shows Pearson with close-cropped hair and a charcoal suit, speaking softly and gesturing directly into a camera. Then a jump-cut leads to Pearson, now 28 and sporting a prominent Afro, shouting with a revival preacher’s gusto during the week between his expulsion and his reinstatement.

Prominent conservatives jumped on this juxtaposition. If Pearson speaks differently to different audiences, the claim goes, then he’s fake and a political liar. Trumpist spokesman Ali Alexander, who is himself of African-American descent, described Pearson’s two voices as signs of “a serious mental illness” in a since-deleted tweet. Right-wing activist Greg Price accused Pearson of “acting like a character from a Madea movie.”

We might excuse this pig-ignorant and unabashedly racist statement dribbling from White boy Greg Price’s mouth, since he’s probably unaware of America’s Black rhetorical tradition. But Alexander, a former ministerial candidate, should know better. He should know that Black pulpit speech uses rhetorical devices which are so time-tested, they’re literally described in the ancient writings of Aristotle and Cicero. Rhetorical devices that White people have forgotten, at great cost.

White Americans have long pined for the perceived authenticity demonstrated by our Black brethren. Perhaps that’s why we honkies have appropriated blues music, hip-hop culture, and FUBU couture. But somehow, in longing for Black authenticity, we’ve abandoned our commitment to Black people. We watch our favorite rappers perform onstage, and never hear them speak. We cherry-pick our favorite Martin and Malcolm sermons, and never glimpse the breadth of their delivery.

It’s easy to forget that, while White people increasingly get our worldviews mediated through television and streaming video, Black Americans receive much more of their contact verbally. Therefore, White public figures modulate their speaking style according to the microphone’s needs, and according to the loss compression caused by the broadcast signal. Black rhetoricians care more about how the sound carries in the available space.

Rep. Justin Jones, also of the Tennessee
Three (news photo)

If Rep. Pearson delivered his campaign video like he delivered his public exhortations, he’d blow out your computer’s speakers. Notice how his impassioned public speech gets distorted in the medium you’re probably using to watch it, your smartphone or laptop. Streaming simply can’t handle that degree of variation. I’ve seen similar technical degradation when my favorite Black preachers stream their sermons on YouTube and Facebook. Technology changes how we speak.

But as I say, these rhetorical flourishes aren’t uniquely Black. The great rhetorical pioneers like Aristotle and Cicero preached the importance of modulating your tone, emphasizing key words through repetition, and baring your emotions onstage. Black public speakers didn’t invent this performative style; White public speakers abandoned it. And then their mostly White audience forgot these techniques ever existed, and now act shocked to see them still being used.

White rhetoricians once loved these techniques. John Wesley, Abraham Lincoln, and William Jennings Bryant were all famous for getting so agitated during their speeches that they began weeping. Patrick Henry declaimed “Give me liberty, or give me death,” then reportedly fell down and played dead in the Virginia House of Burgesses. If we’re invested in authenticity, then running down Rep. Pearson isn’t the way. He’s using techniques our White ancestors perfected, and we forgot.