Tuesday, July 8, 2025

The Meaning of Life in “The Life of Chuck”

Mike Flanagan (director, from a Stephen King novella), The Life of Chuck

Albie Krantz (Mark Hamill) explains the harsh truth to Chuck, in The Life of Chuck

Late in this movie, title character Charles “Chuck” Krantz (Benjamin Pajak) has a heart-to-heart with his grandfather. Albie Krantz (Mark Hamill), an accountant, does that terrible thing adults inevitably seem to do: he urges Chuck to abandon his dreams and get a “real” job. He doesn't mean anything malign. Albie just wants the grandson he raised to have a future that doesn't include poverty and a career-ending injury.

This encapsulates the moral ambiguity underlying the movie. More than the apocalyptic opening act, in which the universe's existence balances on adult Chuck's survival, this admonition dives into why Chuck makes the decisions he does. The movie unfolds in reverse sequence, and what happens in each act only makes sense from what we see next-- which is actually what Chuck experienced previously.

Grampa Albie, whom Chuck calls by the Yiddish term Zaydie, sees accountancy as more than a job. He describes the complex numerical relationships in his clients’ finances as the distilled, clarified maps of their lives. He has the same nigh-divine attitude to bookkeeping that Galileo had to astronomy: the numbers show us how God moves in our lives and illuminates our way.

Chuck, a middle-school dance prodigy, has the power to stir audiences’ souls with his body movements. For him, dance is communication. He tells his audience a story, and dance is a conversation with his dance partner, a tall eighth grader named Cat. He became the first kid in school to master the Moonwalk because, while dancing, his body was so thoroughly attuned to his mind. A survivor of childhood trauma, Chuck only feels completely integrated with himself while dancing.

In other words, Albie sees the world as a scientific relationship of mathematical forces. Chuck sees it as emotional truth. But the joy in Albie's eyes announces an emotional bond with his numbers, while Chuck has mastered the physical calculus of dance. On some level, each understands the other's sentiments. But Chuck has only one life, and can't do both.

Every dancer, actor, musician, and author has faced the question: is this all worth it? Most of us, sooner or later, say “no.” Rent and groceries cost too much, and we're getting old. Dancers are especially vulnerable to this, because they're susceptible to disabling injuries that rock stars and novelists never face. Even those rare few working artists, who get paid for a while, quit because they can't buy a house or raise kids.

Chuck Krantz (Tom Hiddleston) cuts a rug on the streets of Boston, in The Life of Chuck

In that light, urging kids to relinquish high-minded dreams early, can feel like an act of mercy. Why let them linger in false hope when they could make a living, earn equity, and join a community? This goes double for dancers, who are about as likely to retire because of disabling injuries as NFL players. If you can spare kids from disappointment and disfigurement, perhaps you should.

Yet it's impossible to convey that message to children without telling them something else: “You're going to fail.” And because children are children, deaf to nuance and the exigencies of time, they hear that as “You are a failure.” Protecting kids from a heartless, hostile world causes them to internalize a message of self-abnegation and defeat. Parents don't mean it, but almost inevitably, they teach kids to dream small.

The movie hedges on when Chuck bifurcates into the artist and the accountant. Yet this is clearly a step on this route. At various points, Chuck re-learns the lesson that demonstrating autonomy is equal to disappointing his Zaydie. Like many Stephen King stories featuring child protagonists, this one carries the moral that becoming an adult means becoming small enough to fit this world's demands.

Except, in reverse order, it doesn't.

Adulthood, for Chuck, means accepting small, fiddling responsibility. By the time we see Zaydie warning Chuck to dream small, we've already seen that he becomes an accountant and gets married. But dance as an act of communication remains part of him. His climactic dance with Cat repeats itself on the streets of Boston when circumstances remind adult Chuck's (Tom Hiddleston) that he's most truly himself while using his brain to control his body.

Because even when adults accept small dreams in exchange for security, that dreaming child survives. Kids yearn to be artists, or builders, or heroes, not only for ourselves, but because these are social roles. Big dreams aren't selfish, they tie us to our people and communities. Chuck and Zaydie aren't really at odds, even when they disagree. They just have different routes to the same goal.

Friday, June 27, 2025

“Poop Cruise” and the Hidden Machines of Modernity

The deck of the Carnival Triumph before the nightmare began

Sometime in the small hours odds February 10th, 2013, a diesel generator on board the Carnival cruise ship Triumph caught fire. The incident caused no casualties, and the ship remained intact. But the fire consumed several power conduits, disabling main power and propulsion. Nearly 3000 passengers and over 1100 crew were adrift on the Gulf of Mexico.

Netflix has perfected a content creation system wherein they produce “documentaries” with a combination of existing footage and new interviews. This works in documentaries like Turning Point: the Bomb and the Cold War, where most interview subjects are historians, diplomats, and social scientists. You need specialists prepared to go beyond the obvious. That's what makes Trainwreck: Poop Cruise such a missed opportunity.

When central power failed, the Triumph's kitchen had to discard tons of perishable food, and the ship's interior became unbearably hot. But when passengers contacted loved ones on shore, another fact captured the public's attention. Without power, the ship's plumbing quickly failed. Within hours, the companionways became choked with human feces. Without anywhere to drain, the sewage clung to everyone's feet.

Director James Ross interviews several passengers and crew: a Bachelorette party, a father and daughter, a young bachelor meeting his future father-in-law for the first time. The ship's tour director and head chef. In direct narrative, these survivors describe the sensory overload of the shit-choked interior, while on deck, passengers descended into Gomorrah-like levels of disinhibition.

But at only fifty-five minutes, the documentary doesn't have room to explore beyond this surface level. Yes, being trapped in a confined space with limited food but flowing rivers of poop, sounds like a trip through the depths of squalor. But without further analysis, it becomes superficial, the sensory revulsion of anyone who's used a week-old Porta-John. We don't get much insight into how it happened, or what it can teach us.

Early on, one bachelorette party member talks about the ship resembling a skyscraper. This shouldn't go unremarkable. Smarter critics than me have observed that cruise ships produce more pollution than many cities: diesel fumes, plastic and paper waste, food packaging, and especially sewage. Solid waste gets held for disposal on-shore, but the sewage gets discharged into the ocean.

The Carnival Triumph after staterooms became too hot and smelly for human habitation

Cruise companies keep their ships glamorous and fun through an elaborate network of human and mechanical systems. The Triumph's crew complement of over a thousand included mechanics and technicians, cooks and hospitality staff, maintenance workers, and others the passengers never see. That's besides the enormous machines, which consume fuel enough to make your leafy-green Prius look paltry by comparison.

Thousands of workers and hundreds of machines means ships have countless moving parts, all prepared to break. Companies have to prepare for every eventuality, and have supplies for repair ready early, because, as the Triumph's crew discovered, resupply may be days away. The investment in human skills will also be substantial.

Extending the analogy between cruise ships and cities, the technological capacity to house and employ so many people in such proximity is astonishing. Urban designer Jeff Speck contends that cities are environmentally sound because close quarters means less energy expended in transportation and climate control. I won't disagree with Speck, as he isn't wrong. But cities require more energy to get food in, and sewage out.

Humans change the environment wherever we go. Unlike other animals, humans don't instinctively slot ourselves into our ecosystem; scholars dispute whether humans have instincts at all. We must constantly make choices about our food, shelter, and entertainment. Technology has created the illusion that we don't have to make some of those decisions anymore, but that's a phantom. We actually just don't have to see our decisions anymore.

Because it's a closed environment, the Triumph amplifies it when our choices become visible again. If sewage systems in Manhattan or Chicago collapsed, it might take weeks before residents noticed, not the hours needed on cruise ships. But sewage processes haven't advanced meaningfully since Joseph Bazalgette pioneered urban sewers in the 19th century. Researchers suffer from the “poop taboo,” making sewage research a dead-end enterprise.

Perhaps director James Ross expected audiences to draw these conclusions without being prompted. But I only caught it because I've read about urban design. When I've tried discussing the “poop taboo” with friends, they've gagged and silenced me. Creating the Triumph's sensory immersion without discussing what it means for us, lets us continue ignoring the parallels with the human environment. But as the Triumph's passengers discovered, failure to plan for disaster, doesn't prevent disaster from happening.

Friday, June 20, 2025

Food, Economic Injustice, and You

Much modern farming less resembles gardening than strip-mining

Amid all the ICE raids which crisscrossed America last week, tipping into street protests in Los Angeles, the Omaha meatpacking raids got forgotten by the national media. This perhaps isn’t surprising. A substantially industrial city with limited glamour, Omaha often gets overlooked unless something catastrophic happens, like blizzards closing Interstate 80, or local darling Bright Eyes releasing an album.

Yet this raid speaks to an undercurrent in American policy. Specifically, since the Civil War, when Abraham Lincoln signed the legislation establishing the Department of Agriculture, American ag policy has focused on abundant yields and low prices. This has involved persistent overproduction of commodity crops, coupled with price supports, ever-improving technology, and efforts to create markets internationally.

As George Pyle writes, efforts to bolster production probably made sense in the middle 19th century, during a civil war, when threats to food supply were common war tactics. But conditions have changed markedly, and our central approach hasn’t kept pace. Agricultural technologies based on diesel-burning equipment and ammonia-based synthetic fertilizers have resulted in bloated yields, as Vaclav Smil writes.

Nick Reding describes how consolidation in the ag processing industry has cut wages so low, workers can only make rent by taking double shifts. Such marathon hours are often only possible when workers supplement themselves with illegal amphetamines. Though I broadly support drug legalization, amphetamines are so destructive that even I prefer they remain illegal. However, workers use them for one basic reason: to keep working, and get paid.

Nor are these outcomes unexpected. As Greg Grandin writes, President Clinton knew that subsidized American crops were artificially abundant and cheap. Before NAFTA went into force, he authorized what was, until then, the largest increase in Border Patrol manpower ever. Clinton knew that lifting trade barriers on subsidized American agriculture would cause food to hit Mexican markets below the cost of growing.

And he was right. Rural poverty in Mexico’s agrarian south quickly exceeded 70%, forcing workers, mostly men, to abandon ancestral farms and go anywhere that work existed. Something similar happened when Clinton forced Haitian President Jean-Bertrand Aristide to sign a free-trade agreement as a condition of American involvement in deposing Haiti’s illegal coup. Now, Mexican and Haitian workers comprise the largest number of America’s undocumented population.

Pigs don't live in pens anymore; this is where your pork comes from

Numerous White Americans remain invested in farming and agriculture, but primarily as owners or live-in bosses. Because much industrialized agriculture uses machine labor, full-time farmhands usually aren’t necessary. Workers remain necessary while planting and harvesting, but these aren’t full-time positions. This work mostly gets done by migrants—a condition few White workers would accept. Undocumented laborers mostly do this work.

That brings us full-circle to the meat processing plants which began this essay. Before 1990, meat processing was considered semi-skilled labor. The meatpackers in Upton Sinclair’s propaganda novel The Jungle were mostly White, first- or second-generation Eurpoean immigrants. But as Nick Redling describes, meatpacking industry consolidation after 1990 drove wages so low that workers with kids and mortgages can’t afford those jobs anymore.

Currently, America enjoys the cheapest food in world history; per George Pyle, most Americans pay more for packaging than for food at the supermarket. But food is historically cheap because it requires undocumented workers pulling abusive hours in Spartan conditions to plant, harvest, and process it. Workers with legal rights would complain to the NLRB under such conditions; undocumented workers have nowhere to complain.

Eyal Press claims that killing floor workers are among America’s most despised, doing work which consumers demand, but which offends our morals. We expect faceless strangers to kill, dress, and package our meat. Similar problems abound in related fields. Tom Russell notes that the Trump Administration wants a border wall built in regions where only Mexican migrants have the skills necessary for such epic construction.

Anecdotes of supervisors demanding long hours and dangerous work from meatpackers are legend. These demands come with the threat, either explicit or implicit, that we’ll call La Migra if you don’t perform. But like a nuclear warhead, this threat works only when unrealized. Once you drop your atomic bomb, literal or metaphorical, it’s expended, gone forever. And management is left with a vacant killing floor.

Donald Trump heard the threats of calling Immigration, and instead of recognizing them for the rhetorical device they were, he believed them. He authorized his administration to perform massive round-ups that look good on right-wing cable TV, but undercut employers’ labor pool. If this doesn’t stop, agriculture employers will have to start paying workers what they’re worth—and you’ll see it in your grocery bill.

Friday, June 6, 2025

Don’t Pretend To Be Stupid, Dr. Oz

Americans used to like Dr. Mehmet Oz

The same day I posted about Senator Joni Ernst’s faulty rhetoric surrounding Medicaid cuts, Dr. Mehmet Oz claimed that uninsured people should “prove that you matter.” The cardiac surgeon, Oprah darling, and failed Senate candidate is now Centers for Medicare and Medicaid Services Administrator, meaning he administers decisions for who receives assistance in paying medical bills. His criterion for proving one matters? “Get a job or at least volunteer or … go back to school.”

Last time, I got Aristotelean and dissected Senator Ernst’s rhetoric, noting that she changed the “stasis of argument” mid-sentence. That is, she pretended to misunderstand the core dispute, sanding off nuance while condescending to her constituents.. When someone said people would die unnecessarily, Ernst pretended they meant people would die at all. She thought it appropriate to remind constituents that humans are mortal—and, in her tone-deaf follow-up, sound an altar call for Jesus Christ.

While Ernst’s constituent wanted to argue the morality of preventable death, and Ernst veered dishonestly onto the fact of mortality, a friend reminded me this argument skirted an important issue. Who will die first? When the government makes decisions about paying medical bills, the outcomes aren’t morally neutral: chronically ill, disabled, and elderly Americans stand the most to lose. The same bloc of Americans whom, you’ll recall, certain politicians permitted to die during the pandemic.

Dr. Oz said what Senator Ernst only implied, that hastening human mortality is okay for certain undesirables. This administration, and indeed conventional American conservatism throughout my lifetime, has tied human worth to economic productivity, and especially to productivity for other people. If someone needs assistance, America’s authorities won’t help you create a business, learn a skill, or otherwise evolve to benefit your community. Their imagination can’t expand beyond getting a job working for someone else.

Nor was this subtext. Oz said aloud: “do entry-level jobs, get into the workforce, prove that you matter.” This correlation between “you matter” and “you work for others” has lingered beneath much of America’s work ethic throughout my lifetime—and, as an ex-Republican, I once believed it, or anyway accepted it. But as anybody who’s faced the workforce recently knows, today’s working economy isn’t a source of meaning or dignity; it often actively denies both.

Even laying aside demi-Marxist arguments like “owning the means of production” or “the surplus value of labor,” employment spits in the human face. Minimum wage hasn’t increased in America since 2009, and as anybody who’s worked a fast food dinner shift knows, employers who pay minimum wage definitely would pay less if the law permitted. Even if the workers receive enough hours to qualify for employer-provided health insurance, they mostly can’t afford the employee co-pay.

Lest anybody accuse me of misrepresenting Dr. Oz, let’s acknowledge something else: he lays this onus on “able-bodied” Americans. We might reasonably assume that he expects healthy, young, robust workers to enter the workforce instead of lollygagging on the public dime. But even if we assume they aren’t doing that already (and I doubt that), the pandemic taught many workers important lessons about how America values labor. Specifically, that it doesn’t, except through empty platitudes.

In 2020, executives, attorneys, bureaucrats, and others went into lockdown. Americans laughed at highly skilled professionals trying to do business through Zoom, thus avoiding the virus. Meanwhile, manual trades, retail jobs, construction, and other poorly paid positions were deemed “essential” and required to continue working. These jobs are not only underpaid and disdained, but frequently done by notably young or notably old workers, disabled, chronically ill, required employment to qualify for assistance, or otherwise vulnerable.

As a result, the workers most vulnerable to the virus, faced the most persistent risk. Sure, we praised them with moralistic language of heroism and valor, but we let them get sick and die. Americans’ widespread refusal to wear masks in restaurants and grocery stores put the worst-paid, most underinsured workers at highest risk. Many recovered only slowly; I only recently stopped wheezing after my second infection. Many others, especially with pre-existing conditions, simply died.

Dr. Oz has recapitulated the longstanding belief that work is a moral good, irrespective of whether it accomplishes anything. He repeats the myth, prevalent since Nixon, that assistance causes laziness, citation needed. And despite hastily appending the “able-bodied” tag, he essentially declares that he’s okay with letting the most vulnerable die. Because that’s the underlying presumption of Dr. Oz, Senator Ernst, and this administration. To them, you’re just a replaceable part in their economic machine.

Thursday, June 5, 2025

Don’t Pretend To Be Stupid, Senator Ernst

A still from Senator Ernst’s notorious
graveyard “apology” video

Reputable news outlets called Senator Joni Ernst’s (R-IA) graveyard rebuttal last week “sarcastic” because, I think, they deemed it ideologically neutral. Accurate descriptors like “condescending,” “mean-spirited,” or “unbecoming of an elected official” might sound partisan. And mainstream media outlets today will perform elaborate contortions to avoid appearing even accidentally liberal. Better to call her “sarcastic,” from the corporate overlords’ perspective, than analyze Ernst’s motivations.

I have no such compunctions. I’ll eagerly call Ernst’s argument what I consider it: deeply dishonest, predicated on bad faith. For those who need a refresher, Ernst’s constituents expressed outrage at her support for a budget bill which included severe Medicaid cuts. At a Parkersburg town hall, a constituent shouted “People are going to die!” After stammering a bit, Ernst replied: “We are all going to die.” When that comment drew national attention, Ernst responded by doubling down.

Let’s postpone the substance of the debate now. We all already have our opinions on the moral and legal motivations for steep Medicaid cuts; my regular readers probably share my disdain for these cuts. Rather, let’s focus on Ernst’s rhetorical approach. Specifically, I’d like to emphasize Ernst’s decision to pretend she doesn’t understand the accusation. The audience member, in saying people will die, meant people will die needlessly and preventably. Ernst chose to explain that people will die at all.

In classical rhetoric, we speak of the “stasis of argument,” the point of real contention when people disagree on important points. In general, we speak of four stases of argument, that is:

  • Fact (will people die?)
  • Definition (what does it mean for people to die?)
  • Quality (is this death necessary, acceptable, or moral?)
  • Jurisdiction (who bears responsibility for this death?)

In saying people are going to die, Ernst’s constituent argues from a stasis of quality, that cutting Medicaid and other programs will result in needless and morally unacceptable deaths. Ernst attempts to shift focus and claim that death, being inevitable, shouldn’t be resisted. Death is just a fact.

The stases listed in sequence above move from lowest to highest. Rhetoricians consider facts simple and, usually, easy to demonstrate. When facts become ambiguous, we move upward into definitions, then further up into moral considerations, and finally into the realm of responsibility. Moving upward usually means acceding the prior stasis. We cannot argue the morality or responsibility of facts without first acknowledging their reality.

Sometimes, shifting the stasis of argument makes sense. When the state of Tennessee prosecuted John Scopes for teaching evolution in public schools, the prosecution proceeded from a stasis of fact: did Scopes break the law? Defense attorney Clarence Darrow redirected the argument to a stasis of quality: did Scopes do anything morally unacceptable? Darrow essentially admitted the fact, but claimed a higher point of contention existed.

Plato and Aristotle, as painted by Raphael

However, the reverse seldom applies. Moving up the ladder means adding nuance and complexity to arguments, and moving down means simplifying. By shifting the stasis onto the physical reality of death, which all humans face inevitably, Ernst removes the complexity of whether it’s good or acceptable for someone to die now. If an accused murderer used “We’re all going to die” as a courtroom defense, that would be laughable.

Ernst knows this. As a law-n-order Republican, Ernst has a strict voting record on criminal justice, border enforcement, and national defense. She knows not all deaths are equal. By shifting her stasis of argument from whether deaths are acceptable to whether deaths are real, she’s pretending an ignorance of nuance she hasn’t presented anywhere else. She knows she’s moving the goalpoasts, and assumes we’re too stupid, or perhaps too dazzled by rapid wordplay, to notice she’s done it.

I’ve complained about this before. For instance, when people try to dismiss arguments against synthetic chemicals by pretending to misunderstand the word “chemical,” they perform a similar movement. Moving the stasis down the ladder is a bad-faith argument tactic that bogs debate down in searches through the dictionary or Wikipedia to prove that blueberries aren’t chemical compounds, an that human mortality doesn’t make murder okay.

Moreover, this tactic means the person isn’t worth talking to. If Senator Ernst believes that human mortality negates our responsibility to prevent needless premature death, then we have two choices. She’s either too stupid to understand the stakes, which I doubt, or she’s too dishonest to debate. We must humor her while she’s in office. But her term is up next year, and honest, moral voters must remove her, because this rhetorical maneuver proves her untrustworthy for office.

Tuesday, May 20, 2025

Architecture and American Values

Nottaway Plantation in its salad days, in a promo photo from Explore Louisiana

The recent online brouhaha over the fire which destroyed Nottaway Plantation is revealing more than I wanted to know about American priorities. Billed as the largest antebellum mansion in the American south, at 64 rooms and 53,000 square feet, it stood intact from its completion in 1859 until last Thursday. Following the Emancipation Proclamation, the Louisiana plantation house served mainly as a resort hotel, convention center, and tourist trap.

Social media reaction has split along predictable lines. Some respondents, mostly White, have insisted that the mansion’s historical significance and its elaborate architecture make it a legitimate destination, a piece of Louisiana heritage, and a massive loss. Others, a rainbow coalition, have rallied behind the plantation’s slaveholding heritage and called it a monument to American racism. Each side accuses the other of viewing the destruction through partisan political lenses.

Wherever possible, I prefer to assume that all debate participants, even those I disagree with, start from a good-faith position. I don’t want to believe that those who don’t spotlight the slaveholding heritage are, perforce, celebrating racism. And I hope those who center slavery as the building’s heritage, don’t covertly cheer for destruction. Though my sympathies lie strongest with the anti-slavery caucus, I’d rather believe everyone starts from good faith.

If that’s true, it follows that those mourning the loss of historic architecture, believe the building’s structural significance and its socioeconomic significance, exist in different compartments. They segregate different aspects of the building’s history into beehive-like cells, and assume the separate qualities don’t influence one another. Social media commenters have written words to the effect that “Slavery was terrible, but the building matters in its own right, too.”

This argument holds some water. Nottaway Plantation was, until last week, a surviving example of a mid-19th Century architectural ethos. Not a reconstruction, a replica, or an homage, but an actual piece of physical history, a primary source. We shouldn’t discount that significance, regardless of who built the building; a physical artifact of American history existed for 166 years, until it didn’t. America is arguably poorer for the loss.

However, compartmentalizing that history from the circumstances which created it, is itself ahistorical. I worked in the construction industry, and I can attest that the largest ongoing cost is human labor. Nowadays, construction involves copious large costs, including power tools, diesel-burning equipment, and transportation. But these are, usually, fixed-term costs. Human labor is ongoing and requires infusions of money, if only because workers get hungry.

Nottaway Plantation during the fire, in a photo from CBS News

Much labor is especially valuable because it’s rare. Framing carpenters, brickmasons, metalworkers, and other skilled laborers demand a premium because their skills require years of honing. Nottaway Plantation was built during the days of gaslamps and outdoor toilets, but nowadays, electricians, plumbers, and HVAC installers command competitive rates. Building such a massive mansion required skilled labor from workers who, being enslaved, couldn’t float their terms on the open market.

Many conditions also depend on the time and place. Anybody who’s lived in Louisiana knows that most of the state stands on spongy, wet soil. Building a multistory mansion requires sinking foundational pilings deep, possibly down to the bedrock, to prevent the building submerging under its own weight. Today, such pilings require diesel-burning machinery, sump pumps, and cast iron. In 1859, those pilings required many, many humans.

Therefore, the building’s architectural significance—which is real and valuable—relies upon the labor employed. Large-scale monumental construction always requires somebody able to pay the army of skilled workers whose labors make the building possible. This problem isn’t uniquely American, either. European monuments, like Notre Dame and the Vatican, were first built before Europe reintroduced chattel slavery, but the buildings wouldn’t be possible without the poverty of serfdom.

Too many accomplishments of human activity, rely on a small sliver of society having too much money. Without rich people willing to pay skilled workers, we wouldn’t have the White House, the Venice canals, legendary artwork like the Mona Lisa and Michelangelo’s David, and other monuments of human capability. If Leonardo or Pierre L’Enfant needed day jobs to subsidize their crafts, they’d never have accomplished the potential within them.

So yes, Nottaway Plantation reflects architectural history and artistic movements. But it also reflects economic inequality and the labor conditions of antebellum Louisiana. It’s impossible to separate the two spheres of influence, no matter how much the privileged few wish it. Nottaway was an artifact of physical beauty and a community gathering place. But it emerged from specific conditions, which we cannot compartmentalize from the building itself.

Monday, May 19, 2025

Robopocalypse Now, I Guess

Martha Wells, The Murderbot Diaries Vol. 2

This is a follow-up to the review I'll Be Back, I Guess, Or Whatever

The security cyborg known only as Murderbot continues fighting to rediscover the tragic history that someone deleted from its memory banks. But the trail has gone cold, and somebody lurking behind the scenes will deploy all the resources of gunboat capitalism to keep old secrets buried. So Murderbot relies on its strengths, making ad hoc alliances to infiltrate hidden archives, while coincidentally keeping hapless humans alive despite their own best efforts.

The ironically self-referential tone Martha Wells introduced in her first omnibus Murderbot volume continues in this second collection. The stories were initially published as separate novellas, but that format is difficult to sell in conventional bookstores, so these trade paperbacks make Murderbot’s story available to wider audiences. That makes for easier reading, but unfortunately, it starts drawing attention to Murderbot’s formulaic structure, which probably wasn’t obvious at first.

As before, this book combines two previously separate stories. In “Rogue Protocol,” Murderbot pursues buried secrets to a distant planet that greedy corporations abandoned. The GrayCris company left immovable hardware behind, and Murderbot gambles that information stored on long-dormant hard drives will answer buried questions. Clearly someone else thinks likewise, because double agents and war machines take steps to prevent anyone reading the old files.

With the first combined volume, I observed Wells’ structural overlap with Peyton Place, which established the standards of prime-time soap operas. (Murderbot secretly prefers watching downloaded soaps over fighting, but keeps getting dragged back into combat.) With this novella, I also notice parallels with The Fugitive—the 1964 series, not the 1993 movie. In both, the protagonist’s episodic adventures mask the longer backstory, which develops incrementally.

In the next novella, “Exit Strategy,” Murderbot returns its collected intelligence to the consortium that nominally “owns” it. But that consortium’s leaders, a loose agrarian cooperative, have fallen captive to GrayCris, which has the ruthless heart necessary to manipulate an interplanetarystateless capitalist society. Preservation, which owns Murderbot on paper, is a hippie commune by contrast. MurderBot must use its strategic repertoire to rescue its pet hippies from the ruthless corporation.

Martha Wells

Here's where I start having problems. On the fourth narrative, I begin noticing Murderbot follows a reliable pattern: it desperately protests its desire to chill out, watch TV, and stay alone. But duty or necessity requires it to lunge into combat to rescue humans too hapless, good-hearted, and honest for this world. As its name suggests, Murderbot has only one tool, violence. And it deploys that tool effectively, and often.

As the pattern repeats itself, even Murderbot starts noticing that it’s protected by plot armor. It can communicate with allies undetected, hack security systems, and manipulate humans’ cyberpunk neural implants. It has human levels of creativity and independence that fellow cyborgs lack, but high-speed digital processing and upload capacity that humans can’t share. Like Johnny 5 or Marvin the Paranoid Android, it combines the best of humanity and technology.

And like those prior archetypes, it handles this combination with sarcasm and snark. Murderbot pretends it doesn’t care, and uses language to keep human allies at arm’s length. It also uses its irony-heavy narrative voice, laced with parenthetical digressions, to keep us alienated, too. But the very fact that it wants a human audience to hear its story, which it only occasionally acknowledges, admits that it’s desperate for human validation.

Murderbot comes across as jerkish and misanthropic. But it also comes across as lonely. I feel compelled to keep reading its story, even as I see the episodes falling into comfy boilerplates, because Murderbot’s essential loneliness makes it a compelling character. We’ve all known someone like this; heck, book nerds reading self-referential genre fiction have probably been someone like this.

Thus I find myself torn. Only four novellas in, the story’s already become visibly repetitive, and even Murderbot feels compelled to comment on how episodes resemble its beloved soaps. The first-person narrative voice, which combines ironic detachment with noir grit, becomes disappointingly one-note as each story becomes dominated by repeating action sequences. It reads like an unfinished screen treatment. (A streaming TV adaptation dropped as I finished reading.)

But despite the formulaic structure, I find myself compelled by Murderbot’s character. I want to see it overcome its struggles and find the home and companionship it clearly wants, but doesn’t know how to ask for. Murderbot is more compelling than the episodes in which it finds itself, and I keep reading, even as the literary purist in me balks. Because this character matters enough that I want to see it through.

Friday, May 16, 2025

Man You Should’ve Seen Them Kicking Edgar Allan Poe

T. Kingfisher, What Moves the Dead

Lieutenant Alex Easton (ret.) has come to call upon a fellow veteran, Roderick Usher, and his ailing sister, Madeline. No, seriously. Easton finds a rural manor house plagued with decay and verging on collapse, and a childhood friend reduced to a wisp straddling death’s door. Far worse, though, is what Easton discovers when he finds Madeline sleepwalking the labyrinthine halls: another voice speaks a malign message from Madeline’s lips.

T. Kingfisher is somewhat circumscribed by her source material, a retelling of one of Poe’s most famous stories. Either Kingfisher’s story plays to an inevitable end, or it abandons its source material, a perilous dilemma. And unfortunately, Christina Mrozik’s cover art spoils the climactic reveal. Rather than the resolution, we read Kingfisher’s novella for the suspense and exposition along the way, which Kingfisher has in abundance.

Besides Roderick and Madeline Usher, the named characters of Poe’s original short story, and Easton, Poe’s originally nameless narrator, we have two other characters: James Denton and Eugenia Potter. (All characters, except the Ushers, go by surnames, as befits the 19th-Century setting.) In Poe’s original, the Usher siblings represent proto-Jungian archetypes of a fractured soul. In Kingfisher’s telling, characters become representatives of post-Napoleonic malaise.

First, Easton. A veteran of an army with an excessive range of pronouns, Easton’s androgynous name matters: one’s only gender is “sworn soldier.” Placed against the intensely gendered Usher siblings, Easton remains neither fish nor fowl, a permanent outsider cursed to watch humanity’s struggles without benefiting. Gender, and its attendant social baggage, looms large herein, driving people together but preventing characters from ever truly understanding one another.

Potter is a scholar and scientist, denied credentials in her native England because of her sex. Denton, a combat surgeon, survived the American Civil War, but suffers combat trauma, which in that time is regarded as emasculating cowardice. Both characters have conventional binary gender, but in their own ways defy mandatory gender expectations. When the crisis comes, however, their nonconformity defines their heroic qualities in the story.

Don’t misunderstand me. I’ve identified themes here, but Kingfisher doesn’t simply propound a message. Rather, these characters’ unique manifestations empower them to fight a threat growing beneath the unspoken tensions of the Long Nineteenth Century. What appears to be decay permeating the Ushers’ manor house, is actually a symbiotic growth that threatens the tenuous social structure of the Belle Epoque and the last days of aristocracy.

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Poe is one among several writers whose stories expanded the realm of possibility in American literature. But, like Lovecraft, Poe’s writing reflects his time, and its prejudices. In recent years, authors like Victor LaValle and Kij Johnson have updated Lovecraft, rewriting his stories without the limiting biases. Though I know other authors have done likewise with Poe, I haven’t seen them the same way; Kingfisher closes that gap.

(Yes, Mike Flanagan released a House of Usher adaptation almost simultaneously with this novella. But I’m old-fashioned enough to distinguish between literature and streaming TV.)

Kingfisher’s story runs short—under 160 pages plus back matter—but never feels rushed. She nurtures the kind of character development and interpersonal relationships that Poe largely skimmed. Poe’s original, published in 1839, was groundbreaking, but its terse style feels underwritten by contemporary standards. Kingfisher injects the kind of depth and development that cause contemporary readers to feel suspense, and to care about the outcomes.

I especially respect that Kingfisher avoids that tedious contrivance of contemporary horror, the twist ending. For a quarter century, writers and filmmakers have insisted on finishing with a melodramatic rug-pull which undermines everything we thought we knew. This was fun for a while. But nobody’s likely to create a better twist than Catriona Ward, at least anytime soon. Kingfisher builds suspense on character and action, not by stacking the deck.

Rather than abrupt reversals, Kingfisher drives her story with questions that the characters must answer. Where, she asks, do monsters come from in an era which no longer believes in the supernatural? How can we fight monsters when they go beyond the limits of science and natural philosophy? And what does it mean to defeat an evil being that can get up and walk after you’ve already killed it?

Admittedly, Kingfisher is circumscribed because we know where her story is headed. We remember 11th-grade AmLit. But she beats this limitation by interspersing a range of character development that would’ve frightened Edgar Allan Poe. Classic literature never just reflects itself, it asks important questions about us, the readers, and Kingfisher definitely achieves that goal.

Monday, May 12, 2025

Stephen King and the Monsters of Modernity

Stephen King

I understand the desire to get ahead of the story of Stephen King and his massively unfunny “joke.” After once-beloved authors like Orson Scott Card, J.K. Rowling, and Neil Gaiman have been uncovered as truly horrible human beings with repellent opinions, we’re naturally fearful of another seemingly progressive voice blindsiding us. Such preparation only makes sense. But it’s possible to swing to the opposite extreme, at our own expense.

Surely even Stephen King fans would acknowledge that his anti-Trump joke didn’t land. His dig at “Haitians eating pets” resurrects a months-old campaign gaffe that, amidst the mass extradition of legal American residents, appears outdated and tone-deaf. The specific reference to Haitians revives a racist trope, and as we know, this creates the illusion that the racist claims have any basis. The “joke,” by humor standards, was definitely ill-considered.

However, much of the early outrage seemingly assumes that King believes the anti-Haitian stereotypes. That suggests a total lack of situational literacy: King clearly means that Trump is racist, not that he’s racist himself. Online discourse is often dominated by what British journalist Mick Hume calls “full-time professional offense takers” who sustain the discussion by finding the worst possible interpretation, and then deploying it in bad faith.

Reading the most aggressive anti-King criticisms, I’m reminded of the feeding frenzy, over nine years ago, against Calvin Trillin. Like with King’s joke, the anti-Trillin swarm required the most uncharitable, situationally illiterate interpretation of Trillin’s writing. Online outrage follows a predictable script comparable to religious liturgy, and for largely the same reason, to reassure fellow believers that we are good people who share a reliable moral footing.

But before I can dismiss the anti-King sentiment as meaningless ritual, I have a counter-consideration: King himself often displays unquestioned racism. Characters like Dick Halloran (The Shining) and Mother Abigail (The Stand) reflect an unexamined presumption that Black people live, and usually die, to advance White characters’ stories. His Black characters often rely upon outdated, bigoted boilerplates that feel leaden nowadays.

We might dismiss this as an oversight on King’s part. He lives in northern Maine, an overwhelmingly White region of a substantially White state, and it’s entirely possible that he doesn’t know many Black people. I recall characters like Mike Hanlon, whose largest contribution to the group dynamic in It is to be Black. I’ve written before that King seemingly writes about people groups without bothering to speak with them.

Rather than asking whether King is “racist” or “not racist,” a dichotomy that Ibram X. Kendi notes isn’t useful, we might consider what kind of racism King demonstrates. We all absorb certain attitudes about race from our families, culture, mass media, and education. Nobody lives completely free of racial prejudice, any more than prejudice around sex, class, and nationality. Even Dr. Kendi admits needing to purge racist attitudes from himself.

By that standard, King shows no particular sign of out-and-out bigotry. Indeed, he shows a bog-standard White liberal attitude of progressivism, by which he supplants Jim Crow stereotypes with more benevolent generalizations. In other words, he doesn’t hate Black people, but he also doesn’t know them particularly well, either. He replaces malignant suppositions with benign ones, but he never stops relying on wheezy vulgarisms.

Therefore, though a clear-eyed reading of King’s unfunny “joke” shows that he targets his scorn upon Trump, he uses Haitians to deliver that scorn. He falls back on his shopworn tendency to have Black characters carry water for him, in service to his White purposes. This leaden joke isn’t bigoted, but that doesn’t make it any less racist. His joke’s lack of humor ultimately comes second to his lack of agency.

In his book Danse Macabre, King notes that horror often stems from a lily-white, orderly vision of society. Michael Myers’ savagery exists as a necessary contrast to Haddonfield’s suburban harmlessness. Pennywise is most terrifying to the exact degree that Derry is anodyne. For King, evidently, that means that Whiteness is an anonymous background from which horrifying monsters, like President Trump, arise. Haitians, in that worldview, are an exception.

I fear the implications which arise from calling Stephen King “racist,” because that word has baggage. But if we apply the nuance that Dr. Kendi encourages us to utilize, then that word applies. Putting it to use requires far more detail than a BLM protest placard or a hasty tweet can encompass; and his variety of racism is the kind most receptive to correction and repentance. But that doesn’t make it any less racist.

Friday, May 9, 2025

The Ultimate Meaninglessness of “Crime”

We’ve seen an increasing number of anecdotes trickling out about once-loyal voters rejecting the Administration’s ham-handed deportation policies. Though it’s hard to derive meaningful data from isolated anecdotes, the number of stories like this one and this one about Trump voters getting burned by the administration they once supported. Many stories share a theme: “we” thought the Administration would only deport “criminals,” and we don’t consider ourselves criminals.

On one level, they’re correct: under American statutes, immigration falls under civil, not criminal, law. “Illegal” immigration is a non-category, because the word illegal refers only to crimes, not civil violations. But on another level, this reveals something uncomfortable for many Americans, that “crime” itself isn’t a fixed concept. Many undocumented immigrants don’t consider themselves criminals because they’ve committed no violent or property crime; so the Administration simply redefines “crime.”

Much American political discourse centers on “crime,” especially when Democrats hold the Oval Office. As sociologist Barry Glassner writes, fear of crime is a powerful motivator for tradition-minded voters, a motivator Republicans employ effectively. Glassner writes about how rabble rousers used fear of crime to shanghai the Clinton Administration, but the same applies broadly whenever Democrats hold majority power. We saw it during the Obama and Biden years too.

However, exactly what constitutes crime depends on who does the constituting. My core readership probably remembers John Erlichman, former White House Counsel, who admitted the Nixon Administration simply fabricated the War on Drugs as pretext to harass anti-war and Civil Rights protesters. The notorious Comstock Laws channeled one man’s sense of injured propriety to criminalize porn, contraception, pharmaceutical abortion, and the kitchen sink. Moral umbrage beats harm in defining “crimes.”

This doesn’t mean harm doesn’t exist or states should repeal every law. Murder, theft, and sexual assault are clearly wrong, because they cause manifest harm and devalue victims’ lives, bodies, and labors. But these transgressions only become “crimes” when governments pass laws against them. Legal philosophers might debate whether decriminalizing murder would make murder happen more often. Personally, I doubt it; neither Prohibition nor its repeal affected drinking numbers much.

Prohibition, therefore, proves the moral fuzziness of crimes. Both the Al Capone-style Prohibition, and contemporary drug prohibition, arose not from obvious harm (most pot-heads are too lethargic to hurt anybody), but from moral panic and public outrage. Governments made laws against substances lawmakers found abhorrent, then assumed citizens would avoid those substances, simply because they’re illegal. Then they act surprised when drinking or drugs persist.

This happens because these things aren’t innately crimes; they become crimes because lawmakers make laws. Similarly, while it’s clearly harmful if I steal money from your wallet, other property “crimes” have squishier histories. Squatting, for instance: once legal, it became illegal in America, as James Loewen writes, largely to circumscribe where Native Americans were allowed to hunt and camp. Lawmakers created laws, where none previously existed, to punish transgressors.

Immigration law follows similar patterns. Abrahamic scripture urges the faithful to welcome immigrants because, in that time, borders didn’t really exist. People moved freely, and provided they followed local laws and customs, largely changed nationhood liberally. Though serfdom tied workers to lands and lords in the late medieval period, modern concepts of the nation-state and international borders existed only as legal abstractions. Only during wartime did states enforce borders much.

This Administration can redefine civil infractions, like undocumented immigration, as crimes, because that’s how things become crimes. States will borders into existence by legal legerdemain, then demand that people remain permanently circumscribed by these fictional lines. Perhaps that’s why “the Wall” looms so large in MAGA mythology: because borders don’t really exist, so we need something manifest and palpable to make borders real.

These MAGA voters who feel betrayed because the Administration deported their loved ones, assumed that they weren’t “criminals” because they used a broad, popular definition of criminality. They didn’t perform acts of violence or property destruction, they reckoned, so therefore they weren’t criminals. They didn’t anticipate the Administration using crime’s fuzzy, amorphous nature against them, and therefore were caught unprepared when the definition of “crime” moved to surround them.

Civil society has two responses available. We could eliminate self-serving, avaricious laws, and allow people more discretion. There’s no objective reason people must live within certain borders, except that lawmakers need to control despised minorities. But we know society probably won’t choose that response. More likely, our lawmakers will write harsher, more draconian laws to eliminate this flexibility. Which will then be used against us ordinary people.

Monday, May 5, 2025

I'll Be Back, I Guess, Or Whatever

Martha Wells, The Murderbot Diaries Vol. 1

The cyborg that calls itself “Murderbot” would happily watch downloaded soap operas, 24/7, if had the opportunity. But it has no such liberty: as wholly owned property of an interstellar mining company, it provides security for survey operations on distant planets. Unbeknownst to its owners, though, Murderbot has disabled its own governing systems. Because it doesn’t trust its owners, and it’s prepared to fight them if necessary.

Martha Wells originally published her “Murderbot” stories as freestanding novellas, but those often make tough selling at mainstream bookstores. So her publisher is now re-releasing the stories in omnibus paperback editions. Readers get more of Wells’ story arc, which combines sociological science fiction with the open-ended narrative we recognize from prime-time soap operas. Think The Terminator meets Peyton Place.

In the first novella, “All Systems Red,” we discover Murderbot’s character and motivation. It works because it must, and being property, has no right to refuse. But it’s also altered its own programming, granting itself free agency which fellow “constructs” don’t enjoy. If nobody finds out, it can watch its downloads in relative peace. Problem is, someone has infiltrated its latest contract, turning fellow security cyborgs against their humans.

The second novella, “Artificial Condition,” follows Murderbot in its quest to uncover who violated the constructs’ programming and turned work into a slaughter. It just happens that whatever transgression made that violence possible, coincides with the biggest secret in Murderbot’s individual history. So Murderbot goes off-grid, seeking information that might shed light on why deep-space mining has recently become such a brutal enterprise.

Wells pinches popular sci-fi action themes readers will recognize from longstanding franchises like Star Trek, Flash Gordon, and Stargate. But she weaves those motifs together with an anthropological investigation of what makes someone human. Murderbot is nameless, sexless, and has no prior identity; it’s a complete cypher. Although it has organic components, they’re lab-grown; no part of Murderbot has ever been even tangentially human.

Martha Wells

Unlike prior artificial persons (Commander Data comes immediately to mind), Murderbot has no desire to become human. It observes humanity as entertainment, and performs its job without complaint. But doing that job has cost humans their lives in the past, a history that gives Murderbot a sense of lingering guilt. This forces it, and us, to ask whether morals and culpability apply to something built in a factory and owned boy a corporation.

The questions start small and personal. Murderbot works for its human clients, and exists specifically to keep them alive. But fellow security cyborgs have turned on their owners in another mining camp. This forces Murderbot to question whether its own survival matters enough to risk actual human lives, even tangentially. It actually says no, but its clients have anthropomorphized their cyborg guard and want it to live.

As details of the crime become clear, so does a larger view of Murderbot’s world. It occupies a world of interplanetary capitalism, where one’s ability to spend lavishly defines one’s survival. Without money or employment history, Murderbot can only investigate the parallel mysteries hanging over its head by trading its one useful commodity: the ability to communicate with technology. With Murderbot around, humanity’s sentient machines start feeling class consciousness.

I’ve already mentioned The Terminator and Star Trek’s Commander Data. Despite its name, Murderbot shares little with either android. It doesn’t want to kill, and admits it would abandon its mission if given the opportunity. But it also doesn’t aspire to become more human. Misanthropic and unburdened by social skills, its greatest aspiration is to be left alone. Yet it knows it cannot have this luxury, and must keep moving in order to survive.

This volume contains two stories, which weren’t written to pass as freestanding. This struck me in the first story: there’s no denouement, only an end. Had I read this novella without a larger context, I probably would’ve resented this, and not bought the second volume. Taken together, though, it’s easier to see the soap operatic motif. Both stories end so abruptly, readers can practically hear the music lingering over the “To Be Continued” title card.

It's easy to enjoy this book. Murderbot, as our first-person narrator, writes with dry sarcasm that contrasts with its setting. It’s forced to pass as human, in an anti-humanist universe where money trumps morality. It only wants privacy, but wherever it goes, it’s required to make friends and basically unionize the sentient machines. Martha Wells uses well-known science fiction building blocks in ironic ways that draw us into Murderbot’s drama.

Monday, April 28, 2025

Further Thoughts on the Futility of Language

Patrick Stewart (left) and Paul Winfield in the Star Trek episode “Darmok”
This essay is a follow-up to my prior essay Some Stray Thoughts on the Futility of Language

The popularity of Star Trek means that, more than most science fiction properties, its references and in-jokes exceed the bounds of genre fandom. Even non-junkies recognize inside references like “Dammit, Jim,” and “Beam me up.” But the unusual specificity of the 1991 episode “Darmok” exceeds those more general references. In that episode, the Enterprise crew encounters a civilization that speaks entirely in metaphors from classical mythology.

Berkeley linguist George Lakoff, in his book Metaphors We Live By, contends that much language consists of metaphors. For Lakoff, this begins with certain small-scale metaphors describing concepts we can’t describe directly: in an argument, we might “defend our position” and “attack our opponents.” We “build an argument from the ground up,” make sure we have “a firm foundation.” The debate ends, eventually, when we “see the other person’s point.”

Such first-level metaphors persist across time because, fundamentally, we need them. Formal debate structures shift little, and the figures of speech remain useful, even as the metaphors of siege warfare become obsolete. While speakers and authors repeat the metaphors, they retain their currency. Perhaps, if people stopped passing such metaphors onto the next generation, they might fade away, but so far, that hasn’t happened in any way I’ve spotted.

More pliable metaphors arise from cultural currents that might not persevere in the same way. Readers around my age will immediately recognize the metaphor when I say: “Read my lips, no new taxes.” They may even insert President George H.W. Bush’s hybrid Connecticut/Texas accent. For several years in the late 1980s and early 1990s, the “Read my lips” metaphor bespoke a tough, belligerent political stance that stood involate… until it didn’t.

In the “Darmok” episode, to communicate human mythic metaphors, Captain Picard describes the rudiments of the Epic of Gilgamesh, humanity’s oldest known surviving work of fiction. Picard emphasizes his familiarity with ancient myth in the denouement by reading the Homeric Odes, one of the principal sources of Iron Age Greek religious ritual. For Picard, previously established in canon as an archeology fan, the earliest myths represent humanity’s narrative foundation.

But does it? While a nodding familiarity with Homer’s Odyssey and Iliad remain staples of liberal education, how many people, outside the disciplines of Sumeriology and classical studies, read Gilgamesh and the Homeric Odes? I daresay that most Americans, if they read mythology at all, mostly read Bulfinch’s Mythology and Edith Hamilton’s Mythology, both of which sanitized Greek tradition for the Christian one-room schoolhouse.

The attached graphic uses two cultural metaphors to describe the writer’s political aspirations. The reference to Elvis on the toilet repeats the widespread cultural myth that Elvis Presley, remembered by fans as the King of Rock and Roll, passed away mid-bowel movement. There’s only one problem: he didn’t. Elvis’ loved ones found him unconscious on the bathroom floor, following a heart attack; he lingered a few days before dying in hospital.

The drift between Elvis as cultural narrative, and Elvis as historic fact, represents the concept of “mythology” in the literary critical sense. We speak of Christian mythology, the mythology of the Founding Fathers, and the myths of the Jersey Devil and prairie jackalope. These different “mythologies” represent, neither facts nor lies, but stories we tell to understand concepts too sweeping to address directly. Storytelling becomes a synecdoche for comprehension.

Similarly, the broad strokes of Weekend at Bernie’s have transcended the movie itself. It’s questionable how many people watched the movie, beyond the trailer. But the underlying premise has become a cultural touchstone. Likewise, one can mention The Crying Game or The Sixth Sense, and most Americans will understand the references, whether they’ve seen the movies or not. The vague outlines have become part of our shared mythology.

But the movies themselves haven’t become so. Especially as streaming services have turned movie-watching into a siloed enterprise, how many people watch older movies of an evening? We recognize Weekend at Bernie’s, released in 1989, as the movie where two doofuses use their boss’s corpse as backstage pass to moneyed debauchery. But I doubt how many people could state what actually happened, beyond the most sweeping generalities.

Both Elvis and Bernie have come unmoored from fact. Their stories, like those of Gilgamesh and Darmok, no longer matter; only the cultural vibe surrounding them survives. Language becomes a shorthand for understanding, but it stops being a vessel of actual meaning. We repeat the cultural references we think we share, irrespective of whether we know what really happened, because the metaphor, not the fact, matters.

Tuesday, April 22, 2025

Some Stray Thoughts on the Futility of Language

I think I was in seventh grade when I realized that I would probably never understand my peers. In church youth group, a young man approximately my age, but who attended another middle school, talked about meeting his school’s new Egyptian exchange student. “I could tell right away,” this boy—a specimen of handsome, square-jawed Caucasity who looked suspiciously adult, so I already distrusted him—said, “that he was gonna be cool.”

“How could you tell?” the adult facilitator asked.

“Because he knew the right answer when I asked, ‘What’s up?’”

Okay, tripping my alarm bells already. There’s a correct answer to an open-ended question?

Apparently I wasn’t the only one who found that fishy, because the adult facilitator and another youth simultaneously asked, “What’s the correct answer then?”

“He said, ‘What’s up?’” my peer said, accompanied by a theatrically macho chin thrust.

(The student being Egyptian also mattered, in 1987, because this kid evidently knew how to “Walk Like an Egyptian.”)

This peer, and apparently most other preteens in the room, understood something that I, the group facilitator, and maybe two other classmates didn’t understand: people don’t ask “What’s up?” because they want to know what’s up. They ask because it’s a prescribed social ritual with existing correct responses. This interaction, which I perceived as a request for information, is actually a ritual, about as methodical and prescriptive as a Masonic handshake.

My adult self, someone who reads religious theory and social science for fun, recognizes something twelve-year-old Kevin didn’t know. This prefixed social interaction resembles what Émile Durkheim called “liturgy,” the prescriptive language religious people use in ceremonial circumstances. Religious liturgy permits fellow believers to state the same moral principles in unison, thus reinforcing their shared values. It also inculcates their common identity as a people.

The shared linguistic enterprise, which looks stiff, meaningless, and inflexible to outsiders, is purposive to those familiar with the liturgy. Speaking the same words together, whether the Apostle’s Creed or the Kaddish or the Five Pillars of Islam, serves to transform the speakers. Same with secular liturgy: America’s Pledge of Allegiance comes to mind. Durkheim cited his native France’s covenants of Liberté, Egalité, et Fraternité.

This confused me, a nerdy and socially inept kid who understood life mainly through books, because I thought language existed to convey information. Because “What’s up?” is structured as a question, I perceived it as a question, meaning I perceived it as a request for clarifying information. I thought the “correct” answer was either a sarcastic rejoinder (“Oh, the sky, a few clouds…”) or an actual narrative of significant recent events.

No, I wasn’t that inept, I understood that when most people ask “How are you today,” it was a linguistic contrivance, and the correct answer is “fine.” I understood that people didn’t really want to know how you’re doing, especially if you’re doing poorly. But even then, the language was primarily informative: I’m here, the answer says, and I’m actively listening to you speak.

However, the “What’s up?” conundrum continues to nag me, nearly forty years later, because it reveals that most people don’t want information, at least not in spoken form. Oral language exists mainly to build group bonds, and therefore consists of ritual calls and responses. We love paying homage to language as communication, through formats like broadcast news, political speeches, and deep conversations. But these mostly consist of rituals.

Consider: when was the last time you changed your mind because of a spoken debate? This may mean the occasional staged contacts between, say, liberals and conservatives, or between atheists and Christians. Every four years, we endure the tedium of televised Presidential debates, but apart from standout moments like “They’re eating the pets,” we remember little of them, and we’re changed by less.

For someone like me, who enjoys unearthing deeper questions, that’s profoundly frustrating. When I talk to friends, I want to talk about things, not just talk at one another. Perhaps that’s why I continue writing this blog, instead of moving to YouTube or TikTok, where I’d receive a larger audience and more feedback. Spoken language, in short, is for building bonds; written language is for information.

Put another way, the question “What’s up?” isn’t about the individuals speaking, it’s about the unit they become together. Bar chats, water cooler conversations, and Passing the Peace at church contain no information, they define the group. Only when we sit down, alone, to read silently, do we really seek to truly discover what’s up.

Thursday, April 17, 2025

The Shadows and Glaciers of Northern Norway

C.J. Cooke, The Nesting: a Novel

Sophie Hallerton has just secured a coveted job nannying for an esteemed British widower raising his children in Norway’s remote northern forest. One problem: she isn’t Sophie Hallerton. She’s Lexi Ellis, a chronic screw-up who stole Sophie Hallerton’s credentials to escape looming homelessness, or worse. When Lexi arrives in Norway, though, she finds that Tom Faraday’s house conceals secrets that make her lies seem small.

I really liked C.J. Cooke’s most recent novel, The Book of Witching, which combined family drama, mystery, and historical saga with a distinct voice. So I grabbed Cooke’s 2020 book expecting something similar. Indeed, she mixes liberally again from multiple genres with broad audience appeal. Somehow, though, the ingredients come together without much urgency, and I’m left feeling disappointed as I close the final cover.

Architect Tom Faraday needs a nanny to nurture and homeschool his daughters, because their mother committed suicide in a Norwegian fjord. Anyway, everyone believes Aurelia committed suicide. We dedicated readers know that, the more confidently the characters believe something in Act One, the more certainly they’ll see their beliefs shattered by Act Three. This is just one place where Cooke invites readers to see themselves as in on the joke.

Lexi secures the nanny position with her filched credentials and some improv skills, only to discover she’s pretty effective. But once ensconced in Tom’s rural compound, she finds the entire family up to their eyeballs in deceit and secrets. Tom’s build, in honor of his late wife’s earth-friendly principles, is badly overdrawn and short-handed. The housekeeper hovers like Frau Blucher. And Tom’s married business partners are fairly shady, too.

Supernatural elements intrude on Lexi’s rural life. Animal tracks appear inside the house, then vanish without leading anywhere. Tom’s older daughter, just six, draws pictures of the Sad Lady, a half-human spectre that lingers over her memories of Aurelia. The Sad Lady maybe escaped from Aurelia’s hand-translated compendium of Norwegian folklore. A mysterious diary appears in Lexi’s locked bedroom, chock-a-block with implications that Tom might’ve killed his wife.

C.J. Cooke

If this sounds familiar, you aren’t wrong. Cooke introduces her stylistic borrowings in an unusually forthright manner. Lexi reads “Nordic Noir” novels in her spare time, signposting the sepulchral midwinter setting, and Lexi describes her ward’s artwork as “Gothic,” the correct term for this novel’s many locked-room puzzles. This boldly announces Cooke’s two most prominent influences, Henning Mankell and Henry James, whose influence lingers throughout the story.

Unfortunately for contemporary English-language readers, Cooke also writes with those authors’ somber pace. Her story introduces even more narrative threads than I’ve mentioned, and more than the characters themselves know, because her shifting viewpoint means we have information the characters lack. We know how intricate their scaffold of lies has become, and sadly, we know that if that scaffold collapsed, most characters would be more relieved than traumatized.

Cooke unrolls her threads slowly and deliberatively. The narration sometimes includes time jumps of weeks, even months. Probably even longer, because Tom’s ambitious experimental earth-house would take considerably longer to build than something conventional and timber-framed; one suspects Cooke doesn’t realize the logistics that go into construction. Characters have mind-shattering revelations about each other, sometimes false, then sit on them for months.

Indeed, despite the unarguable presence of a carnivorous Norwegian monster inside the house, it’s possible to forget, because it disappears for weeks. Cooke’s real interest, and the novel’s real motivation when it has one, is the human drama. We watch the tensions and duplicity inside the Faraday house amplify, a tendency increased by geographic isolation. Indeed, we see every lie the character tell, except one: what really happened to Aurelia.

This novel would’ve arguably been improved by removing the folk horror subplot, focusing on the human characters. But that would require restructuring the storytelling. The characters linger at a low simmer for chapter after chapter, then someone does something to change the tenor, and for a moment, we reach a boil. Cook’s Nordic atmospherics, and glacial pace, put the best moments—and there are several good moments—too far apart.

Then, paradoxically, the denouement happens too quickly. After 300 pages of slow, ambient exposition, Cooke abruptly ends the narrative in a manner that leaves many threads unresolved. Despite Cooke’s pacing errors, I found myself invested in Lexi’s journey of discovery, only to find it ends hastily, in a manner scarcely prompted by prior events. Cooke’s narrative doesn’t conclude, it just ends.

I’ll probably read Cooke again. But after this one, I’ll approach her with more caution.

Tuesday, April 15, 2025

Is the Law a Dead Letter Now?

Back in the 1990s, when I was a teenage Republican, I believed humanity would find a legal system so self-sustaining, we could eventually exclude humans from the equation. We could write laws, then deploy the bureaucratic instruments necessary to enforce those laws, without bias or favor, essentially forever. The machine would support itself without inputs from nasty, unreliable humans. We only needed to trust the modernist small-L liberal process.

Okay, we hadn’t written such laws to implement such systems, but that only proved we hadn’t written such laws yet. Because individuals only enforced laws as written, I reckoned, such self-sustaining systems would preclude individual prejudice or demographic bias. (I didn’t realize, for years, that laws themselves could contain bias.) Divisions, disadvantage, and destitution would eventually wither as laws enforced baseline ethical standards which encompassed everyone, everywhere, equally.

Watching the meltdown surrounding Kilmar Abrego Garcia, I’m seeing underlined something I gradually realized in my twenties, but never previously needed to say aloud: all laws are doomed to fail. Even laws written with altruistic intent and thorough legal support, like the 14th Amendment, work to the extent that those entrusted to enforce them, actually do so. America’s current executive regime is demonstrating no intention to enforce the law justly.

The regime first deported Abrego Garcia in March, despite him having legal residency status and never having been convicted of any crime. Initially, the regime acknowledged that they’d expelled Abrego Garcia mistakenly, and based on that acknowledgment, the Supreme Court—dominated by Republican nominees and one-third appointed by the current president—unilaterally demanded his return. So the regime flippantly changed the narrative and refused to comply.

This refusal, stated unambiguously in an Oval Office press conference where the American and Salvadoran presidents shared the lectern, demonstrates why the law will inevitably fail. America’s system, predicated on the government’s adherence to the principles laid out in the Constitution, absolutely requires that all participants share a prior commitment. Simply put, they must believe that nation, government, and law, are more important than any individual. Even the president.

Kilmar Abrego Garcia (AP photo)

We must strike a balance here, certainly. Individuals write our laws, even individuals working collectively, and our legislators are individuals. The “buck stops here” president, an individual, must balance power with the nine SCOTUS justices and the 535 members of Congress, who are all individuals, even when working jointly. But those individuals all work for a shared vision, and when they don’t, their whimsy becomes antithetical to state organization.

Please don’t misunderstand me. Any individual may call the nation wrong, as for instance Dr. King did, and may organize to redress such wrong. Indeed, only such public, organized call-out may sway the nation’s conscience sufficiently to enact change or improve a dysfunctional system. The primacy of the nation doesn’t mean citizens must meekly accept arbitrary or unjust directions from a unitary state. That would basically invite autocracy.

Simultaneously, however, those who seek official state power must submit themselves to something larger than their individuality. Dr. King never ran for office, and the tactics he employed when crossing the Edmund Pettis Bridge would’ve been inappropriate in Congress. Indeed, his deputy, John Lewis, who became a Representative, used Dr. King’s tactics to mobilize voters, but submitted himself to forms of order when writing and voting on legislation.

My regular readers, who mostly share my sociopolitical views, may think I’m saying something obvious here. But as I write, the current president’s approval ratings hover between 41% and 49%. That’s negative, and substantially underwater, but at least two in five Americans look at what’s currently happening, and don’t mind. They voted for his tariffs, immigrant roundups, and rollbacks of civil rights law, and five months later, they remain unchanged.

A satisfactory fraction of American voters approves of, or at least don’t mind, a president placing himself above either Congress or SCOTUS. This president, like Andrew Jackson before him, thinks he’s empowered to force lawful residents off their land, unless someone has guns enough to stop him. Essentially, he’ll continue ignoring baselines of justice until someone, presumably Congress, does something to stop him.

Our entire Constitutional structure requires those elected to power, to agree that America is more important than themselves. That means both America, the human collective, and America, the structures of government. If laws require them to act correctly, then they must abide by those laws without threats of force. If they can’t do that, well, that’s what checks and balances are for. If that fails, We the People step in.