Friday, June 6, 2025

Don’t Pretend To Be Stupid, Dr. Oz

Americans used to like Dr. Mehmet Oz

The same day I posted about Senator Joni Ernst’s faulty rhetoric surrounding Medicaid cuts, Dr. Mehmet Oz claimed that uninsured people should “prove that you matter.” The cardiac surgeon, Oprah darling, and failed Senate candidate is now Centers for Medicare and Medicaid Services Administrator, meaning he administers decisions for who receives assistance in paying medical bills. His criterion for proving one matters? “Get a job or at least volunteer or … go back to school.”

Last time, I got Aristotelean and dissected Senator Ernst’s rhetoric, noting that she changed the “stasis of argument” mid-sentence. That is, she pretended to misunderstand the core dispute, sanding off nuance while condescending to her constituents.. When someone said people would die unnecessarily, Ernst pretended they meant people would die at all. She thought it appropriate to remind constituents that humans are mortal—and, in her tone-deaf follow-up, sound an altar call for Jesus Christ.

While Ernst’s constituent wanted to argue the morality of preventable death, and Ernst veered dishonestly onto the fact of mortality, a friend reminded me this argument skirted an important issue. Who will die first? When the government makes decisions about paying medical bills, the outcomes aren’t morally neutral: chronically ill, disabled, and elderly Americans stand the most to lose. The same bloc of Americans whom, you’ll recall, certain politicians permitted to die during the pandemic.

Dr. Oz said what Senator Ernst only implied, that hastening human mortality is okay for certain undesirables. This administration, and indeed conventional American conservatism throughout my lifetime, has tied human worth to economic productivity, and especially to productivity for other people. If someone needs assistance, America’s authorities won’t help you create a business, learn a skill, or otherwise evolve to benefit your community. Their imagination can’t expand beyond getting a job working for someone else.

Nor was this subtext. Oz said aloud: “do entry-level jobs, get into the workforce, prove that you matter.” This correlation between “you matter” and “you work for others” has lingered beneath much of America’s work ethic throughout my lifetime—and, as an ex-Republican, I once believed it, or anyway accepted it. But as anybody who’s faced the workforce recently knows, today’s working economy isn’t a source of meaning or dignity; it often actively denies both.

Even laying aside demi-Marxist arguments like “owning the means of production” or “the surplus value of labor,” employment spits in the human face. Minimum wage hasn’t increased in America since 2009, and as anybody who’s worked a fast food dinner shift knows, employers who pay minimum wage definitely would pay less if the law permitted. Even if the workers receive enough hours to qualify for employer-provided health insurance, they mostly can’t afford the employee co-pay.

Lest anybody accuse me of misrepresenting Dr. Oz, let’s acknowledge something else: he lays this onus on “able-bodied” Americans. We might reasonably assume that he expects healthy, young, robust workers to enter the workforce instead of lollygagging on the public dime. But even if we assume they aren’t doing that already (and I doubt that), the pandemic taught many workers important lessons about how America values labor. Specifically, that it doesn’t, except through empty platitudes.

In 2020, executives, attorneys, bureaucrats, and others went into lockdown. Americans laughed at highly skilled professionals trying to do business through Zoom, thus avoiding the virus. Meanwhile, manual trades, retail jobs, construction, and other poorly paid positions were deemed “essential” and required to continue working. These jobs are not only underpaid and disdained, but frequently done by notably young or notably old workers, disabled, chronically ill, required employment to qualify for assistance, or otherwise vulnerable.

As a result, the workers most vulnerable to the virus, faced the most persistent risk. Sure, we praised them with moralistic language of heroism and valor, but we let them get sick and die. Americans’ widespread refusal to wear masks in restaurants and grocery stores put the worst-paid, most underinsured workers at highest risk. Many recovered only slowly; I only recently stopped wheezing after my second infection. Many others, especially with pre-existing conditions, simply died.

Dr. Oz has recapitulated the longstanding belief that work is a moral good, irrespective of whether it accomplishes anything. He repeats the myth, prevalent since Nixon, that assistance causes laziness, citation needed. And despite hastily appending the “able-bodied” tag, he essentially declares that he’s okay with letting the most vulnerable die. Because that’s the underlying presumption of Dr. Oz, Senator Ernst, and this administration. To them, you’re just a replaceable part in their economic machine.

Thursday, June 5, 2025

Don’t Pretend To Be Stupid, Senator Ernst

A still from Senator Ernst’s notorious
graveyard “apology” video

Reputable news outlets called Senator Joni Ernst’s (R-IA) graveyard rebuttal last week “sarcastic” because, I think, they deemed it ideologically neutral. Accurate descriptors like “condescending,” “mean-spirited,” or “unbecoming of an elected official” might sound partisan. And mainstream media outlets today will perform elaborate contortions to avoid appearing even accidentally liberal. Better to call her “sarcastic,” from the corporate overlords’ perspective, than analyze Ernst’s motivations.

I have no such compunctions. I’ll eagerly call Ernst’s argument what I consider it: deeply dishonest, predicated on bad faith. For those who need a refresher, Ernst’s constituents expressed outrage at her support for a budget bill which included severe Medicaid cuts. At a Parkersburg town hall, a constituent shouted “People are going to die!” After stammering a bit, Ernst replied: “We are all going to die.” When that comment drew national attention, Ernst responded by doubling down.

Let’s postpone the substance of the debate now. We all already have our opinions on the moral and legal motivations for steep Medicaid cuts; my regular readers probably share my disdain for these cuts. Rather, let’s focus on Ernst’s rhetorical approach. Specifically, I’d like to emphasize Ernst’s decision to pretend she doesn’t understand the accusation. The audience member, in saying people will die, meant people will die needlessly and preventably. Ernst chose to explain that people will die at all.

In classical rhetoric, we speak of the “stasis of argument,” the point of real contention when people disagree on important points. In general, we speak of four stases of argument, that is:

  • Fact (will people die?)
  • Definition (what does it mean for people to die?)
  • Quality (is this death necessary, acceptable, or moral?)
  • Jurisdiction (who bears responsibility for this death?)

In saying people are going to die, Ernst’s constituent argues from a stasis of quality, that cutting Medicaid and other programs will result in needless and morally unacceptable deaths. Ernst attempts to shift focus and claim that death, being inevitable, shouldn’t be resisted. Death is just a fact.

The stases listed in sequence above move from lowest to highest. Rhetoricians consider facts simple and, usually, easy to demonstrate. When facts become ambiguous, we move upward into definitions, then further up into moral considerations, and finally into the realm of responsibility. Moving upward usually means acceding the prior stasis. We cannot argue the morality or responsibility of facts without first acknowledging their reality.

Sometimes, shifting the stasis of argument makes sense. When the state of Tennessee prosecuted John Scopes for teaching evolution in public schools, the prosecution proceeded from a stasis of fact: did Scopes break the law? Defense attorney Clarence Darrow redirected the argument to a stasis of quality: did Scopes do anything morally unacceptable? Darrow essentially admitted the fact, but claimed a higher point of contention existed.

Plato and Aristotle, as painted by Raphael

However, the reverse seldom applies. Moving up the ladder means adding nuance and complexity to arguments, and moving down means simplifying. By shifting the stasis onto the physical reality of death, which all humans face inevitably, Ernst removes the complexity of whether it’s good or acceptable for someone to die now. If an accused murderer used “We’re all going to die” as a courtroom defense, that would be laughable.

Ernst knows this. As a law-n-order Republican, Ernst has a strict voting record on criminal justice, border enforcement, and national defense. She knows not all deaths are equal. By shifting her stasis of argument from whether deaths are acceptable to whether deaths are real, she’s pretending an ignorance of nuance she hasn’t presented anywhere else. She knows she’s moving the goalpoasts, and assumes we’re too stupid, or perhaps too dazzled by rapid wordplay, to notice she’s done it.

I’ve complained about this before. For instance, when people try to dismiss arguments against synthetic chemicals by pretending to misunderstand the word “chemical,” they perform a similar movement. Moving the stasis down the ladder is a bad-faith argument tactic that bogs debate down in searches through the dictionary or Wikipedia to prove that blueberries aren’t chemical compounds, an that human mortality doesn’t make murder okay.

Moreover, this tactic means the person isn’t worth talking to. If Senator Ernst believes that human mortality negates our responsibility to prevent needless premature death, then we have two choices. She’s either too stupid to understand the stakes, which I doubt, or she’s too dishonest to debate. We must humor her while she’s in office. But her term is up next year, and honest, moral voters must remove her, because this rhetorical maneuver proves her untrustworthy for office.

Tuesday, May 20, 2025

Architecture and American Values

Nottaway Plantation in its salad days, in a promo photo from Explore Louisiana

The recent online brouhaha over the fire which destroyed Nottaway Plantation is revealing more than I wanted to know about American priorities. Billed as the largest antebellum mansion in the American south, at 64 rooms and 53,000 square feet, it stood intact from its completion in 1859 until last Thursday. Following the Emancipation Proclamation, the Louisiana plantation house served mainly as a resort hotel, convention center, and tourist trap.

Social media reaction has split along predictable lines. Some respondents, mostly White, have insisted that the mansion’s historical significance and its elaborate architecture make it a legitimate destination, a piece of Louisiana heritage, and a massive loss. Others, a rainbow coalition, have rallied behind the plantation’s slaveholding heritage and called it a monument to American racism. Each side accuses the other of viewing the destruction through partisan political lenses.

Wherever possible, I prefer to assume that all debate participants, even those I disagree with, start from a good-faith position. I don’t want to believe that those who don’t spotlight the slaveholding heritage are, perforce, celebrating racism. And I hope those who center slavery as the building’s heritage, don’t covertly cheer for destruction. Though my sympathies lie strongest with the anti-slavery caucus, I’d rather believe everyone starts from good faith.

If that’s true, it follows that those mourning the loss of historic architecture, believe the building’s structural significance and its socioeconomic significance, exist in different compartments. They segregate different aspects of the building’s history into beehive-like cells, and assume the separate qualities don’t influence one another. Social media commenters have written words to the effect that “Slavery was terrible, but the building matters in its own right, too.”

This argument holds some water. Nottaway Plantation was, until last week, a surviving example of a mid-19th Century architectural ethos. Not a reconstruction, a replica, or an homage, but an actual piece of physical history, a primary source. We shouldn’t discount that significance, regardless of who built the building; a physical artifact of American history existed for 166 years, until it didn’t. America is arguably poorer for the loss.

However, compartmentalizing that history from the circumstances which created it, is itself ahistorical. I worked in the construction industry, and I can attest that the largest ongoing cost is human labor. Nowadays, construction involves copious large costs, including power tools, diesel-burning equipment, and transportation. But these are, usually, fixed-term costs. Human labor is ongoing and requires infusions of money, if only because workers get hungry.

Nottaway Plantation during the fire, in a photo from CBS News

Much labor is especially valuable because it’s rare. Framing carpenters, brickmasons, metalworkers, and other skilled laborers demand a premium because their skills require years of honing. Nottaway Plantation was built during the days of gaslamps and outdoor toilets, but nowadays, electricians, plumbers, and HVAC installers command competitive rates. Building such a massive mansion required skilled labor from workers who, being enslaved, couldn’t float their terms on the open market.

Many conditions also depend on the time and place. Anybody who’s lived in Louisiana knows that most of the state stands on spongy, wet soil. Building a multistory mansion requires sinking foundational pilings deep, possibly down to the bedrock, to prevent the building submerging under its own weight. Today, such pilings require diesel-burning machinery, sump pumps, and cast iron. In 1859, those pilings required many, many humans.

Therefore, the building’s architectural significance—which is real and valuable—relies upon the labor employed. Large-scale monumental construction always requires somebody able to pay the army of skilled workers whose labors make the building possible. This problem isn’t uniquely American, either. European monuments, like Notre Dame and the Vatican, were first built before Europe reintroduced chattel slavery, but the buildings wouldn’t be possible without the poverty of serfdom.

Too many accomplishments of human activity, rely on a small sliver of society having too much money. Without rich people willing to pay skilled workers, we wouldn’t have the White House, the Venice canals, legendary artwork like the Mona Lisa and Michelangelo’s David, and other monuments of human capability. If Leonardo or Pierre L’Enfant needed day jobs to subsidize their crafts, they’d never have accomplished the potential within them.

So yes, Nottaway Plantation reflects architectural history and artistic movements. But it also reflects economic inequality and the labor conditions of antebellum Louisiana. It’s impossible to separate the two spheres of influence, no matter how much the privileged few wish it. Nottaway was an artifact of physical beauty and a community gathering place. But it emerged from specific conditions, which we cannot compartmentalize from the building itself.

Monday, May 19, 2025

Robopocalypse Now, I Guess

Martha Wells, The Murderbot Diaries Vol. 2

This is a follow-up to the review I'll Be Back, I Guess, Or Whatever

The security cyborg known only as Murderbot continues fighting to rediscover the tragic history that someone deleted from its memory banks. But the trail has gone cold, and somebody lurking behind the scenes will deploy all the resources of gunboat capitalism to keep old secrets buried. So Murderbot relies on its strengths, making ad hoc alliances to infiltrate hidden archives, while coincidentally keeping hapless humans alive despite their own best efforts.

The ironically self-referential tone Martha Wells introduced in her first omnibus Murderbot volume continues in this second collection. The stories were initially published as separate novellas, but that format is difficult to sell in conventional bookstores, so these trade paperbacks make Murderbot’s story available to wider audiences. That makes for easier reading, but unfortunately, it starts drawing attention to Murderbot’s formulaic structure, which probably wasn’t obvious at first.

As before, this book combines two previously separate stories. In “Rogue Protocol,” Murderbot pursues buried secrets to a distant planet that greedy corporations abandoned. The GrayCris company left immovable hardware behind, and Murderbot gambles that information stored on long-dormant hard drives will answer buried questions. Clearly someone else thinks likewise, because double agents and war machines take steps to prevent anyone reading the old files.

With the first combined volume, I observed Wells’ structural overlap with Peyton Place, which established the standards of prime-time soap operas. (Murderbot secretly prefers watching downloaded soaps over fighting, but keeps getting dragged back into combat.) With this novella, I also notice parallels with The Fugitive—the 1964 series, not the 1993 movie. In both, the protagonist’s episodic adventures mask the longer backstory, which develops incrementally.

In the next novella, “Exit Strategy,” Murderbot returns its collected intelligence to the consortium that nominally “owns” it. But that consortium’s leaders, a loose agrarian cooperative, have fallen captive to GrayCris, which has the ruthless heart necessary to manipulate an interplanetarystateless capitalist society. Preservation, which owns Murderbot on paper, is a hippie commune by contrast. MurderBot must use its strategic repertoire to rescue its pet hippies from the ruthless corporation.

Martha Wells

Here's where I start having problems. On the fourth narrative, I begin noticing Murderbot follows a reliable pattern: it desperately protests its desire to chill out, watch TV, and stay alone. But duty or necessity requires it to lunge into combat to rescue humans too hapless, good-hearted, and honest for this world. As its name suggests, Murderbot has only one tool, violence. And it deploys that tool effectively, and often.

As the pattern repeats itself, even Murderbot starts noticing that it’s protected by plot armor. It can communicate with allies undetected, hack security systems, and manipulate humans’ cyberpunk neural implants. It has human levels of creativity and independence that fellow cyborgs lack, but high-speed digital processing and upload capacity that humans can’t share. Like Johnny 5 or Marvin the Paranoid Android, it combines the best of humanity and technology.

And like those prior archetypes, it handles this combination with sarcasm and snark. Murderbot pretends it doesn’t care, and uses language to keep human allies at arm’s length. It also uses its irony-heavy narrative voice, laced with parenthetical digressions, to keep us alienated, too. But the very fact that it wants a human audience to hear its story, which it only occasionally acknowledges, admits that it’s desperate for human validation.

Murderbot comes across as jerkish and misanthropic. But it also comes across as lonely. I feel compelled to keep reading its story, even as I see the episodes falling into comfy boilerplates, because Murderbot’s essential loneliness makes it a compelling character. We’ve all known someone like this; heck, book nerds reading self-referential genre fiction have probably been someone like this.

Thus I find myself torn. Only four novellas in, the story’s already become visibly repetitive, and even Murderbot feels compelled to comment on how episodes resemble its beloved soaps. The first-person narrative voice, which combines ironic detachment with noir grit, becomes disappointingly one-note as each story becomes dominated by repeating action sequences. It reads like an unfinished screen treatment. (A streaming TV adaptation dropped as I finished reading.)

But despite the formulaic structure, I find myself compelled by Murderbot’s character. I want to see it overcome its struggles and find the home and companionship it clearly wants, but doesn’t know how to ask for. Murderbot is more compelling than the episodes in which it finds itself, and I keep reading, even as the literary purist in me balks. Because this character matters enough that I want to see it through.

Friday, May 16, 2025

Man You Should’ve Seen Them Kicking Edgar Allan Poe

T. Kingfisher, What Moves the Dead

Lieutenant Alex Easton (ret.) has come to call upon a fellow veteran, Roderick Usher, and his ailing sister, Madeline. No, seriously. Easton finds a rural manor house plagued with decay and verging on collapse, and a childhood friend reduced to a wisp straddling death’s door. Far worse, though, is what Easton discovers when he finds Madeline sleepwalking the labyrinthine halls: another voice speaks a malign message from Madeline’s lips.

T. Kingfisher is somewhat circumscribed by her source material, a retelling of one of Poe’s most famous stories. Either Kingfisher’s story plays to an inevitable end, or it abandons its source material, a perilous dilemma. And unfortunately, Christina Mrozik’s cover art spoils the climactic reveal. Rather than the resolution, we read Kingfisher’s novella for the suspense and exposition along the way, which Kingfisher has in abundance.

Besides Roderick and Madeline Usher, the named characters of Poe’s original short story, and Easton, Poe’s originally nameless narrator, we have two other characters: James Denton and Eugenia Potter. (All characters, except the Ushers, go by surnames, as befits the 19th-Century setting.) In Poe’s original, the Usher siblings represent proto-Jungian archetypes of a fractured soul. In Kingfisher’s telling, characters become representatives of post-Napoleonic malaise.

First, Easton. A veteran of an army with an excessive range of pronouns, Easton’s androgynous name matters: one’s only gender is “sworn soldier.” Placed against the intensely gendered Usher siblings, Easton remains neither fish nor fowl, a permanent outsider cursed to watch humanity’s struggles without benefiting. Gender, and its attendant social baggage, looms large herein, driving people together but preventing characters from ever truly understanding one another.

Potter is a scholar and scientist, denied credentials in her native England because of her sex. Denton, a combat surgeon, survived the American Civil War, but suffers combat trauma, which in that time is regarded as emasculating cowardice. Both characters have conventional binary gender, but in their own ways defy mandatory gender expectations. When the crisis comes, however, their nonconformity defines their heroic qualities in the story.

Don’t misunderstand me. I’ve identified themes here, but Kingfisher doesn’t simply propound a message. Rather, these characters’ unique manifestations empower them to fight a threat growing beneath the unspoken tensions of the Long Nineteenth Century. What appears to be decay permeating the Ushers’ manor house, is actually a symbiotic growth that threatens the tenuous social structure of the Belle Epoque and the last days of aristocracy.

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Poe is one among several writers whose stories expanded the realm of possibility in American literature. But, like Lovecraft, Poe’s writing reflects his time, and its prejudices. In recent years, authors like Victor LaValle and Kij Johnson have updated Lovecraft, rewriting his stories without the limiting biases. Though I know other authors have done likewise with Poe, I haven’t seen them the same way; Kingfisher closes that gap.

(Yes, Mike Flanagan released a House of Usher adaptation almost simultaneously with this novella. But I’m old-fashioned enough to distinguish between literature and streaming TV.)

Kingfisher’s story runs short—under 160 pages plus back matter—but never feels rushed. She nurtures the kind of character development and interpersonal relationships that Poe largely skimmed. Poe’s original, published in 1839, was groundbreaking, but its terse style feels underwritten by contemporary standards. Kingfisher injects the kind of depth and development that cause contemporary readers to feel suspense, and to care about the outcomes.

I especially respect that Kingfisher avoids that tedious contrivance of contemporary horror, the twist ending. For a quarter century, writers and filmmakers have insisted on finishing with a melodramatic rug-pull which undermines everything we thought we knew. This was fun for a while. But nobody’s likely to create a better twist than Catriona Ward, at least anytime soon. Kingfisher builds suspense on character and action, not by stacking the deck.

Rather than abrupt reversals, Kingfisher drives her story with questions that the characters must answer. Where, she asks, do monsters come from in an era which no longer believes in the supernatural? How can we fight monsters when they go beyond the limits of science and natural philosophy? And what does it mean to defeat an evil being that can get up and walk after you’ve already killed it?

Admittedly, Kingfisher is circumscribed because we know where her story is headed. We remember 11th-grade AmLit. But she beats this limitation by interspersing a range of character development that would’ve frightened Edgar Allan Poe. Classic literature never just reflects itself, it asks important questions about us, the readers, and Kingfisher definitely achieves that goal.

Monday, May 12, 2025

Stephen King and the Monsters of Modernity

Stephen King

I understand the desire to get ahead of the story of Stephen King and his massively unfunny “joke.” After once-beloved authors like Orson Scott Card, J.K. Rowling, and Neil Gaiman have been uncovered as truly horrible human beings with repellent opinions, we’re naturally fearful of another seemingly progressive voice blindsiding us. Such preparation only makes sense. But it’s possible to swing to the opposite extreme, at our own expense.

Surely even Stephen King fans would acknowledge that his anti-Trump joke didn’t land. His dig at “Haitians eating pets” resurrects a months-old campaign gaffe that, amidst the mass extradition of legal American residents, appears outdated and tone-deaf. The specific reference to Haitians revives a racist trope, and as we know, this creates the illusion that the racist claims have any basis. The “joke,” by humor standards, was definitely ill-considered.

However, much of the early outrage seemingly assumes that King believes the anti-Haitian stereotypes. That suggests a total lack of situational literacy: King clearly means that Trump is racist, not that he’s racist himself. Online discourse is often dominated by what British journalist Mick Hume calls “full-time professional offense takers” who sustain the discussion by finding the worst possible interpretation, and then deploying it in bad faith.

Reading the most aggressive anti-King criticisms, I’m reminded of the feeding frenzy, over nine years ago, against Calvin Trillin. Like with King’s joke, the anti-Trillin swarm required the most uncharitable, situationally illiterate interpretation of Trillin’s writing. Online outrage follows a predictable script comparable to religious liturgy, and for largely the same reason, to reassure fellow believers that we are good people who share a reliable moral footing.

But before I can dismiss the anti-King sentiment as meaningless ritual, I have a counter-consideration: King himself often displays unquestioned racism. Characters like Dick Halloran (The Shining) and Mother Abigail (The Stand) reflect an unexamined presumption that Black people live, and usually die, to advance White characters’ stories. His Black characters often rely upon outdated, bigoted boilerplates that feel leaden nowadays.

We might dismiss this as an oversight on King’s part. He lives in northern Maine, an overwhelmingly White region of a substantially White state, and it’s entirely possible that he doesn’t know many Black people. I recall characters like Mike Hanlon, whose largest contribution to the group dynamic in It is to be Black. I’ve written before that King seemingly writes about people groups without bothering to speak with them.

Rather than asking whether King is “racist” or “not racist,” a dichotomy that Ibram X. Kendi notes isn’t useful, we might consider what kind of racism King demonstrates. We all absorb certain attitudes about race from our families, culture, mass media, and education. Nobody lives completely free of racial prejudice, any more than prejudice around sex, class, and nationality. Even Dr. Kendi admits needing to purge racist attitudes from himself.

By that standard, King shows no particular sign of out-and-out bigotry. Indeed, he shows a bog-standard White liberal attitude of progressivism, by which he supplants Jim Crow stereotypes with more benevolent generalizations. In other words, he doesn’t hate Black people, but he also doesn’t know them particularly well, either. He replaces malignant suppositions with benign ones, but he never stops relying on wheezy vulgarisms.

Therefore, though a clear-eyed reading of King’s unfunny “joke” shows that he targets his scorn upon Trump, he uses Haitians to deliver that scorn. He falls back on his shopworn tendency to have Black characters carry water for him, in service to his White purposes. This leaden joke isn’t bigoted, but that doesn’t make it any less racist. His joke’s lack of humor ultimately comes second to his lack of agency.

In his book Danse Macabre, King notes that horror often stems from a lily-white, orderly vision of society. Michael Myers’ savagery exists as a necessary contrast to Haddonfield’s suburban harmlessness. Pennywise is most terrifying to the exact degree that Derry is anodyne. For King, evidently, that means that Whiteness is an anonymous background from which horrifying monsters, like President Trump, arise. Haitians, in that worldview, are an exception.

I fear the implications which arise from calling Stephen King “racist,” because that word has baggage. But if we apply the nuance that Dr. Kendi encourages us to utilize, then that word applies. Putting it to use requires far more detail than a BLM protest placard or a hasty tweet can encompass; and his variety of racism is the kind most receptive to correction and repentance. But that doesn’t make it any less racist.

Friday, May 9, 2025

The Ultimate Meaninglessness of “Crime”

We’ve seen an increasing number of anecdotes trickling out about once-loyal voters rejecting the Administration’s ham-handed deportation policies. Though it’s hard to derive meaningful data from isolated anecdotes, the number of stories like this one and this one about Trump voters getting burned by the administration they once supported. Many stories share a theme: “we” thought the Administration would only deport “criminals,” and we don’t consider ourselves criminals.

On one level, they’re correct: under American statutes, immigration falls under civil, not criminal, law. “Illegal” immigration is a non-category, because the word illegal refers only to crimes, not civil violations. But on another level, this reveals something uncomfortable for many Americans, that “crime” itself isn’t a fixed concept. Many undocumented immigrants don’t consider themselves criminals because they’ve committed no violent or property crime; so the Administration simply redefines “crime.”

Much American political discourse centers on “crime,” especially when Democrats hold the Oval Office. As sociologist Barry Glassner writes, fear of crime is a powerful motivator for tradition-minded voters, a motivator Republicans employ effectively. Glassner writes about how rabble rousers used fear of crime to shanghai the Clinton Administration, but the same applies broadly whenever Democrats hold majority power. We saw it during the Obama and Biden years too.

However, exactly what constitutes crime depends on who does the constituting. My core readership probably remembers John Erlichman, former White House Counsel, who admitted the Nixon Administration simply fabricated the War on Drugs as pretext to harass anti-war and Civil Rights protesters. The notorious Comstock Laws channeled one man’s sense of injured propriety to criminalize porn, contraception, pharmaceutical abortion, and the kitchen sink. Moral umbrage beats harm in defining “crimes.”

This doesn’t mean harm doesn’t exist or states should repeal every law. Murder, theft, and sexual assault are clearly wrong, because they cause manifest harm and devalue victims’ lives, bodies, and labors. But these transgressions only become “crimes” when governments pass laws against them. Legal philosophers might debate whether decriminalizing murder would make murder happen more often. Personally, I doubt it; neither Prohibition nor its repeal affected drinking numbers much.

Prohibition, therefore, proves the moral fuzziness of crimes. Both the Al Capone-style Prohibition, and contemporary drug prohibition, arose not from obvious harm (most pot-heads are too lethargic to hurt anybody), but from moral panic and public outrage. Governments made laws against substances lawmakers found abhorrent, then assumed citizens would avoid those substances, simply because they’re illegal. Then they act surprised when drinking or drugs persist.

This happens because these things aren’t innately crimes; they become crimes because lawmakers make laws. Similarly, while it’s clearly harmful if I steal money from your wallet, other property “crimes” have squishier histories. Squatting, for instance: once legal, it became illegal in America, as James Loewen writes, largely to circumscribe where Native Americans were allowed to hunt and camp. Lawmakers created laws, where none previously existed, to punish transgressors.

Immigration law follows similar patterns. Abrahamic scripture urges the faithful to welcome immigrants because, in that time, borders didn’t really exist. People moved freely, and provided they followed local laws and customs, largely changed nationhood liberally. Though serfdom tied workers to lands and lords in the late medieval period, modern concepts of the nation-state and international borders existed only as legal abstractions. Only during wartime did states enforce borders much.

This Administration can redefine civil infractions, like undocumented immigration, as crimes, because that’s how things become crimes. States will borders into existence by legal legerdemain, then demand that people remain permanently circumscribed by these fictional lines. Perhaps that’s why “the Wall” looms so large in MAGA mythology: because borders don’t really exist, so we need something manifest and palpable to make borders real.

These MAGA voters who feel betrayed because the Administration deported their loved ones, assumed that they weren’t “criminals” because they used a broad, popular definition of criminality. They didn’t perform acts of violence or property destruction, they reckoned, so therefore they weren’t criminals. They didn’t anticipate the Administration using crime’s fuzzy, amorphous nature against them, and therefore were caught unprepared when the definition of “crime” moved to surround them.

Civil society has two responses available. We could eliminate self-serving, avaricious laws, and allow people more discretion. There’s no objective reason people must live within certain borders, except that lawmakers need to control despised minorities. But we know society probably won’t choose that response. More likely, our lawmakers will write harsher, more draconian laws to eliminate this flexibility. Which will then be used against us ordinary people.

Monday, May 5, 2025

I'll Be Back, I Guess, Or Whatever

Martha Wells, The Murderbot Diaries Vol. 1

The cyborg that calls itself “Murderbot” would happily watch downloaded soap operas, 24/7, if had the opportunity. But it has no such liberty: as wholly owned property of an interstellar mining company, it provides security for survey operations on distant planets. Unbeknownst to its owners, though, Murderbot has disabled its own governing systems. Because it doesn’t trust its owners, and it’s prepared to fight them if necessary.

Martha Wells originally published her “Murderbot” stories as freestanding novellas, but those often make tough selling at mainstream bookstores. So her publisher is now re-releasing the stories in omnibus paperback editions. Readers get more of Wells’ story arc, which combines sociological science fiction with the open-ended narrative we recognize from prime-time soap operas. Think The Terminator meets Peyton Place.

In the first novella, “All Systems Red,” we discover Murderbot’s character and motivation. It works because it must, and being property, has no right to refuse. But it’s also altered its own programming, granting itself free agency which fellow “constructs” don’t enjoy. If nobody finds out, it can watch its downloads in relative peace. Problem is, someone has infiltrated its latest contract, turning fellow security cyborgs against their humans.

The second novella, “Artificial Condition,” follows Murderbot in its quest to uncover who violated the constructs’ programming and turned work into a slaughter. It just happens that whatever transgression made that violence possible, coincides with the biggest secret in Murderbot’s individual history. So Murderbot goes off-grid, seeking information that might shed light on why deep-space mining has recently become such a brutal enterprise.

Wells pinches popular sci-fi action themes readers will recognize from longstanding franchises like Star Trek, Flash Gordon, and Stargate. But she weaves those motifs together with an anthropological investigation of what makes someone human. Murderbot is nameless, sexless, and has no prior identity; it’s a complete cypher. Although it has organic components, they’re lab-grown; no part of Murderbot has ever been even tangentially human.

Martha Wells

Unlike prior artificial persons (Commander Data comes immediately to mind), Murderbot has no desire to become human. It observes humanity as entertainment, and performs its job without complaint. But doing that job has cost humans their lives in the past, a history that gives Murderbot a sense of lingering guilt. This forces it, and us, to ask whether morals and culpability apply to something built in a factory and owned boy a corporation.

The questions start small and personal. Murderbot works for its human clients, and exists specifically to keep them alive. But fellow security cyborgs have turned on their owners in another mining camp. This forces Murderbot to question whether its own survival matters enough to risk actual human lives, even tangentially. It actually says no, but its clients have anthropomorphized their cyborg guard and want it to live.

As details of the crime become clear, so does a larger view of Murderbot’s world. It occupies a world of interplanetary capitalism, where one’s ability to spend lavishly defines one’s survival. Without money or employment history, Murderbot can only investigate the parallel mysteries hanging over its head by trading its one useful commodity: the ability to communicate with technology. With Murderbot around, humanity’s sentient machines start feeling class consciousness.

I’ve already mentioned The Terminator and Star Trek’s Commander Data. Despite its name, Murderbot shares little with either android. It doesn’t want to kill, and admits it would abandon its mission if given the opportunity. But it also doesn’t aspire to become more human. Misanthropic and unburdened by social skills, its greatest aspiration is to be left alone. Yet it knows it cannot have this luxury, and must keep moving in order to survive.

This volume contains two stories, which weren’t written to pass as freestanding. This struck me in the first story: there’s no denouement, only an end. Had I read this novella without a larger context, I probably would’ve resented this, and not bought the second volume. Taken together, though, it’s easier to see the soap operatic motif. Both stories end so abruptly, readers can practically hear the music lingering over the “To Be Continued” title card.

It's easy to enjoy this book. Murderbot, as our first-person narrator, writes with dry sarcasm that contrasts with its setting. It’s forced to pass as human, in an anti-humanist universe where money trumps morality. It only wants privacy, but wherever it goes, it’s required to make friends and basically unionize the sentient machines. Martha Wells uses well-known science fiction building blocks in ironic ways that draw us into Murderbot’s drama.

Monday, April 28, 2025

Further Thoughts on the Futility of Language

Patrick Stewart (left) and Paul Winfield in the Star Trek episode “Darmok”
This essay is a follow-up to my prior essay Some Stray Thoughts on the Futility of Language

The popularity of Star Trek means that, more than most science fiction properties, its references and in-jokes exceed the bounds of genre fandom. Even non-junkies recognize inside references like “Dammit, Jim,” and “Beam me up.” But the unusual specificity of the 1991 episode “Darmok” exceeds those more general references. In that episode, the Enterprise crew encounters a civilization that speaks entirely in metaphors from classical mythology.

Berkeley linguist George Lakoff, in his book Metaphors We Live By, contends that much language consists of metaphors. For Lakoff, this begins with certain small-scale metaphors describing concepts we can’t describe directly: in an argument, we might “defend our position” and “attack our opponents.” We “build an argument from the ground up,” make sure we have “a firm foundation.” The debate ends, eventually, when we “see the other person’s point.”

Such first-level metaphors persist across time because, fundamentally, we need them. Formal debate structures shift little, and the figures of speech remain useful, even as the metaphors of siege warfare become obsolete. While speakers and authors repeat the metaphors, they retain their currency. Perhaps, if people stopped passing such metaphors onto the next generation, they might fade away, but so far, that hasn’t happened in any way I’ve spotted.

More pliable metaphors arise from cultural currents that might not persevere in the same way. Readers around my age will immediately recognize the metaphor when I say: “Read my lips, no new taxes.” They may even insert President George H.W. Bush’s hybrid Connecticut/Texas accent. For several years in the late 1980s and early 1990s, the “Read my lips” metaphor bespoke a tough, belligerent political stance that stood involate… until it didn’t.

In the “Darmok” episode, to communicate human mythic metaphors, Captain Picard describes the rudiments of the Epic of Gilgamesh, humanity’s oldest known surviving work of fiction. Picard emphasizes his familiarity with ancient myth in the denouement by reading the Homeric Odes, one of the principal sources of Iron Age Greek religious ritual. For Picard, previously established in canon as an archeology fan, the earliest myths represent humanity’s narrative foundation.

But does it? While a nodding familiarity with Homer’s Odyssey and Iliad remain staples of liberal education, how many people, outside the disciplines of Sumeriology and classical studies, read Gilgamesh and the Homeric Odes? I daresay that most Americans, if they read mythology at all, mostly read Bulfinch’s Mythology and Edith Hamilton’s Mythology, both of which sanitized Greek tradition for the Christian one-room schoolhouse.

The attached graphic uses two cultural metaphors to describe the writer’s political aspirations. The reference to Elvis on the toilet repeats the widespread cultural myth that Elvis Presley, remembered by fans as the King of Rock and Roll, passed away mid-bowel movement. There’s only one problem: he didn’t. Elvis’ loved ones found him unconscious on the bathroom floor, following a heart attack; he lingered a few days before dying in hospital.

The drift between Elvis as cultural narrative, and Elvis as historic fact, represents the concept of “mythology” in the literary critical sense. We speak of Christian mythology, the mythology of the Founding Fathers, and the myths of the Jersey Devil and prairie jackalope. These different “mythologies” represent, neither facts nor lies, but stories we tell to understand concepts too sweeping to address directly. Storytelling becomes a synecdoche for comprehension.

Similarly, the broad strokes of Weekend at Bernie’s have transcended the movie itself. It’s questionable how many people watched the movie, beyond the trailer. But the underlying premise has become a cultural touchstone. Likewise, one can mention The Crying Game or The Sixth Sense, and most Americans will understand the references, whether they’ve seen the movies or not. The vague outlines have become part of our shared mythology.

But the movies themselves haven’t become so. Especially as streaming services have turned movie-watching into a siloed enterprise, how many people watch older movies of an evening? We recognize Weekend at Bernie’s, released in 1989, as the movie where two doofuses use their boss’s corpse as backstage pass to moneyed debauchery. But I doubt how many people could state what actually happened, beyond the most sweeping generalities.

Both Elvis and Bernie have come unmoored from fact. Their stories, like those of Gilgamesh and Darmok, no longer matter; only the cultural vibe surrounding them survives. Language becomes a shorthand for understanding, but it stops being a vessel of actual meaning. We repeat the cultural references we think we share, irrespective of whether we know what really happened, because the metaphor, not the fact, matters.

Tuesday, April 22, 2025

Some Stray Thoughts on the Futility of Language

I think I was in seventh grade when I realized that I would probably never understand my peers. In church youth group, a young man approximately my age, but who attended another middle school, talked about meeting his school’s new Egyptian exchange student. “I could tell right away,” this boy—a specimen of handsome, square-jawed Caucasity who looked suspiciously adult, so I already distrusted him—said, “that he was gonna be cool.”

“How could you tell?” the adult facilitator asked.

“Because he knew the right answer when I asked, ‘What’s up?’”

Okay, tripping my alarm bells already. There’s a correct answer to an open-ended question?

Apparently I wasn’t the only one who found that fishy, because the adult facilitator and another youth simultaneously asked, “What’s the correct answer then?”

“He said, ‘What’s up?’” my peer said, accompanied by a theatrically macho chin thrust.

(The student being Egyptian also mattered, in 1987, because this kid evidently knew how to “Walk Like an Egyptian.”)

This peer, and apparently most other preteens in the room, understood something that I, the group facilitator, and maybe two other classmates didn’t understand: people don’t ask “What’s up?” because they want to know what’s up. They ask because it’s a prescribed social ritual with existing correct responses. This interaction, which I perceived as a request for information, is actually a ritual, about as methodical and prescriptive as a Masonic handshake.

My adult self, someone who reads religious theory and social science for fun, recognizes something twelve-year-old Kevin didn’t know. This prefixed social interaction resembles what Émile Durkheim called “liturgy,” the prescriptive language religious people use in ceremonial circumstances. Religious liturgy permits fellow believers to state the same moral principles in unison, thus reinforcing their shared values. It also inculcates their common identity as a people.

The shared linguistic enterprise, which looks stiff, meaningless, and inflexible to outsiders, is purposive to those familiar with the liturgy. Speaking the same words together, whether the Apostle’s Creed or the Kaddish or the Five Pillars of Islam, serves to transform the speakers. Same with secular liturgy: America’s Pledge of Allegiance comes to mind. Durkheim cited his native France’s covenants of Liberté, Egalité, et Fraternité.

This confused me, a nerdy and socially inept kid who understood life mainly through books, because I thought language existed to convey information. Because “What’s up?” is structured as a question, I perceived it as a question, meaning I perceived it as a request for clarifying information. I thought the “correct” answer was either a sarcastic rejoinder (“Oh, the sky, a few clouds…”) or an actual narrative of significant recent events.

No, I wasn’t that inept, I understood that when most people ask “How are you today,” it was a linguistic contrivance, and the correct answer is “fine.” I understood that people didn’t really want to know how you’re doing, especially if you’re doing poorly. But even then, the language was primarily informative: I’m here, the answer says, and I’m actively listening to you speak.

However, the “What’s up?” conundrum continues to nag me, nearly forty years later, because it reveals that most people don’t want information, at least not in spoken form. Oral language exists mainly to build group bonds, and therefore consists of ritual calls and responses. We love paying homage to language as communication, through formats like broadcast news, political speeches, and deep conversations. But these mostly consist of rituals.

Consider: when was the last time you changed your mind because of a spoken debate? This may mean the occasional staged contacts between, say, liberals and conservatives, or between atheists and Christians. Every four years, we endure the tedium of televised Presidential debates, but apart from standout moments like “They’re eating the pets,” we remember little of them, and we’re changed by less.

For someone like me, who enjoys unearthing deeper questions, that’s profoundly frustrating. When I talk to friends, I want to talk about things, not just talk at one another. Perhaps that’s why I continue writing this blog, instead of moving to YouTube or TikTok, where I’d receive a larger audience and more feedback. Spoken language, in short, is for building bonds; written language is for information.

Put another way, the question “What’s up?” isn’t about the individuals speaking, it’s about the unit they become together. Bar chats, water cooler conversations, and Passing the Peace at church contain no information, they define the group. Only when we sit down, alone, to read silently, do we really seek to truly discover what’s up.

Thursday, April 17, 2025

The Shadows and Glaciers of Northern Norway

C.J. Cooke, The Nesting: a Novel

Sophie Hallerton has just secured a coveted job nannying for an esteemed British widower raising his children in Norway’s remote northern forest. One problem: she isn’t Sophie Hallerton. She’s Lexi Ellis, a chronic screw-up who stole Sophie Hallerton’s credentials to escape looming homelessness, or worse. When Lexi arrives in Norway, though, she finds that Tom Faraday’s house conceals secrets that make her lies seem small.

I really liked C.J. Cooke’s most recent novel, The Book of Witching, which combined family drama, mystery, and historical saga with a distinct voice. So I grabbed Cooke’s 2020 book expecting something similar. Indeed, she mixes liberally again from multiple genres with broad audience appeal. Somehow, though, the ingredients come together without much urgency, and I’m left feeling disappointed as I close the final cover.

Architect Tom Faraday needs a nanny to nurture and homeschool his daughters, because their mother committed suicide in a Norwegian fjord. Anyway, everyone believes Aurelia committed suicide. We dedicated readers know that, the more confidently the characters believe something in Act One, the more certainly they’ll see their beliefs shattered by Act Three. This is just one place where Cooke invites readers to see themselves as in on the joke.

Lexi secures the nanny position with her filched credentials and some improv skills, only to discover she’s pretty effective. But once ensconced in Tom’s rural compound, she finds the entire family up to their eyeballs in deceit and secrets. Tom’s build, in honor of his late wife’s earth-friendly principles, is badly overdrawn and short-handed. The housekeeper hovers like Frau Blucher. And Tom’s married business partners are fairly shady, too.

Supernatural elements intrude on Lexi’s rural life. Animal tracks appear inside the house, then vanish without leading anywhere. Tom’s older daughter, just six, draws pictures of the Sad Lady, a half-human spectre that lingers over her memories of Aurelia. The Sad Lady maybe escaped from Aurelia’s hand-translated compendium of Norwegian folklore. A mysterious diary appears in Lexi’s locked bedroom, chock-a-block with implications that Tom might’ve killed his wife.

C.J. Cooke

If this sounds familiar, you aren’t wrong. Cooke introduces her stylistic borrowings in an unusually forthright manner. Lexi reads “Nordic Noir” novels in her spare time, signposting the sepulchral midwinter setting, and Lexi describes her ward’s artwork as “Gothic,” the correct term for this novel’s many locked-room puzzles. This boldly announces Cooke’s two most prominent influences, Henning Mankell and Henry James, whose influence lingers throughout the story.

Unfortunately for contemporary English-language readers, Cooke also writes with those authors’ somber pace. Her story introduces even more narrative threads than I’ve mentioned, and more than the characters themselves know, because her shifting viewpoint means we have information the characters lack. We know how intricate their scaffold of lies has become, and sadly, we know that if that scaffold collapsed, most characters would be more relieved than traumatized.

Cooke unrolls her threads slowly and deliberatively. The narration sometimes includes time jumps of weeks, even months. Probably even longer, because Tom’s ambitious experimental earth-house would take considerably longer to build than something conventional and timber-framed; one suspects Cooke doesn’t realize the logistics that go into construction. Characters have mind-shattering revelations about each other, sometimes false, then sit on them for months.

Indeed, despite the unarguable presence of a carnivorous Norwegian monster inside the house, it’s possible to forget, because it disappears for weeks. Cooke’s real interest, and the novel’s real motivation when it has one, is the human drama. We watch the tensions and duplicity inside the Faraday house amplify, a tendency increased by geographic isolation. Indeed, we see every lie the character tell, except one: what really happened to Aurelia.

This novel would’ve arguably been improved by removing the folk horror subplot, focusing on the human characters. But that would require restructuring the storytelling. The characters linger at a low simmer for chapter after chapter, then someone does something to change the tenor, and for a moment, we reach a boil. Cook’s Nordic atmospherics, and glacial pace, put the best moments—and there are several good moments—too far apart.

Then, paradoxically, the denouement happens too quickly. After 300 pages of slow, ambient exposition, Cooke abruptly ends the narrative in a manner that leaves many threads unresolved. Despite Cooke’s pacing errors, I found myself invested in Lexi’s journey of discovery, only to find it ends hastily, in a manner scarcely prompted by prior events. Cooke’s narrative doesn’t conclude, it just ends.

I’ll probably read Cooke again. But after this one, I’ll approach her with more caution.

Tuesday, April 15, 2025

Is the Law a Dead Letter Now?

Back in the 1990s, when I was a teenage Republican, I believed humanity would find a legal system so self-sustaining, we could eventually exclude humans from the equation. We could write laws, then deploy the bureaucratic instruments necessary to enforce those laws, without bias or favor, essentially forever. The machine would support itself without inputs from nasty, unreliable humans. We only needed to trust the modernist small-L liberal process.

Okay, we hadn’t written such laws to implement such systems, but that only proved we hadn’t written such laws yet. Because individuals only enforced laws as written, I reckoned, such self-sustaining systems would preclude individual prejudice or demographic bias. (I didn’t realize, for years, that laws themselves could contain bias.) Divisions, disadvantage, and destitution would eventually wither as laws enforced baseline ethical standards which encompassed everyone, everywhere, equally.

Watching the meltdown surrounding Kilmar Abrego Garcia, I’m seeing underlined something I gradually realized in my twenties, but never previously needed to say aloud: all laws are doomed to fail. Even laws written with altruistic intent and thorough legal support, like the 14th Amendment, work to the extent that those entrusted to enforce them, actually do so. America’s current executive regime is demonstrating no intention to enforce the law justly.

The regime first deported Abrego Garcia in March, despite him having legal residency status and never having been convicted of any crime. Initially, the regime acknowledged that they’d expelled Abrego Garcia mistakenly, and based on that acknowledgment, the Supreme Court—dominated by Republican nominees and one-third appointed by the current president—unilaterally demanded his return. So the regime flippantly changed the narrative and refused to comply.

This refusal, stated unambiguously in an Oval Office press conference where the American and Salvadoran presidents shared the lectern, demonstrates why the law will inevitably fail. America’s system, predicated on the government’s adherence to the principles laid out in the Constitution, absolutely requires that all participants share a prior commitment. Simply put, they must believe that nation, government, and law, are more important than any individual. Even the president.

Kilmar Abrego Garcia (AP photo)

We must strike a balance here, certainly. Individuals write our laws, even individuals working collectively, and our legislators are individuals. The “buck stops here” president, an individual, must balance power with the nine SCOTUS justices and the 535 members of Congress, who are all individuals, even when working jointly. But those individuals all work for a shared vision, and when they don’t, their whimsy becomes antithetical to state organization.

Please don’t misunderstand me. Any individual may call the nation wrong, as for instance Dr. King did, and may organize to redress such wrong. Indeed, only such public, organized call-out may sway the nation’s conscience sufficiently to enact change or improve a dysfunctional system. The primacy of the nation doesn’t mean citizens must meekly accept arbitrary or unjust directions from a unitary state. That would basically invite autocracy.

Simultaneously, however, those who seek official state power must submit themselves to something larger than their individuality. Dr. King never ran for office, and the tactics he employed when crossing the Edmund Pettis Bridge would’ve been inappropriate in Congress. Indeed, his deputy, John Lewis, who became a Representative, used Dr. King’s tactics to mobilize voters, but submitted himself to forms of order when writing and voting on legislation.

My regular readers, who mostly share my sociopolitical views, may think I’m saying something obvious here. But as I write, the current president’s approval ratings hover between 41% and 49%. That’s negative, and substantially underwater, but at least two in five Americans look at what’s currently happening, and don’t mind. They voted for his tariffs, immigrant roundups, and rollbacks of civil rights law, and five months later, they remain unchanged.

A satisfactory fraction of American voters approves of, or at least don’t mind, a president placing himself above either Congress or SCOTUS. This president, like Andrew Jackson before him, thinks he’s empowered to force lawful residents off their land, unless someone has guns enough to stop him. Essentially, he’ll continue ignoring baselines of justice until someone, presumably Congress, does something to stop him.

Our entire Constitutional structure requires those elected to power, to agree that America is more important than themselves. That means both America, the human collective, and America, the structures of government. If laws require them to act correctly, then they must abide by those laws without threats of force. If they can’t do that, well, that’s what checks and balances are for. If that fails, We the People step in.

Friday, April 11, 2025

A Very Proper and Decorous English Heist

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 54
Charles Crichton (director), The Lavender Hill Mob

Henry Holland (Alec Guinness) is the epitome of the postwar British nothing man: firmly middle class and middle management, he has little to show for his life. He’s spent twenty years supervising gold bullion shipments for a London commercial bank, handling money he’ll never be allowed to touch. One day his bank announces plans to move him to another department, and Henry decides to act. He’ll never see such money himself unless he steals it.

For approximately ten years after World War II, Ealing Studios, Britain’s longest surviving film studio, produced a string of comedies so consistent, they became a brand. They mixed tones throughout, shifting from dry wordplay and dark sarcasm, straight into loud, garish slapstick, often in the same scene. They shared certain general themes, though, especially the collision between Old Britain, wounded by the war, and a chaotic, freebooting new culture that hadn’t quite found its identity.

When Henry discovers his neighbor, Alfred Pendlebury (Stanley Holloway), owns a small-scale metal foundry, the men decide to collaborate on Henry’s hastily considered heist. Through a caper too silly to recount, Henry and Alfred recruit two small-time hoodlums to perform the actual robbery. This union of jobs, classes, and accents makes a statement about Britain in 1951: the old divisions between castes are melting away. Something new is arising, and that something is probably criminal.

Besides their themes, the classic Ealing comedies shared other traits. Alec Guinness and Stanley Holloway were two among a rotating repertory company appearing in several movies. Films were shot in real-life London streets, and in studios built in repurposed wartime aircraft hangars. The movies’ design bespeaks a Britain that existed only briefly, during the decades between Churchill and Thatcher: hung up on propriety and dignity, but also suddenly young, history bombed away in the Blitz.

The robbery is plucky, entrepreneurial, almost downright admirable. Henry’s crew execute a slapstick heist so silly, the Keystone Kops would’ve doffed their hats. But having done it, the crew find themselves actually holding a vanload of gold bullion, in a country still cash-strapped and suffering under wartime rationing. Gold is worthless, they discover, unless they can sell it. Which means smuggling it out of the country under the Metropolitan Police’s watchful, but easily distracted, eye.

Like in all Ealing comedies, indeed most of 20th century British comedy, much of the humor comes from watching pretentions disintegrate. In another Guinness starring vehicle, The Man in the White Suit, this disintegration is literal, as conflicting sides tear the title character’s newfangled fabric to shreds. Here, it’s more metaphorical. The more our protagonists’ suits become rumpled, the more their hats fly off in frantic pursuits, the more they escape their prewar class roles.

Alec Guinness (left) and Stanley Holloway in The Lavender Hill Mob

This movie culminates in the police pursuing our antiheroes through London streets. This was seventeen years before Steve McQueen’s Bullitt made car chases a cinema staple, so Henry and Alfred make their own rules: frantic but dignified, they never forget their place. They use police tactics to distract the police, turning British decorum against itself, but their insistence on such polite observance eventually dooms them. These sports can escape everything—except their own British nature.

Alec Guinness plays Henry Holland with a gravitas which exceeds one character. In later years, he would become famous for playing implacable elder statesmen in classics like The Bridge on the River Kwai and the original Star Wars. This character has seeds of these more famous roles, but Guinness survives indignities we can’t imagine Obi-Wan Kenobi facing. Henry Holland goes from clerk to mastermind to goofy fugitive, all with seamless integrity. Guinness’ decorum never cracks.

This movie is worth watching in itself, but it also introduces the whole Ealing subgenre. It showcases the personalities, themes, and storytelling that made Ealing a classic. Most Ealing comedies were American successes, and repertory actors, especially Guinness, became American stars. But the genre lasted only briefly; the BBC bought the studio in 1957, and attempts to recapture the Ealing magic failed. Tom Hanks took Guinness’ role in a remake of The Ladykillers, and tanked.

Put briefly, the category is a surviving emblem of a time, place, and culture. Like Kingsley Amis’ Lucky Jim, or Douglas Adams’ Arthur Dent, Guinness’ Henry Holland is a British man in a time when being British didn’t mean much anymore. This movie, with its postwar man struggling for dignity amid changing times and a mobilized proletariat, couldn’t have been made any earlier or later than it was. Watching it is like a time machine.

Friday, April 4, 2025

One Dark Night in an African Dreamland

Yvette Lisa Ndlovu, Drinking from Graveyard Wells: Stories

A recently deceased wife must choose whether to move onto the next life, or become an ancestral avenging spirit in this life. A civil engineer tasked with building a dam must first defeat the carnivorous spirits controlling the river. When houses begin vanishing from an impoverished slum, one gifted girl discovers the disappearances follow a logarithmic pattern. Refugees seeking asylum discover the immigration people aren’t bureaucrats, they’re a priesthood.

Zimbabwean author Yvette Lisa Ndlovu writes from a hybrid perspective: one foot in her homeland, one in the West. Ndlovu herself studied at Cornell and Amherst, and many of her mostly female protagonists are graduates of American (or Americanized) universities. Yet Zimbzbwe’s history, both its ancient past and its recent struggles for independence, remain near the surface. For Ndlovu, Western modernism is usually a thin and transparent veneer.

Many of Ndlovu’s stories fall broadly into the categories of “fantasy” or “horror,” but that’s a marketing contrivance. Though many of her stories involve a monster—a primordial horror dwelling under conflict diamond fields, for instance, or carnivorous ants raised to make boner pills—almost never does the monster drive the story. Usually, Ndlovu’s monsters point her protagonists toward a deeper, more disquieting truth underneath the protagonists’ lives.

Instead of outright horror, these stories mostly turn on the friction between expectation and experience. Our protagonists usually start the story believing something rational, or expecting something reasonable. Recurrent themes include meaningful work and graduating from high school, two of the most common aspirations. But life in post-colonial Zimbabwe, with ancient traditions, modern tools of repression, and widespread poverty, always intrudes on those hopes.

In one story, a Zimbabwean student receives a fluke gift from the ancestral gods: she keeps stumbling accidentally into money. But the more money she fumbles into, the more her family expects from her. Soon the escape she sought becomes the burden she resents—until the gods demand an eternal choice.

When a student suffers blackouts, Western medicine cannot help. She consults an oracle, who finds the cure hidden in the past. To escape her condition, the student must time-travel to early colonialism and recover a military queen whom the British historians erased from living memory.

Yvette Lisa Ndlovu

Ndlovu structures some stories more like fables than Western fiction: an island king discovers immortality, but slowly stops being human. A healer erases the burdens of grief, but secretly serves a master whom her patients never see. A handful of newspaper clippings hide the secret pattern governing city women’s lives.

Not every story is “horror” or “fantasy.” In one story, an American college student discovers a common tool of Zimbabwean folk practice, and finds a way to monetize it, at the people’s expense. In another, poverty forces a talented student to leave school and find work; she pays her bills, but watches opportunities flit past.

Concerns of faith and religion recur. Though many of Ndlovu’s characters are Christian, and quote the Bible generously, they do so in a nation where ancient gods might occupy neighborhood houses. She reads the rituals and habits of government as religious rites, which isn’t a stretch. Issues of daily life contain spiritual depth in a nation where nature, death, and hunger always linger on modern life’s margins.

Ndlovu’s stories range from three to sixteen pages. This means they all make for complete reading in one session, with time left over to contemplate her themes. And those themes do require some deeper thought, because she asks important questions about what it means to be modern in traditional communities, or to be poor in a world with more than enough money. She doesn’t let readers off easily.

Perhaps I can give Ndlovu no greater praise than saying her short stories are genuinely short. Too many short story writers today apparently had an idea for a novel, jotted some notes, and thought they had a story. Not so here. Out of fourteen stories, one feels truncated; the other thirteen read as self-contained and thematically complete. That isn’t feint praise, either. I appreciate that Ndlovu crafts fully realized experiences we can savvy in one sitting.

The title story, which is also the last, asks us whether it’s always bad to go unnoticed. The question comes with piercing directness. Characters find themselves disappearing from a society that doesn’t want to see them. But maybe, for those taken away, it’s a Biblical experience. We can’t know, Ndlovu tells us in the rousing final sentences, but maybe that uncertainty is what makes her characters’ lives worth living.

Thursday, March 27, 2025

Sorry, Dad, I Can’t Do Politics Anymore

Defense Secretary Pete Hegseth

My father thinks I should run for elective office. Because I strive to stay informed on local, national, and world affairs, and base my opinions on solid facts and information, he thinks I’m potential leadership material. Me, I thought I only took seriously the 11th-grade American Civics warning to be an involved citizen and voter. But too few people share that value today, and Dad thinks that makes me electable.

This week’s unfolding events demonstrate why I could never hold elective office. We learned Monday that a squadron of Executive Branch bureaucrats, including the National Security Adviser, the Secretary of Defense, and the Vice President, were conducting classified government business by smartphone app. For those sleeping through the story (or reading it later), we know because National Security Adviser Mike Waltz dialed Atlantic editor Jeffrey Goldberg into the group chat.

Unfortunately, Dad is wrong; I’m no better informed than anyone else on unfolding events. I’ve watched the highlights of senators questioning Director of National Intelligence Tulsi Gabbard and CIA head John Ratcliffe, but even then, I’m incapable of watching without collapsing into spitting rage. Gabbard’s vague, evasive answers on simple questions like “were you included in the group chat” indicate an unwillingness to conduct business in an honest, forthright manner.

Not one person on this group chat—and, because Goldberg in his honesty removed himself after verifying the chat’s accuracy, we don’t know everyone on the chat—thought to double-check the roster of participants. This despite using an unsecured app with a history of hacking. That’s the level of baseline security we’d expect from coworkers organizing a surprise party, not Cabinet secretaries conducting an overseas military strike.

The Administration compounded its unforced errors by lying. On Tuesday, Defense Secretary Pete Hegseth pretended that Goldberg’s chat contained no national security information; on Wednesday, Goldberg published the information. Millions of Americans who share my dedication to competent citizenship couldn’t get our jaws off the floor. Hegseth knew not only that Goldberg had that information, but that he could produce it. And he lied anyway.

National Security Adviser Mike Waltz

In a matter of weeks, we’ve witnessed the devaluation of competence in American society. Trump, who had no government experience before 2016, has peopled his second administration with telegenic muppets who similarly lack either book learning or hands-on proficiency. But then, no wonder, since studies indicate that willingness to vote for Trump correlates broadly with being ill-informed or wrong about facts. We’ve conceived a government by, and for, the ignorant.

Small-d democratic government relies upon two presumptions: that everyone involved is informed on the facts, to the extent that non-specialists could possibly keep informed, and that everyone involved acts in good faith. Both have clearly vanished. The notorious claim that, according to Google Analytics, searches for the word “tariffs” spiked the day after Trump’s election, apparently aren’t true: they spiked the day before. But even that’s embarrassingly late./p>

Either way, though, it reveals the uncomfortable truth that Americans don’t value competence anymore, not in themselves, and not in elected decision-makers. This Administration’s systemic lack of qualifications among its senior staff demonstrates the belief that obliviousness equals honesty. Though the President has installed a handful of serious statesmen in his Cabinet, people like Hegseth, Gabbard, and Kash Patel are unburdened by practical experience or tedious ol’ book larnin’.

Now admittedly, I appreciate when voters express their disgust at business-as-usual Democrats. Democratic leadership’s recent willingness to fold like origami cranes when facing even insignificant pushback, helps convince cocksure voters that competence and experience are overrated. The GOP Administration’s recent activities have maybe been cack-handed, incompetent, and borderline illegal, but they’re doing something. To the uninitiated, that looks bold and authoritative.

But Dad, that’s exactly why I can’t run for office. Because I’ve lived enough, and read enough, to know that rapid changes and quick reforms usually turn to saltpeter and ash. Changes made quickly, get snatched back quickly, especially in a political environment conditioned by digital rage. Rooting out corruption, waste, and bureaucratic intransigence is a slow, painstaking process. Voters today apparently want street theatre. I’m unwilling to do that.

My father might counter by noting that the Administration’s popularity is historically low, that its own voting base is turning away, and that this controversy might be weighty enough to bring them to heel. I say: maybe. But unless voters are willing to recommit themselves to being informed, following events, and knowing better than yesterday, the underlying problem will remain. The next quick-ix demagogue will deceive them the same way.