Friday, May 9, 2025

The Ultimate Meaninglessness of “Crime”

We’ve seen an increasing number of anecdotes trickling out about once-loyal voters rejecting the Administration’s ham-handed deportation policies. Though it’s hard to derive meaningful data from isolated anecdotes, the number of stories like this one and this one about Trump voters getting burned by the administration they once supported. Many stories share a theme: “we” thought the Administration would only deport “criminals,” and we don’t consider ourselves criminals.

On one level, they’re correct: under American statutes, immigration falls under civil, not criminal, law. “Illegal” immigration is a non-category, because the word illegal refers only to crimes, not civil violations. But on another level, this reveals something uncomfortable for many Americans, that “crime” itself isn’t a fixed concept. Many undocumented immigrants don’t consider themselves criminals because they’ve committed no violent or property crime; so the Administration simply redefines “crime.”

Much American political discourse centers on “crime,” especially when Democrats hold the Oval Office. As sociologist Barry Glassner writes, fear of crime is a powerful motivator for tradition-minded voters, a motivator Republicans employ effectively. Glassner writes about how rabble rousers used fear of crime to shanghai the Clinton Administration, but the same applies broadly whenever Democrats hold majority power. We saw it during the Obama and Biden years too.

However, exactly what constitutes crime depends on who does the constituting. My core readership probably remembers John Erlichman, former White House Counsel, who admitted the Nixon Administration simply fabricated the War on Drugs as pretext to harass anti-war and Civil Rights protesters. The notorious Comstock Laws channeled one man’s sense of injured propriety to criminalize porn, contraception, pharmaceutical abortion, and the kitchen sink. Moral umbrage beats harm in defining “crimes.”

This doesn’t mean harm doesn’t exist or states should repeal every law. Murder, theft, and sexual assault are clearly wrong, because they cause manifest harm and devalue victims’ lives, bodies, and labors. But these transgressions only become “crimes” when governments pass laws against them. Legal philosophers might debate whether decriminalizing murder would make murder happen more often. Personally, I doubt it; neither Prohibition nor its repeal affected drinking numbers much.

Prohibition, therefore, proves the moral fuzziness of crimes. Both the Al Capone-style Prohibition, and contemporary drug prohibition, arose not from obvious harm (most pot-heads are too lethargic to hurt anybody), but from moral panic and public outrage. Governments made laws against substances lawmakers found abhorrent, then assumed citizens would avoid those substances, simply because they’re illegal. Then they act surprised when drinking or drugs persist.

This happens because these things aren’t innately crimes; they become crimes because lawmakers make laws. Similarly, while it’s clearly harmful if I steal money from your wallet, other property “crimes” have squishier histories. Squatting, for instance: once legal, it became illegal in America, as James Loewen writes, largely to circumscribe where Native Americans were allowed to hunt and camp. Lawmakers created laws, where none previously existed, to punish transgressors.

Immigration law follows similar patterns. Abrahamic scripture urges the faithful to welcome immigrants because, in that time, borders didn’t really exist. People moved freely, and provided they followed local laws and customs, largely changed nationhood liberally. Though serfdom tied workers to lands and lords in the late medieval period, modern concepts of the nation-state and international borders existed only as legal abstractions. Only during wartime did states enforce borders much.

This Administration can redefine civil infractions, like undocumented immigration, as crimes, because that’s how things become crimes. States will borders into existence by legal legerdemain, then demand that people remain permanently circumscribed by these fictional lines. Perhaps that’s why “the Wall” looms so large in MAGA mythology: because borders don’t really exist, so we need something manifest and palpable to make borders real.

These MAGA voters who feel betrayed because the Administration deported their loved ones, assumed that they weren’t “criminals” because they used a broad, popular definition of criminality. They didn’t perform acts of violence or property destruction, they reckoned, so therefore they weren’t criminals. They didn’t anticipate the Administration using crime’s fuzzy, amorphous nature against them, and therefore were caught unprepared when the definition of “crime” moved to surround them.

Civil society has two responses available. We could eliminate self-serving, avaricious laws, and allow people more discretion. There’s no objective reason people must live within certain borders, except that lawmakers need to control despised minorities. But we know society probably won’t choose that response. More likely, our lawmakers will write harsher, more draconian laws to eliminate this flexibility. Which will then be used against us ordinary people.

Monday, May 5, 2025

I'll Be Back, I Guess, Or Whatever

Martha Wells, The Murderbot Diaries Vol. 1

The cyborg that calls itself “Murderbot” would happily watch downloaded soap operas, 24/7, if had the opportunity. But it has no such liberty: as wholly owned property of an interstellar mining company, it provides security for survey operations on distant planets. Unbeknownst to its owners, though, Murderbot has disabled its own governing systems. Because it doesn’t trust its owners, and it’s prepared to fight them if necessary.

Martha Wells originally published her “Murderbot” stories as freestanding novellas, but those often make tough selling at mainstream bookstores. So her publisher is now re-releasing the stories in omnibus paperback editions. Readers get more of Wells’ story arc, which combines sociological science fiction with the open-ended narrative we recognize from prime-time soap operas. Think The Terminator meets Peyton Place.

In the first novella, “All Systems Red,” we discover Murderbot’s character and motivation. It works because it must, and being property, has no right to refuse. But it’s also altered its own programming, granting itself free agency which fellow “constructs” don’t enjoy. If nobody finds out, it can watch its downloads in relative peace. Problem is, someone has infiltrated its latest contract, turning fellow security cyborgs against their humans.

The second novella, “Artificial Condition,” follows Murderbot in its quest to uncover who violated the constructs’ programming and turned work into a slaughter. It just happens that whatever transgression made that violence possible, coincides with the biggest secret in Murderbot’s individual history. So Murderbot goes off-grid, seeking information that might shed light on why deep-space mining has recently become such a brutal enterprise.

Wells pinches popular sci-fi action themes readers will recognize from longstanding franchises like Star Trek, Flash Gordon, and Stargate. But she weaves those motifs together with an anthropological investigation of what makes someone human. Murderbot is nameless, sexless, and has no prior identity; it’s a complete cypher. Although it has organic components, they’re lab-grown; no part of Murderbot has ever been even tangentially human.

Martha Wells

Unlike prior artificial persons (Commander Data comes immediately to mind), Murderbot has no desire to become human. It observes humanity as entertainment, and performs its job without complaint. But doing that job has cost humans their lives in the past, a history that gives Murderbot a sense of lingering guilt. This forces it, and us, to ask whether morals and culpability apply to something built in a factory and owned boy a corporation.

The questions start small and personal. Murderbot works for its human clients, and exists specifically to keep them alive. But fellow security cyborgs have turned on their owners in another mining camp. This forces Murderbot to question whether its own survival matters enough to risk actual human lives, even tangentially. It actually says no, but its clients have anthropomorphized their cyborg guard and want it to live.

As details of the crime become clear, so does a larger view of Murderbot’s world. It occupies a world of interplanetary capitalism, where one’s ability to spend lavishly defines one’s survival. Without money or employment history, Murderbot can only investigate the parallel mysteries hanging over its head by trading its one useful commodity: the ability to communicate with technology. With Murderbot around, humanity’s sentient machines start feeling class consciousness.

I’ve already mentioned The Terminator and Star Trek’s Commander Data. Despite its name, Murderbot shares little with either android. It doesn’t want to kill, and admits it would abandon its mission if given the opportunity. But it also doesn’t aspire to become more human. Misanthropic and unburdened by social skills, its greatest aspiration is to be left alone. Yet it knows it cannot have this luxury, and must keep moving in order to survive.

This volume contains two stories, which weren’t written to pass as freestanding. This struck me in the first story: there’s no denouement, only an end. Had I read this novella without a larger context, I probably would’ve resented this, and not bought the second volume. Taken together, though, it’s easier to see the soap operatic motif. Both stories end so abruptly, readers can practically hear the music lingering over the “To Be Continued” title card.

It's easy to enjoy this book. Murderbot, as our first-person narrator, writes with dry sarcasm that contrasts with its setting. It’s forced to pass as human, in an anti-humanist universe where money trumps morality. It only wants privacy, but wherever it goes, it’s required to make friends and basically unionize the sentient machines. Martha Wells uses well-known science fiction building blocks in ironic ways that draw us into Murderbot’s drama.

Monday, April 28, 2025

Further Thoughts on the Futility of Language

Patrick Stewart (left) and Paul Winfield in the Star Trek episode “Darmok”
This essay is a follow-up to my prior essay Some Stray Thoughts on the Futility of Language

The popularity of Star Trek means that, more than most science fiction properties, its references and in-jokes exceed the bounds of genre fandom. Even non-junkies recognize inside references like “Dammit, Jim,” and “Beam me up.” But the unusual specificity of the 1991 episode “Darmok” exceeds those more general references. In that episode, the Enterprise crew encounters a civilization that speaks entirely in metaphors from classical mythology.

Berkeley linguist George Lakoff, in his book Metaphors We Live By, contends that much language consists of metaphors. For Lakoff, this begins with certain small-scale metaphors describing concepts we can’t describe directly: in an argument, we might “defend our position” and “attack our opponents.” We “build an argument from the ground up,” make sure we have “a firm foundation.” The debate ends, eventually, when we “see the other person’s point.”

Such first-level metaphors persist across time because, fundamentally, we need them. Formal debate structures shift little, and the figures of speech remain useful, even as the metaphors of siege warfare become obsolete. While speakers and authors repeat the metaphors, they retain their currency. Perhaps, if people stopped passing such metaphors onto the next generation, they might fade away, but so far, that hasn’t happened in any way I’ve spotted.

More pliable metaphors arise from cultural currents that might not persevere in the same way. Readers around my age will immediately recognize the metaphor when I say: “Read my lips, no new taxes.” They may even insert President George H.W. Bush’s hybrid Connecticut/Texas accent. For several years in the late 1980s and early 1990s, the “Read my lips” metaphor bespoke a tough, belligerent political stance that stood involate… until it didn’t.

In the “Darmok” episode, to communicate human mythic metaphors, Captain Picard describes the rudiments of the Epic of Gilgamesh, humanity’s oldest known surviving work of fiction. Picard emphasizes his familiarity with ancient myth in the denouement by reading the Homeric Odes, one of the principal sources of Iron Age Greek religious ritual. For Picard, previously established in canon as an archeology fan, the earliest myths represent humanity’s narrative foundation.

But does it? While a nodding familiarity with Homer’s Odyssey and Iliad remain staples of liberal education, how many people, outside the disciplines of Sumeriology and classical studies, read Gilgamesh and the Homeric Odes? I daresay that most Americans, if they read mythology at all, mostly read Bulfinch’s Mythology and Edith Hamilton’s Mythology, both of which sanitized Greek tradition for the Christian one-room schoolhouse.

The attached graphic uses two cultural metaphors to describe the writer’s political aspirations. The reference to Elvis on the toilet repeats the widespread cultural myth that Elvis Presley, remembered by fans as the King of Rock and Roll, passed away mid-bowel movement. There’s only one problem: he didn’t. Elvis’ loved ones found him unconscious on the bathroom floor, following a heart attack; he lingered a few days before dying in hospital.

The drift between Elvis as cultural narrative, and Elvis as historic fact, represents the concept of “mythology” in the literary critical sense. We speak of Christian mythology, the mythology of the Founding Fathers, and the myths of the Jersey Devil and prairie jackalope. These different “mythologies” represent, neither facts nor lies, but stories we tell to understand concepts too sweeping to address directly. Storytelling becomes a synecdoche for comprehension.

Similarly, the broad strokes of Weekend at Bernie’s have transcended the movie itself. It’s questionable how many people watched the movie, beyond the trailer. But the underlying premise has become a cultural touchstone. Likewise, one can mention The Crying Game or The Sixth Sense, and most Americans will understand the references, whether they’ve seen the movies or not. The vague outlines have become part of our shared mythology.

But the movies themselves haven’t become so. Especially as streaming services have turned movie-watching into a siloed enterprise, how many people watch older movies of an evening? We recognize Weekend at Bernie’s, released in 1989, as the movie where two doofuses use their boss’s corpse as backstage pass to moneyed debauchery. But I doubt how many people could state what actually happened, beyond the most sweeping generalities.

Both Elvis and Bernie have come unmoored from fact. Their stories, like those of Gilgamesh and Darmok, no longer matter; only the cultural vibe surrounding them survives. Language becomes a shorthand for understanding, but it stops being a vessel of actual meaning. We repeat the cultural references we think we share, irrespective of whether we know what really happened, because the metaphor, not the fact, matters.

Tuesday, April 22, 2025

Some Stray Thoughts on the Futility of Language

I think I was in seventh grade when I realized that I would probably never understand my peers. In church youth group, a young man approximately my age, but who attended another middle school, talked about meeting his school’s new Egyptian exchange student. “I could tell right away,” this boy—a specimen of handsome, square-jawed Caucasity who looked suspiciously adult, so I already distrusted him—said, “that he was gonna be cool.”

“How could you tell?” the adult facilitator asked.

“Because he knew the right answer when I asked, ‘What’s up?’”

Okay, tripping my alarm bells already. There’s a correct answer to an open-ended question?

Apparently I wasn’t the only one who found that fishy, because the adult facilitator and another youth simultaneously asked, “What’s the correct answer then?”

“He said, ‘What’s up?’” my peer said, accompanied by a theatrically macho chin thrust.

(The student being Egyptian also mattered, in 1987, because this kid evidently knew how to “Walk Like an Egyptian.”)

This peer, and apparently most other preteens in the room, understood something that I, the group facilitator, and maybe two other classmates didn’t understand: people don’t ask “What’s up?” because they want to know what’s up. They ask because it’s a prescribed social ritual with existing correct responses. This interaction, which I perceived as a request for information, is actually a ritual, about as methodical and prescriptive as a Masonic handshake.

My adult self, someone who reads religious theory and social science for fun, recognizes something twelve-year-old Kevin didn’t know. This prefixed social interaction resembles what Émile Durkheim called “liturgy,” the prescriptive language religious people use in ceremonial circumstances. Religious liturgy permits fellow believers to state the same moral principles in unison, thus reinforcing their shared values. It also inculcates their common identity as a people.

The shared linguistic enterprise, which looks stiff, meaningless, and inflexible to outsiders, is purposive to those familiar with the liturgy. Speaking the same words together, whether the Apostle’s Creed or the Kaddish or the Five Pillars of Islam, serves to transform the speakers. Same with secular liturgy: America’s Pledge of Allegiance comes to mind. Durkheim cited his native France’s covenants of Liberté, Egalité, et Fraternité.

This confused me, a nerdy and socially inept kid who understood life mainly through books, because I thought language existed to convey information. Because “What’s up?” is structured as a question, I perceived it as a question, meaning I perceived it as a request for clarifying information. I thought the “correct” answer was either a sarcastic rejoinder (“Oh, the sky, a few clouds…”) or an actual narrative of significant recent events.

No, I wasn’t that inept, I understood that when most people ask “How are you today,” it was a linguistic contrivance, and the correct answer is “fine.” I understood that people didn’t really want to know how you’re doing, especially if you’re doing poorly. But even then, the language was primarily informative: I’m here, the answer says, and I’m actively listening to you speak.

However, the “What’s up?” conundrum continues to nag me, nearly forty years later, because it reveals that most people don’t want information, at least not in spoken form. Oral language exists mainly to build group bonds, and therefore consists of ritual calls and responses. We love paying homage to language as communication, through formats like broadcast news, political speeches, and deep conversations. But these mostly consist of rituals.

Consider: when was the last time you changed your mind because of a spoken debate? This may mean the occasional staged contacts between, say, liberals and conservatives, or between atheists and Christians. Every four years, we endure the tedium of televised Presidential debates, but apart from standout moments like “They’re eating the pets,” we remember little of them, and we’re changed by less.

For someone like me, who enjoys unearthing deeper questions, that’s profoundly frustrating. When I talk to friends, I want to talk about things, not just talk at one another. Perhaps that’s why I continue writing this blog, instead of moving to YouTube or TikTok, where I’d receive a larger audience and more feedback. Spoken language, in short, is for building bonds; written language is for information.

Put another way, the question “What’s up?” isn’t about the individuals speaking, it’s about the unit they become together. Bar chats, water cooler conversations, and Passing the Peace at church contain no information, they define the group. Only when we sit down, alone, to read silently, do we really seek to truly discover what’s up.

Thursday, April 17, 2025

The Shadows and Glaciers of Northern Norway

C.J. Cooke, The Nesting: a Novel

Sophie Hallerton has just secured a coveted job nannying for an esteemed British widower raising his children in Norway’s remote northern forest. One problem: she isn’t Sophie Hallerton. She’s Lexi Ellis, a chronic screw-up who stole Sophie Hallerton’s credentials to escape looming homelessness, or worse. When Lexi arrives in Norway, though, she finds that Tom Faraday’s house conceals secrets that make her lies seem small.

I really liked C.J. Cooke’s most recent novel, The Book of Witching, which combined family drama, mystery, and historical saga with a distinct voice. So I grabbed Cooke’s 2020 book expecting something similar. Indeed, she mixes liberally again from multiple genres with broad audience appeal. Somehow, though, the ingredients come together without much urgency, and I’m left feeling disappointed as I close the final cover.

Architect Tom Faraday needs a nanny to nurture and homeschool his daughters, because their mother committed suicide in a Norwegian fjord. Anyway, everyone believes Aurelia committed suicide. We dedicated readers know that, the more confidently the characters believe something in Act One, the more certainly they’ll see their beliefs shattered by Act Three. This is just one place where Cooke invites readers to see themselves as in on the joke.

Lexi secures the nanny position with her filched credentials and some improv skills, only to discover she’s pretty effective. But once ensconced in Tom’s rural compound, she finds the entire family up to their eyeballs in deceit and secrets. Tom’s build, in honor of his late wife’s earth-friendly principles, is badly overdrawn and short-handed. The housekeeper hovers like Frau Blucher. And Tom’s married business partners are fairly shady, too.

Supernatural elements intrude on Lexi’s rural life. Animal tracks appear inside the house, then vanish without leading anywhere. Tom’s older daughter, just six, draws pictures of the Sad Lady, a half-human spectre that lingers over her memories of Aurelia. The Sad Lady maybe escaped from Aurelia’s hand-translated compendium of Norwegian folklore. A mysterious diary appears in Lexi’s locked bedroom, chock-a-block with implications that Tom might’ve killed his wife.

C.J. Cooke

If this sounds familiar, you aren’t wrong. Cooke introduces her stylistic borrowings in an unusually forthright manner. Lexi reads “Nordic Noir” novels in her spare time, signposting the sepulchral midwinter setting, and Lexi describes her ward’s artwork as “Gothic,” the correct term for this novel’s many locked-room puzzles. This boldly announces Cooke’s two most prominent influences, Henning Mankell and Henry James, whose influence lingers throughout the story.

Unfortunately for contemporary English-language readers, Cooke also writes with those authors’ somber pace. Her story introduces even more narrative threads than I’ve mentioned, and more than the characters themselves know, because her shifting viewpoint means we have information the characters lack. We know how intricate their scaffold of lies has become, and sadly, we know that if that scaffold collapsed, most characters would be more relieved than traumatized.

Cooke unrolls her threads slowly and deliberatively. The narration sometimes includes time jumps of weeks, even months. Probably even longer, because Tom’s ambitious experimental earth-house would take considerably longer to build than something conventional and timber-framed; one suspects Cooke doesn’t realize the logistics that go into construction. Characters have mind-shattering revelations about each other, sometimes false, then sit on them for months.

Indeed, despite the unarguable presence of a carnivorous Norwegian monster inside the house, it’s possible to forget, because it disappears for weeks. Cooke’s real interest, and the novel’s real motivation when it has one, is the human drama. We watch the tensions and duplicity inside the Faraday house amplify, a tendency increased by geographic isolation. Indeed, we see every lie the character tell, except one: what really happened to Aurelia.

This novel would’ve arguably been improved by removing the folk horror subplot, focusing on the human characters. But that would require restructuring the storytelling. The characters linger at a low simmer for chapter after chapter, then someone does something to change the tenor, and for a moment, we reach a boil. Cook’s Nordic atmospherics, and glacial pace, put the best moments—and there are several good moments—too far apart.

Then, paradoxically, the denouement happens too quickly. After 300 pages of slow, ambient exposition, Cooke abruptly ends the narrative in a manner that leaves many threads unresolved. Despite Cooke’s pacing errors, I found myself invested in Lexi’s journey of discovery, only to find it ends hastily, in a manner scarcely prompted by prior events. Cooke’s narrative doesn’t conclude, it just ends.

I’ll probably read Cooke again. But after this one, I’ll approach her with more caution.

Tuesday, April 15, 2025

Is the Law a Dead Letter Now?

Back in the 1990s, when I was a teenage Republican, I believed humanity would find a legal system so self-sustaining, we could eventually exclude humans from the equation. We could write laws, then deploy the bureaucratic instruments necessary to enforce those laws, without bias or favor, essentially forever. The machine would support itself without inputs from nasty, unreliable humans. We only needed to trust the modernist small-L liberal process.

Okay, we hadn’t written such laws to implement such systems, but that only proved we hadn’t written such laws yet. Because individuals only enforced laws as written, I reckoned, such self-sustaining systems would preclude individual prejudice or demographic bias. (I didn’t realize, for years, that laws themselves could contain bias.) Divisions, disadvantage, and destitution would eventually wither as laws enforced baseline ethical standards which encompassed everyone, everywhere, equally.

Watching the meltdown surrounding Kilmar Abrego Garcia, I’m seeing underlined something I gradually realized in my twenties, but never previously needed to say aloud: all laws are doomed to fail. Even laws written with altruistic intent and thorough legal support, like the 14th Amendment, work to the extent that those entrusted to enforce them, actually do so. America’s current executive regime is demonstrating no intention to enforce the law justly.

The regime first deported Abrego Garcia in March, despite him having legal residency status and never having been convicted of any crime. Initially, the regime acknowledged that they’d expelled Abrego Garcia mistakenly, and based on that acknowledgment, the Supreme Court—dominated by Republican nominees and one-third appointed by the current president—unilaterally demanded his return. So the regime flippantly changed the narrative and refused to comply.

This refusal, stated unambiguously in an Oval Office press conference where the American and Salvadoran presidents shared the lectern, demonstrates why the law will inevitably fail. America’s system, predicated on the government’s adherence to the principles laid out in the Constitution, absolutely requires that all participants share a prior commitment. Simply put, they must believe that nation, government, and law, are more important than any individual. Even the president.

Kilmar Abrego Garcia (AP photo)

We must strike a balance here, certainly. Individuals write our laws, even individuals working collectively, and our legislators are individuals. The “buck stops here” president, an individual, must balance power with the nine SCOTUS justices and the 535 members of Congress, who are all individuals, even when working jointly. But those individuals all work for a shared vision, and when they don’t, their whimsy becomes antithetical to state organization.

Please don’t misunderstand me. Any individual may call the nation wrong, as for instance Dr. King did, and may organize to redress such wrong. Indeed, only such public, organized call-out may sway the nation’s conscience sufficiently to enact change or improve a dysfunctional system. The primacy of the nation doesn’t mean citizens must meekly accept arbitrary or unjust directions from a unitary state. That would basically invite autocracy.

Simultaneously, however, those who seek official state power must submit themselves to something larger than their individuality. Dr. King never ran for office, and the tactics he employed when crossing the Edmund Pettis Bridge would’ve been inappropriate in Congress. Indeed, his deputy, John Lewis, who became a Representative, used Dr. King’s tactics to mobilize voters, but submitted himself to forms of order when writing and voting on legislation.

My regular readers, who mostly share my sociopolitical views, may think I’m saying something obvious here. But as I write, the current president’s approval ratings hover between 41% and 49%. That’s negative, and substantially underwater, but at least two in five Americans look at what’s currently happening, and don’t mind. They voted for his tariffs, immigrant roundups, and rollbacks of civil rights law, and five months later, they remain unchanged.

A satisfactory fraction of American voters approves of, or at least don’t mind, a president placing himself above either Congress or SCOTUS. This president, like Andrew Jackson before him, thinks he’s empowered to force lawful residents off their land, unless someone has guns enough to stop him. Essentially, he’ll continue ignoring baselines of justice until someone, presumably Congress, does something to stop him.

Our entire Constitutional structure requires those elected to power, to agree that America is more important than themselves. That means both America, the human collective, and America, the structures of government. If laws require them to act correctly, then they must abide by those laws without threats of force. If they can’t do that, well, that’s what checks and balances are for. If that fails, We the People step in.

Friday, April 11, 2025

A Very Proper and Decorous English Heist

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 54
Charles Crichton (director), The Lavender Hill Mob

Henry Holland (Alec Guinness) is the epitome of the postwar British nothing man: firmly middle class and middle management, he has little to show for his life. He’s spent twenty years supervising gold bullion shipments for a London commercial bank, handling money he’ll never be allowed to touch. One day his bank announces plans to move him to another department, and Henry decides to act. He’ll never see such money himself unless he steals it.

For approximately ten years after World War II, Ealing Studios, Britain’s longest surviving film studio, produced a string of comedies so consistent, they became a brand. They mixed tones throughout, shifting from dry wordplay and dark sarcasm, straight into loud, garish slapstick, often in the same scene. They shared certain general themes, though, especially the collision between Old Britain, wounded by the war, and a chaotic, freebooting new culture that hadn’t quite found its identity.

When Henry discovers his neighbor, Alfred Pendlebury (Stanley Holloway), owns a small-scale metal foundry, the men decide to collaborate on Henry’s hastily considered heist. Through a caper too silly to recount, Henry and Alfred recruit two small-time hoodlums to perform the actual robbery. This union of jobs, classes, and accents makes a statement about Britain in 1951: the old divisions between castes are melting away. Something new is arising, and that something is probably criminal.

Besides their themes, the classic Ealing comedies shared other traits. Alec Guinness and Stanley Holloway were two among a rotating repertory company appearing in several movies. Films were shot in real-life London streets, and in studios built in repurposed wartime aircraft hangars. The movies’ design bespeaks a Britain that existed only briefly, during the decades between Churchill and Thatcher: hung up on propriety and dignity, but also suddenly young, history bombed away in the Blitz.

The robbery is plucky, entrepreneurial, almost downright admirable. Henry’s crew execute a slapstick heist so silly, the Keystone Kops would’ve doffed their hats. But having done it, the crew find themselves actually holding a vanload of gold bullion, in a country still cash-strapped and suffering under wartime rationing. Gold is worthless, they discover, unless they can sell it. Which means smuggling it out of the country under the Metropolitan Police’s watchful, but easily distracted, eye.

Like in all Ealing comedies, indeed most of 20th century British comedy, much of the humor comes from watching pretentions disintegrate. In another Guinness starring vehicle, The Man in the White Suit, this disintegration is literal, as conflicting sides tear the title character’s newfangled fabric to shreds. Here, it’s more metaphorical. The more our protagonists’ suits become rumpled, the more their hats fly off in frantic pursuits, the more they escape their prewar class roles.

Alec Guinness (left) and Stanley Holloway in The Lavender Hill Mob

This movie culminates in the police pursuing our antiheroes through London streets. This was seventeen years before Steve McQueen’s Bullitt made car chases a cinema staple, so Henry and Alfred make their own rules: frantic but dignified, they never forget their place. They use police tactics to distract the police, turning British decorum against itself, but their insistence on such polite observance eventually dooms them. These sports can escape everything—except their own British nature.

Alec Guinness plays Henry Holland with a gravitas which exceeds one character. In later years, he would become famous for playing implacable elder statesmen in classics like The Bridge on the River Kwai and the original Star Wars. This character has seeds of these more famous roles, but Guinness survives indignities we can’t imagine Obi-Wan Kenobi facing. Henry Holland goes from clerk to mastermind to goofy fugitive, all with seamless integrity. Guinness’ decorum never cracks.

This movie is worth watching in itself, but it also introduces the whole Ealing subgenre. It showcases the personalities, themes, and storytelling that made Ealing a classic. Most Ealing comedies were American successes, and repertory actors, especially Guinness, became American stars. But the genre lasted only briefly; the BBC bought the studio in 1957, and attempts to recapture the Ealing magic failed. Tom Hanks took Guinness’ role in a remake of The Ladykillers, and tanked.

Put briefly, the category is a surviving emblem of a time, place, and culture. Like Kingsley Amis’ Lucky Jim, or Douglas Adams’ Arthur Dent, Guinness’ Henry Holland is a British man in a time when being British didn’t mean much anymore. This movie, with its postwar man struggling for dignity amid changing times and a mobilized proletariat, couldn’t have been made any earlier or later than it was. Watching it is like a time machine.

Friday, April 4, 2025

One Dark Night in an African Dreamland

Yvette Lisa Ndlovu, Drinking from Graveyard Wells: Stories

A recently deceased wife must choose whether to move onto the next life, or become an ancestral avenging spirit in this life. A civil engineer tasked with building a dam must first defeat the carnivorous spirits controlling the river. When houses begin vanishing from an impoverished slum, one gifted girl discovers the disappearances follow a logarithmic pattern. Refugees seeking asylum discover the immigration people aren’t bureaucrats, they’re a priesthood.

Zimbabwean author Yvette Lisa Ndlovu writes from a hybrid perspective: one foot in her homeland, one in the West. Ndlovu herself studied at Cornell and Amherst, and many of her mostly female protagonists are graduates of American (or Americanized) universities. Yet Zimbzbwe’s history, both its ancient past and its recent struggles for independence, remain near the surface. For Ndlovu, Western modernism is usually a thin and transparent veneer.

Many of Ndlovu’s stories fall broadly into the categories of “fantasy” or “horror,” but that’s a marketing contrivance. Though many of her stories involve a monster—a primordial horror dwelling under conflict diamond fields, for instance, or carnivorous ants raised to make boner pills—almost never does the monster drive the story. Usually, Ndlovu’s monsters point her protagonists toward a deeper, more disquieting truth underneath the protagonists’ lives.

Instead of outright horror, these stories mostly turn on the friction between expectation and experience. Our protagonists usually start the story believing something rational, or expecting something reasonable. Recurrent themes include meaningful work and graduating from high school, two of the most common aspirations. But life in post-colonial Zimbabwe, with ancient traditions, modern tools of repression, and widespread poverty, always intrudes on those hopes.

In one story, a Zimbabwean student receives a fluke gift from the ancestral gods: she keeps stumbling accidentally into money. But the more money she fumbles into, the more her family expects from her. Soon the escape she sought becomes the burden she resents—until the gods demand an eternal choice.

When a student suffers blackouts, Western medicine cannot help. She consults an oracle, who finds the cure hidden in the past. To escape her condition, the student must time-travel to early colonialism and recover a military queen whom the British historians erased from living memory.

Yvette Lisa Ndlovu

Ndlovu structures some stories more like fables than Western fiction: an island king discovers immortality, but slowly stops being human. A healer erases the burdens of grief, but secretly serves a master whom her patients never see. A handful of newspaper clippings hide the secret pattern governing city women’s lives.

Not every story is “horror” or “fantasy.” In one story, an American college student discovers a common tool of Zimbabwean folk practice, and finds a way to monetize it, at the people’s expense. In another, poverty forces a talented student to leave school and find work; she pays her bills, but watches opportunities flit past.

Concerns of faith and religion recur. Though many of Ndlovu’s characters are Christian, and quote the Bible generously, they do so in a nation where ancient gods might occupy neighborhood houses. She reads the rituals and habits of government as religious rites, which isn’t a stretch. Issues of daily life contain spiritual depth in a nation where nature, death, and hunger always linger on modern life’s margins.

Ndlovu’s stories range from three to sixteen pages. This means they all make for complete reading in one session, with time left over to contemplate her themes. And those themes do require some deeper thought, because she asks important questions about what it means to be modern in traditional communities, or to be poor in a world with more than enough money. She doesn’t let readers off easily.

Perhaps I can give Ndlovu no greater praise than saying her short stories are genuinely short. Too many short story writers today apparently had an idea for a novel, jotted some notes, and thought they had a story. Not so here. Out of fourteen stories, one feels truncated; the other thirteen read as self-contained and thematically complete. That isn’t feint praise, either. I appreciate that Ndlovu crafts fully realized experiences we can savvy in one sitting.

The title story, which is also the last, asks us whether it’s always bad to go unnoticed. The question comes with piercing directness. Characters find themselves disappearing from a society that doesn’t want to see them. But maybe, for those taken away, it’s a Biblical experience. We can’t know, Ndlovu tells us in the rousing final sentences, but maybe that uncertainty is what makes her characters’ lives worth living.

Thursday, March 27, 2025

Sorry, Dad, I Can’t Do Politics Anymore

Defense Secretary Pete Hegseth

My father thinks I should run for elective office. Because I strive to stay informed on local, national, and world affairs, and base my opinions on solid facts and information, he thinks I’m potential leadership material. Me, I thought I only took seriously the 11th-grade American Civics warning to be an involved citizen and voter. But too few people share that value today, and Dad thinks that makes me electable.

This week’s unfolding events demonstrate why I could never hold elective office. We learned Monday that a squadron of Executive Branch bureaucrats, including the National Security Adviser, the Secretary of Defense, and the Vice President, were conducting classified government business by smartphone app. For those sleeping through the story (or reading it later), we know because National Security Adviser Mike Waltz dialed Atlantic editor Jeffrey Goldberg into the group chat.

Unfortunately, Dad is wrong; I’m no better informed than anyone else on unfolding events. I’ve watched the highlights of senators questioning Director of National Intelligence Tulsi Gabbard and CIA head John Ratcliffe, but even then, I’m incapable of watching without collapsing into spitting rage. Gabbard’s vague, evasive answers on simple questions like “were you included in the group chat” indicate an unwillingness to conduct business in an honest, forthright manner.

Not one person on this group chat—and, because Goldberg in his honesty removed himself after verifying the chat’s accuracy, we don’t know everyone on the chat—thought to double-check the roster of participants. This despite using an unsecured app with a history of hacking. That’s the level of baseline security we’d expect from coworkers organizing a surprise party, not Cabinet secretaries conducting an overseas military strike.

The Administration compounded its unforced errors by lying. On Tuesday, Defense Secretary Pete Hegseth pretended that Goldberg’s chat contained no national security information; on Wednesday, Goldberg published the information. Millions of Americans who share my dedication to competent citizenship couldn’t get our jaws off the floor. Hegseth knew not only that Goldberg had that information, but that he could produce it. And he lied anyway.

National Security Adviser Mike Waltz

In a matter of weeks, we’ve witnessed the devaluation of competence in American society. Trump, who had no government experience before 2016, has peopled his second administration with telegenic muppets who similarly lack either book learning or hands-on proficiency. But then, no wonder, since studies indicate that willingness to vote for Trump correlates broadly with being ill-informed or wrong about facts. We’ve conceived a government by, and for, the ignorant.

Small-d democratic government relies upon two presumptions: that everyone involved is informed on the facts, to the extent that non-specialists could possibly keep informed, and that everyone involved acts in good faith. Both have clearly vanished. The notorious claim that, according to Google Analytics, searches for the word “tariffs” spiked the day after Trump’s election, apparently aren’t true: they spiked the day before. But even that’s embarrassingly late./p>

Either way, though, it reveals the uncomfortable truth that Americans don’t value competence anymore, not in themselves, and not in elected decision-makers. This Administration’s systemic lack of qualifications among its senior staff demonstrates the belief that obliviousness equals honesty. Though the President has installed a handful of serious statesmen in his Cabinet, people like Hegseth, Gabbard, and Kash Patel are unburdened by practical experience or tedious ol’ book larnin’.

Now admittedly, I appreciate when voters express their disgust at business-as-usual Democrats. Democratic leadership’s recent willingness to fold like origami cranes when facing even insignificant pushback, helps convince cocksure voters that competence and experience are overrated. The GOP Administration’s recent activities have maybe been cack-handed, incompetent, and borderline illegal, but they’re doing something. To the uninitiated, that looks bold and authoritative.

But Dad, that’s exactly why I can’t run for office. Because I’ve lived enough, and read enough, to know that rapid changes and quick reforms usually turn to saltpeter and ash. Changes made quickly, get snatched back quickly, especially in a political environment conditioned by digital rage. Rooting out corruption, waste, and bureaucratic intransigence is a slow, painstaking process. Voters today apparently want street theatre. I’m unwilling to do that.

My father might counter by noting that the Administration’s popularity is historically low, that its own voting base is turning away, and that this controversy might be weighty enough to bring them to heel. I say: maybe. But unless voters are willing to recommit themselves to being informed, following events, and knowing better than yesterday, the underlying problem will remain. The next quick-ix demagogue will deceive them the same way.

Monday, March 24, 2025

The Stains of Politics Don’t Wash Off

1001 Books To Read Before Your Kindle Battery Dies, Part 120
Lawrence O’Donnell, Playing with Fire: The 1968 Election and the Transformation of American Politics

Those of us born afterward—which, over half a century later, is most of us—probably know the 1968 presidential election for LBJ’s abrupt withdrawal, or for Bobby Kennedy’s assassination. As this highly contested election recedes from living memory, we risk losing important context that helped define a generation’s relationship with politics. Just as important, without that knowledge, we’re vulnerable to those who would exploit weaknesses that still exist.

Lawrence O’Donnell is a journalist, not an historian; but some of the best history for mass-market readers in recent years has come from journalists. Names like David Zucchino and Joe Starita have returned lost history to Americans, sometimes redefining our self-image in the process. Unconstrained by the necessities of academic writing, journalists can deep-dive into primary sources and spin them into vernacular English. Which is exactly what O’Donnell does here.

As 1968 began, the Democratic party split over Lyndon Johnson. The President’s civil rights legislation alienated old-line conservative Dixiecrats, but his deepening commitments in Vietnam left White progressives politically homeless. This led not one, but two sitting senators to challenge Johnson in the Democratic primary. Bobby Kennedy, son of a political dynasty, moved slowly and strategically, but Eugene McCarthy, a former professor, mustered the enthusiasm of college students and Yippies.

Republicans faced substantially similar problems. George Romney, a center-right maven, failed to muster much enthusiasm, so the question became whom the Republicans would embrace. That old anti-communist firebrand Richard Nixon told Cold War Americans what they wanted to hear, and whom they should fear. But New York governor Nelson Rockefeller offered a relatively optimistic, liberal option. 1968 would be the final knell for Republican liberalism.

But then as now, the presidential election wasn’t only about the political maneuvering. In 1968, Martin Luther King, Jr., was ramping his “Poor People’s Campaign” to new heights. That fateful day in Memphis, he’d visited to help organize the city’s striking sanitation workers. When an assassin’s bullet struck him, it basically exposed an entire generation’s bottled rage. The resulting nationwide explosion redefined the terms for would-be presidential candidates.

Lawrence O’Donnell

Back then, the primary system didn’t matter like it does now. Kennedy, McCarthy, Rockefeller, and Ronald Reagan led aggressive ground campaigns to garner votes, but Nixon and Democratic Vice President Hubert Humphrey played internal party politics. Their aggressive gladhanding secured party nominations for two candidates who didn’t necessarily have deep grass roots. Both parties would have to change future nominating processes to address this injustice.

Even more than the individual candidates, 1968 would redefine party identities. Before the Sixties, party loyalty had more to do with community and geography than issues and platforms. That’s why Johnson, a Democrat from Texas, could shepherd multiple civil rights bills through a historically divided Congress. It’s also why a liberal like Rockefeller, and moderate like Romney, and a conservative like Nixon could compete for the Republican nomination.

But new alignments based on domestic issues changed these alignments. Over the course of 1968, the Nixon contingent, backed by that old segregationist Strom Thurmond, squeezed the liberal Republicans out. Meanwhile, as two anti-war candidates vied for the Democratic nomination, party regulars closed ranks to preserve the Johnson wing’s privilege. Thus, throughout the 1968 primary campaign, the Republican Party became increasingly conservative, and so did the Democrats.

O’Donnell wrote this book directly after the 2016 election, and comments liberally on how the Nixon/Humphrey contest presaged the Trump presidency. As always, history is about what happened, but it’s also about us, the contemporary readers. Presidential campaigns, once scholarly affairs based on debate and communication, became increasingly oriented toward television. Vibes became more important than policies—and Democratic Party vibes were so authoritarian, they made Nixon look amiable.

At the same time, O’Donnell omits information his audience could learn from. He never names the assassins who killed Dr. King and Bobby Kennedy, and never mentions their motivations; James Earl Ray and Sirhan Sirhan don’t even merit index entries. Therefore, Ray’s connection to organized bigotry gets erased, as does Sirhan’s anger over the Six-Day War. O’Donnell pores over living candidates’ policies extensively, but assassins’ bullets apparently just happen.

In O’Donnell’s telling, the 1968 election serves as a fable to explain Trump-era politics. By examining the partisan extremes that calcified during this campaign, we gain the vocabulary to understand what happened in 2016. And though O’Donnell couldn’t have anticipated it, his words became more necessary, his vocabulary more trenchant, after the disaster of 2024. It’s often difficult to examine the present dispassionately, but the past offers us useful tools.

Saturday, March 22, 2025

Why Ostara?

A 19th century engraving depicting
Ostara (source)

Each year during Lent, social media surges with claims that Easter derives its name and mythology from the Anglo-Saxon fertility goddess Ostara. These memes often include claims about how Ostara gives us numerous Easter myths: that rabbits and eggs were her sacred symbols, that her worship involved sexual rituals which early Christians suppressed, even that Ostara died and rose again. These claims are largely fictional; Ostara’s actual mythology is lost.

Less interesting than what Anglo-Saxons believed, or didn’t, about Ostara, is the eagerness with which online critics invent Ostara mythology. No information about Ostara, beyond her name, survives, yet commentators assert a panoply of just-so stories, many beginning with “it’s said” or “the story goes,” variations on the folkloric “Once Upon a Time.” Some such stories are pilfered from Germanic or Near Eastern religions; others seem to be purely fabricated.

Such attempts to revive otherwise lost pre-Christian religions seem counterintuitive. The so-called New Atheists, like Sam Harris and Richard Dawkins, claim that scientific modernity doesn’t need creation myths and just-so stories to organize society. Yet even as Christianity seems ever-further removed from today’s culture, at least a vocal contingent seeks moral justification, not in science, but in ancient myth. The very antiquity of pre-Christian myth gives it exotic appeal.

Multiple factors contribute to why Christianity, and its myths and practices, are fading in Western Civilization. Clergy abuses, past and present, surely contribute. Christianity’s association with warlike, nationalistic, and racist factions doesn’t help. Even its ancient texts, unchanged since the Iron Age, makes it seem weighted with antique baggage. But I’d suggest one important reason Christianity seems distant from modern culture: the religion focuses heavily on death.

Why does Jesus’ suffering and death dominate Christian theology? The Apostle Paul highlights Christ’s crucifixion and resurrection, far beyond Jesus’ moral lessons. Christianity originally spread amid conditions where death was commonplace; most people died, not in hospitals, but at home, surrounded by family. Funerals were massive public gatherings signified by music, food, and other festival trappings. Such events still sometimes happen in rural areas, but have become uncommon elsewhere.

A 19th century Easter card (source)

Rather, modern death has become aberrant. The most common causes of death throughout history—tuberculosis, malaria, bubonic plague, polio, tetanus, whooping cough—have become rare in the last century, the time when Christianity saw its fastest decline. Even industrial accidents and war wounds are treatable in ways past generations didn’t know. Death, once so ever-present that people discussed their funeral preparations over family dinner, has become rare, distant, and distasteful.

Theologians have created convoluted justifications for Christ’s death and resurrection. As Fleming Rutledge writes, virtually no such justifications withstand scrutiny. But for early Christians, no justification was necessary; Christ died because we’ll eventually die, probably sooner rather than later. That camaraderie with God brings comfort. I’ve known two atheist friends who embraced faith and prayer when loved ones were dying, then returned to unbelief when the crisis passed.

But death doesn’t define Ostara. Though some online stories claim she dies and is resurrected every spring, these stories are peripheral. The made-up myths generally highlight fertility, growth, planting, and sex. Concocted myths prioritize life, flourishing, and birth, which seem closer to modern daily experience. In a culture where death seems abnormal, a unifying spiritual narrative privileging birth and life arguably makes sense. Penicillin rendered Christianity obsolete.

This stumbles on one important problem: we’re still going to die. As someone who recently watched a loved one struggle on life support before the merciful end, I find the Easter narrative of God’s mortality comforting in new ways. But we’ve made death distant and antiseptic, hidden inside hospital or nursing home walls, no longer present with daily life. Death has become atypical, but we’re still going to die.

Speaking personally, in past years, I’ve found the romantic mythmaking of Ostara merely treacly. This year, it’s become something more pointed, something harsher. It’s become an active denial of human inevitability, and a shared refusal to accept the human condition. Modern technological society hides death, and dying persons, in antiseptic conditions, pretending they don’t exist. Life has become an eternal present, a permanent now.

I’ve written about this before: myths are ultimately not about the truth, they’re about the people who create them. But in this case, the attempt to invent new “ancient” myths about a lost folk religion aren’t just explanations. They reveal a way modern society denies an important aspect of life, and hides our mortal end like a shameful thing. These myths look cute, but they’re subtly dangerous.

Wednesday, March 19, 2025

Chatterbox Jazz and the Victim Complex, Part Two

This essay is a follow-up to Chatterbox Jazz and the Whie Victim Complex
Another angle on the entrance to the Chatterbox Jazz Club, which only a
complete doofus would mistake for apolitical. (source)

I can’t help considering the parallels, and the lack of parallels, between Elise Hensley, who videoed herself getting ejected from the Chatterbox Jazz Club, and George Floyd. To reiterate, Hensley almost certainly recorded her expulsion deliberately, hoping to cultivate the impression of herself as an oppressed minority. But so far, the explosion of outrage she expected hasn’t arisen. It bears some time to consider why.

Hensley’s video and Darnella Frazer’s recording of George Floyd’s death might seem superficially similar to chronically online denizens. Both filmed on cellphone cameras, these videos show what their respective target audiences consider an injustice. But the online outrage machine flourishes with such displays of false equivalency. Hensley’s staged confrontation, and George Floyd’s unplanned murder, only resemble one another to lazy media consumers.

To exactly such lazy consumers, the sequence appears thusly: somebody distributed video of an injustice in progress. Millions of Americans were outraged. Protesters filled the streets. Ta-dah! We see similar reasoning in the hundreds of January 6th, 2021, rioters who live-streamed their push into the Capitol Building, speaking metaphors of Civil War and 1776: they thought simply seeing provocative media created public sentiment.

This bespeaks a specific attitude, not toward current events, but toward media. Lazy consumers see events not as events, but as content, and information distribution not as journalism, but as content creation. Functionally, Hensley doesn’t elevate herself to George Floyd’s level, she lowers George Floyd to her level. The spontaneous recording of an actual crime in progress, becomes neither better nor worse than her forced confrontation with a queer bartender.

Let me emphasize, this isn’t merely a conservative phenomenon. I’ve struggled to follow political TikTok because, Left and Right alike, it mostly consists of homebrew “journalists” either repeating somebody else’s breaking reports, or shouting angrily at like-minded believers from their car or bedroom. The read-write internet has expanded citizens’ speaking capacity to, hypothetically, infinity, depending on server space. But it’s created little new information.

But conservatives, especially White conservatives, receive one key point differently. They’ see stories of injustice multiply rapidly and gain mainstream attention, and they believe the media creates the martyrs. If martyrdom happens when cameras capture injustice, rather than when humans or institutions perform injustice, then anybody with media technology could recreate the martyrdom process. Anybody could, with a 5G connection, become a martyr.

Such lack of media literacy travels hand-in-hand with the inability to distinguish between forms of injustice. Hensley’s description of her ejection as “discrimination” suggests she thinks herself equal to Black Americans denied service at the Woolworth’s lunch counter in the 1950s. By extension, it suggests her MAGA hat equals organized resistance to injustice. She can’t see the difference, and hopes you can’t, either.

When all news is media manipulation, in other words, then all injustice, no matter how severe, no matter how authentic, becomes equal. Hensley can’t distinguish her own inconvenience from George Floyd’s death—or at least, she expects that others can’t distinguish. The meaninglessness of Hensley’s public stand, as nobody has rallied around her faux injustice, reveals that media manipulation isn’t the same as reality, and some people still can tell.

One recalls the occasional online furor surrounding some doofus who just discovered that “Born in the U.S.A.” isn’t a patriotic song, “Hallelujah” isn’t a Christmas song, and punk rock is political. These people aren’t stupid, despite the inevitable social media pile-on. Rather, these people consume all media, from music to movies to news, passively. Under those conditions, everything becomes equal, and everything becomes small.

Did Elise Hensley seriously believe herself a martyr, surviving a moment of bigoted injustice? Well, only God can judge the contents of her heart. But she evidently hoped other people would believe it, and throw their support behind her. Some evidently did, although the fervor has mostly sputtered. Without the jolt of authenticity, her media manipulation stunt gathered scarce momentum, and seems likely to disappear with the 24-hour news cycle.

The whole “fake news” phenomenon, which pundits say might’ve helped Trump into the presidency twice, relies upon the same action that Hensley attempted, mimicking real events under controlled conditions. But, like Hensley, it mostly failed to fuel real action. It might’ve helped calcify political views among people already inclined toward extreme partisan beliefs, but like Hensley, most “fake news” produced meaningless nine-day wonders.

If I’m right in my interpretation, media consumers are growing weary of manufactured outrage. The next stage will probably be performative cynicism, which is hardly better, but will be at least less terrifying.

Tuesday, March 18, 2025

Chatterbox Jazz and the White Victim Complex

This TripAdvisor photo shows a gay pride flag above the front window
of the Chatterbox Jazz Club in Indianapolis, Indiana (source)

Late last week, a woman video-recorded herself being ordered out of the Chatterbox Jazz Club in Indianapolis, Indiana. When she asks why, the bartender specifically says “because you’re a Trump supporter,” apparently referencing a red MAGA ballcap. When the woman stalls, the bartender retrieves a short-handled baseball bat from behind the bar and says “I’m not fucking around.” The 36-second clip went viral before Monday.

Other commentators note the contradiction between this woman demanding her right to be served, and the Republicans who spearheaded a lawsuit to the Supreme Court letting businesses refuse commerce with gay customers. The case, Masterpiece Cake Shop v. Colorado Civil Rights Commission, didn’t actually legalize anti-gay discrimination. It did, however, redefine “religious neutrality” when writing anti-discrimination law. It also basically kicked the lawsuit back down the ladder, leaving it essentially unresolved today.

I’d rather avoid rehashing that, not because it isn’t legitimate, but because I’m unqualified. Instead, let’s consider the medium. This woman didn’t just get ejected from a hostile venue; she recorded herself getting ejected. She recorded the confrontation on a cellphone camera held vertically, indicating her intention to distribute the footage online, probably on TikTok. Therefore this confrontation didn’t just happen; she probably engineered it.

The woman herself doesn’t appear in the footage. She verbally admits she’s wearing a “Trump hat,” but we never see it; she certainly doesn’t dispute the accusation that “you’re a Trump supporter.” Based on that fact, it seems irrefutable that she did or said something pro-Trump inside the bar. Management released a statement claiming that her party “intentionally misgendered and harassed a Chatterbox employee.”

In the video, the bartender arguing with the video creator appears gender-ambiguous and is coded nonbinary. Some unofficial websites describe the Chatterbox Jazz Club as a gay bar; the Chatterbox’s website takes no discernable position, but shows a trans-rights flag in the front window. The likelihood that accuser Elise Hensley, who describes herself as a repeat customer, didn’t know this before Friday night, is vanishingly small.

Therefore, if Hensley’s party entered this club wearing MAGA hats, they didn’t do so innocently. Unless they specifically said something about their intention to create conflicts or inflame tensions, it’s difficult to prove intent in court. However, their intent to start a fight seems highly likely, even almost certain. For our non-courtroom purposes, it seems clear that Hensley and her party intended to start a fight.

Moreover, because the bartender keeps a bat within reach behind the bar, they’ve probably faced previous challenges. Survivors generally buy weapons after they’ve been robbed or assaulted, not before. Hensley entered the bar spoiling for a fight, and bar staff appeared prepared to give her one. And she filmed the confrontation in process. Therefore, clearly, this happened not for Hensley’s benefit, nor the Chatterbox’s, but for our benefit.

Hensley clearly wants the world to see her suffering some oppression. We have this underscored when she says, “You know that this is, like, discrimination, right?” Other patrons reply with jeering laughter, but Hensley appears serious. In that moment, she perceives herself as suffering discrimination, as being the oppressed party in an unequal power dynamic. She sees herself as the victim in this confrontation.

American conservatives, especially the MAGA variety, occupy an ontological dilemma. They claim their opinions and actions represent most American citizens, that they’re merely saying aloud what everyone else really thinks. Simultaneously, they call themselves an oppressed minority, silenced by overwhelming forces. The Trump administration’s anti-DEI policies embody this duality of White authority and White victimhood: Whites are hypercompetent, but suppressed by incompetent minorities.

Hensley almost certainly recorded this confrontation because she thought it would make her look oppressed, victimized, put-upon. To those who share her prior suppositions, it probably does. The resort to cusswords and threats of violence implies victimhood. Maybe Hensley thought, in the largest city of an overwhelmingly Red state, she could make herself a celebrity victim and parlay that into a leadership position in the long-awaited conservative uprising.

But even the slightest context awareness demonstrates that the patrons laughing at Hensley, not Hensley herself, have the greatest command of the facts. Hensley, like so many in today’s hyperconnected world, has confused being a content creator with being a newsmaker, and as a result, she makes herself look ridiculous. Conservatives love trying to enter themselves in the oppression Olympics.

Elise Hensley will be remembered alongside Amy Cooper, the Central Park woman who turned herself into a synonym for racism, ignorance, and media manipulation. And that’s all she deserves.

Follow-up: Chatterbox Jaxe and the Victim Complex Part 2

Friday, March 14, 2025

How To Invent a Fake Pop Culture

I don’t recall when I first heard the song “Sally Go ’Round the Roses.” I know I first heard Pentangle’s folk singalong arrangement, not the Jaynetts’ Motown-tinged original. Like most listeners my age, who grew up with the mythology of Baby Boomer cultural innovation, I received that generation’s music out of sequence; the 1960s appeared like a single unit, without the history of cultural evolution that define the decade.

Therefore I didn’t understand how influential the Jaynetts’ original version really was. Its use of syncopated backbeat, gated distortion effects, and enigmatic lyrics were, in 1963, completely innovative. The British Invasion hadn’t hit America yet, with the inventive tweaks that the Beatles and the Kinks experimented with. The original label, Tuff, reportedly hated the song until another label tried to purchase it, causing Tuff to rush-release the record.

Eventually, the track hit number two on the Billboard Hot 100 chart. More important for our purposes, though, a loose collective of San Francisco-based musicians embraced it. Grace Slick recorded a rambling, psychedelic cover with her first band, The Great Society, and tried to recreate its impact with classic Jefferson Airplane tracks like “White Rabbit” and “Somebody To Love.” Much of her career involved trying to create that initial rush.

Once one understands that “Sally” came first, its influence becomes audible in other Summer of Love artists, including the Grateful Dead, Creedence Clearwater Revival, Moby Grape, and Big Brother and the Holding Company. These acts all strove to sound loopy and syncopated, and favored lyrics that admitted of multiple interpretations. Much of the “San Francisco Sound” of 1966 to 1973 consisted of riffs and jams on the “Sally” motif.

That’s why it staggered me recently when I discovered that the Jaynetts didn’t exist. Tuff producer Abner Spector crafted “Sally” with two in-house songwriters, an arranger who played most of the instruments, and a roster of contract singers, mostly young Black women. The in-house creative team played around and experimented until they created the song. It didn’t arise from struggling musicians road-testing new material for live audiences.

Grace Slick around 1966, the year she
covered “Sally Go ’Round the Roses”
with the Great Society

A New York-based studio pushed this song out of its assembly-line production system, and it became a hit. Like other bands invented for the studio, including the Monkees and the Grass Roots, the Jaynetts didn’t pay their dues, the studio system willed them into existence. They produced one orphan hit, which somehow travelled across America to create a sound-alike subculture, back when starving musicians could afford San Francisco rent.

Culture corporations, such as the Big Three labels which produce most of America’s pop music, and the Big Five studios which produce most of America’s movies, love to pretend they respond to culture. If lukewarm dribble like The Chainsmokers dominate the Hot 100, labels and radio conglomerates cover their asses by claiming they’re giving the customers what they want. Audiences decide what becomes hits; corporations only produce the product.

But “Sally’s” influence contradicts that claim. Artists respond to what they hear, and when music labels, radio, and Spotify can throttle what gets heard, artists’ ability to create is highly conditional. One recalls, for instance, that journalist Nik Cohn basically lied White disco culture into existence. Likewise, it’s questionable whether Valley Girl culture even existed before Frank and Moon Zappa riffed in Frank’s home studio.

It isn’t only that moneyed interests decide which artists get to record—a seamy but unsurprising reality. Rather, studios create artists in the studio, skimming past the countless ambitious acts playing innumerable bar and club dates while hoping for their breakthrough. This not only saves the difficulty of having to go comparison shopping for new talent, but also results in corporations wholly owning culture as subsidiaries of their brand names.

I’ve used music as my yardstick simply because discovering the Jaynetts didn’t exist rattled me recently. But we could extend this argument to multiple artistic forms. How many filmmakers like Kevin Smith, or authors like Hugh Howey, might exist out there, cranking out top-quality innovative art, hoping to become the next fluke success? And how many will quit and get day jobs because the corporations turned inward for talent?

Corporate distribution and its amplifying influence have good and bad effects. One cannot imagine seismic cultural forces like the Beatles without corporations pressing and distributing their records. But hearing Beatles records became a substitute for live music, like mimicking the Jaynetts became a substitute for inventing new culture. The result is the same: “culture” is what corporations sell, not what artists and audiences create together.

Saturday, March 8, 2025

The Great Exploding Rocket Debacle Continues

A SpaceX Starship test rocket launch (AP photo)

Back in the 1980s, I remember being a science fiction fanboy, growing disgusted with the American space program’s apparent inaction. Sure, NASA maintained a robust schedule of space shuttle flights and satellite launches that had a certain earthside grandeur. But shuttle crews performed a string of low-stakes scientific experiments that yielded only incremental knowledge gains. Compared to Asimov-era promises, NASA seemed terminally timid.

Fiction countered this ennui with the promise of libertarianism. I’d be hard-pressed to name even one title or author forty years later, but a bevy of science fiction authors proposed the idea of private corporations and rich cowboys taking over where NASA proved timid. Ben Bova certainly hinted at this with his Moonbase novels. Arthur C. Clarke, though no Randian libertine, nevertheless had undertones of privatization in his Thatcher-age novels.

This promise of private-sector space flight seemed promising to my preteen self. I was mature enough to gobble down novels written for adults, a voracious reader hungry for the high-minded themes and promises of adventure which grown-up SF promised. But I wasn’t subtle enough to parse deeper meanings. These novels I greedily inhaled often contained dark implications of privatized space exploration encouraging rapacious behavior and destructive greed.

Watching the news surrounding this week’s SpaceX flight explosion, I can’t help remembering those stories I dimly understood. Elon Musk has spent a decade pitching how his multiple corporations can perform public services more efficiently than public bureaus. Yet his spacecraft’s multiple explosions, including this one which halted East Coast air traffic for hours, have repeatedly embarrassed us ex-scifi kids who still think space is pretty cool.

Elon Musk

Musk’s personal wealth reached nearly half a trillion dollars immediately following the 2024 presidential election, thanks to his connections with President Trump. Everyone assumed, not unreasonably, that Musk’s lucrative government contracts, including SpaceX, would yield heavy dividends in a Trump presidency. Yet in under two months since the election, Musk has seen his wealth fall by a quarter, fueled by his reckless behavior and personal unpopularity.

Importantly, Musk’s wealth has come significantly from government contracts. SpaceX makes some profitable product, particularly its StarLink satellite system, but that profit comes with significant amortized debt, underwritten by government credit securities. Most of SpaceX’s actual operating capital comes from NASA contracts since, following the discontinuation of the space shuttle program, NASA has no in-house launch capabilities anymore.

But this privatized space program has been, charitably, embarrassing. NASA spent much of its house budget creating scientific laboratories and testing facilities, because its budget, circumscribed by Congress, included little room for launchpad errors. SpaceX employs few scientists, but mostly engineers, preferring to build and launch physical prototypes, because even when they explode, they create valuable capital through the medium of name recognition.

In practice, this means NASA was slow-moving and timid, but it produced results: NASA went from President Kennedy promising a moon landing, to actually landing on the moon, in seven years. SpaceX moves quickly and dramatically, but it mostly produces falling debris and lurid headlines in the 24-hour news cycle. Its track record getting actual astronauts into space is spotty, and frequently beholden to the bureaucratic cycle.

Again, this underscores a contradiction in libertarian thinking. This week’s explosion, which scattered debris widely throughout the Caribbean, forced the FAA to halt traffic from major American airports—mere days after Musk’s own chainsaw behavior reduced FAA workforce numbers to critical levels. Musk disparages the public sector that, it turns out, he desperately needs.

Herbert Marcuse postulated, in One-Dimensional Man, that technological society produces intellectual stagnation and a headlong race toward mediocrity. This applies, he wrote, in capitalist and communist societies alike: engineers only reproduce what they know already works. Actual innovation requires government intervention, because only governments willingly embrace uncertainty and the capacity for failure.

SpaceX has proven what 1980s SF writers said, and I failed to understand, that a successful privatized space program requires avaricious ego and casual disregard for consequences. Private space exploration requires a greedy space pirate to eviscerate public resources for private gain, then turn around and trust public servants to keep citizens alive when the engineered product literally explodes. That’s the opposite of innovation.

IMusk’s embarrassing post-inauguration behavior and continuing business disasters probably won’t cure anybody of libertarianism, at least yet. People who ideologically believe in the private sector’s goodness will persevere despite seven weeks of high-profile setbacks. But hopefully at least some will accept that, in a high-tech society, the private sector needs a public sector to survive without killing innocent bystanders.