Friday, April 4, 2025

One Dark Night in an African Dreamland

Yvette Lisa Ndlovu, Drinking from Graveyard Wells: Stories

A recently deceased wife must choose whether to move onto the next life, or become an ancestral avenging spirit in this life. A civil engineer tasked with building a dam must first defeat the carnivorous spirits controlling the river. When houses begin vanishing from an impoverished slum, one gifted girl discovers the disappearances follow a logarithmic pattern. Refugees seeking asylum discover the immigration people aren’t bureaucrats, they’re a priesthood.

Zimbabwean author Yvette Lisa Ndlovu writes from a hybrid perspective: one foot in her homeland, one in the West. Ndlovu herself studied at Cornell and Amherst, and many of her mostly female protagonists are graduates of American (or Americanized) universities. Yet Zimbzbwe’s history, both its ancient past and its recent struggles for independence, remain near the surface. For Ndlovu, Western modernism is usually a thin and transparent veneer.

Many of Ndlovu’s stories fall broadly into the categories of “fantasy” or “horror,” but that’s a marketing contrivance. Though many of her stories involve a monster—a primordial horror dwelling under conflict diamond fields, for instance, or carnivorous ants raised to make boner pills—almost never does the monster drive the story. Usually, Ndlovu’s monsters point her protagonists toward a deeper, more disquieting truth underneath the protagonists’ lives.

Instead of outright horror, these stories mostly turn on the friction between expectation and experience. Our protagonists usually start the story believing something rational, or expecting something reasonable. Recurrent themes include meaningful work and graduating from high school, two of the most common aspirations. But life in post-colonial Zimbabwe, with ancient traditions, modern tools of repression, and widespread poverty, always intrudes on those hopes.

In one story, a Zimbabwean student receives a fluke gift from the ancestral gods: she keeps stumbling accidentally into money. But the more money she fumbles into, the more her family expects from her. Soon the escape she sought becomes the burden she resents—until the gods demand an eternal choice.

When a student suffers blackouts, Western medicine cannot help. She consults an oracle, who finds the cure hidden in the past. To escape her condition, the student must time-travel to early colonialism and recover a military queen whom the British historians erased from living memory.

Yvette Lisa Ndlovu

Ndlovu structures some stories more like fables than Western fiction: an island king discovers immortality, but slowly stops being human. A healer erases the burdens of grief, but secretly serves a master whom her patients never see. A handful of newspaper clippings hide the secret pattern governing city women’s lives.

Not every story is “horror” or “fantasy.” In one story, an American college student discovers a common tool of Zimbabwean folk practice, and finds a way to monetize it, at the people’s expense. In another, poverty forces a talented student to leave school and find work; she pays her bills, but watches opportunities flit past.

Concerns of faith and religion recur. Though many of Ndlovu’s characters are Christian, and quote the Bible generously, they do so in a nation where ancient gods might occupy neighborhood houses. She reads the rituals and habits of government as religious rites, which isn’t a stretch. Issues of daily life contain spiritual depth in a nation where nature, death, and hunger always linger on modern life’s margins.

Ndlovu’s stories range from three to sixteen pages. This means they all make for complete reading in one session, with time left over to contemplate her themes. And those themes do require some deeper thought, because she asks important questions about what it means to be modern in traditional communities, or to be poor in a world with more than enough money. She doesn’t let readers off easily.

Perhaps I can give Ndlovu no greater praise than saying her short stories are genuinely short. Too many short story writers today apparently had an idea for a novel, jotted some notes, and thought they had a story. Not so here. Out of fourteen stories, one feels truncated; the other thirteen read as self-contained and thematically complete. That isn’t feint praise, either. I appreciate that Ndlovu crafts fully realized experiences we can savvy in one sitting.

The title story, which is also the last, asks us whether it’s always bad to go unnoticed. The question comes with piercing directness. Characters find themselves disappearing from a society that doesn’t want to see them. But maybe, for those taken away, it’s a Biblical experience. We can’t know, Ndlovu tells us in the rousing final sentences, but maybe that uncertainty is what makes her characters’ lives worth living.

Thursday, March 27, 2025

Sorry, Dad, I Can’t Do Politics Anymore

Defense Secretary Pete Hegseth

My father thinks I should run for elective office. Because I strive to stay informed on local, national, and world affairs, and base my opinions on solid facts and information, he thinks I’m potential leadership material. Me, I thought I only took seriously the 11th-grade American Civics warning to be an involved citizen and voter. But too few people share that value today, and Dad thinks that makes me electable.

This week’s unfolding events demonstrate why I could never hold elective office. We learned Monday that a squadron of Executive Branch bureaucrats, including the National Security Adviser, the Secretary of Defense, and the Vice President, were conducting classified government business by smartphone app. For those sleeping through the story (or reading it later), we know because National Security Adviser Mike Waltz dialed Atlantic editor Jeffrey Goldberg into the group chat.

Unfortunately, Dad is wrong; I’m no better informed than anyone else on unfolding events. I’ve watched the highlights of senators questioning Director of National Intelligence Tulsi Gabbard and CIA head John Ratcliffe, but even then, I’m incapable of watching without collapsing into spitting rage. Gabbard’s vague, evasive answers on simple questions like “were you included in the group chat” indicate an unwillingness to conduct business in an honest, forthright manner.

Not one person on this group chat—and, because Goldberg in his honesty removed himself after verifying the chat’s accuracy, we don’t know everyone on the chat—thought to double-check the roster of participants. This despite using an unsecured app with a history of hacking. That’s the level of baseline security we’d expect from coworkers organizing a surprise party, not Cabinet secretaries conducting an overseas military strike.

The Administration compounded its unforced errors by lying. On Tuesday, Defense Secretary Pete Hegseth pretended that Goldberg’s chat contained no national security information; on Wednesday, Goldberg published the information. Millions of Americans who share my dedication to competent citizenship couldn’t get our jaws off the floor. Hegseth knew not only that Goldberg had that information, but that he could produce it. And he lied anyway.

National Security Adviser Mike Waltz

In a matter of weeks, we’ve witnessed the devaluation of competence in American society. Trump, who had no government experience before 2016, has peopled his second administration with telegenic muppets who similarly lack either book learning or hands-on proficiency. But then, no wonder, since studies indicate that willingness to vote for Trump correlates broadly with being ill-informed or wrong about facts. We’ve conceived a government by, and for, the ignorant.

Small-d democratic government relies upon two presumptions: that everyone involved is informed on the facts, to the extent that non-specialists could possibly keep informed, and that everyone involved acts in good faith. Both have clearly vanished. The notorious claim that, according to Google Analytics, searches for the word “tariffs” spiked the day after Trump’s election, apparently aren’t true: they spiked the day before. But even that’s embarrassingly late./p>

Either way, though, it reveals the uncomfortable truth that Americans don’t value competence anymore, not in themselves, and not in elected decision-makers. This Administration’s systemic lack of qualifications among its senior staff demonstrates the belief that obliviousness equals honesty. Though the President has installed a handful of serious statesmen in his Cabinet, people like Hegseth, Gabbard, and Kash Patel are unburdened by practical experience or tedious ol’ book larnin’.

Now admittedly, I appreciate when voters express their disgust at business-as-usual Democrats. Democratic leadership’s recent willingness to fold like origami cranes when facing even insignificant pushback, helps convince cocksure voters that competence and experience are overrated. The GOP Administration’s recent activities have maybe been cack-handed, incompetent, and borderline illegal, but they’re doing something. To the uninitiated, that looks bold and authoritative.

But Dad, that’s exactly why I can’t run for office. Because I’ve lived enough, and read enough, to know that rapid changes and quick reforms usually turn to saltpeter and ash. Changes made quickly, get snatched back quickly, especially in a political environment conditioned by digital rage. Rooting out corruption, waste, and bureaucratic intransigence is a slow, painstaking process. Voters today apparently want street theatre. I’m unwilling to do that.

My father might counter by noting that the Administration’s popularity is historically low, that its own voting base is turning away, and that this controversy might be weighty enough to bring them to heel. I say: maybe. But unless voters are willing to recommit themselves to being informed, following events, and knowing better than yesterday, the underlying problem will remain. The next quick-ix demagogue will deceive them the same way.

Monday, March 24, 2025

The Stains of Politics Don’t Wash Off

1001 Books To Read Before Your Kindle Battery Dies, Part 120
Lawrence O’Donnell, Playing with Fire: The 1968 Election and the Transformation of American Politics

Those of us born afterward—which, over half a century later, is most of us—probably know the 1968 presidential election for LBJ’s abrupt withdrawal, or for Bobby Kennedy’s assassination. As this highly contested election recedes from living memory, we risk losing important context that helped define a generation’s relationship with politics. Just as important, without that knowledge, we’re vulnerable to those who would exploit weaknesses that still exist.

Lawrence O’Donnell is a journalist, not an historian; but some of the best history for mass-market readers in recent years has come from journalists. Names like David Zucchino and Joe Starita have returned lost history to Americans, sometimes redefining our self-image in the process. Unconstrained by the necessities of academic writing, journalists can deep-dive into primary sources and spin them into vernacular English. Which is exactly what O’Donnell does here.

As 1968 began, the Democratic party split over Lyndon Johnson. The President’s civil rights legislation alienated old-line conservative Dixiecrats, but his deepening commitments in Vietnam left White progressives politically homeless. This led not one, but two sitting senators to challenge Johnson in the Democratic primary. Bobby Kennedy, son of a political dynasty, moved slowly and strategically, but Eugene McCarthy, a former professor, mustered the enthusiasm of college students and Yippies.

Republicans faced substantially similar problems. George Romney, a center-right maven, failed to muster much enthusiasm, so the question became whom the Republicans would embrace. That old anti-communist firebrand Richard Nixon told Cold War Americans what they wanted to hear, and whom they should fear. But New York governor Nelson Rockefeller offered a relatively optimistic, liberal option. 1968 would be the final knell for Republican liberalism.

But then as now, the presidential election wasn’t only about the political maneuvering. In 1968, Martin Luther King, Jr., was ramping his “Poor People’s Campaign” to new heights. That fateful day in Memphis, he’d visited to help organize the city’s striking sanitation workers. When an assassin’s bullet struck him, it basically exposed an entire generation’s bottled rage. The resulting nationwide explosion redefined the terms for would-be presidential candidates.

Lawrence O’Donnell

Back then, the primary system didn’t matter like it does now. Kennedy, McCarthy, Rockefeller, and Ronald Reagan led aggressive ground campaigns to garner votes, but Nixon and Democratic Vice President Hubert Humphrey played internal party politics. Their aggressive gladhanding secured party nominations for two candidates who didn’t necessarily have deep grass roots. Both parties would have to change future nominating processes to address this injustice.

Even more than the individual candidates, 1968 would redefine party identities. Before the Sixties, party loyalty had more to do with community and geography than issues and platforms. That’s why Johnson, a Democrat from Texas, could shepherd multiple civil rights bills through a historically divided Congress. It’s also why a liberal like Rockefeller, and moderate like Romney, and a conservative like Nixon could compete for the Republican nomination.

But new alignments based on domestic issues changed these alignments. Over the course of 1968, the Nixon contingent, backed by that old segregationist Strom Thurmond, squeezed the liberal Republicans out. Meanwhile, as two anti-war candidates vied for the Democratic nomination, party regulars closed ranks to preserve the Johnson wing’s privilege. Thus, throughout the 1968 primary campaign, the Republican Party became increasingly conservative, and so did the Democrats.

O’Donnell wrote this book directly after the 2016 election, and comments liberally on how the Nixon/Humphrey contest presaged the Trump presidency. As always, history is about what happened, but it’s also about us, the contemporary readers. Presidential campaigns, once scholarly affairs based on debate and communication, became increasingly oriented toward television. Vibes became more important than policies—and Democratic Party vibes were so authoritarian, they made Nixon look amiable.

At the same time, O’Donnell omits information his audience could learn from. He never names the assassins who killed Dr. King and Bobby Kennedy, and never mentions their motivations; James Earl Ray and Sirhan Sirhan don’t even merit index entries. Therefore, Ray’s connection to organized bigotry gets erased, as does Sirhan’s anger over the Six-Day War. O’Donnell pores over living candidates’ policies extensively, but assassins’ bullets apparently just happen.

In O’Donnell’s telling, the 1968 election serves as a fable to explain Trump-era politics. By examining the partisan extremes that calcified during this campaign, we gain the vocabulary to understand what happened in 2016. And though O’Donnell couldn’t have anticipated it, his words became more necessary, his vocabulary more trenchant, after the disaster of 2024. It’s often difficult to examine the present dispassionately, but the past offers us useful tools.

Saturday, March 22, 2025

Why Ostara?

A 19th century engraving depicting
Ostara (source)

Each year during Lent, social media surges with claims that Easter derives its name and mythology from the Anglo-Saxon fertility goddess Ostara. These memes often include claims about how Ostara gives us numerous Easter myths: that rabbits and eggs were her sacred symbols, that her worship involved sexual rituals which early Christians suppressed, even that Ostara died and rose again. These claims are largely fictional; Ostara’s actual mythology is lost.

Less interesting than what Anglo-Saxons believed, or didn’t, about Ostara, is the eagerness with which online critics invent Ostara mythology. No information about Ostara, beyond her name, survives, yet commentators assert a panoply of just-so stories, many beginning with “it’s said” or “the story goes,” variations on the folkloric “Once Upon a Time.” Some such stories are pilfered from Germanic or Near Eastern religions; others seem to be purely fabricated.

Such attempts to revive otherwise lost pre-Christian religions seem counterintuitive. The so-called New Atheists, like Sam Harris and Richard Dawkins, claim that scientific modernity doesn’t need creation myths and just-so stories to organize society. Yet even as Christianity seems ever-further removed from today’s culture, at least a vocal contingent seeks moral justification, not in science, but in ancient myth. The very antiquity of pre-Christian myth gives it exotic appeal.

Multiple factors contribute to why Christianity, and its myths and practices, are fading in Western Civilization. Clergy abuses, past and present, surely contribute. Christianity’s association with warlike, nationalistic, and racist factions doesn’t help. Even its ancient texts, unchanged since the Iron Age, makes it seem weighted with antique baggage. But I’d suggest one important reason Christianity seems distant from modern culture: the religion focuses heavily on death.

Why does Jesus’ suffering and death dominate Christian theology? The Apostle Paul highlights Christ’s crucifixion and resurrection, far beyond Jesus’ moral lessons. Christianity originally spread amid conditions where death was commonplace; most people died, not in hospitals, but at home, surrounded by family. Funerals were massive public gatherings signified by music, food, and other festival trappings. Such events still sometimes happen in rural areas, but have become uncommon elsewhere.

A 19th century Easter card (source)

Rather, modern death has become aberrant. The most common causes of death throughout history—tuberculosis, malaria, bubonic plague, polio, tetanus, whooping cough—have become rare in the last century, the time when Christianity saw its fastest decline. Even industrial accidents and war wounds are treatable in ways past generations didn’t know. Death, once so ever-present that people discussed their funeral preparations over family dinner, has become rare, distant, and distasteful.

Theologians have created convoluted justifications for Christ’s death and resurrection. As Fleming Rutledge writes, virtually no such justifications withstand scrutiny. But for early Christians, no justification was necessary; Christ died because we’ll eventually die, probably sooner rather than later. That camaraderie with God brings comfort. I’ve known two atheist friends who embraced faith and prayer when loved ones were dying, then returned to unbelief when the crisis passed.

But death doesn’t define Ostara. Though some online stories claim she dies and is resurrected every spring, these stories are peripheral. The made-up myths generally highlight fertility, growth, planting, and sex. Concocted myths prioritize life, flourishing, and birth, which seem closer to modern daily experience. In a culture where death seems abnormal, a unifying spiritual narrative privileging birth and life arguably makes sense. Penicillin rendered Christianity obsolete.

This stumbles on one important problem: we’re still going to die. As someone who recently watched a loved one struggle on life support before the merciful end, I find the Easter narrative of God’s mortality comforting in new ways. But we’ve made death distant and antiseptic, hidden inside hospital or nursing home walls, no longer present with daily life. Death has become atypical, but we’re still going to die.

Speaking personally, in past years, I’ve found the romantic mythmaking of Ostara merely treacly. This year, it’s become something more pointed, something harsher. It’s become an active denial of human inevitability, and a shared refusal to accept the human condition. Modern technological society hides death, and dying persons, in antiseptic conditions, pretending they don’t exist. Life has become an eternal present, a permanent now.

I’ve written about this before: myths are ultimately not about the truth, they’re about the people who create them. But in this case, the attempt to invent new “ancient” myths about a lost folk religion aren’t just explanations. They reveal a way modern society denies an important aspect of life, and hides our mortal end like a shameful thing. These myths look cute, but they’re subtly dangerous.

Wednesday, March 19, 2025

Chatterbox Jazz and the Victim Complex, Part Two

This essay is a follow-up to Chatterbox Jazz and the Whie Victim Complex
Another angle on the entrance to the Chatterbox Jazz Club, which only a
complete doofus would mistake for apolitical. (source)

I can’t help considering the parallels, and the lack of parallels, between Elise Hensley, who videoed herself getting ejected from the Chatterbox Jazz Club, and George Floyd. To reiterate, Hensley almost certainly recorded her expulsion deliberately, hoping to cultivate the impression of herself as an oppressed minority. But so far, the explosion of outrage she expected hasn’t arisen. It bears some time to consider why.

Hensley’s video and Darnella Frazer’s recording of George Floyd’s death might seem superficially similar to chronically online denizens. Both filmed on cellphone cameras, these videos show what their respective target audiences consider an injustice. But the online outrage machine flourishes with such displays of false equivalency. Hensley’s staged confrontation, and George Floyd’s unplanned murder, only resemble one another to lazy media consumers.

To exactly such lazy consumers, the sequence appears thusly: somebody distributed video of an injustice in progress. Millions of Americans were outraged. Protesters filled the streets. Ta-dah! We see similar reasoning in the hundreds of January 6th, 2021, rioters who live-streamed their push into the Capitol Building, speaking metaphors of Civil War and 1776: they thought simply seeing provocative media created public sentiment.

This bespeaks a specific attitude, not toward current events, but toward media. Lazy consumers see events not as events, but as content, and information distribution not as journalism, but as content creation. Functionally, Hensley doesn’t elevate herself to George Floyd’s level, she lowers George Floyd to her level. The spontaneous recording of an actual crime in progress, becomes neither better nor worse than her forced confrontation with a queer bartender.

Let me emphasize, this isn’t merely a conservative phenomenon. I’ve struggled to follow political TikTok because, Left and Right alike, it mostly consists of homebrew “journalists” either repeating somebody else’s breaking reports, or shouting angrily at like-minded believers from their car or bedroom. The read-write internet has expanded citizens’ speaking capacity to, hypothetically, infinity, depending on server space. But it’s created little new information.

But conservatives, especially White conservatives, receive one key point differently. They’ see stories of injustice multiply rapidly and gain mainstream attention, and they believe the media creates the martyrs. If martyrdom happens when cameras capture injustice, rather than when humans or institutions perform injustice, then anybody with media technology could recreate the martyrdom process. Anybody could, with a 5G connection, become a martyr.

Such lack of media literacy travels hand-in-hand with the inability to distinguish between forms of injustice. Hensley’s description of her ejection as “discrimination” suggests she thinks herself equal to Black Americans denied service at the Woolworth’s lunch counter in the 1950s. By extension, it suggests her MAGA hat equals organized resistance to injustice. She can’t see the difference, and hopes you can’t, either.

When all news is media manipulation, in other words, then all injustice, no matter how severe, no matter how authentic, becomes equal. Hensley can’t distinguish her own inconvenience from George Floyd’s death—or at least, she expects that others can’t distinguish. The meaninglessness of Hensley’s public stand, as nobody has rallied around her faux injustice, reveals that media manipulation isn’t the same as reality, and some people still can tell.

One recalls the occasional online furor surrounding some doofus who just discovered that “Born in the U.S.A.” isn’t a patriotic song, “Hallelujah” isn’t a Christmas song, and punk rock is political. These people aren’t stupid, despite the inevitable social media pile-on. Rather, these people consume all media, from music to movies to news, passively. Under those conditions, everything becomes equal, and everything becomes small.

Did Elise Hensley seriously believe herself a martyr, surviving a moment of bigoted injustice? Well, only God can judge the contents of her heart. But she evidently hoped other people would believe it, and throw their support behind her. Some evidently did, although the fervor has mostly sputtered. Without the jolt of authenticity, her media manipulation stunt gathered scarce momentum, and seems likely to disappear with the 24-hour news cycle.

The whole “fake news” phenomenon, which pundits say might’ve helped Trump into the presidency twice, relies upon the same action that Hensley attempted, mimicking real events under controlled conditions. But, like Hensley, it mostly failed to fuel real action. It might’ve helped calcify political views among people already inclined toward extreme partisan beliefs, but like Hensley, most “fake news” produced meaningless nine-day wonders.

If I’m right in my interpretation, media consumers are growing weary of manufactured outrage. The next stage will probably be performative cynicism, which is hardly better, but will be at least less terrifying.

Tuesday, March 18, 2025

Chatterbox Jazz and the White Victim Complex

This TripAdvisor photo shows a gay pride flag above the front window
of the Chatterbox Jazz Club in Indianapolis, Indiana (source)

Late last week, a woman video-recorded herself being ordered out of the Chatterbox Jazz Club in Indianapolis, Indiana. When she asks why, the bartender specifically says “because you’re a Trump supporter,” apparently referencing a red MAGA ballcap. When the woman stalls, the bartender retrieves a short-handled baseball bat from behind the bar and says “I’m not fucking around.” The 36-second clip went viral before Monday.

Other commentators note the contradiction between this woman demanding her right to be served, and the Republicans who spearheaded a lawsuit to the Supreme Court letting businesses refuse commerce with gay customers. The case, Masterpiece Cake Shop v. Colorado Civil Rights Commission, didn’t actually legalize anti-gay discrimination. It did, however, redefine “religious neutrality” when writing anti-discrimination law. It also basically kicked the lawsuit back down the ladder, leaving it essentially unresolved today.

I’d rather avoid rehashing that, not because it isn’t legitimate, but because I’m unqualified. Instead, let’s consider the medium. This woman didn’t just get ejected from a hostile venue; she recorded herself getting ejected. She recorded the confrontation on a cellphone camera held vertically, indicating her intention to distribute the footage online, probably on TikTok. Therefore this confrontation didn’t just happen; she probably engineered it.

The woman herself doesn’t appear in the footage. She verbally admits she’s wearing a “Trump hat,” but we never see it; she certainly doesn’t dispute the accusation that “you’re a Trump supporter.” Based on that fact, it seems irrefutable that she did or said something pro-Trump inside the bar. Management released a statement claiming that her party “intentionally misgendered and harassed a Chatterbox employee.”

In the video, the bartender arguing with the video creator appears gender-ambiguous and is coded nonbinary. Some unofficial websites describe the Chatterbox Jazz Club as a gay bar; the Chatterbox’s website takes no discernable position, but shows a trans-rights flag in the front window. The likelihood that accuser Elise Hensley, who describes herself as a repeat customer, didn’t know this before Friday night, is vanishingly small.

Therefore, if Hensley’s party entered this club wearing MAGA hats, they didn’t do so innocently. Unless they specifically said something about their intention to create conflicts or inflame tensions, it’s difficult to prove intent in court. However, their intent to start a fight seems highly likely, even almost certain. For our non-courtroom purposes, it seems clear that Hensley and her party intended to start a fight.

Moreover, because the bartender keeps a bat within reach behind the bar, they’ve probably faced previous challenges. Survivors generally buy weapons after they’ve been robbed or assaulted, not before. Hensley entered the bar spoiling for a fight, and bar staff appeared prepared to give her one. And she filmed the confrontation in process. Therefore, clearly, this happened not for Hensley’s benefit, nor the Chatterbox’s, but for our benefit.

Hensley clearly wants the world to see her suffering some oppression. We have this underscored when she says, “You know that this is, like, discrimination, right?” Other patrons reply with jeering laughter, but Hensley appears serious. In that moment, she perceives herself as suffering discrimination, as being the oppressed party in an unequal power dynamic. She sees herself as the victim in this confrontation.

American conservatives, especially the MAGA variety, occupy an ontological dilemma. They claim their opinions and actions represent most American citizens, that they’re merely saying aloud what everyone else really thinks. Simultaneously, they call themselves an oppressed minority, silenced by overwhelming forces. The Trump administration’s anti-DEI policies embody this duality of White authority and White victimhood: Whites are hypercompetent, but suppressed by incompetent minorities.

Hensley almost certainly recorded this confrontation because she thought it would make her look oppressed, victimized, put-upon. To those who share her prior suppositions, it probably does. The resort to cusswords and threats of violence implies victimhood. Maybe Hensley thought, in the largest city of an overwhelmingly Red state, she could make herself a celebrity victim and parlay that into a leadership position in the long-awaited conservative uprising.

But even the slightest context awareness demonstrates that the patrons laughing at Hensley, not Hensley herself, have the greatest command of the facts. Hensley, like so many in today’s hyperconnected world, has confused being a content creator with being a newsmaker, and as a result, she makes herself look ridiculous. Conservatives love trying to enter themselves in the oppression Olympics.

Elise Hensley will be remembered alongside Amy Cooper, the Central Park woman who turned herself into a synonym for racism, ignorance, and media manipulation. And that’s all she deserves.

Follow-up: Chatterbox Jaxe and the Victim Complex Part 2

Friday, March 14, 2025

How To Invent a Fake Pop Culture

I don’t recall when I first heard the song “Sally Go ’Round the Roses.” I know I first heard Pentangle’s folk singalong arrangement, not the Jaynetts’ Motown-tinged original. Like most listeners my age, who grew up with the mythology of Baby Boomer cultural innovation, I received that generation’s music out of sequence; the 1960s appeared like a single unit, without the history of cultural evolution that define the decade.

Therefore I didn’t understand how influential the Jaynetts’ original version really was. Its use of syncopated backbeat, gated distortion effects, and enigmatic lyrics were, in 1963, completely innovative. The British Invasion hadn’t hit America yet, with the inventive tweaks that the Beatles and the Kinks experimented with. The original label, Tuff, reportedly hated the song until another label tried to purchase it, causing Tuff to rush-release the record.

Eventually, the track hit number two on the Billboard Hot 100 chart. More important for our purposes, though, a loose collective of San Francisco-based musicians embraced it. Grace Slick recorded a rambling, psychedelic cover with her first band, The Great Society, and tried to recreate its impact with classic Jefferson Airplane tracks like “White Rabbit” and “Somebody To Love.” Much of her career involved trying to create that initial rush.

Once one understands that “Sally” came first, its influence becomes audible in other Summer of Love artists, including the Grateful Dead, Creedence Clearwater Revival, Moby Grape, and Big Brother and the Holding Company. These acts all strove to sound loopy and syncopated, and favored lyrics that admitted of multiple interpretations. Much of the “San Francisco Sound” of 1966 to 1973 consisted of riffs and jams on the “Sally” motif.

That’s why it staggered me recently when I discovered that the Jaynetts didn’t exist. Tuff producer Abner Spector crafted “Sally” with two in-house songwriters, an arranger who played most of the instruments, and a roster of contract singers, mostly young Black women. The in-house creative team played around and experimented until they created the song. It didn’t arise from struggling musicians road-testing new material for live audiences.

Grace Slick around 1966, the year she
covered “Sally Go ’Round the Roses”
with the Great Society

A New York-based studio pushed this song out of its assembly-line production system, and it became a hit. Like other bands invented for the studio, including the Monkees and the Grass Roots, the Jaynetts didn’t pay their dues, the studio system willed them into existence. They produced one orphan hit, which somehow travelled across America to create a sound-alike subculture, back when starving musicians could afford San Francisco rent.

Culture corporations, such as the Big Three labels which produce most of America’s pop music, and the Big Five studios which produce most of America’s movies, love to pretend they respond to culture. If lukewarm dribble like The Chainsmokers dominate the Hot 100, labels and radio conglomerates cover their asses by claiming they’re giving the customers what they want. Audiences decide what becomes hits; corporations only produce the product.

But “Sally’s” influence contradicts that claim. Artists respond to what they hear, and when music labels, radio, and Spotify can throttle what gets heard, artists’ ability to create is highly conditional. One recalls, for instance, that journalist Nik Cohn basically lied White disco culture into existence. Likewise, it’s questionable whether Valley Girl culture even existed before Frank and Moon Zappa riffed in Frank’s home studio.

It isn’t only that moneyed interests decide which artists get to record—a seamy but unsurprising reality. Rather, studios create artists in the studio, skimming past the countless ambitious acts playing innumerable bar and club dates while hoping for their breakthrough. This not only saves the difficulty of having to go comparison shopping for new talent, but also results in corporations wholly owning culture as subsidiaries of their brand names.

I’ve used music as my yardstick simply because discovering the Jaynetts didn’t exist rattled me recently. But we could extend this argument to multiple artistic forms. How many filmmakers like Kevin Smith, or authors like Hugh Howey, might exist out there, cranking out top-quality innovative art, hoping to become the next fluke success? And how many will quit and get day jobs because the corporations turned inward for talent?

Corporate distribution and its amplifying influence have good and bad effects. One cannot imagine seismic cultural forces like the Beatles without corporations pressing and distributing their records. But hearing Beatles records became a substitute for live music, like mimicking the Jaynetts became a substitute for inventing new culture. The result is the same: “culture” is what corporations sell, not what artists and audiences create together.

Saturday, March 8, 2025

The Great Exploding Rocket Debacle Continues

A SpaceX Starship test rocket launch (AP photo)

Back in the 1980s, I remember being a science fiction fanboy, growing disgusted with the American space program’s apparent inaction. Sure, NASA maintained a robust schedule of space shuttle flights and satellite launches that had a certain earthside grandeur. But shuttle crews performed a string of low-stakes scientific experiments that yielded only incremental knowledge gains. Compared to Asimov-era promises, NASA seemed terminally timid.

Fiction countered this ennui with the promise of libertarianism. I’d be hard-pressed to name even one title or author forty years later, but a bevy of science fiction authors proposed the idea of private corporations and rich cowboys taking over where NASA proved timid. Ben Bova certainly hinted at this with his Moonbase novels. Arthur C. Clarke, though no Randian libertine, nevertheless had undertones of privatization in his Thatcher-age novels.

This promise of private-sector space flight seemed promising to my preteen self. I was mature enough to gobble down novels written for adults, a voracious reader hungry for the high-minded themes and promises of adventure which grown-up SF promised. But I wasn’t subtle enough to parse deeper meanings. These novels I greedily inhaled often contained dark implications of privatized space exploration encouraging rapacious behavior and destructive greed.

Watching the news surrounding this week’s SpaceX flight explosion, I can’t help remembering those stories I dimly understood. Elon Musk has spent a decade pitching how his multiple corporations can perform public services more efficiently than public bureaus. Yet his spacecraft’s multiple explosions, including this one which halted East Coast air traffic for hours, have repeatedly embarrassed us ex-scifi kids who still think space is pretty cool.

Elon Musk

Musk’s personal wealth reached nearly half a trillion dollars immediately following the 2024 presidential election, thanks to his connections with President Trump. Everyone assumed, not unreasonably, that Musk’s lucrative government contracts, including SpaceX, would yield heavy dividends in a Trump presidency. Yet in under two months since the election, Musk has seen his wealth fall by a quarter, fueled by his reckless behavior and personal unpopularity.

Importantly, Musk’s wealth has come significantly from government contracts. SpaceX makes some profitable product, particularly its StarLink satellite system, but that profit comes with significant amortized debt, underwritten by government credit securities. Most of SpaceX’s actual operating capital comes from NASA contracts since, following the discontinuation of the space shuttle program, NASA has no in-house launch capabilities anymore.

But this privatized space program has been, charitably, embarrassing. NASA spent much of its house budget creating scientific laboratories and testing facilities, because its budget, circumscribed by Congress, included little room for launchpad errors. SpaceX employs few scientists, but mostly engineers, preferring to build and launch physical prototypes, because even when they explode, they create valuable capital through the medium of name recognition.

In practice, this means NASA was slow-moving and timid, but it produced results: NASA went from President Kennedy promising a moon landing, to actually landing on the moon, in seven years. SpaceX moves quickly and dramatically, but it mostly produces falling debris and lurid headlines in the 24-hour news cycle. Its track record getting actual astronauts into space is spotty, and frequently beholden to the bureaucratic cycle.

Again, this underscores a contradiction in libertarian thinking. This week’s explosion, which scattered debris widely throughout the Caribbean, forced the FAA to halt traffic from major American airports—mere days after Musk’s own chainsaw behavior reduced FAA workforce numbers to critical levels. Musk disparages the public sector that, it turns out, he desperately needs.

Herbert Marcuse postulated, in One-Dimensional Man, that technological society produces intellectual stagnation and a headlong race toward mediocrity. This applies, he wrote, in capitalist and communist societies alike: engineers only reproduce what they know already works. Actual innovation requires government intervention, because only governments willingly embrace uncertainty and the capacity for failure.

SpaceX has proven what 1980s SF writers said, and I failed to understand, that a successful privatized space program requires avaricious ego and casual disregard for consequences. Private space exploration requires a greedy space pirate to eviscerate public resources for private gain, then turn around and trust public servants to keep citizens alive when the engineered product literally explodes. That’s the opposite of innovation.

IMusk’s embarrassing post-inauguration behavior and continuing business disasters probably won’t cure anybody of libertarianism, at least yet. People who ideologically believe in the private sector’s goodness will persevere despite seven weeks of high-profile setbacks. But hopefully at least some will accept that, in a high-tech society, the private sector needs a public sector to survive without killing innocent bystanders.

Thursday, March 6, 2025

Time For the 28th Amendment

How old were you when you discovered that the right to vote isn’t protected in the United States Constitution?

Like most Americans, I studied the Constitution, in different ways and different forms of depth, through high school, into college, and later in various books, seminars, and media deep-dives throughout my life. Teachers and commentators gushed lovingly over how the 15th Amendment extended voting to former slaves, the 19th Amendment gave women the vote, and the 26th Amendment gave eighteen-year-olds the right to vote.

All of these are good. But they establish that the government cannot withhold the right to vote based on certain protected categories. Not once does the Constitution state who does have the voting franchise; the issue remains airy-fairy and undefined. And I didn’t know that until I read Levitsky and Ziblatt’s Tyranny of the Minority, which I read when I was 49. Only when they pointed it out did I realize this information was missing.

Throughout much of American history, the question of what makes someone a “real” American has loomed large. The Philadelphia Convention of 1789, which drafted the kernel of our current Constitution, was dominated by slaveholders, who wanted their human property counted on the Census, but didn’t want slaves having any vote. These White male aristocrats, whom we dub the “Founders,” handled the problem by punting it onto the states.

As you’d imagine, this created a patchwork of standards. States have, at times, made land ownership a criterion—which created problems when rising industrialization pushed more Americans into cities. Old-fashioned bigotry encouraged many states, overtly or covertly to disenfranchise Black Americans, until the Civil Rights Act of 1964 banned it. Since the Shelby County ruling, states have competed to find innovative new ways to make voting harder.

Many attempts to increase the voting franchise are doomed to fail. Because less populous states, which skew conservative, gain a tactical advantage from the status quo, many common suggestions, like ending the Electoral College or disestablishing the Senate, are non-starters. The Constitution sets the threshold for amendments so high that, in times of bitter polarization like we have now, changing the system is unlikely at best.

But I propose that it’s politically possible to start with something simple: just establish that American citizens have a right to vote, irrespective of state laws. This has multiple advantages. It will set the default for American voting as “opt-out,” rather than the current “opt-in.” It will capitalize on the American fervor for treating everyone equally, since setting a standard baseline of simply letting people vote is, facially, completely equal.

With that in mind, I propose a movement to pressure our lawmakers to create a 28th Amendment. Since I’m not an attorney or Constitutional scholar, I don’t want to create a binding text for such an amendment; that exceeds my skills. But I propose the following as a starting point:

1. All persons who have been born citizens of the United States, or who have been naturalized as citizens under the standards of this Constitution, and having achieved no less than eighteen years of age, shall have the right to vote and to participate in electoral processes in the United States, and in the states in which they reside.
2. All persons who have the voting franchise under the standards of this Constitution, but who shall reside outside the United States for military deployment, lawful students studying abroad, citizens working abroad under a lawful visa, or for any other reasons which Congress shall protect by legislation, shall be permitted to participate in electoral processes in the United States, and in the most recent jurisdiction for which they were most recently resident.
3. The Executive Branch, under terms which Congress shall set by legislation, shall maintain a permanent roster of lawful registered voters in the United States, and shall take responsibility for maintaining the currency of that roster, and shall protect the voting rights of all persons who have the right to participate in the electoral process in the United States.

We voters can pressure American lawmakers to rally behind this straightforward, facially neutral action statement. Sure, I know anti-democracy activists like Peter Thiel exist in America, but I believe they’re controllable, while our system remains tractable to public pressure. We can organize to pressure our lawmakers to support this change by threatening them with the shame of being seen as anti-voting.

This won’t solve all of America’s problems. But it will at least get all Americans involved in the problem-solving process.

Wednesday, February 26, 2025

The First and Last Days of Scottish Witchcraft

C.J. Cooke, The Book of Witching

A calamity has occurred on an uninhabited island in the Orkneys, in Scotland’s sparsely populated far north. Three teenagers reenacted a pre-Christian ceremony, with all the cocksure enthusiasm of teenagers; but it’s ended with one teen dead, another maimed, and the third missing. Now the adults around them must reconstruct what happened, because a malevolent force nobody’s yet seen may have something to profit from the catastrophe.

C.J. Cooke, a sometime university professor, has gained renown for her intensively researched, historically themed dark fantasy novels. This is no exception; not many horror novels include a works-cited page. For this volume, she delves into one of Scotland’s darker episodes. Even by witch trial standards, Scottish trials were notoriously brutal, a revolting mix of Christian piety and state-sanctioned torture which extracted confessions through truly appalling means.

In 2024, Clementine Woodbury struggles to understand the events that stranded her daughter in a Glasgow burns unit. Once lively and free-spirited, Clem’s daughter Erin has grown moody and secretive since becoming a teen mother. With Erin under sedation in a sterile room, Clem can’t ask direct questions about her mysterious injuries, so she takes her granddaughter and commences a freelance investigation. She isn’t prepared for the secrets she uncovers.

Parallel to Clem’s investigation, Alison Balfour stands accused of witchcraft in 1594 Kirkwall. Though the accusation carries whiffs of religious paranoia, Alison quickly realizes the truth: she’s a pawn in a powerful dynastic struggle for control of the Orkneys. Her confession, or lack thereof, will determine which rapacious aristocrat will control Orcadian government—though either outcome will be disastrous for ordinary smallholders like her family.

Cooke’s balance between these two narratives asks important questions. What debts do we moderns owe for injustices performed centuries ago? And what obligations do we bear to future generations? Alison Balfour realizes quickly that she can’t prevent her own unjust death; she can only determine what consequences her death brings upon others. Clem can’t pinpoint what caused her family’s sufferings, but clearly something dark lingers in her heritage.

C.J. Cooke

Though marketed as a “thriller,” this novel’s contemporary portion more resembles an amateur sleuth mystery. While the police struggle to fit Erin’s grievous injuries into their pre-written crime narrative, Clem assumes responsibility for uncovering what happened to her daughter. If this means scrambling into Scotland’s enigmatic, impoverished north to confront a secretive cabal, she clearly considers this an acceptable price for a truth she might not like.

The historical portion, meanwhile, is explicitly political. Orkney suffers under a government that rules by stoking fear among the population, retaining power by convincing the population of an even worse enemy. Alison knows she can’t win this battle. Therefore she’s forced to redefine victory according to what keeps her family and her people alive. Cooke reconstructs a poorly documented time of paranoia, recorded only through state and religious propaganda.

Therein, Cooke tacitly acknowledges something often forgotten in histories of witch hunts: they weren’t the flexings of invincible empires, eager to demonstrate their power. Witch hunts happened after the church-state hybrid began losing unquestioned authority. Alison Balfour’s execution happened a generation after the Scottish Reformation, as the Stuart monarchy clung to dwindling authority. Witch hunts are the superannuated flailings of a broken empire already in retreat.

In this, Cooke shows an aristocracy terrified of its people. Patrick Stewart, Second (and last) Earl of Orkney, sought the church’s benediction because he knew the people already organized against him, that the trade guilds that built his palaces were also hotbeds of insurrectionist intrigue. The Earl and his retinue yearn for unquestioned power, but the very fact they must resort to such extremes proves they’ve already lost the people’s devotion.

Alison Balfour works as a peasant healer among people who survive in nature’s bounty; but palace intrigues and state paranoia drag her into early modernism. Clem Woodbury trusts medicine, modernism, and police technocracy; but she’s forced to delve into her lost heritage and forgotten bloodline when modernity can’t answer her questions. Both women discover truth hiding in secretive corners, that nothing’s ever as simple as the official narrative would claim.

Cooke creates a story of nuance and complexity that rewards multiple levels of reading. She uses the markers of paperback thrillers, and on that level, one could read this book casually, like any other beach novel. But Cooke also asks questions about heritage, responsibility, and power, which don’t yield themselves to easy answers. Especially in Europe, where aristocratic paranoia still casts a long shadow, is the past ever really gone?

Sunday, February 9, 2025

One Foot in Bakersfield, One Foot in the Future

Dwight Yoakam, Brighter Days

Dwight Yoakam’s best music, especially his hits from nearly forty years ago, has always striven to make him sound older than he really is. His 1986 breakthrough single, “Honky Tonk Man,” was a cover of a 1956 Johnny Horton barn-burner, and his best work always strove to sound like Bakersfield, 1960-ish. But sounding older means one thing when you’re thirty; how does he maintain that strategy now at 68?

This, Yoakam’s sixteenth studio album, is emphatically not timeless. Yoakem stages a deliberate callback to his own neo-traditionalist roots, appealing for those who think history has gone badly and want to fix its errors. He channels the twangy, backbeat-heavy Central Valley country music that made him famous. In doing so, he attracts an audience who probably shares my assessment that country music went cockeyed somewhere around 1996.

Despite this, Yoakam avoids the modish cynicism that often accompanies older artists recording nostalgia bait. He’s remarkably optimistic, even on the more melancholy tracks, his expressive sadness often transitory. Perhaps this reflects the two pulls on Yoakam’s artistry: he’s always been artistically (and often ideologically) conservative. But since last recording, he became a father for the first time, at a mere sprightly 63.

So his sound remains retro, but he has a pointed hope for the future. The album opener, “Wide Open Heart,” has the aggressive chomping chords that made both country and rock sound so distinctive in Southern California in 1960. But it’s a love song, full of “She’s all mine to love” and “Come on let’s get it done.” Except it’s love for his carefully restored, chrome-plated street racer car.

Because yeah, Dwight’s old, but he’s young enough to care. This album brims with red-hot emotions for whatever gives Dwight hope enough to keep moving forward. Many seem dedicated to his wife, Emily. (Despite several high-profile relationships in the 1990s, he never married until 2020.) Others are dedicated to music, touring, and a cover of the Carter Family classic “Keep On the Sunny Side,” an ambiguous nod to spirituality.

Dwight Yoakam

The sound remains retro, certainly. He loves crunchy acoustic-electric guitars supplemented with a heavy marmalade of Hammond organ, sounding like something the Wrecking Crew would’ve mass-produced sixty years ago. Most songs maintain a steady 4/4 or 4/6 time; you could line-dance to even the album’s slowest, most navel-gazing tracks. “Can’t Be Wrong” opens with almost the same chords Yoakam used on “Please Please Baby” in 1987.

Yet notwithstanding that conservatism, Dwight sees a brighter future. “A Dream That Never Ends,” with its Laurel Canyon vocal harmonies, implies, despite its title, that love isn’t infinite, and somebody might leave—but he insists he’ll keep believing anyway. “If Only” dreams of what could happen if we shed our carefully constructed cynicism, and includes the eminently quotable line: “If only you’d choose love, love would choose you.”

“California Sky” is perhaps Yoakam’s  most thoroughly engineered track here, with its Tex-Mex guitars and slight nod to fatalism. “Hand Me Down Heart” is exactly what you’d expect from the title, a lament for the suffering he’s previously endured. But instead of surrendering to despair, he presents his heart as something capable of healing, worthy of redemption. Second chances are real in Yoakam’s world.

You might mistake Yoakam’s title track for a love song, with its unifying Tom T. Hall riff, but he wrote it for his son, who closes it with half-gibberish lyrics. “Brighter days are what you promised me,” he sings: optimistic, but with implied consequences if the promise falls through. “Bound Away” laments the touring musician’s life, like CCR’s “Traveling Band,” but with the recognition that “I’m trying to come home, I’m trying to land.”

At nearly an hour, this album runs almost twice conventional LP length. Despite the vinyl revival, Yoakam knows he’ll mostly be streamed or downloaded, and isn’t circumscribed by physical limits. This gives him freedom to play generously with composition and arrangement. Though he doesn’t do anything revolutionary—no Mike Oldfield half-hour experimentation here—he does plumb the full depth of his conservative ethos.

Listening to this album, we’re aware we’re hearing an artist who hasn’t had a Top-40 hit since 2000, and no breakout smash since 1993. His entire career is a nostalgia circuit for aging fans who need reminded why we embraced him so aggressively nearly forty years ago. Yet within that limit, Yoakam expresses an optimism his 1990s recordings sometimes forgot. Dwight’s old, yes, and so am I, but he reminds me that we both still have a future.

Friday, February 7, 2025

Hanging Onto Hope While Everything Around Me Is On Fire

Back in the 1980s, my father used to collect aluminum cans as a form of exercise. In those days, people regularly just chucked cans, food wrappers, and other litter out of moving car windows. Anyone old enough to personally remember the Reagan era will recall that American roadsides, especially urban roadsides, were consistently choked with post-consumer waste.

So my father would take a lawn-and-leaf bag and go walking aimlessly. The walk gave him necessary low-impact exercise and time to clear his head. And he knew it was time to start home when the bag approached full of the aluminum cans he collected. He would take the full bags to the local recyclery for cash, and use the proceeds to take us kids out for burgers.

After eating, he insisted we dispose of our wrappers correctly.

Once upon a time, American attitudes toward waste were, by today's standards, appalling. A New York PR professional coined the term “litterbug” in the 1940s, but the notion that post-consumer waste was “disposable” created the persistent idea that we could just pitch waste anywhere and trust the Lord to handle it. And way too many of us just did. Part of America's anti-urban sentiment in the 1970s and 1980s referred to the trash on every street and sidewalk.

I was too young to understand when things changed, but they did. In the early 1990s, my dad's walks took much longer, and our burger runs became less frequent. At some point, he started coming home with his bag only half-full. Around the time I finished high school, these walks stopped being worth the effort for him. He stopped carrying the bag with him, and he walked much more predictable, programmatic routes.

That was a loss for Dad, of course. His litter-collecting ambles had been an important part of his exercise regime since before I was born. But even he acknowledged that it was a net good. He couldn't find recyclable litter because fewer people were creating litter; more people accepted that they had individual responsibility for the common good. And streets were far cleaner.

Such changes in public morality don't happen in a vacuum. A combination of public education, media campaigns, and changing local laws overpowered the notion that litter was a “victimless” offense. The more people who accepted their responsibility for clean streets, the more pressure on those who dragged their heels. Eventually the momentum became irresistible.

Not that nobody tried to resist. Some people absolutely insisted on their right to litter; some still do. When I was in college, the campus conservative student group sold t-shirts with a disfigured recycling symbol and the logo “Environmentally Unsafe And Proud Of It!” They turned their sloppiness into a political status and a social identity.

Yet the very fact that they did so proved that they were just fighting the tide, and they knew it. Even while wearing that t-shirt, I watched several of them throw their food wrappers in the trash, and their soda-pop bottles in the recycling. The shirts had a brief, voguish popularity, then vanished as the wearers realized they didn't look brave, they looked like dickheads.

We saw similar fates for other once-popular actions: smoking, for instance, or driving with ethylated gasoline. Or racism, or hating on LGBTQ+ populations. These were once commonplace to the point of being bland, then they became agitated political positions, then finally identities. Because the more obvious it became that these were unsustainable behaviors, the more momentum built against them.

As I write, we're witnessing rapid reversal on some of these positions. The incoming administration has passed sweeping revisions that empower racists, homophobes, and irresponsible environmental attitudes. It's easy to think that, because these actions have government approval, it's impossible to stop them.

But I take comfort in their militant aggression. The administration has to fight so viciously because they know they don't have the momentum on their side. I will admit that losing government support for a more just, more responsible society is a massive setback. But they're fighting so hard because, fundamentally, they know they're losing.

Please don't get me wrong. Victory is far from a forgone conclusion. If we get discouraged and squander the energy, we will lose momentum. To win, we need to keep standing up for a just society and a broad, inclusive definition of citizenship. But I still believe the weight of history is on our side. Victory is ours for the demanding, as long as we remain mindful of the moment.

Tuesday, February 4, 2025

Jump, Jive, and Wail Against the Machine

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 53
Thomas Carter (director), Swing Kids

Imagine a world where a group of relatively well-off White teenagers adopted the culture, dance, and trappings of Black musicians. The teenagers pretend this adoption is apolitical, and their subculture is merely fun. But the racially segregated, authoritarian state sees this White embrace of Black culture as tantamount to treason. So they use vaguely written laws to force kids into mandatory social retraining. Some kids resist this conversion; others can’t.

Screenwriter Jonathan Marc Feldman and director Thomas Carter presented this movie in the Reagan/Bush I era’s immediate hangover. Their intended commentary on recent events was particularly unsubtle. This perhaps explains why critics greeted this movie with ambivalence; Roger Ebert, a dedicated acolyte of ars gratia artis, particularly hated it. Yet in subsequent decades, its commentary has become only more relevant, its message more prescient.

Peter Muller (Robert Sean Leonard) and his friends admire the freedom and authenticity of American and British pop culture over Germany during the ascendent Reich. They cut a rug in unlicensed dance clubs with music first recorded by Black and Jewish artists like Duke Ellington and Benny Goodman. As often happens with new youth subcultures, their rebellion includes petty crime. Peter gets arrested, and sentenced to join the Hitler Youth.

The opening act really emphasizes the Swing Kids’ desire to avoid politics. The overwhelmingly White subculture simply yearns for the liberty they perceive in minority cultures, blind to the ways oppression shapes that culture. The Swing Kids refuse to take sides even as Germany begins the march to war. This even though many members are of conscription age: they’ll almost certainly be expected to carry arms for the authoritarian state.

After Peter is forced to join the Hitler Youth (Hitlerjugend, shortened to HJ), his fellow Swing Kid Thomas (Christian Bale) also joins, in a show of solidarity. They pursue a double life, keeping up with HJ ethics of athleticism, nationalism, and militarism by day. At night they don their flamboyant British suits and dance feverishly. They insist they can maintain that dualism, until the moment they can’t.

Their friend Arvid (Frank Whaley), who is Jewish-coded, plays a mean jazz guitar and admires Django Reinhart. Arvid makes bank playing underground clubs and basement dances. But in an autocratic surveillance state, it doesn’t take long before HJ thugs come calling. A back-street beating breaks several bones in Arvid’s hand, rendering him unable to play. Stuck alone in a shabby loft, Peter and Thomas must decide which side they’re on.

l-r: Frank Whaley, Christian Bale, and Robert Sean Leonard in Swing Kids

Feldman and Carter exaggerate the Swing Kids’ moral trajectory. Their early insistence on political innocence is so overwhelming that you initially wonder whether they’re deliberately deceiving themselves. But that willful ignorance gives way quickly. Thomas, surrounded by constant HJ propaganda, eventually starts to believe it. Peter, dragooned into government atrocities, goes the other direction and prepares for a confrontation.

This deliberately didactic theme didn’t help with critics. The movie’s gut-punch arc of moral specificity led some to disparage it as a meaningless weeper designed for children; Ebert, near his death, included this movie among his list of worst movies ever. Undoubtedly, it guides viewers with a heavy hand, and fears that its mostly young intended audience won’t get the message unless it’s heavily signposted.

Yet as educators and activists feud over how exactly to teach that audience about the war, this movie has gained second life. Its aggressively sentimental approach to the lessons the characters learn—especially Peter—reflects the betrayal students feel when they realize the history they’ve learned has been thoroughly whitewashed. Yes, this movie is unsubtle. But so is the discovery of the depths of cruelty humans repeatedly achieve.

It also forces the intended audience to examine itself. Just as Hamburg teenagers pinched Black swing culture, Memphis youths stole Black rock’n’roll, and Oakland kids filched hip-hop. In every case, White kids pretended their cooptation of Black culture was apolitical, that their use of the signs and signifiers of rebellion were party-time fun. White kids love Black culture, but generally need jolted to recognize the forces that shaped that culture.

One can question whether the Swing Kids subculture actually accomplished anything. Doomed resistance movements, from Wat Tyler’s rebellion to the Order of the White Rose to the Woodstock generation, are generally more celebrated after the battle is over. But in a conformist, autocratic state, the Swing Kids movement reminded its participants that they needed, ultimately, to answer to their own consciences. That’s one thing the state can’t take away.

Today’s world can stand to learn that lesson.

Thursday, January 30, 2025

Living in the Wallace & Gromit Economy

Wallace unleashes his newest invention, NORBOT, on his hapless pooch Gromit
in the new film Vengeance Most Fowl, now on Netflix

I’m a fan of Nick Park’s Wallace & Gromit films, since I first discovered the original short films on grainy, probably bootleg VHS in the 1990s. The humor operates on the same principle as Mr. Bean or Red Dwarf: a well-meaning but incompetent protagonist bumbles into situations far above their heads. Wallace, Bean, or Rimmer are momentarily embarrassed, but consistently come out ahead, without really learning anything.

The films present Wallace as a garage inventor and shade-tree mechanic. Though the first short film has him successfully build a moon rocket in a weekend, subsequent films consistently harp on the same theme, that Wallace’s inventions create more problems than they solve. They require added steps, break down frequently, get sabotaged by rascally varmints, and otherwise create needless kerfuffles. All just to less efficiently butter his breakfast toast.

Though that theme runs through nearly every film, short or long, I don’t recall it looming as large as in the latest entry, Vengeance Most Fowl. Throughout Act One, Gromit, the wordless dog character who’s secretly the brains behind the operation, keeps indulging Wallace’s invention mania. However, he longs to complete his necessary tasks and switch over to the activities which give his life meaning: gardening and knitting.

Wallace, however, persistently misunderstands Gromit’s need for meaningful work. He sees both gardening and knitting as repetitive work, which automation can eliminate. Therefore he introduces his newest invention, NORBOT, a self-actuating garden gnome that literally takes jobs right out of Gromit’s hands. Though wordless, Gromit’s Claymation facial expressions make clear the disgust he feels without tasks to occupy his hands and brain.

Thing is, I understand, somewhat, Wallace’s motivation. For years, advocates of Fully Automated Luxury Space Communism have claimed that technology will render work obsolete a week from next Tuesday, and we’ll have limitless free time to… well, to do whatever. More recently, TechBro types have extolled what they falsely call “Artificial Intelligence” to take writing, music, and art away from the nerds by strictly automating it.

Such advocates see work as burdensome, something to outsource. Socialists have historically considered work as something imposed by the economic order, something we can abandon because our high-tech do-funnies will absorb the tedium. TechBros, by contrast, see workers and their jobs as an undesirable sunk cost that they’d rather abandon. Either way, work becomes something to abolish, replacing ordinary humans with machines, computers, and heuristics.

NORBOT represents only the comical reductio ad absurdum of this mentality. It snatches the pruning shears from Gromit’s paws and, in mere seconds, transforms his lush English garden into a topiary extravaganza completely devoid of character. It subsequently steals Gromit’s yarn and knits Wallace another outfit exactly like the one he always wears. NORBOT works fast, cheap, and efficiently, but without personality or meaning.

Socialist writer Barbara Garson admits she thought the capitalist class forced workers to work. Only after visiting workplaces and watching the ways employees extract meaning from standardized work, did she realize that work said something about workers’ souls. People don’t work because overseers and debt collectors force it. They work because what we do with our hands, what we create with our brains, defines who we are.

Economist John C. Médaille similarly observes that, if you watch how people spend their free time, it frequently resembles work. Left to their own devices, people might grow vegetables, build Shaker furniture, write novels, perform home improvement, rebuild classic cars, or paint. Although some people certainly drink beer and watch television, complete forfeitures of experience, most people, given the opportunity, seek work to define themselves.

To a limited extent, advancing technology has made such meaning easier to create. Inventions like the steel plow and combine harvester meant that growing crops required fewer workers. In former days, most peasants farmed from sheer necessity. Now, most people can choose whether they want to cultivate the earth, or whether they’d rather make meaning elsewhere. Therefore I’m no absolute Luddite, and embrace technology to a point.

However, I’d contend we’ve surpassed that point. Early Twentieth Century inventions made work more productive, and homemaking more efficient. However, as Research and Development has superseded invention, most “new” technologies simply complexify existing machines. I struggle to imagine any technology that’s improved our lives in the last thirty years. Made us more productive? Sure. But happier, healthier, better developed? I got nothing.

Watching Gromit get his hobbies stolen, I felt the pang of familiarity. We’re all watching capitalists extract meaning from our lives, sometimes without malice. We’re all Gromit now.

Wednesday, January 29, 2025

Bishop Budde and the Prophetic Tradition

The Rt. Rvd. Marian Edgar Budde
(Washington National Cathedral photo)

Over a week ago, Episcopal Bishop Marian Edgar Budde gave President Trump the gentlest, most benevolent scolding in recent political history. She simply urged Trump (a notoriously inattentive churchgoer) to remember all Americans when governing, not only those who resemble himself. This was too much, not only for Trump’s political supporters, but for conservative religious leaders. Trump’s supporters described Budde’s benign concerns as “the radical left just spew[ing] hate.”

Smarter theologians than I have written extensively about the foolish anti-Budde diatribes. Budde’s exhortation to look after marginalized and disadvantaged peoples comes directly from the Gospels, especially the Sermon on the Mount and the Parable of the Sheep and the Goats. Any familiarity with Christ’s message emphasizes that Christians have a God-given responsibility to care for poor, marginalized, and immigrant populations. Not because they’re especially holy, but because they’re poor.

I’d rather contemplate where Budde’s message situates her. In Budde’s willingness to address Trump directly, and speak explicitly to Trump’s attitudes toward American citizens, I’m reminded of three other prophets: Nathan, Elijah, and John the Baptist. All three had specific, conflicted relationships with Hebrew political leaders, and named specific sins each performed by name. Heavily churched readers might know that, for the last two, this challenging didn’t end well.

The prophet Nathan lived in King David’s palace and served some undefined advisory role. His only recorded act of prophecy comes after David steals Bathsheba, a married woman, and sends her husband to die in battle. Nathan spins a parable of a wealthy man abusing his poor neighbor; only after David answers the parable with a demand for retribution does Nathan reveal the parable refers to David himself.

By contrast, Elijah condemns King Ahab from outside the palace walls. When Ahab marries a foreigner, Jezebel, and adopts her religious practices, God sends punishment upon Israel. (In Hebrew scripture, all Israel is judged together; it isn’t a religion of personal righteousness, but a moral backbone for the entire nation.) Elijah and Ahab battle for Israel’s soul for years before the unrepentant Ahab dies and Elijah ascends bodily into heaven.

John the Baptist, though a Christian figure, is similarly Jewish. He condemns the priesthood—which, never forget, served as proxy government for Roman dominion in Judea, and therefore was more political than religious. Like Elijah, he condemned King Herod from outside the palace; unlike Elijah, Herod survived this criticism, and ordered John executed. Where Jesus preached an alternate Judaism, John continued the state-based tradition of Elijah, Amos, and Samuel.

The Hebrew prophetic tradition opposes the inclinations of power. In recent years, we’ve questioned who has the authority to “speak truth to power.” Remember a few years ago, when Republicans went berserk because Michelle Wolf took pokes at the administration at the White House Correspondents’ Dinner? Her defenders insisted that court jesters had a history, even a responsibility, to mock powerful people in high places with uncomfortable truths. But, comedians? Really?

No, historically, comedians entertained; if they made political points, that came only incidentally. Indeed, in Shakespeare’s day, public performances were heavily censored, and comedy, like all performance, trended significantly conservative. Comedy only criticizes power in societies where criticism is deemed, a priori, acceptable. In coming years, as the administration promises to become increasingly authoritarian, pointed comedy, like protest songs before it, will become risky and rare.

Instead, religion has the unique capacity to challenge power in its seat. Especially in an administration that uses the forms of religion, but largely ignores its substance, as Trump does, religious leaders can tell politicians and oligarchs the truths that nobody else dares speak. To the extent that Americans generally, and the administration particularly, believe God and capital-T Truth exist, religion has the privilege to speak it.

This doesn’t mean prophets live safely. Nathan dwelt inside the palace, but Elijah spent his career living as a fugitive. The Northern Kingdom chased Amos out altogether. Jeremiah lived in perpetual fear of crowds, even as he tried desperately to convey the message they needed to hear. John the Baptist died violently, and if Christianity shares the Hebrew prophetic tradition, Jesus and his disciples (except John) all died violently.

If we believe authoritarians need somebody to “speak truth to power,” let’s start with the people who believe Truth exists. Not the people Jeremiah disparaged as “prophets of peace,” either, a category that definitely includes Trump’s so-called spiritual advisers. Rather, let’s find the holy lunatics and angry prophets camping outside the temple walls, shouting. Get ready to eat locusts and wild honey.