Saturday, April 18, 2026

Last Star, Straight Onto Mourning

Catriona Ward, Nowhere Burning

Poor adolescent Riley needs to escape her abusive foster home, and the impossible girl outside her second-floor window offers the sanctuary she needs. All Riley has to do is run away across the Colorado Rockies and learn to fly. As unlikely as that sounds, it’s better than staying put. But when she reaches Nowhere, the strange ruin of mid-century grandeur overlooking Boulder, she finds a compound of frightened fugitives like her, all somehow permanently children.

It wouldn’t be accurate to describe Catriona Ward’s latest book as a retelling of Peter Pan. More like a self-conscious homage. Ward, whose previous works have relied upon sudden reveals and last-minute surprises, offers three converging narratives building toward a secret she’s previously kept. But this time, the reveals don’t feel earned, not like natural extensions of the ongoing story. It feels like she’s deliberately lied in order to blindside us at the last minute.

In the first narrative, Riley escapes her abusive home situation, dragging along her brother Oliver, too young to understand what’s going on. They hike to Nowhere, the remains of a palatial mansion that burned years ago. There, a commune of adolescents has established a stable society without adults. Riley feels both drawn to and repulsed by their self-reliance, backed by simple, useful roles and their leader’s home-brew religion. They worship something that needs constantly appeased.

The second narrative follows Adam. An architect and builder, Adam contracts with prestigious actor Leaf Winham to build improvements on his Colorado mansion, called Nowhere. Leaf is charming and acclaimed, but distrusts fame, and prefers to keep his secrets. Adam feels drawn to Leaf, to the point where he abandons his life, including his pregnant girlfriend. Only when Leaf controls Adam, and he has nothing to return to, does Adam begin uncovering Leaf’s dark secrets.

Finally, documentarian Marc and his camera operator, Kimble, have decided to investigate the urban legends surrounding Nowhere and its cult of children. They want to become the first adults to approach the ruined mansion in several decades, and capture its secrets on camera. But the closer they approach the building, the more friction starts emerging from Marc’s deeply buried past, and it becomes increasingly clear that he’ll hurt his closest friends to keep his secrets.

Catriona Ward

It’s obvious, early on, that these narratives unfold out of sequence. Since Adam’s story unfolds in a Nowhere untouched by fire, it clearly precedes the other two. We read in expectation of how the building’s secrets, clearly known in the other threads, will come out in Adam’s past. Also, how exactly do the other two narratives relate? Marc clearly knows more than he tells Kimble, despite call her the sister he wished he’d always had.

The problem arises here. I’ve read two previous Catriona Ward novels, where plot points revolve around information the characters have, but don’t share. In Little Eve, the narrator is self-consciously telling the story and playing with narrative conventions, in the Agatha Christie style. In The Last House on Needless Street, the secrets are buried under facts the characters take for granted, and therefore haven’t thought about in years. We feel surprised without ever feeling deceived.

Here, the characters clearly know, and often think about, the secrets motivating their choices. They just don’t tell us. The revelations come as raw info dumps, sometimes several paragraphs long. Once the characters reveal the secrets they’ve nurtured, we don’t feel surprised or illuminated, as we have in previous Ward novels; we feel lied to. We can forgive that once, because people lie. But as lie after lie gets revealed, we feel manipulated, not enlightened.

This hurts because Ward’s set-up is so good. Besides Peter Pan, previous critics have compared Ward’s premise to The Shining and Lord of the Flies. Ward isn’t merely imitative, though; she uses these time-honored influences to question how good people with honorable intentions make, and constantly re-make, civilization. Leaf Winham, the charming narcissist, and the children’s religious rituals, are just two forms of community building that work well for adherents, until the moment they don’t.

Reading, I felt like Ward had devised characters, situations, and a nonlinear form that served her psychological writing style well. But she hadn’t figured how to tie the multiple threads together, so she pulled a Hail Mary and hoped we wouldn’t notice. Maybe I wouldn’t have noticed, if her previous books hadn’t been so good that they set my expectations so high. Sadly, the product feels like a good premise, finished with a cheap rug-pull.

Friday, April 10, 2026

In the Hidden Corners of My Hometown

This West Coast modernist design just sprouts in the middle of a post-WWII development.

Flailing my way through protracted unemployment, I recently started driving DoorDash to get cash moving in. My community is too small to produce enough business for me to live off my gig, but it brings in enough to keep groceries on the table. The gig has provided another important education I didn’t realize I needed: despite living in one small city for over twenty years, I’ve discovered how much of town I just don’t know.

My central Nebraska city has a population slightly above 30,000 people. By current American standards, that’s dinky, but on a historical basis, actually quite large. Legendary ancient cities like Chichen Itza or Babylon topped out around 20,000 people, the practical maximum for societies where the majority needed to farm, and urban infrastructure had to primarily support pedestrians and mule carts. Modernity can support much larger populations, though, mostly because of cars, electricity, and Portland cement.

Modernity has also produced something that ancient cities could’ve never supported: single use zoning. When cars put much larger distances within easy reach, citizens building a business in front of their house, a stable in back, and extra rooms for an inn on the side, makes less sense. American communities are now built in sprawling, monolithic ways that discourage visitors. There’s little reason to visit huge swaths of one’s own city without a prior invitation.

This results in acres upon acres, streets upon streets, where I’ve never visited—until now. DoorDash invites me into single-use residential neighborhoods I’ve never previously had purpose or permission to enter. Visiting these quarters for the first time, I witness eclectic architecture, some of it deliberately either minimalist or rococo, and differing ideas about how large the surrounding yard should be. I’ve also witnessed that, the newer the community, the less likely to contain sidewalks.

Very large lawns, without sidewalks or parks, encourage children to play close to home. Current urban design (which, often, means no design, just vibes) discourages children from one of childhood’s primal impulses, the desire to explore. Wandering away from home may be impractical in new developments and, depending on traffic patterns, unsafe. This means children only have opportunities to meet friends and make connections in officially approved spaces, mainly school and, for some, religious congregations.

Just one of a development of identical crackerbox duplexes with postage stamp lawns,
no sidewalks, and no curbside parking—completely hostile to visitors or teenagers.

The extreme opposite, I’ve observed, is small houses, mainly duplexes, on small lots. These are single-story houses with attached garages, requiring a large physical footprint. However, these developments also lack sidewalks, which means not only no pedestrians, but no curbside parking for guests. These houses seemingly go mainly to young families as starter homes, so maybe they don’t entertain much. But it dampens their ability to perform time-honored neighborhood rituals of group bonding through hospitality.

Small starter homes have no parking and no place to set up picnic tables. Larger homes for established families have space and parking, but are so far away that neighbors can scarcely see one another. Either way, these designs discourage traditional neighborhood activities, like block parties or tenants’ unions, and functionally prevent neighbors from getting to know one another. The McMansions, in particular, look awkward, flexing their design flourishes to impress neighbors they’ll never meet.

Traveling to shared spaces, like work or school, requires either an overland hike without sidewalks, or car rides that create traffic jams. My city is small enough that “jams” are fleeting annoyances. But larger unplanned cities like Houston, which is over forty percent paved, can be dangerous during the morning commute. Ambulances trapped in rush-hour traffic have become a notable part of the Houston experience. So was the city’s inability to drain after Hurricane Harvey.

Current urban design standards divide routine activities. This isn’t entirely awful, as most people wouldn’t want to live beside a lead smelter, kimchi cannery, or hog abattoir. But most people also can’t walk to restaurants, shops, or even their neighbors’ houses. All daily business happens enclosed in hermetically sealed, climate-controlled metal capsules. Ordinary people have diminished opportunities to make friends, discover quirky experimental businesses, or, as I’ve learned recently, see most of their own town.

Old cities like central London, Paris, or New York south of Houston Street are designed around human needs: useful sidewalks, homes designed to double as business sites, and multi-story structures that utilize vertical space as assertively as horizontal. We can’t just regress, because history goes proceeds, even when we wish it wouldn’t. But we can look to older spaces for inspiration for innovative ways to utilize newer, more current spaces that aren’t hostile to visitors.

Thursday, April 9, 2026

The Deep, Dark Mines of the Uncanny Valley

T. Kingfisher, What Stalks the Deep

Shellshocked veteran Lt. Alex Easton’s sole qualification to investigate unexplained phenomena, is that they’ve seen it before without flinching. But where they previously fought ineffable monsters in their native Gallacia, a mysterious Eastern European land of dismal swamps and forests primeval, this time, they’ve been invited to America. But then, if there’s a place as old and as hostile to humankind as Gallacia, it must surely be Southern Appalachia.

T. Kingfisher’s “Sworn Soldier” novellas, starring Alex Easton, whose unique gender identity doesn’t translate into English, each delve into different horror subgenres. The first retold a Poe classic, highlighting a theme Poe introduced, but didn’t explore. The second followed the conventions of folk horror. This third unpack a theme popular in recent movies: the legend of mysterious humanoids dwelling in the caverns and mines permeating America’s eastern mountains.

Dr. James Denton, a supporting character from Easton’s first story, has telegrammed Easton for their help. He admits Easton isn’t particularly qualified, except that they’ve faced similar conflicts before, and he needs a partner who won’t ask stupid questions. So Easton crosses the ocean, rides America’s rails, and walks into West Virginia’s dark, forested mountains, a terrain from which more intrepid explorers have frequently failed to return.

Many American folk myths speculate that something dark and mysterious dwells underground, a horrible monster which we’ll uncover by mining for hydrocarbons or even just spelunking. This monster is usually whispered to be older than humankind, and eager for small provocations to resurge and take America from us. Of course, this is coded language. We “Americans” know who we stole this land from, and why they deserve to reclaim it.

Kingfisher salts these themes with a Lovecraftian influence which she acknowledges in her afterword, but which she doesn’t hammer needlessly. Rather, she describes two war-torn old souls, walking wounded, who investigate a land older than human conception. There, they discover a cavern that cannot possibly exist, guarded by a force so close to human, that its very existence personifies the uncanny valley. But that force is holding something worse back.

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Reading this novella, I’m reminded of StanisÅ‚aw Lem’s signal classic, Solaris. Both stories feature humans encountering an intelligence so different from themselves that they cannot truly communicate. Though Easton and Denton have more success than Kelvin in making peace, they struggle with some of the same problems. What does it mean to “communicate” with an intelligence that isn’t human? Or to speak individually with a collective intelligence?

But our protagonists bring something to the story that neither Lovecraft nor Lem considered: capitalists’ willingness to burn everything that doesn’t turn a profit. Lovecraft’s shoggoths and Lem’s ocean planet encounter humans primarily through scientists and explorers. Kingfisher’s primordial intelligence comes to light because humans dynamited the mountains and uncorked Earth’s mantle in search of power and money. Therefore, “first contact” means not curiosity, but pain.

I’ve become a particular Kingfisher fan because she reverses widespread cultural expectations. In this case, besides Easton’s blunt defiance of the Anglophonic gender binary, Easton also sees America as exotic and foreign, reading America back to Kingfisher’s audience. Burned out on conflict, Easton sees American glorification of the Spanish-American war as bizarre and uncivilized. America’s much-bandied national youth seems ridiculous amid Appalachia’s uncountable antiquity.

One could continue unpacking Kingfisher’s themes. Cartesian dualism versus the Freudian psyche, perhaps, or the failures of technological triumphalism in the face of Earth’s unimaginable age. Kingfisher plays with these thematic contrasts and reversals like Lego bricks, creating a whole that readers recognize from previous books, but which is entirely her own. Her ability to use common strategies to tell an uncommon story is why I’ve become a Kingfisher fan.

Although this story remains short, it’s the longest yet of Kingfisher’s Sworn Soldier novellas, over 170 pages plus back matter. This gives Easton space not only to investigate their themes, but also to confront the monster. But this story also has perhaps the largest company of characters yet, and Kingfisher doesn’t give everyone full development. Easton’s loyal batman Angus, in particular, gradually disappears from the story, which is disappointing.

That said, this story largely maintains the momentum of the previous “Sworn Soldier” novellas. Though I might wish the story was about fifty pages longer, to give every character the space they deserve, that would’ve changed the novella-reading experience. Kingfisher’s distinct voice and nonconformist attitude remain visible and keep the narrative popping. It reads like a slice of popular literature, just seen through a lens like you’ve never read before.

Thursday, April 2, 2026

The White Privilege Party, Part 3

This essay is a follow-up to Dinner and Drinks at the White Privilege Party and The White Privilege Party, Part 2.
Woody Guthrie

If the fash hate one thing, it’s being called fash. Or even told they’ve done something fashy. Even when faced with their overwhelming fascism, or with subject experts like Timothy Snyder or Jason Stanley demonstrating their fashy tendencies, they become angry and defensive. President Taco’s claim to be “the least racist person there is” has become the tragicomic emblem fascists’ need to be seen as nevertheless good.

Returning this series to where it began, the question remains of whether protestors should use confrontational chants while challenging the current administration. Specifically, whether they should use Woody Guthrie-type songs to call fascists “fascist” to their faces. In conservative, semi-rural, and racially homogenous places, such boldness will precipitate conflicts, which discourages White protestors from getting involved.

“Fascism” is a notoriously slippery concept, since it adapts itself to local conditions. Snyder and Stanley have useful, but often inconsistent, definitions. For our purposes, let’s define fascism as the hardened and intolerant extreme of the hierarchy I described last time. Fascism not only requires some people to remain powerless for others to have powerful, and divides power racially, but enforces this mandatory division through arbitrary violence.

The history of hierarchical violence reveals something remarkable. As theologian James Cone writes, Jim Crow racial violence didn’t happen to kill the targets. It happened to remind survivors that the perpetrators would face no consequences, because they owned the system. Likewise, the Roman church didn’t burn witches and heretics to force conversions in early Christendom. It only burned nonconformists in the Renaissance, once its political power was unquestionable.

Put briefly, hierarchical violence happens when perpetrators know they’ll face no meaningful punishment. In my lifetime, Kyle Rittenhouse, George Zimmerman, and Bernard Goetz knew or suspected that the racially slanted justice system wouldn’t hold them accountable for shooting Black people or their White sympathizers. So they strapped on guns and went hunting on American streets.

We’ve watched “red states” legalize driving cars into protestors. We’ve watched them refuse to prosecute bullies attacking children. We’ve watched the current administration target harmless dissidents on camera, knowing they won’t be prosecuted, or even meaningfully reprimanded. The deferral of each consequence basically ensures that the next street-level fash will feel authorized to attack, maybe even to kill.

Equally importantly, perpetrators don’t see themselves as villains in this arrangement. Fashy narratives reinforce the belief that hierarchies are necessary, and therefore equality is oppressive. Any attempt to fix unfairness is innately unfair to those who benefit, or think they do. Therefore those protected by the status quo, even the poor and forgotten, are too likely to violently defend what dwindling privilege they have.

The term “extinction burst” has become modish recently. Once you remove reinforcement from previously rewarded behavior, the behavior becomes more extreme and calcified before it disappears. Recent discussions spotlight violence specifically, as America’s overall culture no longer rewards racism, homophobia, and other bigotry as openly as before. But that exact change puts protestors in conservative areas at greater risk.

Please don’t misunderstand, I know these forces are contradictory. People are violent because they know nobody will hold them accountable, but they know nobody will hold them accountable in the exact places where their dying ideology still matters. Florida, which legalized driving cars into protests, has one of America’s oldest median resident ages. Nebraska, where prosecutors won’t charge men who attack kids, remains substantially isolated from the larger economy.

This paradox underlies Critical Race Theory. CRT founder Derrick Bell claimed, with evidence, that racism has proven infinitely elastic as its successive justifications become obsolete. Violent economic necessity justified slavery, but morphed into organized bigotry under Jim Crow. Once the state withdrew support, bigotry became disorganized, like background noise. With each morph, the system excommunicates its former defenders.

The three vigilantes I named—Rittenhouse, Zimmerman, and Goetz—all retreated into anonymity after their acquittals, and became parodies of their prior selves, because their persons didn’t matter. They claimed “self-defense,” but their selves were an afterthought. Their supporters abandoned them because once they bolstered the narrative that White (or White-adjacent) people owned the system, that system no longer needed them.

White progressives fear angering the fash by calling them fashy to their faces, not only because fashies are violent, but because they’re as much displaced by the cultural shifts happening around them as the conservatives are. They’ll hang onto their illusions that they can persuade the fash, because the alternative is plunging headlong into uncertainty. The old system is dying, and to those accustomed to winning, that’s terrifying.

Wednesday, April 1, 2026

The White Privilege Party, Part 2

This essay is a follow-up to Dinner and Drinks at the White Privilege Party.
Striking teachers in the West Virginia statehouse, 2018 (CNN photo)

Political commentators conventionally date the decline of American labor unions to President Reagan mass-firing the PATCO strikers in 1981. But I think the process started much sooner. After peaking in the 1950s, union membership has declined steadily. Though reliable statistics go back to only 1983 (everything prior is estimates and probabilities), union membership rates have halved in that time. This decline has correlated with another powerful social force.

Desegregation.

Ian Haney Lopèz dates union desegregation to 1973, and claims that the battles surrounded seniority. White laborers, Lopèz claims, would rather relinquish all union protections, than surrender the senior standing they achieved under racially biased rules. Tacit within this refusal, though, is the corollary that White workers refused to negotiate alongside Black workers. Too many White workers would rather suffer than see Black people share their protections.

I cannot verify this 1973 date; FDR desegregated defense contractors by executive order during World War II, while Truman desegregated the military in 1948. The American Federation of Labor recognized its first majority-Black union, the Brotherhood of Sleeping Car Porters, in 1925. Union desegregation seems more gradual than abrupt. The point remains, however, that the more inclusive unions became, the more White workers abandoned them.

I’ve begun this essay with labor unions because they’re quantifiable. And of course, correlation doesn’t equal causation; White workers might’ve decided they didn’t need union protection and also that they didn’t want to work alongside Black co-workers coincidentally. But the third prong of the trident, the election of softball racist Ronald Reagan, of “strapping young buck” fame, suggests that racism directed White workers’ economic choices, not vice versa.

This pattern recurs throughout American history. Critics have condemned Nikole Hannah-Jones and her 1619 Project for suggesting the Founding Fathers created America specifically to protect their racial hierarchy. But the fact remains that, after the American Revolution, nine of the thirteen original states, including New York and New Jersey, still practiced slavery. White Americans who talked up liberty and autonomy needed ninety years to fully stop enslaving Black Americans.

And then Jim Crow began.

Bull Connor looses the dogs on protesters in Birmingham, Alabama, on May 3rd, 1963

Nobel Prize-winner Toni Morrison wrote that American values have long valorized individualism and autonomy; but such values have weight only to the extent that they’re denied to some Americans. Me being unfettered only means something while someone else remains restrained. Morrison, a novelist, meant this specifically in literary terms, because in fiction, we can abstract such values to broad moral precepts. But the same principle applies to society writ large.

In today’s America, “peace” doesn’t mean the stability necessary to pursue our physical and spiritual well-being, it means the absence of war. “Wealth” doesn’t mean physical comfort and a full belly, it means the power necessary to employ other people to look after your stuff. “Law” doesn’t mean reliable systems of social order, it means violent crackdowns on nonconformists and the poor. We define our shared values oppositionally.

And, as Morrison writes, we often use race as mental shorthand for this opposition. Sure, sometimes we signify “the other” with other external signs, like hair or piercings. But if White punk rockers want acceptance from the squares, they can shave their Mohawks and remove their tongue studs. Black and Hispanic people can’t stop being Black and Hispanic, and therefore can’t stop being shorthanded as “less than.”

It’s easy, considering American public mythology, to forget that when Dr. King was assassinated in Memphis in 1969, he wasn’t there to mobilize for racial justice. He was there to help unionize the city’s sanitation workers. Sure, those sanitation workers were overwhelmingly Black, but Dr. King had recognized the inextricable bond between American racism and economic injustice. Poverty and Blackness occupy the same headspace in the American imagination.

Concisely put, America organizes itself into in-groups and out-groups, then racializes the groups to simplify remembering who belongs where. The same redlining practices that preserve segregated neighborhoods, have also segregated labor forces. The minute Black people wanted union protections, White workers began embracing myths of radical individualism, even as such individualism left them broke and powerless against billionaire business owners.

Better broke than Black, amirite?

We’re somewhat seeing this rolled back. Black deaths caught on camera have ignited a sense of justice in some White Americans, though not yet enough. But it’s carried its own pushback. Capitalists like Elon Musk and Larry Ellison have sought political power that would’ve made Cornelius Vanderbilt or Andrew Carnegie blush. But it all for the same goal: maintaining the hierarchy of haves and have-nots. Which is, usually, racial.

Concluded in The White Privilege Party, Part 3

Tuesday, March 31, 2026

Dinner and Drinks at the White Privilege Party

Citizens protest the continued ICE presence in Minnesota, January 2026

What moral obligations do White people have, especially White men, to risk death when protecting the powerless in American society? This question confronted me this week when I answered a social media question. Somebody described her White friend in semi-rural Pennsylvania trying to organize a No Kings protest. She wanted to incorporate conventional anti-fascist songs and chants, but her White co-organizers feared becoming “too confrontational” and alienating their neighbors.

I admit I fumbled my answer. I said something mealy-mouthed about regions where somebody arriving with a Gadsden flag and a gun was entirely likely. Progressives living in broadly conservative areas know that those threatening our organized activities face few consequences. Just last month, in my state, an adult man attacked a line of protesting high school students—literal children—and local prosecutors declined to file charges.

Somebody answered my response by saying I failed to understand the Black American experience. Which, as a White cishet man, I probably do. This respondent pointed out that Black Americans face violent pushback for even the most anodyne protest. In fairness, I’ve shared enough protest space with Black and Hispanic people to have witnessed this firsthand, but like anything merely witnessed, that’s not the same as experience.

In all things, I strive to remain fair and broad-minded, and if I’m wrong, I want to amend my ways. So I’ve thought about this response. I could, if I wanted, mumble something about the limits of social media. Especially on platforms which cap character counts, like Xitter, Threads, and Bluesky, it’s impossible to make nuanced arguments. Conditional experiences, including the Black, queer, disabled, or womanist experience, will get elided.

But that’s merely an excuse. Several deeper issues conspired to reach this moment. First, humans naturally desire to stay alive. My sense of moral outrage at persistent American racism arose because racialized violence contravenes the human desire to survive, and places a sliding scale on human life. If some people deserve to remain alive more than others, autocrats could eventually use that relative deserving against me. Or you.

Accused vigilante Kyle Rittenhouse, captured on a cellphone video

This isn’t new or unique. The “Women and children first” ethic made famous on the RMS Titanic was written after literally no women or children survived the SS Arctic disaster. Men on board wanted to survive so badly that they literally elbowed smaller, weaker women aside. Human-made moral codes like chivalry, law of the sea, and bushido generally arise to restrain powerful people’s tendency to value their lives over others.

Honestly, I don’t have to die for justice. As a White man, I could passively acquiesce, and pay little price. This, then, is the obverse of my respondent’s claim that Black people live with constant threats of violence. If White people can walk away from threats and survive, they will walk away. They won’t persevere if their lives are jeopardized, unless the threat of leaving is worse.

Forgive me bringing up old stuff, but here’s where Kyle Rittenhouse enters the discussion. Even I misunderstood the meaning when it happened, focusing on the red-herring language of “self-defense.” But Kyle Rittenhouse’s real lesson was that, if White people stand up for Black lives, human law won’t protect them. White lives don’t matter if they don’t prostrate themselves before a White political apparatus.

In John’s Gospel, Jesus Christ says: “Greater love hath no man than this, that a man lay down his life for his friends.” (15:13 KJV) But Jesus never says anything unless its opposite flits through the common ethos. If we think morality should result in material reward, then dying for others is a failure. Only when justice is communal, not individual, does dying for the cause become an accomplishment.

And right now, in America’s White spaces, communitarian justice just doesn’t exist. That’s why private-sector labor unions are dying, protests are special occasions, and, as Charles M. Blow writes, White people can abandon racial justice actions when the weather becomes harsh. Community justice, once a shared value in America’s factories and farms, has dwindled as those spaces have become racialized, and White people live in atomized suburbs and single-family homes.

Under such conditions, asking White people to risk death simply because it’s right, is a non-starter. Not until our political, religious, and social leaders reclaim a vision of shared consequence, will that change. Unfortunately, that won’t happen while voters, congregants, and citizens don’t reward it—the ultimate circular reasoning. The feedback loop won’t break until, as the fash inevitably do, they turn against the people who first gave them power.

Continued in The White Privilege Party, Part 2 and The White Privilege Party, Part 3

Wednesday, March 25, 2026

How To Read the Constitution Like a Scholar

Jill Lepore, We the People: A History of the U.S. Constitution

When the militants behind the American Revolution wanted to build a government, the idea of a “constitution” already existed, but was mainly abstract. European countries like France and Spain derived constitutions from scattered law, tradition, and judicial practice; to this day, the British constitution remains unwritten, and high court proceedings often include debates about what, exactly, the constitution is. America’s Founders pioneered another idea: writing the constitution down.

Harvard historian Jill Lepore has written about the social and political forces which shape American politics. With this volume, she focuses specifically on the forces which shape our Constitution: not only the text itself, but legal interpretations, public debates, and amendment process. Though Lepore doesn’t say it, she tacitly acknowledges that America’s constitution far exceeds the document, comprising also the institutions and handshake conventions created to make the document enforceable.

Americans once loved the constitutional process, Lepore writes. Not only the national, but the local. Ratification of the current Constitution was the subject of lengthy, sometimes combative public discussions, and the original text as written satisfied nobody, though it became the text Americans could live with. Meanwhile, for over two centuries, state constitutional conventions happened, on average, once every eighteen months, and state governments almost aggressively amended themselves.

Then we stopped. America hasn’t seen a state constitutional convention since 1986 and, although the states ratified the 27th Amendment in 1992, it was a procedural asterisk; the federal Constitution hasn’t been meaningfully amended since 1971. Certainly we can’t say that the need for developing institutions has dwindled; if anything, events of the 2010s and 2020s revealed how fatally outdated and unresponsive our Constitution has begun. What caused the change?

Lepore answers that question through the debates which surrounded the original Constitution and its amendments, successful and unsuccessful. The Founders, mostly Enlightenment rationalists, believed government could operate smoothly as a machine if removed from frail human hands, and when the original Articles of Confederation proved unsuccessful, the 1787 Convention proceeded with the attitude of social engineers. Lepore compares the 1787 Convention with concurrent developments in clockwork technology.

Dr. Jill Lepore

Almost immediately, though, Americans began demanding amendments. The original Constitution was almost entirely procedural, and omitted the moral imperatives which drove the Revolution and Shays’ Rebellion. The first Congress wanted to shepherd through a Bill of Rights, but Article V didn’t even include instructions for “correct” amendments: should changes be incorporated into the original text, or tacked on as appendices? Congress chose the latter, after some contention.

As written, the amendment process proved cumbersome. Savvy news and history readers already know this. But Lepore delves into procedural hurdles that well-meaning lawmakers, Left and Right, have faced, and how they overcame them. Sadly, one tool for overcoming intransigence is, apparently, war. After the first twelve amendments ironed out procedural and rights quirks, subsequent amendments have mostly happened in clusters surrounding the Civil War, World War I, and Vietnam.

Despite the Founders’ vision, the state machine didn’t prove immune to human influence. Lepore describes how intervening events, like the Civil War or the annexation of Hawaii, changed the Constitution’s meaning. The text didn’t vary, except where amended, but as circumstances made Americans reevaluate themselves, we also reevaluated our unifying text. America’s political leaders changed their constitutional reading to allow, say, annexing Hawaii whole, which changed our shared identity.

Likewise, powerful people—mostly unelected—changed the Constitution by changing relevant practice. Supreme Court cases like Plessy and Roe read certain interpretations into procedure; Brown and Dobbs read them back out. Philosophies like “Originalism,” which arose in tandem with changing opinions about abortion, created interpretive lenses which courts used to create or abolish rights, until they didn’t. The text hasn’t changed in 55 years, but the Constitution has changed wildly.

Reading this book, I recall constitutional scholar Mary Anne Franks, who compared constitutional adherence to religious fundamentalism. If the Constitution has become holy writ, then Lepore’s telling reads like a history of hermeneutics, the processes of scriptural interpretation. Just as Christians have read and reinterpreted the Bible considering surrounding cultural influences, Americans have reinterpreted the Constitution to reflect the conditions in which our country lives.

This religious comparison isn’t flippant. Late in the book, Lepore writes that nations treat new constitutions as tools, but old constitutions, not just America’s, become venerated. The American Constitution was once esteemed so lightly that the original sheepskin parchment got misplaced; now it’s a sacred relic of state sacrament, hardened against nuclear attack. If Americanism is a religion, then changing hermeneutics deserves serious, almost monastic study.

Wednesday, March 18, 2026

Reading and Thinking in a Paranoid Age

Johann Hari

Yesterday, as I write, I finished reading a book. Once upon a time, this wouldn’t have merited an announcement; I did it as regularly as breathing. But this has become more rare and remarkable, and as a book blogger, I have concrete evidence that this is the first time I closed a book and proclaimed “Finished!” in nearly three months. Not that I haven’t read, but I haven’t followed one book through to the end.

Nor am I alone. Anecdotally, my friends report a massive increase in doomscrolling, perhaps the most passive activity which modernity permits. One sits with a small, pocket-sized computer, flipping listlessly through two or three orphan apps, hoping something jumps out urgently enough to fill the spiritual void we all apparently share. Nothing arrives, of course. But the hope of finding something provides a greater sense of reward than getting up and doing something constructive can.

Johann Hari synopsizes the multi-pronged science behind this decline. Some of it comes innately from just getting older, as it becomes harder to create new synaptic connections. Activities which come easily to youth and young adults, like reading, studying, or handicrafts, just grow harder for adults, and we need to develop discipline enough to overcome this. So yes, if reading and art seem more difficult than when you were small, that isn’t just rosy-eyed nostalgia.

But the problem isn’t wholly internal. Technology critics note that our smartphones, tablets, and other screen technology have addictive qualities. App developers maximize the hypnotic quality of their interfaces, utilizing design principles that make us want to stare. Streaming content on platforms like Netflix and Disney+ have more camera cuts and other jolt moments than the broadcast television I grew up watching, which triggers the reptile brain to keep watching, scanning for further life-saving inputs.

I cringe, though, at the word “addictive.” The concept of addiction gets misused in government PR and middle-school “Just Say No” curricula. Often, to describe something as “addictive” implies almost magical properties, like a cursed object that weakens and destroys its owner. This isn’t so. Not everyone who tries cannabis or cocaine becomes addicted, just as not everyone who fiddles on social media on their phones becomes addicted. Something deeper and more primal happens first.

Dr. Gabor Maté

As addiction specialist Gabor Maté writes, addictions develop under specific circumstances. Some people become addicted after life-shaping traumas: childhood abuse and neglect mold children’s brains in ways that protect them as kids, but are maladaptive in adults. Addicts consume their product to numb their trauma scars. Other addicts have more fleeting issues. The second-leading cause of addiction is loneliness, which addicts can overcome through sociability. For AA participants, spirituality arguably matters less than the meetings.

What, then turns screens addictive? Returning to Hari, he writes that certain life experiences create trauma-like effects on the brain. This includes certain forms of uncertainty, including poverty, homelessness, and war. Many American soldiers notoriously became heroin addicts in Vietnam, then cleared up when they returned to civilian life. I grew up believing that people became homeless because they were addicts, but that’s backward; they become addicts because they’re homeless. Substances take the fear away.

America’s economy has created unprecedented prosperity, but hasn’t distributed it equitably. Elon Musk is currently angling to become our first trillionaire, while uncountable underemployed Americans rely on multiple part-time jobs and gig work to stay afloat. I’ve bounced among short-term jobs for three years, often panicking to keep rent paid and lights on. When I pause between hustles, that allows time for thoughts to emerge, reminding me of every bill about to go into arrears.

Hari and Maté agree that such uncertainty warps the brain. In conditions of constant fear, the limbic system, and especially the amygdala, gets bigger, while the prefrontal cortex withers. A larger amygdala makes us highly reactive, even downright paranoid. An atrophied cortex means less self-discipline, and just as importantly, less ability to empathize with strangers. Both of these make us too impatient for the detail work and contemplative pace of reading or of creating art.

Uncertainty and paranoia have become our standard of life. Not just economic uncertainty, but street violence and wars of choice permeate our daily lives. This results in a population more primed for fear, snap reactions, and restlessness. Into that circumstance, streaming TV media increasingly gives us very loud, aggressive, juddery content that sates our need for stimulation. Something as sedate as reading or listening to classical music seems quaint. So no, it’s not just me.

Tuesday, March 17, 2026

T. Kingfisher and the Kingdom of Free Women

T. Kingfisher, Nettle & Bone

Princess Marra of the Harbor Kingdom is a spare daughter, never to inherit, whose only hope for advancement is to wed a prince, someday. Until then, she’s foisted onto a provincial convent while her older sister gets the prestigious marriage. But she discovers the truth: her sister is a political pawn, abused and terrified, reduced to a walking shadow. Naturally, Marra decides to organize a campaign to assassinate the patriarchy.

In the last year, I’ve become a massive fan of T. Kingfisher’s novellas. She channels classic literature and folklore, refashioning the background noise of our dreams into insightful dark fantasy. This is Kingfisher’s first full-length novel I’ve read, and instead of remaking a specific story, she uses images cherry-picked extensively from Grimm’s Fairy Tales. The product turns childhood mythology into a grown-up fable of power, resistance, and self-reliance.

Marra’s story begins when she’s already past thirty. She chastises herself for being an adult and still believing the legendry of “happy ever after.” Her sister’s marriage to a handsome prince, solemnized by a literal fairy godmother, has proven disastrous. Perhaps Marra’s awakening comes late, but it nevertheless comes. So she leaves the religious cloister and begins walking, seeking the magical assistance that will help her liberate her family.

Psychologist Bruno Bettelheim wrote that, in the realm of fairytale, the bond between siblings matters more than that between spouses. That certainly applies here. Marra and Kamia had a contentious relationship a children, but as adults, their mutual trust and self-reliance gives them strength when faced with duplicitous adulthood. Kingfisher’s narrative maps so perfectly onto Bettelheim’s Jungian prototype that it’s tempting to psychoanalyze her story.

However, this is a false temptation. Kingfisher creates a dreamlike atmosphere, appropriately devoid of proper nouns. Many characters are identified only by their roles: the king, Sister Apothecary, the dust-wife. When characters merit names, it’s only first names, usually Anglo-Saxon: Marra, Agnes, Fenris, Miss Margaret. Even countries have names like the Harbor Kingdom and the Northern Kingdom. (One country has a name, but it’s distant and half-mythic, like Avalon.)

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Characters and places lack names, here, because they belong only to a stage in Marra’s life. Bettelheim’s map of fairytale describes children transitioning into adulthood, with accompanying adult roles, like marriage and family. But Kingfisher describes a subsequent transition, where adults finally shed the conditioning of childhood storytelling. Princess Marra was first conditioned by the royal court, then by the convent. Now she must at last become herself.

Prince Vorling of the Northern Kingdom, Marra’s brother-in-law, is indeed handsome and charming. He’s also violent, domineering, and jealous. He maintains power, over both his kingdom and his family, through exaggerated displays of male swagger, and he sacrifices all relationships to maintaining the illusion of control. He truly desires to be a fairy-tale prince, and he’ll brook no intrusion on that story from annoying human foibles.

Therefore, Marra literally walks away from her society’s twin institutions of power: the royal court and religion. She spent over thirty years appeasing the dual threats of state violence and eternal judgement. Now she must obey the only instrument more true than either the kingdom or the gods, her own conscience. If that means striking a dagger to the power structures of two kingdoms, well, so be it.

Along the way, she assembles her company: the dust-wife (a vaguely defined sorceress), her mousey fairy godmother, and a massive, gentle-hearted warrior. Oh, and Bonedog, the company mascot, whose name says it all. He’s a dog resurrected from reassembled bones. If this sounds like somebody’s Dungeons & Dragons campaign, I won’t disagree, and the story has the semi-improvisational feel of a dungeon master trying to wrangle the players back on track.

Kingfisher’s product invites comparison to Tolkein, Michael Moorcock, and Andrzej Sapkowski, writers who mix dreamlike whimsy with painful grown-up realizations. Kingfisher’s characters march against the arrayed ceremony of kingdom and state religion, knowing death is likely, simply because it’s right. Princess Marra doubts herself and, without her companions’ support, would probably back out. But together, they form their own morally succinct counterculture, linked by morality and trust.

Please don’t misunderstand. I’ve deployed terminology from psychology and lit-crit, but one could read Kingfisher’s narrative as a rollicking adventure. Like the best literature, though, the story exists on multiple levels. Kingfisher uses playful genre boilerplates to make her message acceptable. But she also reminds us, in this post-MeToo culture, that “happily ever after” relies on the honor system. If Prince Charming lacks honor, then sisters must stand together.

Other reviews of T. Kingfisher books:
Man You Should’ve Seen Them Kicking Edgar Allan Poe
Secrets Buried in the World’s Darkest Corners
The Sleeper and the Beauty of Dreams

Monday, March 2, 2026

Will We Ever Get Tired of Re-Fighting Old Battles?

Promo still from the last time someone dragged The X-Files out of the deep freeze

This weekend’s illegal American bombing of Iran arrives hand-in-glove with another cultural announcement: Hulu is relaunching The X-Files. Preliminary announcements call it a “reboot,” but deeper reportage suggests it’s more a soft reboot, a continuation with new leads. Simultaneously, reports suggest there might be a long-awaited season five for Veronica Mars. (This is more ambiguous, maybe misreading the series being acquired by Netflix; wording is fuzzy.)

I’ve complained before about the cultural currents behind constant reboots. Pop culture is always behind the times anyway, and the flood of streaming media has made the biggest entertainment conglomerates more timid, not less. But this feels different. The resurrection of two popular franchises, thirty-three and twenty-four years old respectively, amid a “Make America Great Again” culture feels more than timid. It feels like a hasty retreat from reality.

Throughout the Current President’s 2016 campaign, he decried urban violence and burning cities, despite such violence being at near-historic lows. But his rhetoric makes sense in his life context, as the Bronx famously caught fire in the late 1970s, the same time he moved into Manhattan real estate with his purchase of the former Commodore Hotel. The poor future President was simply trapped in the sociopolitical milieu of his thirties, unable to grow.

Similarly, this weekend’s bombing of Iranian civilian targets mirrors the President’s unhealed past. Consider his inability to stop heaving accusations against the Central Park Five, nearly a quarter century after they were exonerated. This President retains grudges and political interpretations molded by a privileged youth and segregated social set. In context, he likely bombed Iran, not really for its nuclear program, but as payback for the 1979 Hostage Crisis.

This has become the default for much American politics. We aren’t facing the past, we’re relitigating the past. In the 1980s, both political discourse and mass media desperately wanted to re-fight the Vietnam War, but correctly this time. Franchises like Iron Eagle, Rambo, and Top Gun promised to purge America’s Vietnam disgrace. More recently, Call of Duty and James Bond try to tweak our memory of the Cold War.

Caught in the interregnum between the Cold War and the Global War on Terror, the Clinton decade offered enforced cheerfulness, a frothy meringue of Empire Records and Ben Stiller’s early career. The X-Files directly countered that, maintaining post-Reagan cynicism toward America’s surface culture. Scully and (especially) Mulder walked through neon-soaked midnight landscapes, uniquely able to see the venality that made that era’s party ethos possible.

Kristen Bell in the original network run of Veronica Mars

Veronica Mars pushed this contrast to the extreme. Read superficially, the series presented a stereotyped Southern California panorama, all hypersaturated colors and loud, jangly indie pop soundscapes. Only Veronica and her father—and, eventually, those trapped in their decaying orbit—understood the vulgar horse-trading and human commodities that subsidized Neptune, California’s skin-deep glamour. Veronica, like Mulder, was ready to expose the lie, damn the consequences.

Both franchises took dim opinions of power structures. Veronica Mars fought plush-bottomed police as often as criminals, while Mulder and sometimes Scully brought official corruption to light despite, not because of, the law. But both presented a morally distinct, binary universe. Neptune’s Sheriff Lamb and the Smoking Man were clearly evil, and needed exposed to a public which their shows depicted as passive and sheep-like, desperate for an underdog hero.

Unfortunately, the political tenor has changed. From the impotent government depicted in the 1970s, to the malignant one of the 1990s, the problem has been presented as siloed at the top. The disclosure of the Epstein documents, like the Panama Papers before them, has revealed a network of politicians, capitalists, entertainers, academics, and scientists colluding to support an otherwise decrepit system. The “secret” isn’t secret anymore.

While politicians and media captains want to refight the battles of their, or our, childhood, rapidly unfolding news reveals their vision of the problem as charmingly naïve. Nary a top-tier capitalist or government insider didn’t share information with Epstein. Public intellectuals like Noam Chomsky and Richard Dawkins had their hands in his pockets. The rot isn’t an isolated, partisan tumor. Everyone, everywhere in the system, has been proved complicit.

Veronica Mars and The X-Files helped define a generation’s idea of acceptable villains. They showed our lawkeepers were complicit with lawbreakers in the anarchy most people felt in their ordinary lives. But reality has overtaken the scope these shows made possible. Bringing back the monsters of my twenties is worse than quaint. It offers audiences my age an excuse to avoid the monsters that have revealed themselves in reality.

Saturday, February 14, 2026

Lee Brice in Country Music’s Nostalgia Pits

Lee Brice (promo photo)

Lee Brice debuted his new song, “Country Nowadays,” at the TPUSA Super Bowl halftime show on February 8, and it was… disappointing. Brice visibly struggled to fingerpick and sing at the same time, and gargled into the microphone with a diminished rage that, presumably, he meant to resemble J.J. Cale. The product sounded like an apprentice singer-songwriter struggling through an open-mike night in a less reputable Nashville nightclub.

More attention, though, has stuck to Brice’s lyrics. The entire show ran over half an hour, but pundits have replayed the same fifteen seconds of Brice moaning the opening lines:

I just want to cut my grass, feed my dogs, wear my boots
Not turn the TV on, sit and watch the evening news
Be told if I tell my own daughter that little boys ain’t little girls
I’d be up the creek in hot water in this cancel-your ass-world.

Jon Stewart, that paragon of nonpartisan fairness, crowed that nobody’s stopping Brice from cutting his grass, feeding his dogs, or wearing his boots. Like that’s a winning stroke. Focusing on Brice’s banal laundry list misses the point, that Brice actively aspires to be middle-class and nondescript. But he believes that knowing and caring about other people’s issues makes him oppressed in a diverse, complex world.

One recalls the ubiquitous 2012 cartoon which circulated on social media with its attribution and copyright information cropped off. A man with a military haircut and Marine Corps sleeve stripes repeatedly orders “Just coffee, black.” A spike-haired barista with a nose ring tries to upsell him several specialty coffees he doesn’t want. Of course, nobody’s ever really had this interaction, but many people think they have.

Both Lee Brice and the coffee cartoonist aspire to live in a consistent, low-friction world. If your understanding of the recent-ish past comes from mass media, you might imagine the world lacked conflict, besides the acceptable conflict of the Cold War. John Wayne movies, Leave It to Beaver, and mid-century paperback novels presented a morally concise and economically stable world, in which White protagonists could restore balance by swinging a fist.

The coffee cartoon, with its unreadable
signature (click to enlarge)

By contrast, Brice and the coffee cartoonist face the same existential terror: the world doesn’t center me anymore. Yes, I said “existential terror.” What Brice sings with maudlin angst, and the cartoon plays for yuks, is a fear-based response, struggling to understand one’s place in the world. We all face that terror when becoming adults, of course. But once upon a time, we Honkies had social roles written for us.

I’ve said this before, but it bears repeating: “bein’ country,” as Brice sang, today means being assiduously anti-modern. Country music’s founders, particularly the Carter Family and Jimmy Rogers, were assiduously engaged with current events in the Great Depression. This especially includes A.P. Carter, who couldn’t have written his greatest music without Esley Riddle, a disabled Black guitarist. Country’s origins were manifestly progressive.

But around 1964, when the Beatles overtook the pop charts, several former rockers with Southern roots found themselves artistically homeless. Johnny Cash, Conway Twitty, Jerry Lee Lewis, and others managed to reinvent themselves as country musicians by simply emphasizing their native twang. But their music shifted themes distinctly. Their lyrics looked backward to beatified sharecropper pasts, peacefully sanded of economic inequality and political friction.

In 2004, Tim McGraw released “Back When,” a similar (though less partisan) love song to the beatified past. McGraw longs for a time “back when a Coke was a Coke, and a screw was a screw.” I don’t know whether McGraw deliberately channeled Merle Haggard’s 1982 song “Are the Good Times Really Over,” in which he sang “I wish Coke was still cola, and a joint was a bad place to go.”

Haggard notably did something Brice and McGraw don’t: he slapped a date on the “good times.” He sang: “Back before Elvis, or the Beatles.” That is, before 1954, when Haggard turned 17 and saw Lefty Frizzell in concert. Haggard, like McGraw or Brice, doesn’t yearn for any specific time. He misses stage of personal development when he didn’t have to make active choices or take responsibility for his actions.

Country musicians, especially men, love to cosplay adulthood, wearing tattered work shirts and pitching their singing voices down. Yet we see this theme developed across decades: virtue exists in the past, when life lacked diversity or conflict, and half-formed youths could nestle in social roles like a hammock. Lee Brice’s political statement, like generations before him, is to refuse to face grown-up reality.

Tuesday, February 3, 2026

The Doomed Promise of Change from Within

Bull Connor looses the dogs on protesters in Birmingham, Alabama, on May 3rd, 1963

“Why don’t you try applying for ICE and see if you can maybe change things from within?”

I’ve been out of work for several months now with precious few leads and no real opportunities pending. I can’t be the only one in this situation, as our national policy-makers keep inventing new ways to submarine domestic development and make every consumer good more expensive. But every unemployed person ultimately faces the problem alone, as bills accumulate and the daily reality becomes more bleak. I find myself becoming despondent.

Meanwhile, ICE—Immigration and Customs Enforcement—has recently received a massive cash transfusion from the Taco Administration, making it America’s largest law enforcement agency. This tops the previous largest agency, Customs and Border Protection. To meet the Administration’s demand for more deportations, ICE is offering luxurious sign-on benefits and expedited training. It’s also notoriously handing firearms to loose cannons and dangerous people.

In this tumultuous context, my friend good-heartedly suggested I join ICE. Why not get that lush government bag, she said, while also standing against the rampant violence we’ve seen unfolding in Minnesota? I’ve read and heard similar stories for years. Applicants join the police, military, federal agencies, and other secure government jobs, full of idealism, eager to push reform peacefully, from within. These stories seldom end well.

We’ve probably all heard anecdotes. For serious sources, let’s consider Shane Bauer, author of American Prison. As a journalistic project, Bauer took a job as a corrections officer at a private prison in Louisiana. In his telling, Bauer started off idealistic, eager to discover how prisons change prisoners while making a profit. He left the project, though, when his girlfriend reported his private communications becoming increasingly bitter, vindictive, and violent.

Matt Taibbi describes something similar while writing about Eric Garner. He quotes a patrolman who joined the NYPD, hoping to challenge the department’s bureaucratic cruelty. But subject to constant micromanagement and quotas, he found he hadn’t changed the department, it changed him. Besides this, agencies notoriously find inventive ways to enforce conformity, resist scrutiny, and punish reformers and whistleblowers.

Citizens protest the continued ICE presence in Minnesota, January 2026

We could discuss why this happens. Maybe power corrupts, but as Brian Klaas demonstrates, power also attracts those most willing to be corrupted. People who become cops and corrections officers already have a vindictive streak; power simply gives them official vestments. Besides law enforcement, we’ve probably all seen laborers who became managers, students who became teachers, renters who became landlords, and adopted the worst aspects of their new positions.

Colleagues of good standing could, hypothetically, stop this. But we know they don’t. Six ICE officers dog-piled on Alex Pretti before one finally shot him; three officers surrounded Derek Chauvin as he knelt on George Floyd, not stopping Chauvin, but forming a human barricade to keep civilians back. Maybe officers high-minded enough to stop the violence already quit the agencies, but more likely, participants conformed themselves to the existing structure.

These patterns aren’t unique to law enforcement, though police ubiquity makes it more visible. When institutional rot infiltrates a subculture, purging it is rare. We’ve seen private corporations, college fraternities, and other civilian organizations succumb, even if they aren’t protected by qualified immunity. Put simply, those who have power, even limited power within a specific institution, become enamored of it, and perform heinous acts to protect it.

Nor are these effects limited by circumstances. Shane Bauer recalls needing months to recover from the aggression he learned as a corrections officer, rewriting his book several times to purge the anger. Eyal Press describes the ways that COs, drone bomber pilots, and even meat-packing workers experience wartime levels of flashbacks and nightmares. Social psychologist Rachel MacNair calls this phenomenon perpetration-induced traumatic stress.

I’d argue that the violence we’re now seeing enacted in Minnesota, is a more extreme version of violence we’ve all seen before. From schoolyard fistfights and fraternity hazing, to union busting and workplace interrogation, to police violence at anti-police violence protests, it’s all the same. In a structurally unequal society, those who benefit can maintain their standing only through force, or threats of force. Wealth, power, and status are therefore innately violent.

Therefore, changing from within isn’t possible. Though some individuals make some progress, and the occasional abuser may get purged and prosecuted, lone idealists generally can’t fix broken systems. Once the institutional rot becomes widespread enough that the institution must close ranks to protect itself, there’s little chance of “reform.” The system will warp and destroy those who learn its ways, no matter how idealistic.

Friday, January 30, 2026

Toxic Work Ethic in America, Part Three

This essay follows from Toxic Work Ethic in America, Part One and Toxic Work Ethic in America, Part Two.

We humans intensely comprehend our own limitations, fears, and psychological twinges, because we’re all passengers inside our own heads. We can never truly understand other people’s mental states from outside. In my last two essays, I made sweeping generalizations about working-class and upper-class mindsets, but I’m no scientist. I simply constructed a tentative hypothesis from personal experience, conversations, and observing public figures.

To recap, I suggested that most people have conditioned inner narratives driving their workplace habits, and “work ethic” is the benign manifestation of malignant inner trauma. I attribute this trauma to fathers, perhaps because my sources, both personal and public, are men. Maybe women learn more workplace habits from mothers; leave an informed comment. Either way, our “work ethic” is an external tool to paper over inner damage.

But this carries deeper implications: if I’m right, then work ethic, and workplace habits generally, orient toward the past. We appease the voices which exposed our inadequacies as children, constantly trying to silence condemnations that, as adults, only exist in our own heads. Addiction treatment specialist Gabor Maté says something similar about substance users, that they mostly want to assuage pain which their brains keep inflicting on themselves.

(As an aside, many friends have warm, supportive relationships with their fathers. I largely did, too, before his memory started failing. I don’t disparage fathers, but observe how they sustain patterns which they themselves don’t realize have caused harm.)

Contra this past orientation, most spiritual traditions favor a mindful orientation toward the present. Buddhist meditation, Christian centering prayer, and Taoist wandering all encourage supplicants to exist in the present, attuned to each moment, listening for the universe’s subtle call. The workplace of capitalist accrual reminds us of past voices and future rewards. But spiritual practice calls us to exist here, now, as we are.

Bringing spirituality into a workplace ethics discussion is, I realize, risky. Many True Believers insist their spiritual tradition is uniquely true, which could split my audience. Yet bear with me. For all their manifold differences, the religions I’ve studied share a core proposition that the person before us, the community around us, or the conflict buffeting us, holds primacy in our spirituality. Here. Now.

Overcoming the inherent “work ethic” trauma means attuning ourselves to the present. It means listening to instructions, not in fear of punishment or longing for reward, but as they are. It means recognizing our bosses as humans, with the foibles and needs that entails, and not as manifestations of engrained father images. It requires being attuned enough to our own bodies and limitations to say, without malice or fear, “No.”

Humans find ourselves torn between our carnal condition, driven substantially by past traumas and future needs, and our spiritual nature, which faces the present. What’s worse, our spiritual leaders, themselves facing the same tension, encourage this divide. When a millennia-old textbook becomes more important than the immediate person, conflict, or community, then spiritual leaders sink to the level of employers and politicians.

Moreover, the worldly forces which profit from our “work ethic” trauma, already know this. That’s why they barge into our spiritual domains. Billionaires and politicians have transformed Christianity into a nationalist front, reduced “self-care” to retail therapy, and taught us to see mindfulness as a professional strategy. Developing a spiritual discipline will entail purging the anti-spiritual influences from your tradition.

The spiritual equanimity I describe has no single path. Despite me mentioning prayer and meditation, I’ve found these disciplines of limited personal value. But I’ve achieved something comparable by writing poetry: listening to each moment, and selecting the most appropriate word which exists, has helped me attune myself to the present. Whatever removes you from past traumas and future mirages may be your path toward spiritual balance.

This conclusion probably feels abstruse, distant from my starting premise. Yet I believe it holds together. Whether it’s my father chastising me for slowing down, or Errol Musk chastising Elon for not collecting enough accomplishment tokens, that condemning voice comes from the past. The past thus can’t save us, nor the future, which doesn’t yet exist. Only in the present, the spiritual center, can we escape that conditioning.

Elon Musk and I learned incompatible messages from our fathers, which produce wholly divergent outcomes. Yet the harm those messages continue to produce have made us smaller, spiritually less developed beings. And we could both escape by reorienting ourselves away from those messages. But that means stopping seeing ourselves as economic actors, and redefining ourselves as human.

Thursday, January 29, 2026

Toxic Work Ethic in America, Part Two

This essay follows from Toxic Work Ethic in America, Part One.
Elon Musk

Elon Musk, currently likely to become America’s first trillionaire, has a conflicted history with his father, South African entrepreneur Errol Musk. Elon tries to deny Errol’s part-ownership of an emerald mine, for instance, but Errol calls that pure mythology. Even if Errol didn’t bankroll Elon’s earliest ventures, his wealth allowed Elon freedom to pursue an education, experiment with technology, and start several businesses in his early twenties.

If, as I said previously, people arrogant enough to become billionaires and presidents aren’t conditioned in childhood to be self-effacing, that doesn’t mean they’re unconditioned. And like me, their conditioning comes heavily from fathers. My father conditioned me, mainly by yelling, to maintain a self-destructive work ethic, pushing myself to the brink of collapse, then returning home too depleted to do housework. Elon’s father conditioned him to… well.

Like Elon, Errol was a serial entrepreneur, who also used his wealth to buy out enterprises that piqued his interest. Like Elon, Errol married a glamorous, accomplished wife, but seemingly paid her little attention, letting Maye Musk pursue her interests without support or awareness. Like Elon, Errol is sexually voracious: Elon has fourteen children by four women that we know about, while Errol had a child with his own stepdaughter.

Where my father taught me to deny myself and disappear entirely into my role as an employee, student, battalion member, or whatever, Errol Musk taught Elon to elevate himself, and his desires, over other people. Errol conditioned his son to be constantly self-seeking, always aware of ways he falls short or looks small. My father conditioned be to be self-abnegating, while Errol conditioned Elon to be self-centered.

I don’t know Elon’s full story, partly because Elon often contradicts himself regarding his biography. So I’ll draw an analogy. Joe Plumeri, former CEO of the Willis Group, opens his memoir by describing his father showing him the luxurious houses around his New Jersey hometown. Describing himself later as a “workaholic” who loves showing his father around his accomplishments, it becomes clear: Plumeri has spent his life appeasing his father.

Sigmund Freud

One could extend the comparison. Consider the American Presidents and presidential candidates who considered the Presidency their birthright: John Quincy Adams, Benjamin Harrison, George W. Bush. George H.W. Bush and Al Gore were both sons of senators. The Kennedy family exists. John McCain, whose father and grandfather were both four-star admirals, had his military career stall because of his POW status; he ran for President partly to outrank his ancestors.

Developmental psychologists describe human behavior as “highly conditioned.” In plain English, this means our past circumstances shape our present options. We cannot make a completely original decision, but rather see our opportunities defined by our life experiences. Many of the conditioning agents that shape our ability to see fall into two broad categories: standards we want to live up to, and mistakes we want to live down.

Again, for many of us, fathers (or father figures) shape our perceptions. My father taught me to see myself as part of a unit: whether a workplace, a classroom, or a military battalion, I needed to diminish. If I took an unscheduled break, yawned loudly, or even slowed down notably, my father volubly reminded me that I wasn’t just shirking my individual duties. I was letting the entire group down.

Meanwhile, billionaire fathers teach their sons to seek themselves. Sometimes this self-seeking is a doom spiral, as Cornelius Vanderbilt failed to teach his sons business acumen, and the Vanderbilt fortune eventually disappeared. Other times, this self-seeking accrues wealth and power. We can see this in how billionaires treat others: Elon’s multiple divorces and President Taco’s Epstein Island adventures show they see women as consumable resources, not people.

My military analogy recurs. Rank-and-file soldiers internalize an ethos of self-sacrifice, and learn to see heroic death as the ultimate virtue. And I do mean “learn”: cult expert Daniella Mestyanek Young writes that basic training doesn’t teach military skills, it teaches self-abnegation and the primacy of the unit. As a collective, the military survives by teaching its members that their individual lives aren’t worth saving.

Elon Musk claims to work 100 hours per week. This feels specious, since he also claims to be a world-class competitive videogame player, while recently tweeting nearly 100 times per day. But even if it’s true, Musk doesn’t work those soul-breaking hours because he’s disappeared into his job. Instead, he’s made his companies an instrument of his ego, something to inflate himself, though it will never leave him full.

Thursday, January 22, 2026

Toxic Work Ethic in America, Part One

I sometimes literally hear my father’s voice when he isn’t there. Not in a jolly metaphorical way, either, but in a terrifying, often humiliating way. I first noticed this when I worked at a medical components factory during my hiatus between college and graduate school. If I slowed down, slacked off, or simply paused to chat with my co-workers, I heard my father shouting angrily, demanding to know why I wasn’t working until I bled.

Because of this terrifying voice which chased me throughout the workday, I moved faster, took shorter breaks, and got more done than people who had worked there far longer than me. Supervisors took notice, too. They often praised my work ethic, telling me that they wished they had an entire shift full of laborers as “dedicated” as me. Because they weren’t passengers in my head, as I was, they took my terror as committed professionalism.

Management often mistakes being “busy” with productivity. I noticed this often while working in construction: management would schedule marathon hours, especially in the final crunch. But management only deluded themselves. Fatigue, boredom, and resentment created new problems, while workers spent most of every morning ripping out the mistakes they made the previous evening, when they were tired. Team supervisors micromanaged workers’ every decision, because site superintendents micromanaged the supervisors. Everyone was tired all the time.

Literally every blue-collar job I’ve worked has faced some version of this. If food service workers find themselves caught up with tables, they’re given cleaning tasks to do, or refilling table caddies. I’ve worked in two car parts factories, where we were ordered to sweep and clean if the machines even briefly went down. Every moment is policed, every action judged, and companies demand constant maximum productivity; unscheduled pauses are justifications for reprimands, often stern.

Meanwhile, I’ve worked only two white-collar jobs, as a freshman composition teacher and a marketing copywriter. In both positions, I’ve been astounded by how much scheduled work time gets consumed by non-work activities. Chatting, dithering, side projects, day drinking, and even napping are anecdotally common. While hourly wage earners have their hours aggressively monitored for unsanctioned yawns, resulting in paranoid, often manic work, managers have so much discretion that they want for things to do.

My father spent most of his military career as a rank-and-file enlisted man. If you’ve ever spent time on a military installation, you know how aggressively the enlisted men’s time is regulated. Every barracks, parade ground, warship, and hangar is the epitome of cleanliness, with every plank sanded, hinge oiled, bolt painted, and floor scrubbed. Especially for unmarried recruits living on-post, twelve-hour workdays of constant, regulated motion are common, and labor outputs are closely quantified.

Simultaneously, a peer whose father was a career officer told me that officers cultivate the attitude of men of leisure. (We met in school off-post, because even officers’ and enlisted men’s children are discouraged from mingling.) Not that officers don’t work, because they too have pervasive regulations and readiness standards. Rather, they achieve their dictated goals at measured, deliberate speeds. Humans with autonomy, not checklists and rubrics, measure officers’ outputs. Rules are discretionary, not absolute.

This pattern applies broadly. Matt Taibbi wrote (before becoming a culture war spokesmodel) that every SNAP benefit applicant gets treated like incipient fraud, while almost nobody was held responsible for the 2008 financial collapse. Since then, we’ve seen how the only people indicted for the January 6th, 2021, Capitol insurrection, were the foot soldiers at the door, and they got pardoned. Those who incited the crime not only got ignored, but they also got reelected.

America cultivates a socioeconomic narrative in which the poor, the laborer, the voter—the enlisted men of civilian society—hear their inadequacies repeated endlessly. Nor is this accidental. The wealthy and powerful—our officers—want us to suffer the constant loop of condemnation for even momentary weakness, like I heard, and sometimes still hear, my father. The psychological harm which this repetition causes individuals doesn’t matter, because to our “officers,” the outcome is “work ethic.”

But it also enables unbridgeable gaps in American social structure. This is why laborers seldom become management, classroom teachers rarely become administrators, and most citizens have little chance of getting elected to higher office. We numpties cannot lead because we’ve been conditioned to rehearse our inadequacies, real or imagined, constantly. Only those without that conditioning have the arrogance necessary to become presidents, billionaires, and other captains of society. “Work ethic” is the opposite of advancement.

This essay continues in Toxic Work Ethic in America, Part Two