Wednesday, May 13, 2026

Obsolete Men and Vanishing Adulthood

This essay is a follow-up to Obsolete Men and the Gendered Violence Epidemic and Obsolete Men vs. Shrinking Women
Braden “Clavicular” Peters

I don’t like giving Braden “Clavicular” Peters free oxygen, largely because his philosophy is so dangerous that I fear it becoming airborne. His belief that life belongs to those who are good-looking enough is maybe not controversial, as we merely average-looking men can attest. But his desire to manipulate his physiognomy to become as absurdly handsome as possible, involves a regimen of intensive self-harm.

It was bad enough with men like Andrew Tate, whose abusive workout regimen has distorted his body as badly as his soul. Any psychologist can tell you that obsessive bodybuilding, to the point where your body becomes cartoonish, emerges from the same well of self-hatred that manifests in women as anorexia nervosa. Tate is loud and charismatic enough to make his insecurities everyone else’s problem, at great mutual cost.

Peters, though, doesn’t just distort himself. Bodybuilding is only part of a larger regimen, which includes injecting dangerous drugs, wolfing down questionable supplements, and self-flagellation. He became the public face of “looksmaxxing” in recent months as the most grotesque part of his regimen—self-administered facial beatings with a hammer—went viral. He believes that treating himself with violence will make him more conventionally handsome.

And he isn’t entirely wrong. Recent photos show Peters looking like an exaggerated form of a mid-20th Century matinee idol, with big shoulders, great hair, and a well-defined chin. Of course, as I write, Peters has recently turned twenty, so whether his good looks represent his abusive regimen, or simply graduating from awkward adolescence, is subject to debate. What we can’t debate, though, is: this man looks thirty.

Telling a twenty-year-old woman that she looks thirty would probably get you smacked. In American culture, female physical beauty correlates with outward markers of fertility, which means youth. Women use ointments, tinctures, injections, and surgery to stave off the appearance of age, though the results are questionable. Lauren Sanchéz Bezos’ recent appearance at the Met Gala resulted in laughter at her augmented appearance and questionable wardrobe.

Lauren Sanchéz Bezos

But for men, looking older is desirable. In interviews, Peters describes injecting himself with steroids at age fourteen to achieve the shredded look that normally requires years of dedication and effort. His famous, highly defined jaw, does indeed come to most men through years of small-scale trauma, sports injuries, and dangerous work. Like millions of adolescents, Peters wants to skip the dues-paying stage and be recognized as an adult.

Who can blame him? As entry-level professional jobs dwindle, men keep jobs into their twenties and thirties that formerly belonged to teenagers. Countless adults, of both biological sexes, cannot afford to move out of their parents’ houses. Student debt, once a ten-year commitment, has become a lifelong burden. The average age of first-home purchase is now forty. In such an environment, paying one’s dues in linear time is downright foolish.

In such an environment, Peters doesn’t want to merely look good. Placed in his social context, Peters wants to speed-run adulthood, or anyway the one aspect of adulthood which he can control. Savvy media manipulators can fake the personal characteristics that make older men attractive to women, including emotional regulation and economic stability. But only those willing to treat themselves violently can look old enough to enter the market.

However, let’s continue looking at that same broader context. The ways that men used to hasten rugged good looks, like playing sports or doing difficult physical labor, are all communitarian. There’s no such thing as solitaire football, and building a house requires a team. The ways men formerly organized themselves into communities, including labor unions, religious congregations, and even bowling leagues, look increasingly quaint, if they even still exist.

Peters has to speed-run adulthood alone because, otherwise, he has nowhere to go. Modern life has become mostly solitary and, unless you’re born to money, the chance of getting ahead through hard work and ingenuity alone is virtually nil. Peters has made himself a mass-media grotesque, but in doing so, he’s captured our attention, the one meaningful resource for cash-poor boys hoping to make themselves a life in American modernity.

Our solution must involve getting outside our own homes, and outside our own heads. Easier said than done. But even as churches and unions seem irrelevant, many communities still have adult sports leagues, maker spaces, and public libraries. Individuals and small groups can organize new networks, like community choirs, improv companies, and charitable volunteer organizations.

We must seek the trappings of adulthood, once hoarded in the workplace, out in the large rcommunity.

Monday, May 11, 2026

Creating a Marketplace for Reliable Guessing

Philip E. Tetlock, Expert Political Judgment: How Good Is It? How Can We Know?

Why are professional prognosticators so consistently bad at predicting the future? We know this phenomenon most clearly from basic cable news, where credentialed experts prognosticate about how good, bad, or volatile the near future will be, usually in ways that support their ideology. But it manifests in other environments: business professionals who fail to forecast economic trends. Legislators who let lush opportunities slip away. Inventors pushing questionable technology.

Canadian-American psychologist Philip Tetlock, currently at the University of Pennsylvania, asked himself this question immediately after Operation Iraqi Freedom. Mass-media oracles predicted either swift, easy victory, or else nigh-apocalypse. The reality reflected neither partisan extreme, but instead descended into the same quotidian brutality that has characterized American intervention since WWII. Why, Tetlock asks, were both sides so wrong, and why has nobody paid for their overconfidence?

He favors the metaphor of foxes and hedgehogs, which he pinches from Isaiah Berlin, though it’s far older. Foxes know many things, Tetlock writes; hedgehogs know one big thing. Mass-media operators love highly credentialed experts, especially on economics and world affairs. But those experts’ predictions are often only marginally better than committed dilettantes who read newspapers daily and remains informed. Further, the more advanced one’s credentials, the more marginal the gains.

So far, so good. Tetlock’s description essentially accords with our recent experiences of camera-friendly experts reliably whiffing their predictions. My problem arises when Tetlock transitions from describing to explaining. A consummate scholar, Tetlock is reluctant to say anything which he cannot support strictly from quantifiable evidence. And holy moly, does Tetlock have extensive and thoroughly documented evidence to deploy.

Let’s make something clear: despite his praise (often muted) for well-informed dilettantes, he writes for scholarly audiences motivated by deep research. He fortifies his prose with histograms, p-values, and confidence intervals. He spends several column inches breaking down the mathematical modeling which supports his conclusions, and he seldom goes beyond the evidence. He dedicates an entire chapter to anticipating and transcribing his critics’ likely counterarguments.

Philip E. Tetlock, Ph.D.

Tetlock briefly acknowledges, but doesn’t expand much upon, the reality of who receives attention. TV pundits, hero CEOs, civil rights activists, and tech bros all broadly favor certainty, volume, and swagger. Reliable predictors, working from diversified backgrounds and intellectual caution, can look timid on Sunday talk shows or corporate board meetings. Put another way, saying wrong things confidently looks more telegenic than trading in likelihoods, conditionals, and caution.

Unfortunately, Tetlock himself demonstrates this. He refuses to offer opinions without sourced evidence, and he refuses to offer evidence without lengthy discursions on mathematical variance. Because his status relies on measurable outcomes—what he terms “reputational bets”—he refuses to place everything on one spin of the roulette wheel. The product he thus creates is more likely to be accurate, but less compelling in a media-saturated “attention economy.”

He also omits something I consider vitally important. The principle of homophily means we’re more likely to spend time around people like ourselves. Scholars congregate with other scholars; journalists chill with media professionals; lawmakers drink with lawyers. We see this particularly in economics, the scholarly field least likely to cite sources from other disciplines: our environment discourages seeking differing influences, disconfirming evidence, or even a diverse friend network.

Invested dilettantes make reliable predictions, perhaps, because they see how hypothetical outcomes postulated in scholarly journals actually unfold in daily life. Unfortunately, to calculate his confidence intervals on reliable predictions, Tetlock generates a core sample of prognosticators who are, like himself, flush with academic credentials. If military historians predict one outcome for war, and generals predict another, maybe consult the enlisted men carrying weapons, not more historians and generals.

Rereading what I’ve written, I feel I’ve misrepresented Tetlock’s product. I like his thesis, that intellectual diversity trumps depth in creating reliable forecasts. Later chapters on public accountability are particularly promising, if underwritten. Especially in subsequent years (Tetlock’s first edition appeared in 2005), we’ve seen public experts become increasingly hostile to criticism or disconfirming sources. Doubt has become, not the precursor to better thinking, but a sign of disloyalty. Unsurprisingly, experts have become more likely to be wrong.

Considering my doubts, and new evidence since 2005, we could perhaps read this volume as a prolegomenon to further research. Tetlock himself co-wrote a subsequent volume, which I’ve already purposed to read. But I feel it actually serves Tetlock’s thesis to suggest that future research should come from an interdisciplinary source, perhaps a public-private partnership. The future of the forecasting business is too valuable to entrust only to other forecasters.

Thursday, April 30, 2026

Obsolete Men vs. Shrinking Women

This essay is a follow-up to my previous essay Obsolete Men and the Gendered Violence Epidemic
Sabrina Carpenter

If, as I stated previously, modern masculinity means rejecting anything feminine in the self, what then is modern femininity? This matters as women’s appearances have come in for renewed criticism. Louis Theroux, in Inside the Manosphere, followed several men whose male reinforcement routines involve obsessive exercise and bodybuilding, making their bodies huge and brawny. Simultaneously, we’ve witnessed the recent rise in demand for feminine smallness.

Smarter critics than me have commented upon the rise in Ozempic bodies. Celebrity women who once tied their public personas to their larger frames, like Amy Schumer and Adele, recently lost weight so rapidly that it’s essentially impossible without synthetic pharmaceuticals. Large women gaining small bodies requires expensive drugs and full-time exercise routines, so it’s obviously impractical for most women. Yet it’s widespread enough recently to appear normative.

Some time ago, I read an essay—now lost—critiquing gender roles in “romantasy” fiction. The author noted a recurrent theme she called “size gaps,” presented as almost equally reprehensible as age gaps. Many romantasy novels feature hulking, muscular leading men, basically walking slabs of uncooked steak. Their leading women are dainty flowers, maybe skilled swordswomen, but usually small enough to ride piggyback on their lovers’ shoulders.

This underlines part of my problem: we define gender in our society oppositionally. Men are large, tall, and muscular; women are small, slight, and shouldn’t have muscles. Standards of feminine beauty have changed little since 1925, when Lewis & Young praised a woman for being “five foot two, eyes of blue.” A century later, feminine beauty icon Sabrina Carpenter (five foot zero) omitted trousers from a custom Louis Vuitton suit, specifically because of her height.

Not to criticize Carpenter personally; she had no more control over her height than I had over becoming exceptionally tall. But her social icon role, beautiful and sexy but not necessarily for men, includes her body type: short, buxom, with large facial features. We see similar behavior from other women who sway cultural standards. Lizzo, Melissa McCarthy, and Kathy Bates, all celebrated large women, recently lost weight with GLP-1 drugs.

Criticizing Hollywood bodies is nothing new. Women on camera have long been expected to maintain teenage proportions well into adulthood, a standard only possible for those who can afford personal assistants and pricy gym regimens. But the recent rise in waif-like women, coupled with the concomitant visibility of ox-like men, reflects a brutal reality: men must not look like women. And women must not look like men.

Braden Peters

Such attitudes have very real consequences. One of the best actors I ever shared the stage with, a beautiful woman with dynamic range and powerful singing chops, also stood over six feet tall. This made her too tall to gaze up soulfully into most men’s eyes, which precluded her from substantial roles. She bounced through some insulting comic relief roles that reduced her to a function of her unusual height, before leaving the industry altogether.

Switching genders, I see the culmination of this trend in the male “looksmaxxing” influencer Braden Peters, stage name Clavicular. Peters put himself through grotesque paces to achieve his appearance goals: drug injections, day-long bodybuilding runs, even beating himself with a hammer to maximize his jawline. The regimen has arguably worked, because he looks like an exaggerated caricature of a midcentury Hollywood leading man.

But he’s achieved his goals at a catastrophic price. By his own admission, Peters began injecting himself with synthetic testosterone supplements at age 14 to hasten adult characteristics. Within five years, he’d consumed so much fake testosterone that his body stopped producing the natural stuff. In essence, Clavicular chemically castrated himself. He maintains the external appearance of a sexy man, but his (ahem) primary sexual characteristic no longer works.

Sabrina Carpenter and Braden Peters are opposite sides of the same coin, and I do mean opposite. If women are shorter than men, then Carpenter’s exceptionally slight height becomes aspirational. If men are harder-featured then women, then Peters literally beating his face with a hammer to maximize his jawline becomes an acceptable price. Influencers define both binary genders by looking as little as possible like the other.

I’m using external appearance, like height and build, as metaphors here. The problem is much more pervasive. As American society generally becomes more accepting of alternate gender presentations, our cultural gatekeepers have become even more rigid and restrictive. This is still currently a fringe issue, mercifully. But then, so was the alt-right, until it conquered the government. Left untended, these positions risk becoming mainstream.

Saturday, April 25, 2026

How To Build and Destroy an Empire

1001 Books To Read Before Your Kindle Battery Dies, Part 122
Adam Hochschild, King Leopold's Ghost: A Story of Greed, Terror, and Heroism in Colonial Africa

Belgium’s King Leopold II aspired to become one of Europe’s great powers, although Belgium, a loose regional federation, didn’t exist before 1830. Becoming a colonial empire, on par with Britain and France, allowed Leopold a quick, cost-effective way to achieve greatness. It certainly helped that he didn’t care who he hurt, and saw native peoples on colonized land as a treatable nuisance. So he set eyes on the Congo.

Journalist Adam Hochschild previously covered South Africa’s waning apartheid government, a beat that put him in contact with CIA and MI6 officials and their off-the-record stories. One such story involved the CIA’s multiple slapstick efforts to overthrow Patrice Lumumba, the only democratically elected leader of the post-colonial Congo. Further investigation led Hochschild to a colonial history that, in the 1990s, was largely forgotten in Europe and America.

Using primary source documents, including eyewitness testimony and elaborate government and business records, Hochschild reconstructs Leopold’s process. Europeans in the 19th Century desperately wanted to see themselves as heroes. They bankrolled adventurers l ke Henry Morton Stanley, whose wanderings in Africa’s interior were polished to conceal his actual violent tendencies. Europeans also moralistically raged against Arab slave trading, despite having barely ended the Triangle Trade themselves.

Leopold managed his age’s three great influences—moralism, adventurism, and industrialism—to build support for Belgian intervention in Africa. Except not in Belgium, which cared more about building a reliable domestic state. So Leopold sold bonds overseas, got lucrative British and Prussian loans, and mortgaged royal properties to subsidize his plans. They paid off, too, as Stanley inked treaties with Congolese nations that gave Leopold massive territorial control.

Territory that, incidentally, he never visited.

But, burdened with debt obligations and international prestige, Leopold quickly needed to show profits. He hired agents who cared little for rules, armed them with newfangled carbine rifles, and set quotas. This turned out to be an excellent formula for lucrative export markets, provided nobody cared about the human cost to native peoples. Several state agents made a mint, while Leopold became fabulously rich. Natives fled the bloodshed.

Adam Hochschild

Then as now, money and property became their own justifications. Agents of the state corporation didn’t care whom they hurt, provided they got paid. Those forced to do the actual work never saw the rewards, and indeed were punished severely for even minor noncompliance; casual maiming was common, and company soldiers destroyed entire villages when quotas weren’t met. Africans lived as slaves in their ancestral homeland.

Not everything in Hochschild’s telling is bleak, though. As Leopold’s hybrid of military, government, and capitalism grew to unprecedented power and violence, others began resisting. While many state agents reveled in violence, others were sickened, and carried their stories back to Europe. One such disillusioned state agent was Joseph Conrad, whose novella Heart of Darkness continues telling the resistant story long after Leopold’s colony ended.

Two other resisters were E.D. Morel and Roger Casement. A journalist and a state bureaucrat respectively, they carried news of Leopold’s brutal government to the very countries that owned Congolese bonds and debt instruments. Leopold attempted a PR campaign in Europe and America to assure Whites everywhere of Belgium’s moral valor. But Morel and Casement shone lights on how Leopold’s administration governed, and who got rich of African labor. World sentiment finally turned.

Hochschild writes history without moral sentiment. Those who resisted Leopold’s imperial experiment often had their own racist lenses, and sometimes preserved power as much as they resisted it. While Leopold’s Congo may have been exceptionally violent, Morel and Casement overlooked British and French abuses in adjacent colonies. And Conrad, though conscious of the damage empire caused, never had courage enough to abandon his privileges.

Of all problems in writing this history, though, Hochschild acknowledges the greatest himself: Africans left few primary sources. Even oral history wasn’t coordinated until the survivors of Leopold’s terror were aged and vanishing. Congo historiography winds up being a heavily European narrative; Africans become somebody Europeans speak with, or speak for, not autonomous individuals who speak for themselves. History is as much a matter of what’s missing as what’s known.

Despite these yawning gaps, Hochschild’s history is thorough and enlightening. It’s also timely. When it appeared in 1998, it was revolutionary and even dangerous, but Hochschild’s broad themes have become intrinsic to the modern narrative of resistance. Because, although company agents aren’t massacring villages or cutting off hands, the underlying parallels are way too visible. History is never about the dead; it’s about we who live in the aftermath.

Wednesday, April 22, 2026

Obsolete Men and the Gendered Violence Epidemic

CONTENT WARNING: this essay contains direct discussions of sexual and gendered violence. I've tried to remain dispassionate and considerate of readers’ sensibilities, but the subject remains what it is.

Nobody asked my opinion about the “global rape academy” story that exploded on social media last week, nearly a month after CNN first reported it. As a book blogger with a negligible audience and few respondents, my interpretation doesn’t matter. Certainly nothing I say will ameliorate the repellent content and persistent harm this “academy” has perpetuated. I’ve debated whether my contribution would do more harm than good.

But in one of those flukes of synchronicity, this story overlapped with several others. The story gained traction as I finished reading Gareth Russell’s The Six Loves of James I. Russell writes that, nearing the end of his life, King James strenuously avoided entangling England and Scotland in Europe’s wars of religion. For this, British political and religious leaders disparaged James as “feminine,” and therefore unworthy of power.

Millions of viewers watched Louis Theroux’ Netflix documentary Inside the Manosphere, which dropped almost simultaneously with CNN’s report. Theroux interviews representatives of a highly public form of masculinity, which rewards displays of strength and valor, while actively disparaging women. Theroux’ interviewees call their girlfriends “the dishwasher” and discuss monogamy for thee but not me, demonstrating the inferior position they reserve for women.

Anti-estrogen pills for men have begun invading my all-night doomscrolling sessions. Minimally regulated supplements, sold by mostly anonymous vendors, promise to help aging men eliminate man-boobs and soft guts, while turning them into sexual powerhouses guaranteed to please their partners. These ads’ innate subtext includes that any implication of femininity, including softness or having boobs, undercuts one’s status as a man and a husband.

All three influences share the supposition that femininity is necessarily inferior. Any man showing feminine signs is perforce disqualified from being a king, a husband, or even a man; men must purge femininity through war, domination, or chemical self-mutilation. Men must hurt or kill anything feminine within themselves, not only internally, but in highly public displays of masculine reinforcement. Anything less diminishes a man.

Should we wonder, in such conditions, that some men—and indeed, some women—consider the feminine necessarily deficient, no matter who displays it? Womanhood becomes, not another manifestation of human potential, but an enemy to control and restrict. Men raping their wives, or men intruding themselves into women’s personal space in public to demand sexual favors, aren’t merely criminals or assholes. They’re defending their dwindling male prerogative.

This form of masculinity often, but not necessarily, correlates with political conservatism. Right wingers like Paul Joseph Watson, who popularized the epithet soyboy, and Alex Jones, whose rage at progressives often becomes so pitched that he screams wordlessly into the microphone, perform notorious displays of machismo. Jones’ shirtless horse rides, a naked mimicry of Vladimir Putin, are pointedly anti-American in nature.

Nor are these displays unique to men; because conservatives consider anger a prerequisite to seriousness, conservative women adopt public displays of macho anger. Tomi Lahren notoriously starts her broadcasts already spitting rage. Candace Owens loves getting belligerent, often cussing into the microphone to prove her legitimacy. If violence and war are necessarily masculine, and therefore strong, these women will remake themselves as masculine as possible.

Most men lack the social reach of Alex Jones, manosphere influencers, or President Taco. They can’t hurt women as a class. They’re reduced to hurting women as individuals—which means the women to whom they have the readiest access. Reducing gendered violence to a sexual fetish also allows them to commodify their violence; CNN reports that several “academy” participants sold one another unregulated sedatives online.

Woman hatred as entrepreneurship. Yuck.

These men drugging and raping their wives are functionally equal to ICE agent Jonathan Ross, who videoed himself shooting a mother in the face, then calling her a “fucking bitch.” Whether it’s murdering mommies in the street, belittling them on podcasts, or turning them into lifeless sex dolls in their own bedrooms, these men all treat women equally. Femininity deserves to be hurt, both in myself and in the world.

I take comfort that these displays are rare. As Snopes reports, the 62 million users number, popularized after CNN’s report, describes the entire hosting website, a porn outlet owned by a New York smut entrepreneur. The “academy” itself had barely a thousand active users. Even this, though, isn’t wholly comforting, as the site’s content is entirely user-generated, and therefore almost certainly contains other illegal content.

I wish I had uplifting, humanitarian solutions. Sadly, we’ve reached this position behind a raft of causes: male economic obsolescence, rapidly changing gender roles, diminishment of hierarchy and violence as mandatory social organizing tools, and more. We neglected the causes until they became a crisis. History warns that the situation is unlikely to reverse itself now, unless something brutal upsets the apple cart.

Until then, all of us, male and female, will continue paying the revolting price.

Tuesday, April 21, 2026

The Other King James Version

Gareth Russell, The Six Loves of James I

James Stewart (Stuart) was crowned King of Scotland, probably illegally, at only ten months old, inheriting from Mary, the mother he barely met. He was raised under spartan conditions, with the expectation that he’d eventually command in battle against the English, the French, or his own barons. But by disposition, he was better suited to scholarship, theology, and poetry. Desperate and lonely, he sought companionship wherever he could find it.

Anglo-Irish author and historian Gareth Russell has written multiple biographies of European monarchs’ private lives. Russell admits he chose the “six loves” angle as a deliberate parallel to Henry VIII’s six wives, but pinning down exact numbers is difficult. He has to reconstruct James’ private life through diaries, letters, and other contemporary documents, many written in coded language. Because unlike Henry, James’ multiple dalliances were with men.

Russell describes James’ strict, cloistered education, under Presbyterian clerics who despised anything even slightly feminine. They taught him to distrust his mother’s legacy, women’s advice, and anything nurturing or fair within himself. (Poetry, back then, was highly manful, and James became an accomplished lyricist.) They also mostly denied him any friends his own age. His childhood seems bleak and lonely; no wonder he rebelled when he came of age.

Even before reaching adulthood, James began the royal prerogative of keeping “favorites.” These were all men; he showed little interest in women, either erotic or platonic. Rumors of James’ relationships ran rampant, though his contemporaries described them only obliquely, as addressing them directly would’ve been unseemly. Courtiers described James’ male favorites as “minions,” a word that had judgmental connotations that modern English has largely lost.

Understanding James’ relationships in modern terms is difficult. In early modern Scotland and England, sexuality was an action, not a state of being; the word “homosexual” wouldn’t be coined for centuries. Russell includes a detailed appendix on the processes of translating Jacobean-era social descriptions into modern English, because like race, sexual identity is a social construct, one which James’ contemporaries, like ours, often deployed maliciously.

In this book, Russell attempts to write a strict biography of James’ private life, not a history of his political reign. This proves difficult. Back when monarchs had actual political power, it was often difficult to separate kings’ private and public lives, especially when kings like James plied his favorites with the one gift a chronically cash-strapped monarch could give: aristocratic titles. Private life and public power were inextricably entwined.

Gareth Russell

James broadly favored good-looking men in their early to middle twenties. Most shared his scholarly inclinations, though in Russell’s telling, at last one favorite was a doofus whom James thought he could rehabilitate. At this late date, it seems painfully naïve to pretend James didn’t have sexual relationships with men, though his association with the King James (Authorized) Bible has made admitting that difficult for some commentators.

But again, James wasn’t homosexual in the current sense. He apparently never loved his queen consort, Anna of Denmark, though he certainly respected and trusted her. Sometimes he seems to have even liked her. They had seven children, and when Anna died, James was legitimately grief-stricken. Russell also identifies at least one, possibly two women James had as mistresses, affairs noted for their passion but not depth or durability.

From the monarchy of Scotland, James graduated to the monarchy of England. The English crown had less power under constitutional standards, but more prestige, and becoming King of England entangled James in European power politics. James’ willingness to trust male favorites with court authority left him vulnerable to aristocratic criticism, especially as he disfavored foreign wars, which contemporaries disparaged as terminally feminine.

Then as now, calling a man “feminine” was the highest insult.

In Russell’s telling, James’ private life seems a balance of contrasts. His strict Calvinist upbringing, with its disdain for women and femininity, probably influenced his relationships with men, in ways his teachers never intended. He wrote extensively on theology, but was ambivalent toward faith. He was proudly Scottish, but barely visited the nation after becoming King of England. He loved his favorites so deeply that he jeopardized his kingdom.

Most important, there’s no separation between King James and James Stewart. He trusted his wife, his lovers, and his sons with remarkable power. He experienced passionate love but jealously guarded the royal prerogatives, eager not just to be king, but to be seen as majestic. All subsequent British monarchs have descended from James, through a distaff line. He wasn’t always prudent, but he was always King.

Saturday, April 18, 2026

Last Star, Straight Onto Mourning

Catriona Ward, Nowhere Burning

Poor adolescent Riley needs to escape her abusive foster home, and the impossible girl outside her second-floor window offers the sanctuary she needs. All Riley has to do is run away across the Colorado Rockies and learn to fly. As unlikely as that sounds, it’s better than staying put. But when she reaches Nowhere, the strange ruin of mid-century grandeur overlooking Boulder, she finds a compound of frightened fugitives like her, all somehow permanently children.

It wouldn’t be accurate to describe Catriona Ward’s latest book as a retelling of Peter Pan. More like a self-conscious homage. Ward, whose previous works have relied upon sudden reveals and last-minute surprises, offers three converging narratives building toward a secret she’s previously kept. But this time, the reveals don’t feel earned, not like natural extensions of the ongoing story. It feels like she’s deliberately lied in order to blindside us at the last minute.

In the first narrative, Riley escapes her abusive home situation, dragging along her brother Oliver, too young to understand what’s going on. They hike to Nowhere, the remains of a palatial mansion that burned years ago. There, a commune of adolescents has established a stable society without adults. Riley feels both drawn to and repulsed by their self-reliance, backed by simple, useful roles and their leader’s home-brew religion. They worship something that needs constantly appeased.

The second narrative follows Adam. An architect and builder, Adam contracts with prestigious actor Leaf Winham to build improvements on his Colorado mansion, called Nowhere. Leaf is charming and acclaimed, but distrusts fame, and prefers to keep his secrets. Adam feels drawn to Leaf, to the point where he abandons his life, including his pregnant girlfriend. Only when Leaf controls Adam, and he has nothing to return to, does Adam begin uncovering Leaf’s dark secrets.

Finally, documentarian Marc and his camera operator, Kimble, have decided to investigate the urban legends surrounding Nowhere and its cult of children. They want to become the first adults to approach the ruined mansion in several decades, and capture its secrets on camera. But the closer they approach the building, the more friction starts emerging from Marc’s deeply buried past, and it becomes increasingly clear that he’ll hurt his closest friends to keep his secrets.

Catriona Ward

It’s obvious, early on, that these narratives unfold out of sequence. Since Adam’s story unfolds in a Nowhere untouched by fire, it clearly precedes the other two. We read in expectation of how the building’s secrets, clearly known in the other threads, will come out in Adam’s past. Also, how exactly do the other two narratives relate? Marc clearly knows more than he tells Kimble, despite call her the sister he wished he’d always had.

The problem arises here. I’ve read two previous Catriona Ward novels, where plot points revolve around information the characters have, but don’t share. In Little Eve, the narrator is self-consciously telling the story and playing with narrative conventions, in the Agatha Christie style. In The Last House on Needless Street, the secrets are buried under facts the characters take for granted, and therefore haven’t thought about in years. We feel surprised without ever feeling deceived.

Here, the characters clearly know, and often think about, the secrets motivating their choices. They just don’t tell us. The revelations come as raw info dumps, sometimes several paragraphs long. Once the characters reveal the secrets they’ve nurtured, we don’t feel surprised or illuminated, as we have in previous Ward novels; we feel lied to. We can forgive that once, because people lie. But as lie after lie gets revealed, we feel manipulated, not enlightened.

This hurts because Ward’s set-up is so good. Besides Peter Pan, previous critics have compared Ward’s premise to The Shining and Lord of the Flies. Ward isn’t merely imitative, though; she uses these time-honored influences to question how good people with honorable intentions make, and constantly re-make, civilization. Leaf Winham, the charming narcissist, and the children’s religious rituals, are just two forms of community building that work well for adherents, until the moment they don’t.

Reading, I felt like Ward had devised characters, situations, and a nonlinear form that served her psychological writing style well. But she hadn’t figured how to tie the multiple threads together, so she pulled a Hail Mary and hoped we wouldn’t notice. Maybe I wouldn’t have noticed, if her previous books hadn’t been so good that they set my expectations so high. Sadly, the product feels like a good premise, finished with a cheap rug-pull.

Friday, April 10, 2026

In the Hidden Corners of My Hometown

This West Coast modernist design just sprouts in the middle of a post-WWII development.

Flailing my way through protracted unemployment, I recently started driving DoorDash to get cash moving in. My community is too small to produce enough business for me to live off my gig, but it brings in enough to keep groceries on the table. The gig has provided another important education I didn’t realize I needed: despite living in one small city for over twenty years, I’ve discovered how much of town I just don’t know.

My central Nebraska city has a population slightly above 30,000 people. By current American standards, that’s dinky, but on a historical basis, actually quite large. Legendary ancient cities like Chichen Itza or Babylon topped out around 20,000 people, the practical maximum for societies where the majority needed to farm, and urban infrastructure had to primarily support pedestrians and mule carts. Modernity can support much larger populations, though, mostly because of cars, electricity, and Portland cement.

Modernity has also produced something that ancient cities could’ve never supported: single use zoning. When cars put much larger distances within easy reach, citizens building a business in front of their house, a stable in back, and extra rooms for an inn on the side, makes less sense. American communities are now built in sprawling, monolithic ways that discourage visitors. There’s little reason to visit huge swaths of one’s own city without a prior invitation.

This results in acres upon acres, streets upon streets, where I’ve never visited—until now. DoorDash invites me into single-use residential neighborhoods I’ve never previously had purpose or permission to enter. Visiting these quarters for the first time, I witness eclectic architecture, some of it deliberately either minimalist or rococo, and differing ideas about how large the surrounding yard should be. I’ve also witnessed that, the newer the community, the less likely to contain sidewalks.

Very large lawns, without sidewalks or parks, encourage children to play close to home. Current urban design (which, often, means no design, just vibes) discourages children from one of childhood’s primal impulses, the desire to explore. Wandering away from home may be impractical in new developments and, depending on traffic patterns, unsafe. This means children only have opportunities to meet friends and make connections in officially approved spaces, mainly school and, for some, religious congregations.

Just one of a development of identical crackerbox duplexes with postage stamp lawns,
no sidewalks, and no curbside parking—completely hostile to visitors or teenagers.

The extreme opposite, I’ve observed, is small houses, mainly duplexes, on small lots. These are single-story houses with attached garages, requiring a large physical footprint. However, these developments also lack sidewalks, which means not only no pedestrians, but no curbside parking for guests. These houses seemingly go mainly to young families as starter homes, so maybe they don’t entertain much. But it dampens their ability to perform time-honored neighborhood rituals of group bonding through hospitality.

Small starter homes have no parking and no place to set up picnic tables. Larger homes for established families have space and parking, but are so far away that neighbors can scarcely see one another. Either way, these designs discourage traditional neighborhood activities, like block parties or tenants’ unions, and functionally prevent neighbors from getting to know one another. The McMansions, in particular, look awkward, flexing their design flourishes to impress neighbors they’ll never meet.

Traveling to shared spaces, like work or school, requires either an overland hike without sidewalks, or car rides that create traffic jams. My city is small enough that “jams” are fleeting annoyances. But larger unplanned cities like Houston, which is over forty percent paved, can be dangerous during the morning commute. Ambulances trapped in rush-hour traffic have become a notable part of the Houston experience. So was the city’s inability to drain after Hurricane Harvey.

Current urban design standards divide routine activities. This isn’t entirely awful, as most people wouldn’t want to live beside a lead smelter, kimchi cannery, or hog abattoir. But most people also can’t walk to restaurants, shops, or even their neighbors’ houses. All daily business happens enclosed in hermetically sealed, climate-controlled metal capsules. Ordinary people have diminished opportunities to make friends, discover quirky experimental businesses, or, as I’ve learned recently, see most of their own town.

Old cities like central London, Paris, or New York south of Houston Street are designed around human needs: useful sidewalks, homes designed to double as business sites, and multi-story structures that utilize vertical space as assertively as horizontal. We can’t just regress, because history goes proceeds, even when we wish it wouldn’t. But we can look to older spaces for inspiration for innovative ways to utilize newer, more current spaces that aren’t hostile to visitors.

Thursday, April 9, 2026

The Deep, Dark Mines of the Uncanny Valley

T. Kingfisher, What Stalks the Deep

Shellshocked veteran Lt. Alex Easton’s sole qualification to investigate unexplained phenomena, is that they’ve seen it before without flinching. But where they previously fought ineffable monsters in their native Gallacia, a mysterious Eastern European land of dismal swamps and forests primeval, this time, they’ve been invited to America. But then, if there’s a place as old and as hostile to humankind as Gallacia, it must surely be Southern Appalachia.

T. Kingfisher’s “Sworn Soldier” novellas, starring Alex Easton, whose unique gender identity doesn’t translate into English, each delve into different horror subgenres. The first retold a Poe classic, highlighting a theme Poe introduced, but didn’t explore. The second followed the conventions of folk horror. This third unpack a theme popular in recent movies: the legend of mysterious humanoids dwelling in the caverns and mines permeating America’s eastern mountains.

Dr. James Denton, a supporting character from Easton’s first story, has telegrammed Easton for their help. He admits Easton isn’t particularly qualified, except that they’ve faced similar conflicts before, and he needs a partner who won’t ask stupid questions. So Easton crosses the ocean, rides America’s rails, and walks into West Virginia’s dark, forested mountains, a terrain from which more intrepid explorers have frequently failed to return.

Many American folk myths speculate that something dark and mysterious dwells underground, a horrible monster which we’ll uncover by mining for hydrocarbons or even just spelunking. This monster is usually whispered to be older than humankind, and eager for small provocations to resurge and take America from us. Of course, this is coded language. We “Americans” know who we stole this land from, and why they deserve to reclaim it.

Kingfisher salts these themes with a Lovecraftian influence which she acknowledges in her afterword, but which she doesn’t hammer needlessly. Rather, she describes two war-torn old souls, walking wounded, who investigate a land older than human conception. There, they discover a cavern that cannot possibly exist, guarded by a force so close to human, that its very existence personifies the uncanny valley. But that force is holding something worse back.

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Reading this novella, I’m reminded of Stanisław Lem’s signal classic, Solaris. Both stories feature humans encountering an intelligence so different from themselves that they cannot truly communicate. Though Easton and Denton have more success than Kelvin in making peace, they struggle with some of the same problems. What does it mean to “communicate” with an intelligence that isn’t human? Or to speak individually with a collective intelligence?

But our protagonists bring something to the story that neither Lovecraft nor Lem considered: capitalists’ willingness to burn everything that doesn’t turn a profit. Lovecraft’s shoggoths and Lem’s ocean planet encounter humans primarily through scientists and explorers. Kingfisher’s primordial intelligence comes to light because humans dynamited the mountains and uncorked Earth’s mantle in search of power and money. Therefore, “first contact” means not curiosity, but pain.

I’ve become a particular Kingfisher fan because she reverses widespread cultural expectations. In this case, besides Easton’s blunt defiance of the Anglophonic gender binary, Easton also sees America as exotic and foreign, reading America back to Kingfisher’s audience. Burned out on conflict, Easton sees American glorification of the Spanish-American war as bizarre and uncivilized. America’s much-bandied national youth seems ridiculous amid Appalachia’s uncountable antiquity.

One could continue unpacking Kingfisher’s themes. Cartesian dualism versus the Freudian psyche, perhaps, or the failures of technological triumphalism in the face of Earth’s unimaginable age. Kingfisher plays with these thematic contrasts and reversals like Lego bricks, creating a whole that readers recognize from previous books, but which is entirely her own. Her ability to use common strategies to tell an uncommon story is why I’ve become a Kingfisher fan.

Although this story remains short, it’s the longest yet of Kingfisher’s Sworn Soldier novellas, over 170 pages plus back matter. This gives Easton space not only to investigate their themes, but also to confront the monster. But this story also has perhaps the largest company of characters yet, and Kingfisher doesn’t give everyone full development. Easton’s loyal batman Angus, in particular, gradually disappears from the story, which is disappointing.

That said, this story largely maintains the momentum of the previous “Sworn Soldier” novellas. Though I might wish the story was about fifty pages longer, to give every character the space they deserve, that would’ve changed the novella-reading experience. Kingfisher’s distinct voice and nonconformist attitude remain visible and keep the narrative popping. It reads like a slice of popular literature, just seen through a lens like you’ve never read before.

Thursday, April 2, 2026

The White Privilege Party, Part 3

This essay is a follow-up to Dinner and Drinks at the White Privilege Party and The White Privilege Party, Part 2.
Woody Guthrie

If the fash hate one thing, it’s being called fash. Or even told they’ve done something fashy. Even when faced with their overwhelming fascism, or with subject experts like Timothy Snyder or Jason Stanley demonstrating their fashy tendencies, they become angry and defensive. President Taco’s claim to be “the least racist person there is” has become the tragicomic emblem fascists’ need to be seen as nevertheless good.

Returning this series to where it began, the question remains of whether protestors should use confrontational chants while challenging the current administration. Specifically, whether they should use Woody Guthrie-type songs to call fascists “fascist” to their faces. In conservative, semi-rural, and racially homogenous places, such boldness will precipitate conflicts, which discourages White protestors from getting involved.

“Fascism” is a notoriously slippery concept, since it adapts itself to local conditions. Snyder and Stanley have useful, but often inconsistent, definitions. For our purposes, let’s define fascism as the hardened and intolerant extreme of the hierarchy I described last time. Fascism not only requires some people to remain powerless for others to have powerful, and divides power racially, but enforces this mandatory division through arbitrary violence.

The history of hierarchical violence reveals something remarkable. As theologian James Cone writes, Jim Crow racial violence didn’t happen to kill the targets. It happened to remind survivors that the perpetrators would face no consequences, because they owned the system. Likewise, the Roman church didn’t burn witches and heretics to force conversions in early Christendom. It only burned nonconformists in the Renaissance, once its political power was unquestionable.

Put briefly, hierarchical violence happens when perpetrators know they’ll face no meaningful punishment. In my lifetime, Kyle Rittenhouse, George Zimmerman, and Bernard Goetz knew or suspected that the racially slanted justice system wouldn’t hold them accountable for shooting Black people or their White sympathizers. So they strapped on guns and went hunting on American streets.

We’ve watched “red states” legalize driving cars into protestors. We’ve watched them refuse to prosecute bullies attacking children. We’ve watched the current administration target harmless dissidents on camera, knowing they won’t be prosecuted, or even meaningfully reprimanded. The deferral of each consequence basically ensures that the next street-level fash will feel authorized to attack, maybe even to kill.

Equally importantly, perpetrators don’t see themselves as villains in this arrangement. Fashy narratives reinforce the belief that hierarchies are necessary, and therefore equality is oppressive. Any attempt to fix unfairness is innately unfair to those who benefit, or think they do. Therefore those protected by the status quo, even the poor and forgotten, are too likely to violently defend what dwindling privilege they have.

The term “extinction burst” has become modish recently. Once you remove reinforcement from previously rewarded behavior, the behavior becomes more extreme and calcified before it disappears. Recent discussions spotlight violence specifically, as America’s overall culture no longer rewards racism, homophobia, and other bigotry as openly as before. But that exact change puts protestors in conservative areas at greater risk.

Please don’t misunderstand, I know these forces are contradictory. People are violent because they know nobody will hold them accountable, but they know nobody will hold them accountable in the exact places where their dying ideology still matters. Florida, which legalized driving cars into protests, has one of America’s oldest median resident ages. Nebraska, where prosecutors won’t charge men who attack kids, remains substantially isolated from the larger economy.

This paradox underlies Critical Race Theory. CRT founder Derrick Bell claimed, with evidence, that racism has proven infinitely elastic as its successive justifications become obsolete. Violent economic necessity justified slavery, but morphed into organized bigotry under Jim Crow. Once the state withdrew support, bigotry became disorganized, like background noise. With each morph, the system excommunicates its former defenders.

The three vigilantes I named—Rittenhouse, Zimmerman, and Goetz—all retreated into anonymity after their acquittals, and became parodies of their prior selves, because their persons didn’t matter. They claimed “self-defense,” but their selves were an afterthought. Their supporters abandoned them because once they bolstered the narrative that White (or White-adjacent) people owned the system, that system no longer needed them.

White progressives fear angering the fash by calling them fashy to their faces, not only because fashies are violent, but because they’re as much displaced by the cultural shifts happening around them as the conservatives are. They’ll hang onto their illusions that they can persuade the fash, because the alternative is plunging headlong into uncertainty. The old system is dying, and to those accustomed to winning, that’s terrifying.

Wednesday, April 1, 2026

The White Privilege Party, Part 2

This essay is a follow-up to Dinner and Drinks at the White Privilege Party.
Striking teachers in the West Virginia statehouse, 2018 (CNN photo)

Political commentators conventionally date the decline of American labor unions to President Reagan mass-firing the PATCO strikers in 1981. But I think the process started much sooner. After peaking in the 1950s, union membership has declined steadily. Though reliable statistics go back to only 1983 (everything prior is estimates and probabilities), union membership rates have halved in that time. This decline has correlated with another powerful social force.

Desegregation.

Ian Haney Lopèz dates union desegregation to 1973, and claims that the battles surrounded seniority. White laborers, Lopèz claims, would rather relinquish all union protections, than surrender the senior standing they achieved under racially biased rules. Tacit within this refusal, though, is the corollary that White workers refused to negotiate alongside Black workers. Too many White workers would rather suffer than see Black people share their protections.

I cannot verify this 1973 date; FDR desegregated defense contractors by executive order during World War II, while Truman desegregated the military in 1948. The American Federation of Labor recognized its first majority-Black union, the Brotherhood of Sleeping Car Porters, in 1925. Union desegregation seems more gradual than abrupt. The point remains, however, that the more inclusive unions became, the more White workers abandoned them.

I’ve begun this essay with labor unions because they’re quantifiable. And of course, correlation doesn’t equal causation; White workers might’ve decided they didn’t need union protection and also that they didn’t want to work alongside Black co-workers coincidentally. But the third prong of the trident, the election of softball racist Ronald Reagan, of “strapping young buck” fame, suggests that racism directed White workers’ economic choices, not vice versa.

This pattern recurs throughout American history. Critics have condemned Nikole Hannah-Jones and her 1619 Project for suggesting the Founding Fathers created America specifically to protect their racial hierarchy. But the fact remains that, after the American Revolution, nine of the thirteen original states, including New York and New Jersey, still practiced slavery. White Americans who talked up liberty and autonomy needed ninety years to fully stop enslaving Black Americans.

And then Jim Crow began.

Bull Connor looses the dogs on protesters in Birmingham, Alabama, on May 3rd, 1963

Nobel Prize-winner Toni Morrison wrote that American values have long valorized individualism and autonomy; but such values have weight only to the extent that they’re denied to some Americans. Me being unfettered only means something while someone else remains restrained. Morrison, a novelist, meant this specifically in literary terms, because in fiction, we can abstract such values to broad moral precepts. But the same principle applies to society writ large.

In today’s America, “peace” doesn’t mean the stability necessary to pursue our physical and spiritual well-being, it means the absence of war. “Wealth” doesn’t mean physical comfort and a full belly, it means the power necessary to employ other people to look after your stuff. “Law” doesn’t mean reliable systems of social order, it means violent crackdowns on nonconformists and the poor. We define our shared values oppositionally.

And, as Morrison writes, we often use race as mental shorthand for this opposition. Sure, sometimes we signify “the other” with other external signs, like hair or piercings. But if White punk rockers want acceptance from the squares, they can shave their Mohawks and remove their tongue studs. Black and Hispanic people can’t stop being Black and Hispanic, and therefore can’t stop being shorthanded as “less than.”

It’s easy, considering American public mythology, to forget that when Dr. King was assassinated in Memphis in 1969, he wasn’t there to mobilize for racial justice. He was there to help unionize the city’s sanitation workers. Sure, those sanitation workers were overwhelmingly Black, but Dr. King had recognized the inextricable bond between American racism and economic injustice. Poverty and Blackness occupy the same headspace in the American imagination.

Concisely put, America organizes itself into in-groups and out-groups, then racializes the groups to simplify remembering who belongs where. The same redlining practices that preserve segregated neighborhoods, have also segregated labor forces. The minute Black people wanted union protections, White workers began embracing myths of radical individualism, even as such individualism left them broke and powerless against billionaire business owners.

Better broke than Black, amirite?

We’re somewhat seeing this rolled back. Black deaths caught on camera have ignited a sense of justice in some White Americans, though not yet enough. But it’s carried its own pushback. Capitalists like Elon Musk and Larry Ellison have sought political power that would’ve made Cornelius Vanderbilt or Andrew Carnegie blush. But it all for the same goal: maintaining the hierarchy of haves and have-nots. Which is, usually, racial.

Concluded in The White Privilege Party, Part 3

Tuesday, March 31, 2026

Dinner and Drinks at the White Privilege Party

Citizens protest the continued ICE presence in Minnesota, January 2026

What moral obligations do White people have, especially White men, to risk death when protecting the powerless in American society? This question confronted me this week when I answered a social media question. Somebody described her White friend in semi-rural Pennsylvania trying to organize a No Kings protest. She wanted to incorporate conventional anti-fascist songs and chants, but her White co-organizers feared becoming “too confrontational” and alienating their neighbors.

I admit I fumbled my answer. I said something mealy-mouthed about regions where somebody arriving with a Gadsden flag and a gun was entirely likely. Progressives living in broadly conservative areas know that those threatening our organized activities face few consequences. Just last month, in my state, an adult man attacked a line of protesting high school students—literal children—and local prosecutors declined to file charges.

Somebody answered my response by saying I failed to understand the Black American experience. Which, as a White cishet man, I probably do. This respondent pointed out that Black Americans face violent pushback for even the most anodyne protest. In fairness, I’ve shared enough protest space with Black and Hispanic people to have witnessed this firsthand, but like anything merely witnessed, that’s not the same as experience.

In all things, I strive to remain fair and broad-minded, and if I’m wrong, I want to amend my ways. So I’ve thought about this response. I could, if I wanted, mumble something about the limits of social media. Especially on platforms which cap character counts, like Xitter, Threads, and Bluesky, it’s impossible to make nuanced arguments. Conditional experiences, including the Black, queer, disabled, or womanist experience, will get elided.

But that’s merely an excuse. Several deeper issues conspired to reach this moment. First, humans naturally desire to stay alive. My sense of moral outrage at persistent American racism arose because racialized violence contravenes the human desire to survive, and places a sliding scale on human life. If some people deserve to remain alive more than others, autocrats could eventually use that relative deserving against me. Or you.

Accused vigilante Kyle Rittenhouse, captured on a cellphone video

This isn’t new or unique. The “Women and children first” ethic made famous on the RMS Titanic was written after literally no women or children survived the SS Arctic disaster. Men on board wanted to survive so badly that they literally elbowed smaller, weaker women aside. Human-made moral codes like chivalry, law of the sea, and bushido generally arise to restrain powerful people’s tendency to value their lives over others.

Honestly, I don’t have to die for justice. As a White man, I could passively acquiesce, and pay little price. This, then, is the obverse of my respondent’s claim that Black people live with constant threats of violence. If White people can walk away from threats and survive, they will walk away. They won’t persevere if their lives are jeopardized, unless the threat of leaving is worse.

Forgive me bringing up old stuff, but here’s where Kyle Rittenhouse enters the discussion. Even I misunderstood the meaning when it happened, focusing on the red-herring language of “self-defense.” But Kyle Rittenhouse’s real lesson was that, if White people stand up for Black lives, human law won’t protect them. White lives don’t matter if they don’t prostrate themselves before a White political apparatus.

In John’s Gospel, Jesus Christ says: “Greater love hath no man than this, that a man lay down his life for his friends.” (15:13 KJV) But Jesus never says anything unless its opposite flits through the common ethos. If we think morality should result in material reward, then dying for others is a failure. Only when justice is communal, not individual, does dying for the cause become an accomplishment.

And right now, in America’s White spaces, communitarian justice just doesn’t exist. That’s why private-sector labor unions are dying, protests are special occasions, and, as Charles M. Blow writes, White people can abandon racial justice actions when the weather becomes harsh. Community justice, once a shared value in America’s factories and farms, has dwindled as those spaces have become racialized, and White people live in atomized suburbs and single-family homes.

Under such conditions, asking White people to risk death simply because it’s right, is a non-starter. Not until our political, religious, and social leaders reclaim a vision of shared consequence, will that change. Unfortunately, that won’t happen while voters, congregants, and citizens don’t reward it—the ultimate circular reasoning. The feedback loop won’t break until, as the fash inevitably do, they turn against the people who first gave them power.

Continued in The White Privilege Party, Part 2 and The White Privilege Party, Part 3

Wednesday, March 25, 2026

How To Read the Constitution Like a Scholar

Jill Lepore, We the People: A History of the U.S. Constitution

When the militants behind the American Revolution wanted to build a government, the idea of a “constitution” already existed, but was mainly abstract. European countries like France and Spain derived constitutions from scattered law, tradition, and judicial practice; to this day, the British constitution remains unwritten, and high court proceedings often include debates about what, exactly, the constitution is. America’s Founders pioneered another idea: writing the constitution down.

Harvard historian Jill Lepore has written about the social and political forces which shape American politics. With this volume, she focuses specifically on the forces which shape our Constitution: not only the text itself, but legal interpretations, public debates, and amendment process. Though Lepore doesn’t say it, she tacitly acknowledges that America’s constitution far exceeds the document, comprising also the institutions and handshake conventions created to make the document enforceable.

Americans once loved the constitutional process, Lepore writes. Not only the national, but the local. Ratification of the current Constitution was the subject of lengthy, sometimes combative public discussions, and the original text as written satisfied nobody, though it became the text Americans could live with. Meanwhile, for over two centuries, state constitutional conventions happened, on average, once every eighteen months, and state governments almost aggressively amended themselves.

Then we stopped. America hasn’t seen a state constitutional convention since 1986 and, although the states ratified the 27th Amendment in 1992, it was a procedural asterisk; the federal Constitution hasn’t been meaningfully amended since 1971. Certainly we can’t say that the need for developing institutions has dwindled; if anything, events of the 2010s and 2020s revealed how fatally outdated and unresponsive our Constitution has begun. What caused the change?

Lepore answers that question through the debates which surrounded the original Constitution and its amendments, successful and unsuccessful. The Founders, mostly Enlightenment rationalists, believed government could operate smoothly as a machine if removed from frail human hands, and when the original Articles of Confederation proved unsuccessful, the 1787 Convention proceeded with the attitude of social engineers. Lepore compares the 1787 Convention with concurrent developments in clockwork technology.

Dr. Jill Lepore

Almost immediately, though, Americans began demanding amendments. The original Constitution was almost entirely procedural, and omitted the moral imperatives which drove the Revolution and Shays’ Rebellion. The first Congress wanted to shepherd through a Bill of Rights, but Article V didn’t even include instructions for “correct” amendments: should changes be incorporated into the original text, or tacked on as appendices? Congress chose the latter, after some contention.

As written, the amendment process proved cumbersome. Savvy news and history readers already know this. But Lepore delves into procedural hurdles that well-meaning lawmakers, Left and Right, have faced, and how they overcame them. Sadly, one tool for overcoming intransigence is, apparently, war. After the first twelve amendments ironed out procedural and rights quirks, subsequent amendments have mostly happened in clusters surrounding the Civil War, World War I, and Vietnam.

Despite the Founders’ vision, the state machine didn’t prove immune to human influence. Lepore describes how intervening events, like the Civil War or the annexation of Hawaii, changed the Constitution’s meaning. The text didn’t vary, except where amended, but as circumstances made Americans reevaluate themselves, we also reevaluated our unifying text. America’s political leaders changed their constitutional reading to allow, say, annexing Hawaii whole, which changed our shared identity.

Likewise, powerful people—mostly unelected—changed the Constitution by changing relevant practice. Supreme Court cases like Plessy and Roe read certain interpretations into procedure; Brown and Dobbs read them back out. Philosophies like “Originalism,” which arose in tandem with changing opinions about abortion, created interpretive lenses which courts used to create or abolish rights, until they didn’t. The text hasn’t changed in 55 years, but the Constitution has changed wildly.

Reading this book, I recall constitutional scholar Mary Anne Franks, who compared constitutional adherence to religious fundamentalism. If the Constitution has become holy writ, then Lepore’s telling reads like a history of hermeneutics, the processes of scriptural interpretation. Just as Christians have read and reinterpreted the Bible considering surrounding cultural influences, Americans have reinterpreted the Constitution to reflect the conditions in which our country lives.

This religious comparison isn’t flippant. Late in the book, Lepore writes that nations treat new constitutions as tools, but old constitutions, not just America’s, become venerated. The American Constitution was once esteemed so lightly that the original sheepskin parchment got misplaced; now it’s a sacred relic of state sacrament, hardened against nuclear attack. If Americanism is a religion, then changing hermeneutics deserves serious, almost monastic study.

Wednesday, March 18, 2026

Reading and Thinking in a Paranoid Age

Johann Hari

Yesterday, as I write, I finished reading a book. Once upon a time, this wouldn’t have merited an announcement; I did it as regularly as breathing. But this has become more rare and remarkable, and as a book blogger, I have concrete evidence that this is the first time I closed a book and proclaimed “Finished!” in nearly three months. Not that I haven’t read, but I haven’t followed one book through to the end.

Nor am I alone. Anecdotally, my friends report a massive increase in doomscrolling, perhaps the most passive activity which modernity permits. One sits with a small, pocket-sized computer, flipping listlessly through two or three orphan apps, hoping something jumps out urgently enough to fill the spiritual void we all apparently share. Nothing arrives, of course. But the hope of finding something provides a greater sense of reward than getting up and doing something constructive can.

Johann Hari synopsizes the multi-pronged science behind this decline. Some of it comes innately from just getting older, as it becomes harder to create new synaptic connections. Activities which come easily to youth and young adults, like reading, studying, or handicrafts, just grow harder for adults, and we need to develop discipline enough to overcome this. So yes, if reading and art seem more difficult than when you were small, that isn’t just rosy-eyed nostalgia.

But the problem isn’t wholly internal. Technology critics note that our smartphones, tablets, and other screen technology have addictive qualities. App developers maximize the hypnotic quality of their interfaces, utilizing design principles that make us want to stare. Streaming content on platforms like Netflix and Disney+ have more camera cuts and other jolt moments than the broadcast television I grew up watching, which triggers the reptile brain to keep watching, scanning for further life-saving inputs.

I cringe, though, at the word “addictive.” The concept of addiction gets misused in government PR and middle-school “Just Say No” curricula. Often, to describe something as “addictive” implies almost magical properties, like a cursed object that weakens and destroys its owner. This isn’t so. Not everyone who tries cannabis or cocaine becomes addicted, just as not everyone who fiddles on social media on their phones becomes addicted. Something deeper and more primal happens first.

Dr. Gabor Maté

As addiction specialist Gabor Maté writes, addictions develop under specific circumstances. Some people become addicted after life-shaping traumas: childhood abuse and neglect mold children’s brains in ways that protect them as kids, but are maladaptive in adults. Addicts consume their product to numb their trauma scars. Other addicts have more fleeting issues. The second-leading cause of addiction is loneliness, which addicts can overcome through sociability. For AA participants, spirituality arguably matters less than the meetings.

What, then turns screens addictive? Returning to Hari, he writes that certain life experiences create trauma-like effects on the brain. This includes certain forms of uncertainty, including poverty, homelessness, and war. Many American soldiers notoriously became heroin addicts in Vietnam, then cleared up when they returned to civilian life. I grew up believing that people became homeless because they were addicts, but that’s backward; they become addicts because they’re homeless. Substances take the fear away.

America’s economy has created unprecedented prosperity, but hasn’t distributed it equitably. Elon Musk is currently angling to become our first trillionaire, while uncountable underemployed Americans rely on multiple part-time jobs and gig work to stay afloat. I’ve bounced among short-term jobs for three years, often panicking to keep rent paid and lights on. When I pause between hustles, that allows time for thoughts to emerge, reminding me of every bill about to go into arrears.

Hari and Maté agree that such uncertainty warps the brain. In conditions of constant fear, the limbic system, and especially the amygdala, gets bigger, while the prefrontal cortex withers. A larger amygdala makes us highly reactive, even downright paranoid. An atrophied cortex means less self-discipline, and just as importantly, less ability to empathize with strangers. Both of these make us too impatient for the detail work and contemplative pace of reading or of creating art.

Uncertainty and paranoia have become our standard of life. Not just economic uncertainty, but street violence and wars of choice permeate our daily lives. This results in a population more primed for fear, snap reactions, and restlessness. Into that circumstance, streaming TV media increasingly gives us very loud, aggressive, juddery content that sates our need for stimulation. Something as sedate as reading or listening to classical music seems quaint. So no, it’s not just me.

Tuesday, March 17, 2026

T. Kingfisher and the Kingdom of Free Women

T. Kingfisher, Nettle & Bone

Princess Marra of the Harbor Kingdom is a spare daughter, never to inherit, whose only hope for advancement is to wed a prince, someday. Until then, she’s foisted onto a provincial convent while her older sister gets the prestigious marriage. But she discovers the truth: her sister is a political pawn, abused and terrified, reduced to a walking shadow. Naturally, Marra decides to organize a campaign to assassinate the patriarchy.

In the last year, I’ve become a massive fan of T. Kingfisher’s novellas. She channels classic literature and folklore, refashioning the background noise of our dreams into insightful dark fantasy. This is Kingfisher’s first full-length novel I’ve read, and instead of remaking a specific story, she uses images cherry-picked extensively from Grimm’s Fairy Tales. The product turns childhood mythology into a grown-up fable of power, resistance, and self-reliance.

Marra’s story begins when she’s already past thirty. She chastises herself for being an adult and still believing the legendry of “happy ever after.” Her sister’s marriage to a handsome prince, solemnized by a literal fairy godmother, has proven disastrous. Perhaps Marra’s awakening comes late, but it nevertheless comes. So she leaves the religious cloister and begins walking, seeking the magical assistance that will help her liberate her family.

Psychologist Bruno Bettelheim wrote that, in the realm of fairytale, the bond between siblings matters more than that between spouses. That certainly applies here. Marra and Kamia had a contentious relationship a children, but as adults, their mutual trust and self-reliance gives them strength when faced with duplicitous adulthood. Kingfisher’s narrative maps so perfectly onto Bettelheim’s Jungian prototype that it’s tempting to psychoanalyze her story.

However, this is a false temptation. Kingfisher creates a dreamlike atmosphere, appropriately devoid of proper nouns. Many characters are identified only by their roles: the king, Sister Apothecary, the dust-wife. When characters merit names, it’s only first names, usually Anglo-Saxon: Marra, Agnes, Fenris, Miss Margaret. Even countries have names like the Harbor Kingdom and the Northern Kingdom. (One country has a name, but it’s distant and half-mythic, like Avalon.)

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Characters and places lack names, here, because they belong only to a stage in Marra’s life. Bettelheim’s map of fairytale describes children transitioning into adulthood, with accompanying adult roles, like marriage and family. But Kingfisher describes a subsequent transition, where adults finally shed the conditioning of childhood storytelling. Princess Marra was first conditioned by the royal court, then by the convent. Now she must at last become herself.

Prince Vorling of the Northern Kingdom, Marra’s brother-in-law, is indeed handsome and charming. He’s also violent, domineering, and jealous. He maintains power, over both his kingdom and his family, through exaggerated displays of male swagger, and he sacrifices all relationships to maintaining the illusion of control. He truly desires to be a fairy-tale prince, and he’ll brook no intrusion on that story from annoying human foibles.

Therefore, Marra literally walks away from her society’s twin institutions of power: the royal court and religion. She spent over thirty years appeasing the dual threats of state violence and eternal judgement. Now she must obey the only instrument more true than either the kingdom or the gods, her own conscience. If that means striking a dagger to the power structures of two kingdoms, well, so be it.

Along the way, she assembles her company: the dust-wife (a vaguely defined sorceress), her mousey fairy godmother, and a massive, gentle-hearted warrior. Oh, and Bonedog, the company mascot, whose name says it all. He’s a dog resurrected from reassembled bones. If this sounds like somebody’s Dungeons & Dragons campaign, I won’t disagree, and the story has the semi-improvisational feel of a dungeon master trying to wrangle the players back on track.

Kingfisher’s product invites comparison to Tolkein, Michael Moorcock, and Andrzej Sapkowski, writers who mix dreamlike whimsy with painful grown-up realizations. Kingfisher’s characters march against the arrayed ceremony of kingdom and state religion, knowing death is likely, simply because it’s right. Princess Marra doubts herself and, without her companions’ support, would probably back out. But together, they form their own morally succinct counterculture, linked by morality and trust.

Please don’t misunderstand. I’ve deployed terminology from psychology and lit-crit, but one could read Kingfisher’s narrative as a rollicking adventure. Like the best literature, though, the story exists on multiple levels. Kingfisher uses playful genre boilerplates to make her message acceptable. But she also reminds us, in this post-MeToo culture, that “happily ever after” relies on the honor system. If Prince Charming lacks honor, then sisters must stand together.

Other reviews of T. Kingfisher books:
Man You Should’ve Seen Them Kicking Edgar Allan Poe
Secrets Buried in the World’s Darkest Corners
The Sleeper and the Beauty of Dreams