Wednesday, April 1, 2026

The White Privilege Party, Part 2

This essay is a follow-up to Dinner and Drinks at the White Privilege Party.
Striking teachers in the West Virginia statehouse, 2018 (CNN photo)

Political commentators conventionally date the decline of American labor unions to President Reagan mass-firing the PATCO strikers in 1981. But I think the process started much sooner. After peaking in the 1950s, union membership has declined steadily. Though reliable statistics go back to only 1983 (everything prior is estimates and probabilities), union membership rates have halved in that time. This decline has correlated with another powerful social force.

Desegregation.

Ian Haney Lopèz dates union desegregation to 1973, and claims that the battles surrounded seniority. White laborers, Lopèz claims, would rather relinquish all union protections, than surrender the senior standing they achieved under racially biased rules. Tacit within this refusal, though, is the corollary that White workers refused to negotiate alongside Black workers. Too many White workers would rather suffer than see Black people share their protections.

I cannot verify this 1973 date; FDR desegregated defense contractors by executive order during World War II, while Truman desegregated the military in 1948. The American Federation of Labor recognized its first majority-Black union, the Brotherhood of Sleeping Car Porters, in 1925. Union desegregation seems more gradual than abrupt. The point remains, however, that the more inclusive unions became, the more White workers abandoned them.

I’ve begun this essay with labor unions because they’re quantifiable. And of course, correlation doesn’t equal causation; White workers might’ve decided they didn’t need union protection and also that they didn’t want to work alongside Black co-workers coincidentally. But the third prong of the trident, the election of softball racist Ronald Reagan, of “strapping young buck” fame, suggests that racism directed White workers’ economic choices, not vice versa.

This pattern recurs throughout American history. Critics have condemned Nikole Hannah-Jones and her 1619 Project for suggesting the Founding Fathers created America specifically to protect their racial hierarchy. But the fact remains that, after the American Revolution, nine of the thirteen original states, including New York and New Jersey, still practiced slavery. White Americans who talked up liberty and autonomy needed ninety years to fully stop enslaving Black Americans.

And then Jim Crow began.

Bull Connor looses the dogs on protesters in Birmingham, Alabama, on May 3rd, 1963

Nobel Prize-winner Toni Morrison wrote that American values have long valorized individualism and autonomy; but such values have weight only to the extent that they’re denied to some Americans. Me being unfettered only means something while someone else remains restrained. Morrison, a novelist, meant this specifically in literary terms, because in fiction, we can abstract such values to broad moral precepts. But the same principle applies to society writ large.

In today’s America, “peace” doesn’t mean the stability necessary to pursue our physical and spiritual well-being, it means the absence of war. “Wealth” doesn’t mean physical comfort and a full belly, it means the power necessary to employ other people to look after your stuff. “Law” doesn’t mean reliable systems of social order, it means violent crackdowns on nonconformists and the poor. We define our shared values oppositionally.

And, as Morrison writes, we often use race as mental shorthand for this opposition. Sure, sometimes we signify “the other” with other external signs, like hair or piercings. But if White punk rockers want acceptance from the squares, they can shave their Mohawks and remove their tongue studs. Black and Hispanic people can’t stop being Black and Hispanic, and therefore can’t stop being shorthanded as “less than.”

It’s easy, considering American public mythology, to forget that when Dr. King was assassinated in Memphis in 1969, he wasn’t there to mobilize for racial justice. He was there to help unionize the city’s sanitation workers. Sure, those sanitation workers were overwhelmingly Black, but Dr. King had recognized the inextricable bond between American racism and economic injustice. Poverty and Blackness occupy the same headspace in the American imagination.

Concisely put, America organizes itself into in-groups and out-groups, then racializes the groups to simplify remembering who belongs where. The same redlining practices that preserve segregated neighborhoods, have also segregated labor forces. The minute Black people wanted union protections, White workers began embracing myths of radical individualism, even as such individualism left them broke and powerless against billionaire business owners.

Better broke than Black, amirite?

We’re somewhat seeing this rolled back. Black deaths caught on camera have ignited a sense of justice in some White Americans, though not yet enough. But it’s carried its own pushback. Capitalists like Elon Musk and Larry Ellison have sought political power that would’ve made Cornelius Vanderbilt or Andrew Carnegie blush. But it all for the same goal: maintaining the hierarchy of haves and have-nots. Which is, usually, racial.

To Be Concluded

Tuesday, March 31, 2026

Dinner and Drinks at the White Privilege Party

Citizens protest the continued ICE presence in Minnesota, January 2026

What moral obligations do White people have, especially White men, to risk death when protecting the powerless in American society? This question confronted me this week when I answered a social media question. Somebody described her White friend in semi-rural Pennsylvania trying to organize a No Kings protest. She wanted to incorporate conventional anti-fascist songs and chants, but her White co-organizers feared becoming “too confrontational” and alienating their neighbors.

I admit I fumbled my answer. I said something mealy-mouthed about regions where somebody arriving with a Gadsden flag and a gun was entirely likely. Progressives living in broadly conservative areas know that those threatening our organized activities face few consequences. Just last month, in my state, an adult man attacked a line of protesting high school students—literal children—and local prosecutors declined to file charges.

Somebody answered my response by saying I failed to understand the Black American experience. Which, as a White cishet man, I probably do. This respondent pointed out that Black Americans face violent pushback for even the most anodyne protest. In fairness, I’ve shared enough protest space with Black and Hispanic people to have witnessed this firsthand, but like anything merely witnessed, that’s not the same as experience.

In all things, I strive to remain fair and broad-minded, and if I’m wrong, I want to amend my ways. So I’ve thought about this response. I could, if I wanted, mumble something about the limits of social media. Especially on platforms which cap character counts, like Xitter, Threads, and Bluesky, it’s impossible to make nuanced arguments. Conditional experiences, including the Black, queer, disabled, or womanist experience, will get elided.

But that’s merely an excuse. Several deeper issues conspired to reach this moment. First, humans naturally desire to stay alive. My sense of moral outrage at persistent American racism arose because racialized violence contravenes the human desire to survive, and places a sliding scale on human life. If some people deserve to remain alive more than others, autocrats could eventually use that relative deserving against me. Or you.

Accused vigilante Kyle Rittenhouse, captured on a cellphone video

This isn’t new or unique. The “Women and children first” ethic made famous on the RMS Titanic was written after literally no women or children survived the SS Arctic disaster. Men on board wanted to survive so badly that they literally elbowed smaller, weaker women aside. Human-made moral codes like chivalry, law of the sea, and bushido generally arise to restrain powerful people’s tendency to value their lives over others.

Honestly, I don’t have to die for justice. As a White man, I could passively acquiesce, and pay little price. This, then, is the obverse of my respondent’s claim that Black people live with constant threats of violence. If White people can walk away from threats and survive, they will walk away. They won’t persevere if their lives are jeopardized, unless the threat of leaving is worse.

Forgive me bringing up old stuff, but here’s where Kyle Rittenhouse enters the discussion. Even I misunderstood the meaning when it happened, focusing on the red-herring language of “self-defense.” But Kyle Rittenhouse’s real lesson was that, if White people stand up for Black lives, human law won’t protect them. White lives don’t matter if they don’t prostrate themselves before a White political apparatus.

In John’s Gospel, Jesus Christ says: “Greater love hath no man than this, that a man lay down his life for his friends.” (15:13 KJV) But Jesus never says anything unless its opposite flits through the common ethos. If we think morality should result in material reward, then dying for others is a failure. Only when justice is communal, not individual, does dying for the cause become an accomplishment.

And right now, in America’s White spaces, communitarian justice just doesn’t exist. That’s why private-sector labor unions are dying, protests are special occasions, and, as Charles M. Blow writes, White people can abandon racial justice actions when the weather becomes harsh. Community justice, once a shared value in America’s factories and farms, has dwindled as those spaces have become racialized, and White people live in atomized suburbs and single-family homes.

Under such conditions, asking White people to risk death simply because it’s right, is a non-starter. Not until our political, religious, and social leaders reclaim a vision of shared consequence, will that change. Unfortunately, that won’t happen while voters, congregants, and citizens don’t reward it—the ultimate circular reasoning. The feedback loop won’t break until, as the fash inevitably do, they turn against the people who first gave them power.

To Be Continued

Wednesday, March 25, 2026

How To Read the Constitution Like a Scholar

Jill Lepore, We the People: A History of the U.S. Constitution

When the militants behind the American Revolution wanted to build a government, the idea of a “constitution” already existed, but was mainly abstract. European countries like France and Spain derived constitutions from scattered law, tradition, and judicial practice; to this day, the British constitution remains unwritten, and high court proceedings often include debates about what, exactly, the constitution is. America’s Founders pioneered another idea: writing the constitution down.

Harvard historian Jill Lepore has written about the social and political forces which shape American politics. With this volume, she focuses specifically on the forces which shape our Constitution: not only the text itself, but legal interpretations, public debates, and amendment process. Though Lepore doesn’t say it, she tacitly acknowledges that America’s constitution far exceeds the document, comprising also the institutions and handshake conventions created to make the document enforceable.

Americans once loved the constitutional process, Lepore writes. Not only the national, but the local. Ratification of the current Constitution was the subject of lengthy, sometimes combative public discussions, and the original text as written satisfied nobody, though it became the text Americans could live with. Meanwhile, for over two centuries, state constitutional conventions happened, on average, once every eighteen months, and state governments almost aggressively amended themselves.

Then we stopped. America hasn’t seen a state constitutional convention since 1986 and, although the states ratified the 27th Amendment in 1992, it was a procedural asterisk; the federal Constitution hasn’t been meaningfully amended since 1971. Certainly we can’t say that the need for developing institutions has dwindled; if anything, events of the 2010s and 2020s revealed how fatally outdated and unresponsive our Constitution has begun. What caused the change?

Lepore answers that question through the debates which surrounded the original Constitution and its amendments, successful and unsuccessful. The Founders, mostly Enlightenment rationalists, believed government could operate smoothly as a machine if removed from frail human hands, and when the original Articles of Confederation proved unsuccessful, the 1787 Convention proceeded with the attitude of social engineers. Lepore compares the 1787 Convention with concurrent developments in clockwork technology.

Dr. Jill Lepore

Almost immediately, though, Americans began demanding amendments. The original Constitution was almost entirely procedural, and omitted the moral imperatives which drove the Revolution and Shays’ Rebellion. The first Congress wanted to shepherd through a Bill of Rights, but Article V didn’t even include instructions for “correct” amendments: should changes be incorporated into the original text, or tacked on as appendices? Congress chose the latter, after some contention.

As written, the amendment process proved cumbersome. Savvy news and history readers already know this. But Lepore delves into procedural hurdles that well-meaning lawmakers, Left and Right, have faced, and how they overcame them. Sadly, one tool for overcoming intransigence is, apparently, war. After the first twelve amendments ironed out procedural and rights quirks, subsequent amendments have mostly happened in clusters surrounding the Civil War, World War I, and Vietnam.

Despite the Founders’ vision, the state machine didn’t prove immune to human influence. Lepore describes how intervening events, like the Civil War or the annexation of Hawaii, changed the Constitution’s meaning. The text didn’t vary, except where amended, but as circumstances made Americans reevaluate themselves, we also reevaluated our unifying text. America’s political leaders changed their constitutional reading to allow, say, annexing Hawaii whole, which changed our shared identity.

Likewise, powerful people—mostly unelected—changed the Constitution by changing relevant practice. Supreme Court cases like Plessy and Roe read certain interpretations into procedure; Brown and Dobbs read them back out. Philosophies like “Originalism,” which arose in tandem with changing opinions about abortion, created interpretive lenses which courts used to create or abolish rights, until they didn’t. The text hasn’t changed in 55 years, but the Constitution has changed wildly.

Reading this book, I recall constitutional scholar Mary Anne Franks, who compared constitutional adherence to religious fundamentalism. If the Constitution has become holy writ, then Lepore’s telling reads like a history of hermeneutics, the processes of scriptural interpretation. Just as Christians have read and reinterpreted the Bible considering surrounding cultural influences, Americans have reinterpreted the Constitution to reflect the conditions in which our country lives.

This religious comparison isn’t flippant. Late in the book, Lepore writes that nations treat new constitutions as tools, but old constitutions, not just America’s, become venerated. The American Constitution was once esteemed so lightly that the original sheepskin parchment got misplaced; now it’s a sacred relic of state sacrament, hardened against nuclear attack. If Americanism is a religion, then changing hermeneutics deserves serious, almost monastic study.

Wednesday, March 18, 2026

Reading and Thinking in a Paranoid Age

Johann Hari

Yesterday, as I write, I finished reading a book. Once upon a time, this wouldn’t have merited an announcement; I did it as regularly as breathing. But this has become more rare and remarkable, and as a book blogger, I have concrete evidence that this is the first time I closed a book and proclaimed “Finished!” in nearly three months. Not that I haven’t read, but I haven’t followed one book through to the end.

Nor am I alone. Anecdotally, my friends report a massive increase in doomscrolling, perhaps the most passive activity which modernity permits. One sits with a small, pocket-sized computer, flipping listlessly through two or three orphan apps, hoping something jumps out urgently enough to fill the spiritual void we all apparently share. Nothing arrives, of course. But the hope of finding something provides a greater sense of reward than getting up and doing something constructive can.

Johann Hari synopsizes the multi-pronged science behind this decline. Some of it comes innately from just getting older, as it becomes harder to create new synaptic connections. Activities which come easily to youth and young adults, like reading, studying, or handicrafts, just grow harder for adults, and we need to develop discipline enough to overcome this. So yes, if reading and art seem more difficult than when you were small, that isn’t just rosy-eyed nostalgia.

But the problem isn’t wholly internal. Technology critics note that our smartphones, tablets, and other screen technology have addictive qualities. App developers maximize the hypnotic quality of their interfaces, utilizing design principles that make us want to stare. Streaming content on platforms like Netflix and Disney+ have more camera cuts and other jolt moments than the broadcast television I grew up watching, which triggers the reptile brain to keep watching, scanning for further life-saving inputs.

I cringe, though, at the word “addictive.” The concept of addiction gets misused in government PR and middle-school “Just Say No” curricula. Often, to describe something as “addictive” implies almost magical properties, like a cursed object that weakens and destroys its owner. This isn’t so. Not everyone who tries cannabis or cocaine becomes addicted, just as not everyone who fiddles on social media on their phones becomes addicted. Something deeper and more primal happens first.

Dr. Gabor Maté

As addiction specialist Gabor Maté writes, addictions develop under specific circumstances. Some people become addicted after life-shaping traumas: childhood abuse and neglect mold children’s brains in ways that protect them as kids, but are maladaptive in adults. Addicts consume their product to numb their trauma scars. Other addicts have more fleeting issues. The second-leading cause of addiction is loneliness, which addicts can overcome through sociability. For AA participants, spirituality arguably matters less than the meetings.

What, then turns screens addictive? Returning to Hari, he writes that certain life experiences create trauma-like effects on the brain. This includes certain forms of uncertainty, including poverty, homelessness, and war. Many American soldiers notoriously became heroin addicts in Vietnam, then cleared up when they returned to civilian life. I grew up believing that people became homeless because they were addicts, but that’s backward; they become addicts because they’re homeless. Substances take the fear away.

America’s economy has created unprecedented prosperity, but hasn’t distributed it equitably. Elon Musk is currently angling to become our first trillionaire, while uncountable underemployed Americans rely on multiple part-time jobs and gig work to stay afloat. I’ve bounced among short-term jobs for three years, often panicking to keep rent paid and lights on. When I pause between hustles, that allows time for thoughts to emerge, reminding me of every bill about to go into arrears.

Hari and Maté agree that such uncertainty warps the brain. In conditions of constant fear, the limbic system, and especially the amygdala, gets bigger, while the prefrontal cortex withers. A larger amygdala makes us highly reactive, even downright paranoid. An atrophied cortex means less self-discipline, and just as importantly, less ability to empathize with strangers. Both of these make us too impatient for the detail work and contemplative pace of reading or of creating art.

Uncertainty and paranoia have become our standard of life. Not just economic uncertainty, but street violence and wars of choice permeate our daily lives. This results in a population more primed for fear, snap reactions, and restlessness. Into that circumstance, streaming TV media increasingly gives us very loud, aggressive, juddery content that sates our need for stimulation. Something as sedate as reading or listening to classical music seems quaint. So no, it’s not just me.

Tuesday, March 17, 2026

T. Kingfisher and the Kingdom of Free Women

T. Kingfisher, Nettle & Bone

Princess Marra of the Harbor Kingdom is a spare daughter, never to inherit, whose only hope for advancement is to wed a prince, someday. Until then, she’s foisted onto a provincial convent while her older sister gets the prestigious marriage. But she discovers the truth: her sister is a political pawn, abused and terrified, reduced to a walking shadow. Naturally, Marra decides to organize a campaign to assassinate the patriarchy.

In the last year, I’ve become a massive fan of T. Kingfisher’s novellas. She channels classic literature and folklore, refashioning the background noise of our dreams into insightful dark fantasy. This is Kingfisher’s first full-length novel I’ve read, and instead of remaking a specific story, she uses images cherry-picked extensively from Grimm’s Fairy Tales. The product turns childhood mythology into a grown-up fable of power, resistance, and self-reliance.

Marra’s story begins when she’s already past thirty. She chastises herself for being an adult and still believing the legendry of “happy ever after.” Her sister’s marriage to a handsome prince, solemnized by a literal fairy godmother, has proven disastrous. Perhaps Marra’s awakening comes late, but it nevertheless comes. So she leaves the religious cloister and begins walking, seeking the magical assistance that will help her liberate her family.

Psychologist Bruno Bettelheim wrote that, in the realm of fairytale, the bond between siblings matters more than that between spouses. That certainly applies here. Marra and Kamia had a contentious relationship a children, but as adults, their mutual trust and self-reliance gives them strength when faced with duplicitous adulthood. Kingfisher’s narrative maps so perfectly onto Bettelheim’s Jungian prototype that it’s tempting to psychoanalyze her story.

However, this is a false temptation. Kingfisher creates a dreamlike atmosphere, appropriately devoid of proper nouns. Many characters are identified only by their roles: the king, Sister Apothecary, the dust-wife. When characters merit names, it’s only first names, usually Anglo-Saxon: Marra, Agnes, Fenris, Miss Margaret. Even countries have names like the Harbor Kingdom and the Northern Kingdom. (One country has a name, but it’s distant and half-mythic, like Avalon.)

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Characters and places lack names, here, because they belong only to a stage in Marra’s life. Bettelheim’s map of fairytale describes children transitioning into adulthood, with accompanying adult roles, like marriage and family. But Kingfisher describes a subsequent transition, where adults finally shed the conditioning of childhood storytelling. Princess Marra was first conditioned by the royal court, then by the convent. Now she must at last become herself.

Prince Vorling of the Northern Kingdom, Marra’s brother-in-law, is indeed handsome and charming. He’s also violent, domineering, and jealous. He maintains power, over both his kingdom and his family, through exaggerated displays of male swagger, and he sacrifices all relationships to maintaining the illusion of control. He truly desires to be a fairy-tale prince, and he’ll brook no intrusion on that story from annoying human foibles.

Therefore, Marra literally walks away from her society’s twin institutions of power: the royal court and religion. She spent over thirty years appeasing the dual threats of state violence and eternal judgement. Now she must obey the only instrument more true than either the kingdom or the gods, her own conscience. If that means striking a dagger to the power structures of two kingdoms, well, so be it.

Along the way, she assembles her company: the dust-wife (a vaguely defined sorceress), her mousey fairy godmother, and a massive, gentle-hearted warrior. Oh, and Bonedog, the company mascot, whose name says it all. He’s a dog resurrected from reassembled bones. If this sounds like somebody’s Dungeons & Dragons campaign, I won’t disagree, and the story has the semi-improvisational feel of a dungeon master trying to wrangle the players back on track.

Kingfisher’s product invites comparison to Tolkein, Michael Moorcock, and Andrzej Sapkowski, writers who mix dreamlike whimsy with painful grown-up realizations. Kingfisher’s characters march against the arrayed ceremony of kingdom and state religion, knowing death is likely, simply because it’s right. Princess Marra doubts herself and, without her companions’ support, would probably back out. But together, they form their own morally succinct counterculture, linked by morality and trust.

Please don’t misunderstand. I’ve deployed terminology from psychology and lit-crit, but one could read Kingfisher’s narrative as a rollicking adventure. Like the best literature, though, the story exists on multiple levels. Kingfisher uses playful genre boilerplates to make her message acceptable. But she also reminds us, in this post-MeToo culture, that “happily ever after” relies on the honor system. If Prince Charming lacks honor, then sisters must stand together.

Other reviews of T. Kingfisher books:
Man You Should’ve Seen Them Kicking Edgar Allan Poe
Secrets Buried in the World’s Darkest Corners
The Sleeper and the Beauty of Dreams