Thursday, April 2, 2026

The White Privilege Party, Part 3

This essay is a follow-up to Dinner and Drinks at the White Privilege Party and The White Privilege Party, Part 2.
Woody Guthrie

If the fash hate one thing, it’s being called fash. Or even told they’ve done something fashy. Even when faced with their overwhelming fascism, or with subject experts like Timothy Snyder or Jason Stanley demonstrating their fashy tendencies, they become angry and defensive. President Taco’s claim to be “the least racist person there is” has become the tragicomic emblem fascists’ need to be seen as nevertheless good.

Returning this series to where it began, the question remains of whether protestors should use confrontational chants while challenging the current administration. Specifically, whether they should use Woody Guthrie-type songs to call fascists “fascist” to their faces. In conservative, semi-rural, and racially homogenous places, such boldness will precipitate conflicts, which discourages White protestors from getting involved.

“Fascism” is a notoriously slippery concept, since it adapts itself to local conditions. Snyder and Stanley have useful, but often inconsistent, definitions. For our purposes, let’s define fascism as the hardened and intolerant extreme of the hierarchy I described last time. Fascism not only requires some people to remain powerless for others to have powerful, and divides power racially, but enforces this mandatory division through arbitrary violence.

The history of hierarchical violence reveals something remarkable. As theologian James Cone writes, Jim Crow racial violence didn’t happen to kill the targets. It happened to remind survivors that the perpetrators would face no consequences, because they owned the system. Likewise, the Roman church didn’t burn witches and heretics to force conversions in early Christendom. It only burned nonconformists in the Renaissance, once its political power was unquestionable.

Put briefly, hierarchical violence happens when perpetrators know they’ll face no meaningful punishment. In my lifetime, Kyle Rittenhouse, George Zimmerman, and Bernard Goetz knew or suspected that the racially slanted justice system wouldn’t hold them accountable for shooting Black people or their White sympathizers. So they strapped on guns and went hunting on American streets.

We’ve watched “red states” legalize driving cars into protestors. We’ve watched them refuse to prosecute bullies attacking children. We’ve watched the current administration target harmless dissidents on camera, knowing they won’t be prosecuted, or even meaningfully reprimanded. The deferral of each consequence basically ensures that the next street-level fash will feel authorized to attack, maybe even to kill.

Equally importantly, perpetrators don’t see themselves as villains in this arrangement. Fashy narratives reinforce the belief that hierarchies are necessary, and therefore equality is oppressive. Any attempt to fix unfairness is innately unfair to those who benefit, or think they do. Therefore those protected by the status quo, even the poor and forgotten, are too likely to violently defend what dwindling privilege they have.

The term “extinction burst” has become modish recently. Once you remove reinforcement from previously rewarded behavior, the behavior becomes more extreme and calcified before it disappears. Recent discussions spotlight violence specifically, as America’s overall culture no longer rewards racism, homophobia, and other bigotry as openly as before. But that exact change puts protestors in conservative areas at greater risk.

Please don’t misunderstand, I know these forces are contradictory. People are violent because they know nobody will hold them accountable, but they know nobody will hold them accountable in the exact places where their dying ideology still matters. Florida, which legalized driving cars into protests, has one of America’s oldest median resident ages. Nebraska, where prosecutors won’t charge men who attack kids, remains substantially isolated from the larger economy.

This paradox underlies Critical Race Theory. CRT founder Derrick Bell claimed, with evidence, that racism has proven infinitely elastic as its successive justifications become obsolete. Violent economic necessity justified slavery, but morphed into organized bigotry under Jim Crow. Once the state withdrew support, bigotry became disorganized, like background noise. With each morph, the system excommunicates its former defenders.

The three vigilantes I named—Rittenhouse, Zimmerman, and Goetz—all retreated into anonymity after their acquittals, and became parodies of their prior selves, because their persons didn’t matter. They claimed “self-defense,” but their selves were an afterthought. Their supporters abandoned them because once they bolstered the narrative that White (or White-adjacent) people owned the system, that system no longer needed them.

White progressives fear angering the fash by calling them fashy to their faces, not only because fashies are violent, but because they’re as much displaced by the cultural shifts happening around them as the conservatives are. They’ll hang onto their illusions that they can persuade the fash, because the alternative is plunging headlong into uncertainty. The old system is dying, and to those accustomed to winning, that’s terrifying.

Wednesday, April 1, 2026

The White Privilege Party, Part 2

This essay is a follow-up to Dinner and Drinks at the White Privilege Party.
Striking teachers in the West Virginia statehouse, 2018 (CNN photo)

Political commentators conventionally date the decline of American labor unions to President Reagan mass-firing the PATCO strikers in 1981. But I think the process started much sooner. After peaking in the 1950s, union membership has declined steadily. Though reliable statistics go back to only 1983 (everything prior is estimates and probabilities), union membership rates have halved in that time. This decline has correlated with another powerful social force.

Desegregation.

Ian Haney Lopèz dates union desegregation to 1973, and claims that the battles surrounded seniority. White laborers, Lopèz claims, would rather relinquish all union protections, than surrender the senior standing they achieved under racially biased rules. Tacit within this refusal, though, is the corollary that White workers refused to negotiate alongside Black workers. Too many White workers would rather suffer than see Black people share their protections.

I cannot verify this 1973 date; FDR desegregated defense contractors by executive order during World War II, while Truman desegregated the military in 1948. The American Federation of Labor recognized its first majority-Black union, the Brotherhood of Sleeping Car Porters, in 1925. Union desegregation seems more gradual than abrupt. The point remains, however, that the more inclusive unions became, the more White workers abandoned them.

I’ve begun this essay with labor unions because they’re quantifiable. And of course, correlation doesn’t equal causation; White workers might’ve decided they didn’t need union protection and also that they didn’t want to work alongside Black co-workers coincidentally. But the third prong of the trident, the election of softball racist Ronald Reagan, of “strapping young buck” fame, suggests that racism directed White workers’ economic choices, not vice versa.

This pattern recurs throughout American history. Critics have condemned Nikole Hannah-Jones and her 1619 Project for suggesting the Founding Fathers created America specifically to protect their racial hierarchy. But the fact remains that, after the American Revolution, nine of the thirteen original states, including New York and New Jersey, still practiced slavery. White Americans who talked up liberty and autonomy needed ninety years to fully stop enslaving Black Americans.

And then Jim Crow began.

Bull Connor looses the dogs on protesters in Birmingham, Alabama, on May 3rd, 1963

Nobel Prize-winner Toni Morrison wrote that American values have long valorized individualism and autonomy; but such values have weight only to the extent that they’re denied to some Americans. Me being unfettered only means something while someone else remains restrained. Morrison, a novelist, meant this specifically in literary terms, because in fiction, we can abstract such values to broad moral precepts. But the same principle applies to society writ large.

In today’s America, “peace” doesn’t mean the stability necessary to pursue our physical and spiritual well-being, it means the absence of war. “Wealth” doesn’t mean physical comfort and a full belly, it means the power necessary to employ other people to look after your stuff. “Law” doesn’t mean reliable systems of social order, it means violent crackdowns on nonconformists and the poor. We define our shared values oppositionally.

And, as Morrison writes, we often use race as mental shorthand for this opposition. Sure, sometimes we signify “the other” with other external signs, like hair or piercings. But if White punk rockers want acceptance from the squares, they can shave their Mohawks and remove their tongue studs. Black and Hispanic people can’t stop being Black and Hispanic, and therefore can’t stop being shorthanded as “less than.”

It’s easy, considering American public mythology, to forget that when Dr. King was assassinated in Memphis in 1969, he wasn’t there to mobilize for racial justice. He was there to help unionize the city’s sanitation workers. Sure, those sanitation workers were overwhelmingly Black, but Dr. King had recognized the inextricable bond between American racism and economic injustice. Poverty and Blackness occupy the same headspace in the American imagination.

Concisely put, America organizes itself into in-groups and out-groups, then racializes the groups to simplify remembering who belongs where. The same redlining practices that preserve segregated neighborhoods, have also segregated labor forces. The minute Black people wanted union protections, White workers began embracing myths of radical individualism, even as such individualism left them broke and powerless against billionaire business owners.

Better broke than Black, amirite?

We’re somewhat seeing this rolled back. Black deaths caught on camera have ignited a sense of justice in some White Americans, though not yet enough. But it’s carried its own pushback. Capitalists like Elon Musk and Larry Ellison have sought political power that would’ve made Cornelius Vanderbilt or Andrew Carnegie blush. But it all for the same goal: maintaining the hierarchy of haves and have-nots. Which is, usually, racial.

Concluded in The White Privilege Party, Part 3

Tuesday, March 31, 2026

Dinner and Drinks at the White Privilege Party

Citizens protest the continued ICE presence in Minnesota, January 2026

What moral obligations do White people have, especially White men, to risk death when protecting the powerless in American society? This question confronted me this week when I answered a social media question. Somebody described her White friend in semi-rural Pennsylvania trying to organize a No Kings protest. She wanted to incorporate conventional anti-fascist songs and chants, but her White co-organizers feared becoming “too confrontational” and alienating their neighbors.

I admit I fumbled my answer. I said something mealy-mouthed about regions where somebody arriving with a Gadsden flag and a gun was entirely likely. Progressives living in broadly conservative areas know that those threatening our organized activities face few consequences. Just last month, in my state, an adult man attacked a line of protesting high school students—literal children—and local prosecutors declined to file charges.

Somebody answered my response by saying I failed to understand the Black American experience. Which, as a White cishet man, I probably do. This respondent pointed out that Black Americans face violent pushback for even the most anodyne protest. In fairness, I’ve shared enough protest space with Black and Hispanic people to have witnessed this firsthand, but like anything merely witnessed, that’s not the same as experience.

In all things, I strive to remain fair and broad-minded, and if I’m wrong, I want to amend my ways. So I’ve thought about this response. I could, if I wanted, mumble something about the limits of social media. Especially on platforms which cap character counts, like Xitter, Threads, and Bluesky, it’s impossible to make nuanced arguments. Conditional experiences, including the Black, queer, disabled, or womanist experience, will get elided.

But that’s merely an excuse. Several deeper issues conspired to reach this moment. First, humans naturally desire to stay alive. My sense of moral outrage at persistent American racism arose because racialized violence contravenes the human desire to survive, and places a sliding scale on human life. If some people deserve to remain alive more than others, autocrats could eventually use that relative deserving against me. Or you.

Accused vigilante Kyle Rittenhouse, captured on a cellphone video

This isn’t new or unique. The “Women and children first” ethic made famous on the RMS Titanic was written after literally no women or children survived the SS Arctic disaster. Men on board wanted to survive so badly that they literally elbowed smaller, weaker women aside. Human-made moral codes like chivalry, law of the sea, and bushido generally arise to restrain powerful people’s tendency to value their lives over others.

Honestly, I don’t have to die for justice. As a White man, I could passively acquiesce, and pay little price. This, then, is the obverse of my respondent’s claim that Black people live with constant threats of violence. If White people can walk away from threats and survive, they will walk away. They won’t persevere if their lives are jeopardized, unless the threat of leaving is worse.

Forgive me bringing up old stuff, but here’s where Kyle Rittenhouse enters the discussion. Even I misunderstood the meaning when it happened, focusing on the red-herring language of “self-defense.” But Kyle Rittenhouse’s real lesson was that, if White people stand up for Black lives, human law won’t protect them. White lives don’t matter if they don’t prostrate themselves before a White political apparatus.

In John’s Gospel, Jesus Christ says: “Greater love hath no man than this, that a man lay down his life for his friends.” (15:13 KJV) But Jesus never says anything unless its opposite flits through the common ethos. If we think morality should result in material reward, then dying for others is a failure. Only when justice is communal, not individual, does dying for the cause become an accomplishment.

And right now, in America’s White spaces, communitarian justice just doesn’t exist. That’s why private-sector labor unions are dying, protests are special occasions, and, as Charles M. Blow writes, White people can abandon racial justice actions when the weather becomes harsh. Community justice, once a shared value in America’s factories and farms, has dwindled as those spaces have become racialized, and White people live in atomized suburbs and single-family homes.

Under such conditions, asking White people to risk death simply because it’s right, is a non-starter. Not until our political, religious, and social leaders reclaim a vision of shared consequence, will that change. Unfortunately, that won’t happen while voters, congregants, and citizens don’t reward it—the ultimate circular reasoning. The feedback loop won’t break until, as the fash inevitably do, they turn against the people who first gave them power.

Continued in The White Privilege Party, Part 2 and The White Privilege Party, Part 3

Wednesday, March 25, 2026

How To Read the Constitution Like a Scholar

Jill Lepore, We the People: A History of the U.S. Constitution

When the militants behind the American Revolution wanted to build a government, the idea of a “constitution” already existed, but was mainly abstract. European countries like France and Spain derived constitutions from scattered law, tradition, and judicial practice; to this day, the British constitution remains unwritten, and high court proceedings often include debates about what, exactly, the constitution is. America’s Founders pioneered another idea: writing the constitution down.

Harvard historian Jill Lepore has written about the social and political forces which shape American politics. With this volume, she focuses specifically on the forces which shape our Constitution: not only the text itself, but legal interpretations, public debates, and amendment process. Though Lepore doesn’t say it, she tacitly acknowledges that America’s constitution far exceeds the document, comprising also the institutions and handshake conventions created to make the document enforceable.

Americans once loved the constitutional process, Lepore writes. Not only the national, but the local. Ratification of the current Constitution was the subject of lengthy, sometimes combative public discussions, and the original text as written satisfied nobody, though it became the text Americans could live with. Meanwhile, for over two centuries, state constitutional conventions happened, on average, once every eighteen months, and state governments almost aggressively amended themselves.

Then we stopped. America hasn’t seen a state constitutional convention since 1986 and, although the states ratified the 27th Amendment in 1992, it was a procedural asterisk; the federal Constitution hasn’t been meaningfully amended since 1971. Certainly we can’t say that the need for developing institutions has dwindled; if anything, events of the 2010s and 2020s revealed how fatally outdated and unresponsive our Constitution has begun. What caused the change?

Lepore answers that question through the debates which surrounded the original Constitution and its amendments, successful and unsuccessful. The Founders, mostly Enlightenment rationalists, believed government could operate smoothly as a machine if removed from frail human hands, and when the original Articles of Confederation proved unsuccessful, the 1787 Convention proceeded with the attitude of social engineers. Lepore compares the 1787 Convention with concurrent developments in clockwork technology.

Dr. Jill Lepore

Almost immediately, though, Americans began demanding amendments. The original Constitution was almost entirely procedural, and omitted the moral imperatives which drove the Revolution and Shays’ Rebellion. The first Congress wanted to shepherd through a Bill of Rights, but Article V didn’t even include instructions for “correct” amendments: should changes be incorporated into the original text, or tacked on as appendices? Congress chose the latter, after some contention.

As written, the amendment process proved cumbersome. Savvy news and history readers already know this. But Lepore delves into procedural hurdles that well-meaning lawmakers, Left and Right, have faced, and how they overcame them. Sadly, one tool for overcoming intransigence is, apparently, war. After the first twelve amendments ironed out procedural and rights quirks, subsequent amendments have mostly happened in clusters surrounding the Civil War, World War I, and Vietnam.

Despite the Founders’ vision, the state machine didn’t prove immune to human influence. Lepore describes how intervening events, like the Civil War or the annexation of Hawaii, changed the Constitution’s meaning. The text didn’t vary, except where amended, but as circumstances made Americans reevaluate themselves, we also reevaluated our unifying text. America’s political leaders changed their constitutional reading to allow, say, annexing Hawaii whole, which changed our shared identity.

Likewise, powerful people—mostly unelected—changed the Constitution by changing relevant practice. Supreme Court cases like Plessy and Roe read certain interpretations into procedure; Brown and Dobbs read them back out. Philosophies like “Originalism,” which arose in tandem with changing opinions about abortion, created interpretive lenses which courts used to create or abolish rights, until they didn’t. The text hasn’t changed in 55 years, but the Constitution has changed wildly.

Reading this book, I recall constitutional scholar Mary Anne Franks, who compared constitutional adherence to religious fundamentalism. If the Constitution has become holy writ, then Lepore’s telling reads like a history of hermeneutics, the processes of scriptural interpretation. Just as Christians have read and reinterpreted the Bible considering surrounding cultural influences, Americans have reinterpreted the Constitution to reflect the conditions in which our country lives.

This religious comparison isn’t flippant. Late in the book, Lepore writes that nations treat new constitutions as tools, but old constitutions, not just America’s, become venerated. The American Constitution was once esteemed so lightly that the original sheepskin parchment got misplaced; now it’s a sacred relic of state sacrament, hardened against nuclear attack. If Americanism is a religion, then changing hermeneutics deserves serious, almost monastic study.

Wednesday, March 18, 2026

Reading and Thinking in a Paranoid Age

Johann Hari

Yesterday, as I write, I finished reading a book. Once upon a time, this wouldn’t have merited an announcement; I did it as regularly as breathing. But this has become more rare and remarkable, and as a book blogger, I have concrete evidence that this is the first time I closed a book and proclaimed “Finished!” in nearly three months. Not that I haven’t read, but I haven’t followed one book through to the end.

Nor am I alone. Anecdotally, my friends report a massive increase in doomscrolling, perhaps the most passive activity which modernity permits. One sits with a small, pocket-sized computer, flipping listlessly through two or three orphan apps, hoping something jumps out urgently enough to fill the spiritual void we all apparently share. Nothing arrives, of course. But the hope of finding something provides a greater sense of reward than getting up and doing something constructive can.

Johann Hari synopsizes the multi-pronged science behind this decline. Some of it comes innately from just getting older, as it becomes harder to create new synaptic connections. Activities which come easily to youth and young adults, like reading, studying, or handicrafts, just grow harder for adults, and we need to develop discipline enough to overcome this. So yes, if reading and art seem more difficult than when you were small, that isn’t just rosy-eyed nostalgia.

But the problem isn’t wholly internal. Technology critics note that our smartphones, tablets, and other screen technology have addictive qualities. App developers maximize the hypnotic quality of their interfaces, utilizing design principles that make us want to stare. Streaming content on platforms like Netflix and Disney+ have more camera cuts and other jolt moments than the broadcast television I grew up watching, which triggers the reptile brain to keep watching, scanning for further life-saving inputs.

I cringe, though, at the word “addictive.” The concept of addiction gets misused in government PR and middle-school “Just Say No” curricula. Often, to describe something as “addictive” implies almost magical properties, like a cursed object that weakens and destroys its owner. This isn’t so. Not everyone who tries cannabis or cocaine becomes addicted, just as not everyone who fiddles on social media on their phones becomes addicted. Something deeper and more primal happens first.

Dr. Gabor Maté

As addiction specialist Gabor Maté writes, addictions develop under specific circumstances. Some people become addicted after life-shaping traumas: childhood abuse and neglect mold children’s brains in ways that protect them as kids, but are maladaptive in adults. Addicts consume their product to numb their trauma scars. Other addicts have more fleeting issues. The second-leading cause of addiction is loneliness, which addicts can overcome through sociability. For AA participants, spirituality arguably matters less than the meetings.

What, then turns screens addictive? Returning to Hari, he writes that certain life experiences create trauma-like effects on the brain. This includes certain forms of uncertainty, including poverty, homelessness, and war. Many American soldiers notoriously became heroin addicts in Vietnam, then cleared up when they returned to civilian life. I grew up believing that people became homeless because they were addicts, but that’s backward; they become addicts because they’re homeless. Substances take the fear away.

America’s economy has created unprecedented prosperity, but hasn’t distributed it equitably. Elon Musk is currently angling to become our first trillionaire, while uncountable underemployed Americans rely on multiple part-time jobs and gig work to stay afloat. I’ve bounced among short-term jobs for three years, often panicking to keep rent paid and lights on. When I pause between hustles, that allows time for thoughts to emerge, reminding me of every bill about to go into arrears.

Hari and Maté agree that such uncertainty warps the brain. In conditions of constant fear, the limbic system, and especially the amygdala, gets bigger, while the prefrontal cortex withers. A larger amygdala makes us highly reactive, even downright paranoid. An atrophied cortex means less self-discipline, and just as importantly, less ability to empathize with strangers. Both of these make us too impatient for the detail work and contemplative pace of reading or of creating art.

Uncertainty and paranoia have become our standard of life. Not just economic uncertainty, but street violence and wars of choice permeate our daily lives. This results in a population more primed for fear, snap reactions, and restlessness. Into that circumstance, streaming TV media increasingly gives us very loud, aggressive, juddery content that sates our need for stimulation. Something as sedate as reading or listening to classical music seems quaint. So no, it’s not just me.