Wednesday, March 25, 2026

How To Read the Constitution Like a Scholar

Jill Lepore, We the People: A History of the U.S. Constitution

When the militants behind the American Revolution wanted to build a government, the idea of a “constitution” already existed, but was mainly abstract. European countries like France and Spain derived constitutions from scattered law, tradition, and judicial practice; to this day, the British constitution remains unwritten, and high court proceedings often include debates about what, exactly, the constitution is. America’s Founders pioneered another idea: writing the constitution down.

Harvard historian Jill Lepore has written about the social and political forces which shape American politics. With this volume, she focuses specifically on the forces which shape our Constitution: not only the text itself, but legal interpretations, public debates, and amendment process. Though Lepore doesn’t say it, she tacitly acknowledges that America’s constitution far exceeds the document, comprising also the institutions and handshake conventions created to make the document enforceable.

Americans once loved the constitutional process, Lepore writes. Not only the national, but the local. Ratification of the current Constitution was the subject of lengthy, sometimes combative public discussions, and the original text as written satisfied nobody, though it became the text Americans could live with. Meanwhile, for over two centuries, state constitutional conventions happened, on average, once every eighteen months, and state governments almost aggressively amended themselves.

Then we stopped. America hasn’t seen a state constitutional convention since 1986 and, although the states ratified the 27th Amendment in 1992, it was a procedural asterisk; the federal Constitution hasn’t been meaningfully amended since 1971. Certainly we can’t say that the need for developing institutions has dwindled; if anything, events of the 2010s and 2020s revealed how fatally outdated and unresponsive our Constitution has begun. What caused the change?

Lepore answers that question through the debates which surrounded the original Constitution and its amendments, successful and unsuccessful. The Founders, mostly Enlightenment rationalists, believed government could operate smoothly as a machine if removed from frail human hands, and when the original Articles of Confederation proved unsuccessful, the 1787 Convention proceeded with the attitude of social engineers. Lepore compares the 1787 Convention with concurrent developments in clockwork technology.

Dr. Jill Lepore

Almost immediately, though, Americans began demanding amendments. The original Constitution was almost entirely procedural, and omitted the moral imperatives which drove the Revolution and Shays’ Rebellion. The first Congress wanted to shepherd through a Bill of Rights, but Article V didn’t even include instructions for “correct” amendments: should changes be incorporated into the original text, or tacked on as appendices? Congress chose the latter, after some contention.

As written, the amendment process proved cumbersome. Savvy news and history readers already know this. But Lepore delves into procedural hurdles that well-meaning lawmakers, Left and Right, have faced, and how they overcame them. Sadly, one tool for overcoming intransigence is, apparently, war. After the first twelve amendments ironed out procedural and rights quirks, subsequent amendments have mostly happened in clusters surrounding the Civil War, World War I, and Vietnam.

Despite the Founders’ vision, the state machine didn’t prove immune to human influence. Lepore describes how intervening events, like the Civil War or the annexation of Hawaii, changed the Constitution’s meaning. The text didn’t vary, except where amended, but as circumstances made Americans reevaluate themselves, we also reevaluated our unifying text. America’s political leaders changed their constitutional reading to allow, say, annexing Hawaii whole, which changed our shared identity.

Likewise, powerful people—mostly unelected—changed the Constitution by changing relevant practice. Supreme Court cases like Plessy and Roe read certain interpretations into procedure; Brown and Dobbs read them back out. Philosophies like “Originalism,” which arose in tandem with changing opinions about abortion, created interpretive lenses which courts used to create or abolish rights, until they didn’t. The text hasn’t changed in 55 years, but the Constitution has changed wildly.

Reading this book, I recall constitutional scholar Mary Anne Franks, who compared constitutional adherence to religious fundamentalism. If the Constitution has become holy writ, then Lepore’s telling reads like a history of hermeneutics, the processes of scriptural interpretation. Just as Christians have read and reinterpreted the Bible considering surrounding cultural influences, Americans have reinterpreted the Constitution to reflect the conditions in which our country lives.

This religious comparison isn’t flippant. Late in the book, Lepore writes that nations treat new constitutions as tools, but old constitutions, not just America’s, become venerated. The American Constitution was once esteemed so lightly that the original sheepskin parchment got misplaced; now it’s a sacred relic of state sacrament, hardened against nuclear attack. If Americanism is a religion, then changing hermeneutics deserves serious, almost monastic study.

Wednesday, March 18, 2026

Reading and Thinking in a Paranoid Age

Johann Hari

Yesterday, as I write, I finished reading a book. Once upon a time, this wouldn’t have merited an announcement; I did it as regularly as breathing. But this has become more rare and remarkable, and as a book blogger, I have concrete evidence that this is the first time I closed a book and proclaimed “Finished!” in nearly three months. Not that I haven’t read, but I haven’t followed one book through to the end.

Nor am I alone. Anecdotally, my friends report a massive increase in doomscrolling, perhaps the most passive activity which modernity permits. One sits with a small, pocket-sized computer, flipping listlessly through two or three orphan apps, hoping something jumps out urgently enough to fill the spiritual void we all apparently share. Nothing arrives, of course. But the hope of finding something provides a greater sense of reward than getting up and doing something constructive can.

Johann Hari synopsizes the multi-pronged science behind this decline. Some of it comes innately from just getting older, as it becomes harder to create new synaptic connections. Activities which come easily to youth and young adults, like reading, studying, or handicrafts, just grow harder for adults, and we need to develop discipline enough to overcome this. So yes, if reading and art seem more difficult than when you were small, that isn’t just rosy-eyed nostalgia.

But the problem isn’t wholly internal. Technology critics note that our smartphones, tablets, and other screen technology have addictive qualities. App developers maximize the hypnotic quality of their interfaces, utilizing design principles that make us want to stare. Streaming content on platforms like Netflix and Disney+ have more camera cuts and other jolt moments than the broadcast television I grew up watching, which triggers the reptile brain to keep watching, scanning for further life-saving inputs.

I cringe, though, at the word “addictive.” The concept of addiction gets misused in government PR and middle-school “Just Say No” curricula. Often, to describe something as “addictive” implies almost magical properties, like a cursed object that weakens and destroys its owner. This isn’t so. Not everyone who tries cannabis or cocaine becomes addicted, just as not everyone who fiddles on social media on their phones becomes addicted. Something deeper and more primal happens first.

Dr. Gabor Maté

As addiction specialist Gabor Maté writes, addictions develop under specific circumstances. Some people become addicted after life-shaping traumas: childhood abuse and neglect mold children’s brains in ways that protect them as kids, but are maladaptive in adults. Addicts consume their product to numb their trauma scars. Other addicts have more fleeting issues. The second-leading cause of addiction is loneliness, which addicts can overcome through sociability. For AA participants, spirituality arguably matters less than the meetings.

What, then turns screens addictive? Returning to Hari, he writes that certain life experiences create trauma-like effects on the brain. This includes certain forms of uncertainty, including poverty, homelessness, and war. Many American soldiers notoriously became heroin addicts in Vietnam, then cleared up when they returned to civilian life. I grew up believing that people became homeless because they were addicts, but that’s backward; they become addicts because they’re homeless. Substances take the fear away.

America’s economy has created unprecedented prosperity, but hasn’t distributed it equitably. Elon Musk is currently angling to become our first trillionaire, while uncountable underemployed Americans rely on multiple part-time jobs and gig work to stay afloat. I’ve bounced among short-term jobs for three years, often panicking to keep rent paid and lights on. When I pause between hustles, that allows time for thoughts to emerge, reminding me of every bill about to go into arrears.

Hari and Maté agree that such uncertainty warps the brain. In conditions of constant fear, the limbic system, and especially the amygdala, gets bigger, while the prefrontal cortex withers. A larger amygdala makes us highly reactive, even downright paranoid. An atrophied cortex means less self-discipline, and just as importantly, less ability to empathize with strangers. Both of these make us too impatient for the detail work and contemplative pace of reading or of creating art.

Uncertainty and paranoia have become our standard of life. Not just economic uncertainty, but street violence and wars of choice permeate our daily lives. This results in a population more primed for fear, snap reactions, and restlessness. Into that circumstance, streaming TV media increasingly gives us very loud, aggressive, juddery content that sates our need for stimulation. Something as sedate as reading or listening to classical music seems quaint. So no, it’s not just me.

Tuesday, March 17, 2026

T. Kingfisher and the Kingdom of Free Women

T. Kingfisher, Nettle & Bone

Princess Marra of the Harbor Kingdom is a spare daughter, never to inherit, whose only hope for advancement is to wed a prince, someday. Until then, she’s foisted onto a provincial convent while her older sister gets the prestigious marriage. But she discovers the truth: her sister is a political pawn, abused and terrified, reduced to a walking shadow. Naturally, Marra decides to organize a campaign to assassinate the patriarchy.

In the last year, I’ve become a massive fan of T. Kingfisher’s novellas. She channels classic literature and folklore, refashioning the background noise of our dreams into insightful dark fantasy. This is Kingfisher’s first full-length novel I’ve read, and instead of remaking a specific story, she uses images cherry-picked extensively from Grimm’s Fairy Tales. The product turns childhood mythology into a grown-up fable of power, resistance, and self-reliance.

Marra’s story begins when she’s already past thirty. She chastises herself for being an adult and still believing the legendry of “happy ever after.” Her sister’s marriage to a handsome prince, solemnized by a literal fairy godmother, has proven disastrous. Perhaps Marra’s awakening comes late, but it nevertheless comes. So she leaves the religious cloister and begins walking, seeking the magical assistance that will help her liberate her family.

Psychologist Bruno Bettelheim wrote that, in the realm of fairytale, the bond between siblings matters more than that between spouses. That certainly applies here. Marra and Kamia had a contentious relationship a children, but as adults, their mutual trust and self-reliance gives them strength when faced with duplicitous adulthood. Kingfisher’s narrative maps so perfectly onto Bettelheim’s Jungian prototype that it’s tempting to psychoanalyze her story.

However, this is a false temptation. Kingfisher creates a dreamlike atmosphere, appropriately devoid of proper nouns. Many characters are identified only by their roles: the king, Sister Apothecary, the dust-wife. When characters merit names, it’s only first names, usually Anglo-Saxon: Marra, Agnes, Fenris, Miss Margaret. Even countries have names like the Harbor Kingdom and the Northern Kingdom. (One country has a name, but it’s distant and half-mythic, like Avalon.)

T. Kingfisher (a known and public
pseudonym for Ursula Vernon)

Characters and places lack names, here, because they belong only to a stage in Marra’s life. Bettelheim’s map of fairytale describes children transitioning into adulthood, with accompanying adult roles, like marriage and family. But Kingfisher describes a subsequent transition, where adults finally shed the conditioning of childhood storytelling. Princess Marra was first conditioned by the royal court, then by the convent. Now she must at last become herself.

Prince Vorling of the Northern Kingdom, Marra’s brother-in-law, is indeed handsome and charming. He’s also violent, domineering, and jealous. He maintains power, over both his kingdom and his family, through exaggerated displays of male swagger, and he sacrifices all relationships to maintaining the illusion of control. He truly desires to be a fairy-tale prince, and he’ll brook no intrusion on that story from annoying human foibles.

Therefore, Marra literally walks away from her society’s twin institutions of power: the royal court and religion. She spent over thirty years appeasing the dual threats of state violence and eternal judgement. Now she must obey the only instrument more true than either the kingdom or the gods, her own conscience. If that means striking a dagger to the power structures of two kingdoms, well, so be it.

Along the way, she assembles her company: the dust-wife (a vaguely defined sorceress), her mousey fairy godmother, and a massive, gentle-hearted warrior. Oh, and Bonedog, the company mascot, whose name says it all. He’s a dog resurrected from reassembled bones. If this sounds like somebody’s Dungeons & Dragons campaign, I won’t disagree, and the story has the semi-improvisational feel of a dungeon master trying to wrangle the players back on track.

Kingfisher’s product invites comparison to Tolkein, Michael Moorcock, and Andrzej Sapkowski, writers who mix dreamlike whimsy with painful grown-up realizations. Kingfisher’s characters march against the arrayed ceremony of kingdom and state religion, knowing death is likely, simply because it’s right. Princess Marra doubts herself and, without her companions’ support, would probably back out. But together, they form their own morally succinct counterculture, linked by morality and trust.

Please don’t misunderstand. I’ve deployed terminology from psychology and lit-crit, but one could read Kingfisher’s narrative as a rollicking adventure. Like the best literature, though, the story exists on multiple levels. Kingfisher uses playful genre boilerplates to make her message acceptable. But she also reminds us, in this post-MeToo culture, that “happily ever after” relies on the honor system. If Prince Charming lacks honor, then sisters must stand together.

Other reviews of T. Kingfisher books:
Man You Should’ve Seen Them Kicking Edgar Allan Poe
Secrets Buried in the World’s Darkest Corners
The Sleeper and the Beauty of Dreams

Monday, March 2, 2026

Will We Ever Get Tired of Re-Fighting Old Battles?

Promo still from the last time someone dragged The X-Files out of the deep freeze

This weekend’s illegal American bombing of Iran arrives hand-in-glove with another cultural announcement: Hulu is relaunching The X-Files. Preliminary announcements call it a “reboot,” but deeper reportage suggests it’s more a soft reboot, a continuation with new leads. Simultaneously, reports suggest there might be a long-awaited season five for Veronica Mars. (This is more ambiguous, maybe misreading the series being acquired by Netflix; wording is fuzzy.)

I’ve complained before about the cultural currents behind constant reboots. Pop culture is always behind the times anyway, and the flood of streaming media has made the biggest entertainment conglomerates more timid, not less. But this feels different. The resurrection of two popular franchises, thirty-three and twenty-four years old respectively, amid a “Make America Great Again” culture feels more than timid. It feels like a hasty retreat from reality.

Throughout the Current President’s 2016 campaign, he decried urban violence and burning cities, despite such violence being at near-historic lows. But his rhetoric makes sense in his life context, as the Bronx famously caught fire in the late 1970s, the same time he moved into Manhattan real estate with his purchase of the former Commodore Hotel. The poor future President was simply trapped in the sociopolitical milieu of his thirties, unable to grow.

Similarly, this weekend’s bombing of Iranian civilian targets mirrors the President’s unhealed past. Consider his inability to stop heaving accusations against the Central Park Five, nearly a quarter century after they were exonerated. This President retains grudges and political interpretations molded by a privileged youth and segregated social set. In context, he likely bombed Iran, not really for its nuclear program, but as payback for the 1979 Hostage Crisis.

This has become the default for much American politics. We aren’t facing the past, we’re relitigating the past. In the 1980s, both political discourse and mass media desperately wanted to re-fight the Vietnam War, but correctly this time. Franchises like Iron Eagle, Rambo, and Top Gun promised to purge America’s Vietnam disgrace. More recently, Call of Duty and James Bond try to tweak our memory of the Cold War.

Caught in the interregnum between the Cold War and the Global War on Terror, the Clinton decade offered enforced cheerfulness, a frothy meringue of Empire Records and Ben Stiller’s early career. The X-Files directly countered that, maintaining post-Reagan cynicism toward America’s surface culture. Scully and (especially) Mulder walked through neon-soaked midnight landscapes, uniquely able to see the venality that made that era’s party ethos possible.

Kristen Bell in the original network run of Veronica Mars

Veronica Mars pushed this contrast to the extreme. Read superficially, the series presented a stereotyped Southern California panorama, all hypersaturated colors and loud, jangly indie pop soundscapes. Only Veronica and her father—and, eventually, those trapped in their decaying orbit—understood the vulgar horse-trading and human commodities that subsidized Neptune, California’s skin-deep glamour. Veronica, like Mulder, was ready to expose the lie, damn the consequences.

Both franchises took dim opinions of power structures. Veronica Mars fought plush-bottomed police as often as criminals, while Mulder and sometimes Scully brought official corruption to light despite, not because of, the law. But both presented a morally distinct, binary universe. Neptune’s Sheriff Lamb and the Smoking Man were clearly evil, and needed exposed to a public which their shows depicted as passive and sheep-like, desperate for an underdog hero.

Unfortunately, the political tenor has changed. From the impotent government depicted in the 1970s, to the malignant one of the 1990s, the problem has been presented as siloed at the top. The disclosure of the Epstein documents, like the Panama Papers before them, has revealed a network of politicians, capitalists, entertainers, academics, and scientists colluding to support an otherwise decrepit system. The “secret” isn’t secret anymore.

While politicians and media captains want to refight the battles of their, or our, childhood, rapidly unfolding news reveals their vision of the problem as charmingly naïve. Nary a top-tier capitalist or government insider didn’t share information with Epstein. Public intellectuals like Noam Chomsky and Richard Dawkins had their hands in his pockets. The rot isn’t an isolated, partisan tumor. Everyone, everywhere in the system, has been proved complicit.

Veronica Mars and The X-Files helped define a generation’s idea of acceptable villains. They showed our lawkeepers were complicit with lawbreakers in the anarchy most people felt in their ordinary lives. But reality has overtaken the scope these shows made possible. Bringing back the monsters of my twenties is worse than quaint. It offers audiences my age an excuse to avoid the monsters that have revealed themselves in reality.

Saturday, February 14, 2026

Lee Brice in Country Music’s Nostalgia Pits

Lee Brice (promo photo)

Lee Brice debuted his new song, “Country Nowadays,” at the TPUSA Super Bowl halftime show on February 8, and it was… disappointing. Brice visibly struggled to fingerpick and sing at the same time, and gargled into the microphone with a diminished rage that, presumably, he meant to resemble J.J. Cale. The product sounded like an apprentice singer-songwriter struggling through an open-mike night in a less reputable Nashville nightclub.

More attention, though, has stuck to Brice’s lyrics. The entire show ran over half an hour, but pundits have replayed the same fifteen seconds of Brice moaning the opening lines:

I just want to cut my grass, feed my dogs, wear my boots
Not turn the TV on, sit and watch the evening news
Be told if I tell my own daughter that little boys ain’t little girls
I’d be up the creek in hot water in this cancel-your ass-world.

Jon Stewart, that paragon of nonpartisan fairness, crowed that nobody’s stopping Brice from cutting his grass, feeding his dogs, or wearing his boots. Like that’s a winning stroke. Focusing on Brice’s banal laundry list misses the point, that Brice actively aspires to be middle-class and nondescript. But he believes that knowing and caring about other people’s issues makes him oppressed in a diverse, complex world.

One recalls the ubiquitous 2012 cartoon which circulated on social media with its attribution and copyright information cropped off. A man with a military haircut and Marine Corps sleeve stripes repeatedly orders “Just coffee, black.” A spike-haired barista with a nose ring tries to upsell him several specialty coffees he doesn’t want. Of course, nobody’s ever really had this interaction, but many people think they have.

Both Lee Brice and the coffee cartoonist aspire to live in a consistent, low-friction world. If your understanding of the recent-ish past comes from mass media, you might imagine the world lacked conflict, besides the acceptable conflict of the Cold War. John Wayne movies, Leave It to Beaver, and mid-century paperback novels presented a morally concise and economically stable world, in which White protagonists could restore balance by swinging a fist.

The coffee cartoon, with its unreadable
signature (click to enlarge)

By contrast, Brice and the coffee cartoonist face the same existential terror: the world doesn’t center me anymore. Yes, I said “existential terror.” What Brice sings with maudlin angst, and the cartoon plays for yuks, is a fear-based response, struggling to understand one’s place in the world. We all face that terror when becoming adults, of course. But once upon a time, we Honkies had social roles written for us.

I’ve said this before, but it bears repeating: “bein’ country,” as Brice sang, today means being assiduously anti-modern. Country music’s founders, particularly the Carter Family and Jimmy Rogers, were assiduously engaged with current events in the Great Depression. This especially includes A.P. Carter, who couldn’t have written his greatest music without Esley Riddle, a disabled Black guitarist. Country’s origins were manifestly progressive.

But around 1964, when the Beatles overtook the pop charts, several former rockers with Southern roots found themselves artistically homeless. Johnny Cash, Conway Twitty, Jerry Lee Lewis, and others managed to reinvent themselves as country musicians by simply emphasizing their native twang. But their music shifted themes distinctly. Their lyrics looked backward to beatified sharecropper pasts, peacefully sanded of economic inequality and political friction.

In 2004, Tim McGraw released “Back When,” a similar (though less partisan) love song to the beatified past. McGraw longs for a time “back when a Coke was a Coke, and a screw was a screw.” I don’t know whether McGraw deliberately channeled Merle Haggard’s 1982 song “Are the Good Times Really Over,” in which he sang “I wish Coke was still cola, and a joint was a bad place to go.”

Haggard notably did something Brice and McGraw don’t: he slapped a date on the “good times.” He sang: “Back before Elvis, or the Beatles.” That is, before 1954, when Haggard turned 17 and saw Lefty Frizzell in concert. Haggard, like McGraw or Brice, doesn’t yearn for any specific time. He misses stage of personal development when he didn’t have to make active choices or take responsibility for his actions.

Country musicians, especially men, love to cosplay adulthood, wearing tattered work shirts and pitching their singing voices down. Yet we see this theme developed across decades: virtue exists in the past, when life lacked diversity or conflict, and half-formed youths could nestle in social roles like a hammock. Lee Brice’s political statement, like generations before him, is to refuse to face grown-up reality.