Tuesday, December 17, 2024

Witches of the World, Unite!

Alix E. Harrow, The Once and Future Witches

The three Eastwood sisters carry old resentments, and their household witchcraft is fairly lackluster, letting them eke by in 1893 America. But, after seven years of estrangement, they bump into one another in the busiest square in New Salem. Their unexpected reunion corresponds with the emergence of a fortress unseen since the age of myth. The Eastwood sisters must ask themselves: are they the chosen ones to restore American witchcraft?

Alix E. Harrow, who was a professor of American and African American Studies before becoming a full-time novelist, does something similar here to what Susanna Clarke did with her breakout novel, Jonathan Strange and Mr. Norell. Harrow combines the trappings of modern fantasy with the great, socially engaged novels of the 19th Century. Harrow’s take is, unsurprisingly, more American in tenor, but it accomplishes the same goals with comparable aplomb.

Harrow creates an alternate America where magic actually exists, and the great witch-hunters of colonial antiquity had a point. (She plays somewhat loose with historical dates, so plan your response accordingly.) The Salem Witch Trials ended in a massacre, the entire village razed to ferret out the relatively small number of actual witches. The survivors hurried to create New Salem, their moral utopia of Christian privilege and mechanized industry.

Into New Salem stumble the Eastwood sisters. Hedge witches from the agrarian hinterlands, they have accepted lives of compromise in New Salem’s patriarchal system. But their forced reunion causes the entire city to glimpse Avalon, the last bastion where the storied St. George purged the last true witches. The sisters attempt to escape what appears to be Fate forcing their hands, but every sidestep draws them closer together.

But a specter looms over New Salem. Gideon Hill, an avaricious political candidate, promises to purge witchcraft, trade unionism, moral decay, and the kitchen sink. His stump speeches combine rhetorical nods to Christianity with a laundry list of grievances for White citizens feeling threatened by rapid change. Taken for himself, Hill is greasy and unpleasant, but not dangerous. Except he’s riding a wave of public umbrage to the mayor’s office.

Alix E. Harrow

In some ways, Harrow writes a standard fantasy narrative. The Eastwood sisters resemble heroes like Frodo Baggins or Geralt of Rivia, true believers who must resist a rising tide of injustice, even when they’ve grown fatigued. Mass-market fantasy loves its beleaguered underdogs. But, removed from Neverland and placed in a milieu American readers will remember from high school history class, the themes become exceptionally poignant for current audiences.

These themes of alienation and moralistic terror could describe 1893 or today. Harrow laces her narrative with allusions to Dickens, Marx, Upton Sinclair, and others, but not fatuously. For Harrow, these writers describe the American experience amid rapid change, an experience that remains unsettled 130 years later. Powerful people resist change because it threatens their authority, and they seek ways to make the populace complicit in their oppression.

Harrow demonstrates that hierarchies of power rely on equal measures of power and deceit. The Eastwood sisters must resist Gideon Hill’s instruments of physical force, but they must also unlearn messages of fear and self-doubt that they’ve internalized throughout their lifetimes. They must fight injustice, even when they’re tired, even when they’re ready to have normal human-scale relationships, because the fight is right, and because there’s nobody else.

We feel for the sisters, in their struggle to liberate Avalon from the patriarchy, because they are human. Yes, the truth of Avalon is vast and metaphysical. But their story is ultimately about people: about the jobs we accept to pay rent, the relationships that make the battle worthwhile. Therefore when the sisters rise up against tyrannical bosses, pietistic politicians, and toxic partners, we undertake that journey with them.

Further, Harrow avoids facile answers to difficult problems. She has at least three moments that, in conventional genre fiction, would’ve signaled the story’s culmination and the sisters’ ultimate triumph. But in Harrow’s telling, there is no grand culmination, no moment of eternal transcendent victory. Instead, the story keeps changing, the conflict evolves to reflect the characters’ complex world evolving with them.

By combining the nostalgia of historical fiction with the splendor of paperback fantasy, Harrow creates a story that readers can immerse ourselves in, with characters who feel like our friends. But she also addresses themes that the great (male) writers of American literature introduced viewed from another angle. We can enjoy this engaging story of complicated characters. Or we can recognize ourselves, and our struggles, amid Harrow’s urgent themes.

Friday, December 13, 2024

Luigi Mangione and Political Messianism

The martyrdom of Luigi Mangione

When accused assassin Luigi Mangione gunned down United Healthcare CEO Brian Thompson last week, the initial response was surprisingly bipartisan. Even we who abhor violence as a political instrument, nevertheless acknowledged that the wealthy, protected by law and supported by our political establishment, need consequences. After all, the only thing rich people love more than money, is being alive to spend (or hoard) it.

Mangione represents the latest manifestation of a popular phantom haunting American politics: the yearning for a secular Messiah. Americans long for a unique individual who will, like Jesus before the money changers, sweep uncleanliness from our sacred places and restore the hope we all believed America had in 11th Grade American Civics class. This powerful unitary individual always seems just across the horizon—and, like the horizon, never quite arrives.

Political messianism has a dual nature: it supposedly galvanizes people around moral principles, but it does so through a singular personality. This may mean the speculation, repeated on basic cable and social media, that Mangione’s actions will galvanize American class consciousness. Or it may mean attributing the kingdom and the power, forever, as QAnon purists believe Donald Trump will purge the halls of power.

This yearning is bipartisan, or perhaps nonpartisan. I’ve written before that both parties look to presidential candidates for deliverance, especially when the other party controls the Oval Office. We saw something similar with what Democrats and dissident Republicans expected from the Mueller Report. In each case, those standing outside the political establishment expected a singular personality to purge “sin” and restore the Eternal Kingdom.

Sometimes this means imputing moral clarity to a conveniently absent figurehead. Political junkies either attribute Ronald Reagan with rescuing America from catastrophic moral decline, or ruining everything with cack-handed mismanagement. Either way, they selectively remember Reagan’s Administration. His adherents elide Iran-Contra, and the fact he left office in disgrace; his critics forget he performed necessary triage on outdated FDR-era programs.

Importantly, political messianism, like the religious variety, heavily emphasizes either the past or the future. Many messianic figures, like Jesus, Socrates, the Buddha, and Confucius, left no written record during their lifetimes; their followers recorded their teachings only posthumously. Likewise, messianic movements, from Hasidism to John’s Revelation to Marx’s Grand Synthesis, await a future where sin is somehow expunged, and humanity made pure.

Sin always exists in the present. The past, whether the Hebrew Eden or the Greek Golden Age or Taoist Pangu, is simple, morally clear, and benevolent. Likewise, the Revolution taught in American creation myths lacked doubt or nuance; it was inarguably good. The emergence of sin corresponds with the emergence of complexity. The more subtlety and finesse necessary to explain doctrine, the more tainted it becomes with doubt and sin.

Against this complexity, religions consistently promise a messiah. Besides Jesus of Nazareth, other proclaimed messiahs include Simon bar Kokhba, Moses Maimonides, and Cyrus the Great. Islam promises the future appearance of the Mahdi, whose military purge will precede the final judgement. Modern Judaism promises not an individual messiah, but a messianic age of moral clarity—which, again, disturbingly resembles Marx’s promised Grand Synthesis.

Whenever somebody promises Donald Trump will “drain the swamp,” or promises that Kamala Harris will “save democracy,” that’s messianic language. Similarly, whenever social media pundits gush over Luigi Mangione’s blows against capitalist resource hoarding, or consider him an emblem of class consciousness, they channel their own moral principles through his person. Sin is abstract; Mangione’s actions are concrete. Violence is harsh, but it at least makes sense.

Such reasoning stumbles, however, in the one messiah whose promise outlasted his person: Jesus of Nazareth. Though his message gave hurting peasants hope during an epoch of imperial conquest, his blows against the empire were philosophical, not militant. Violent uprisings against kleptocrats usually end in crackdowns and purges, and functionally strengthen the status quo. Durable rebellions embrace complexity, they don’t replace it with the simplicity of a gun.

Jesus’ messianic message had a class component. He blessed the poor, fed hungry masses, and prayed to forgive our debts, as we forgive our debtors. From a secular reading, Jesus’ teachings underscore messages of class solidarity, especially in Luke’s Gospel. But he acknowledged, at his arrest, that warriors don’t live amid their victory; they die, and others reap the profit. He accepted his death so others could receive the benefit.

Luigi Mangione might yet engender such a messianic legacy, but I doubt it. No individual will save us from the conditions we’ve created collectively for so long.

Tuesday, December 10, 2024

Knowledge That Died in the War

Jayne Anne Phillips, Night Watch: a Novel

Young ConaLee comes from a part of West Virginia hill country where people don’t need, or know, one another’s last names. Therefore it isn’t strange that she doesn’t know hers, or her mother’s Christian name. When the aggressive interloper that ConaLee knows only as Papa (though he isn’t her father) tires of ConaLee’s family, he deposits them at the lunatic asylum in Weston, ConaLee must maintain the illusion of post-Civil War respectability that she’s mastered.

Author and professor Jayne Anne Phillips’ novels focus on lonely souls wandering an America they don’t understand. She won the Pulitzer Prize for this novel, which focuses on the loss of knowledge that follows war. Phillips’ characters spend the story pursuing information, and the healthy closure that come with it, and several times come perilously close to finding it. They never know how close, though, because unlike we readers, they have only a limited perspective.

Yanked out of the only life she’s ever known, ConaLee wants to protect her mother from more harm than she’s already experienced. ConaLee blames herself for her failure to ward of Papa, a Confederate deserter and sexual predator. This self-blame is certainly unfair, since she’s only thirteen. But the wider world ConaLee experiences at the Weston lunatic asylum [sic] makes her realize how small and uninformed she is, leaving her desperate for any momentary understanding.

Her mother passes as Miss Janet, a well-to-do who keeps her secrets zealously. Glimpsed from her perspective, though, the story changes. Her husband, ConaLee’s father, enlisted at the start of the Civil War, believing that valorous service would grant him status. They ran from ignominious beginnings, after all, and live in constant fear of capture. Service would grant both of them a legal name and freedom from the hunt. Sadly, he just never came back.

John O’Shea, the asylum’s Night Watch, knows that isn’t his real name. Wounded at some distant battle, he lost all memory before the War. He earned a pseudonym and discovered a talent for helping those who, like him, lost mental capacity through trauma or abuse. He continues searching for his past identity, feeling the gnawing sensation that someone, somewhere, waits for him. We know, as readers, who that is, but his wounded memory remains slippery.

Jayne Anne Phillips

Overseeing everything is Dearbhla (pronounced “Dervla”), a patient watchwoman who is half doting grandmother, half Irish swamp witch. She longs to restore ConaLee’s sundered family and exorcise Papa’s damage, but without better skills, she remains an observer. She wanders throughout the Virginias, seeking the lines of knowledge which war severed, always one step removed from finding it. Readers see how close she comes, always doomed to mishear a valuable clue or to miss something important.

Phillips’ narrative might meet the criteria of “postmodernism,” since it deals with the finitude of human knowledge. Her characters stumble blindly, always just barely failing to glimpse the truth, because they don’t understand their place in the narrative. Because they don’t know it’s a narrative. We readers understand we’re reading a novel, and therefore we grasp the importance of the many missed clues. But meaning is something readers impute, not something these characters naturally have.

Novels like this turn on degrees of disappointment. Characters are condemned to repeat the patterns of dancing right up to the precipice of understanding, then dance away again, never realizing how close they came. We wait on tenterhooks to see when the characters will realize what’s obvious to us, knowing that when they do, some other form of disappointment will follow. The limits of human perspective, and the fallibility of human memory, keep them blind.

The narrative voice reads more like a prose poem than a novel. Or like several braided poems. ConaLee, home-schooled on the books her mother can afford, mostly Dickens and the Bible, speaks in a lyric voice which differentiates her from more pragmatic characters, like the asylum doctor. O’Shea, a complete tabula rasa, has a plainspoken patter, a strict noun-verb voice bereft of ornament. War has changed how characters speak, leaving them with outdated, peculiar voices.

Human beings, Phillips implies, exist within a broader tapestry. But seen from inside, we never grasp the part we play, the thread we leave behind. Meaning comes only when we view the story from outside, which individuals can never do. Knowledge is something we create, not something that exists. And, as characters change names like shirts, even our identities come from our actions, not our beings. Someday, looking back, we’ll glimpse what it all meant.

Wednesday, December 4, 2024

Capitalism and the Winners’ Society

Jeopardyhost Ken Jennings

I just did something I used to do frequently, but haven’t done for years: I watched a full episode of Jeopardy. Television has lost its allure in recent years, and Jeopardy’s appeal to shallow knowledge of inconsistent topics has become an emblem of modern society. Too many people know surface-level flotsam about nearly everything, giving us all a false sense of expertise on topics where we’re profoundly ignorant.

One player, working with speed and confidence, managed to rack up the largest cash haul on the stage, then he hit a Daily Double. He decided to risk it all on a question about corporate ad slogans. Yes, he risked it all--and lost it all. He went from being in the lead, to having nothing.

American capitalism loves winners. I’ve witnessed journalists, economists, and social media stans tripping over their own packers to lavish praise on billionaire CEOs. It’s often unintentionally hilarious to watch White boys without two dimes to rub together, invent phony narratives to explain why Elon Musk, Jeff Bezos, and Mark Zuckerberg are epoch-making geniuses, not just winners of America’s greenback lottery.

Business writers I've reviewed on this blog have lavished praise on society's winners. They will perform seriocomic contortions to justify why, say, Jack Welch or Sam Walton are intrepid adventurers and the epitome of moneyed masculinity. To critics like me, these paragons of capitalism are highway robbers and guttersnipes who hoard the wealth created by others. But their fluttering groupies insist their favorite CEOs generated their wealth by rubbing two sticks together in the woods.

Elon Musk

Among these praises, one seems pointedly recurrent: billionaires deserve their riches because they risked their own seed capital. Risk looms large in billionaire mythology. Michael J. Sandel quotes philosopher Robert Nozick as saying that all wealth is the product of bold risk-taking and unique innovation, the outcome of intrepid individuals who somehow exist in an economic vacuum.

However, even if we accept the risk mythology, it fumbles on one level: we evaluate risks by their success. It's easy to praise Bill Gates or Steve Jobs for their success, when we don't simultaneously evaluate why the founders of Sun Microsystems and Commodore International flopped so ignominiously. When we only look at those who risked and won, it's easy to think risk leads to reward.

But for every world-renowned success, there are uncountable failures. Eighty percent of American companies fail within five years. A handful of actors become million-dollar stars, but most limp along on day jobs before leaving the industry altogether. For every Facebook, there's a bloodbath of MySpaces, Geocities, and Friendsters. Failure, not success, is the usual outcome for risky courage.

Watching Jeopardy, that one player who lost everything didn't give up. He slowly won back the money he lost, then in Final Jeopardy, gambled everything again. He turned out to be the only player to recognize a question about Dashiell Hammett, and finished the game with almost four times as much as the second-place finisher. In under thirty TV minutes, he went from wearing egg on his face, to becoming the reigning champion.

Bill Gates

But that outcome wasn't inevitable. He won, not only because of his own wide-ranging knowledge, but because the rules called a halt to the game before he had a chance to lose everything again. He benefited from writers who corresponded with his backlog of trivia knowledge. And he handled the buzzer effectively--something many past players have said isn't easy.

When Elon Musk and his giddy evangelists call him a self-made billionaire, they overlook the environment that created his wealth. As Giblin and Doctorow write, the monopsony economy that makes Musk's companies possible results from public policy and social order. Regulations written to prevent past economic abuses, become barriers to entry for small start-up entrepreneurs. Inequality festers unchecked.

This doesn't mean risk-takers don't deserve reward. Musk, Bezos, Gates, and others did indeed risk their, or their investors’, seed capital, and could've eaten dust. But that doesn't mean they work alone. They used others’ skills, labor, and time. They benefited from technicians trained in state schools. Many, like Musk, received direct government subsidies to offset the costs which their risks carried.

American rhetoric in support of capitalism emphasizes the goodness of taking a risk. But in practice, our economy doesn't reward those who take a risk, but those who win. And victory is always socially conditioned. Success means what customers pay for, not what billionaires prefer. There's literally no difference between Microsoft and Sun Microsystems, except that one guessed right.

Monday, November 25, 2024

Artists and the Schizoid Personality

J.K. Rowling—great artist, bad person

I’m trying to write a vampire novel. It’s a cheap grab for marketable attention, yes, but I need to restart my long-delayed writing career. Therefore I’m trying to overstuff the novel with cultural motifs of capitalist resentment, misplaced sexuality, and status envy, the three benchmarks of mass-market vampire fiction. I find myself facing a problem, though. I simply don’t know enough about nonstandard sexual expression to write about it comfortably.

Sure, I know something about the fear, anger, and desperation which all monsters and their victims share. But since at least Bram Stoker, and certainly since Anne Rice, authors have used vampires as metaphors for sexual appetites so socially repellent that one must squelch them. Vampires are, by nature, dominant, violent, and voracious, preying on trust and innocence—all things I’m not. I’m doubting my ability to finish this project.

I’ve had this problem before. I’m more than competent organizing words in ways that readers find pleasing, and multiple trusted sources assure me I should seek wider audiences. But it’s a writing truism that stories require conflict, and for every protagonist with a goal, they require some antagonist willing to do anything necessary to impede that goal. Somebody must always, from the storytelling perspective, be the villain.

Especially in this age of Marvel craptaculars and Starwoid blockbusters, readers feel dissatisfied with ordinary human-scale conflicts. MFA workshops continue producing countless “literary” conflicts of the Kramer vs. Kramer style, which indie presses publish for prestige, and which seldom get a second printing. To snag enough readers to buy groceries, authors must mass-produce generational traumas, planet-destroying weapons, and truly demented monsters.

Roman Polanski—great artist, lousy person

And to write those monsters with the full dimensionality audiences have grown accustomed to, writers must occupy the monstrous headspace. One cannot write monsters without, at least occasionally, thinking like a monster. After all, truly terrible creatures like Dracula, Iago, and Lord Voldemort aren’t mere mustache-twirling villains in the Snidely Whiplash mold; their evil comes from someplace, and serves some motivation deep within the character.

Crafting fully fleshed monsters, though, means the monster’s thoughts live within the author. We see this sometimes in what authors affirm, or what they deny. J.K. Rowling, for instance, created Lord Voldemort, a thinly coded Hitler analogue. But in creating that character, she also incorporated Churchill-era war propaganda about an assertively fair-haired Britain becoming overrun with foreigners and, ahem, bankers. Then these beliefs spilled into her private life.

Shall I continue? Cinematic genius Roman Polanski wrote, directed, and starred in The Tenant, a thriller about encroaching urban paranoia and isolation, in 1976. In 1977, he was accused of—and confessed to—drugging and raping a minor, and has remained a fugitive from American justice ever since. Knowing what we do about sexual violence, the crime for which he got caught almost certainly wasn’t his first. Personal secrets undoubtedly drove artistic paranoia.

Many of Neil Gaiman’s best stories involve hurting women. Coraline, Stardust, and several Sandman plots all involve women fearing for their lives, going insane, or being held in captivity. Though Gaiman never went the full Green Lantern and stuffed women in refrigerators, he nevertheless motivated many of his best stories by causing fictional women pain. We now know that torture didn’t exist entirely in his imagination.

Neil Gaiman—great artist, terrible person

Art doesn’t emerge from whole, balanced, healthy minds. Like an oyster creating a pearl, the object of beauty begins with irritation and injury. Worse, in order to produce enough content to make a living in this media-saturated world, artists must do more than tolerate the irritation; we must prod it, feed it, and rip previously healed wounds open again. Even when it doesn’t produce literal violence, artists are unbalanced people.

When I describe artists as “schizoid personalities,” I don’t mean in the clinical sense. I mean in the older Greek sense, where the “schiz-” prefix refers to something torn or cut in two. The same prefix used in schizophrenia is the same root used in the word scissors, signifying something thoroughly torn, in ways it can never be completely repaired. Artists must tear themselves asunder to create fully realized conflicts.

And I must ask myself: am I willing to risk becoming torn likewise? Do I love art enough to create those thoughts within myself, knowing that I’m not immune to their lure? What if I create the monster capable of expressing such eloquent trauma—and then, like Rowling, Polanski, and Gaiman, I can’t control it? If artists aren’t unbalanced people when they begin, most will be so before they finish.

Sunday, November 24, 2024

RFK Jr. is Maybe, Slightly, Right

Robert F. Kennedy, Jr.

It’s a matter of time before Robert F. Kennedy, Jr., and Elon Musk create an insuperable rift within the upcoming Trump administration. Musk believes that engineers and their technology will solve society’s problems with efficiency beyond anything government can accomplish, a philosophy called cyberlibertarianism. RFK, by contrast, distrusts science and technology, and wants to rescind 200 years of progress in physiology and medicine.

Left-leaning quarters of the internet have begun mocking RFK as part of our routine anti-Trump rhetoric. We ridicule his disdain for vaccines in the immediate wake of a pandemic made absurdly worse by an administration that hampered any efforts to curtail the spread. We disparage his fear of pharmaceuticals as indicative of his now-infamous brain worm, which maybe could’ve been prevented by medicine. Everything RFK says is automatically tainted.

Yet I suggest there’s something to his persistent appeal. People like RFK, and those like him who automatically distrust science, because he isn’t entirely wrong. RFK taps a fertile vein of public sentiment that realizes we’ve heard lies from people and institutions for years, clothed in the vestments of science, technology, and mathematics. Americans have become distrustful of authority, and not without reason. RFK simply identifies that distrust.

We’ve witnessed how industrialized pharmaceutical companies, the storied “Big Pharma,” have yanked patients’ chains for years. Nearly a decade ago, so-called “Pharma Bro” Martin Shkreli increased prices on life-saving diabetes medications, which weren’t rare or proprietary, simply to boost his own shoddy past investments. Malcolm Gladwell writes how Purdue Pharma created the “opioid epidemic” by targeting, not patients in pain, but doctors desperately lonely for human validation.

These highly visible forms of public manipulation only remain in public memory because they’re uncomplicated. More nuanced issues, like the sale of over-the-counter amphetamines as diet aides, or the whole fen-phen debacle, are harder to understand, and therefore harder to remember. We’ve wondered at shifting standards for diet and exercise: a few years ago, doctors told us small quantities of red wine were healthful. Now we’re told no quantity of alcohol is safe.

Elon Musk

What’s more, many of RFK’s concerns about food additives and processing reflect our own concerns. Many packaged and convenience foods are fortified with synthetic preservatives and wheat flour base to remain shelf-stable. Many are also fortified with synthetic flavor compounds, because the processing leaches out natural flavor. As Rampton and Stauber write, these additives are presumed safe only because they haven’t been proven unsafe, and are only lightly regulated.

And our government kowtows to the companies which produce these foods, pharmaceuticals, and other substances we put into their bodies. Across many years and both major parties, administrations have let industrial conglomerates like Pfizer, Unilever, Con-Agra, and Bayer run roughshod, only occasionally constrained when public outrage grows vocal enough to jeopardize politicians’ reelection chances. Almost like representatives care more about their donors’ demands than their constituents’ health.

This isn’t new, of course. Anybody who even fleetingly reads American history knows that, before the rise of antitrust regulation, the companies that manufactured drugs fortified their patent medications with arsenic, cocaine, and opium. Before Congress created the FDA, food companies stretched their packaged foods with sawdust and urine. Government stepped in to enforce baseline safety standards, but now hides behind regulations as opaque as the companies they purportedly regulate.

So yeah, RFK actually understands an important thread in American political discourse: the terror Americans feel at how much power these opaque corporations and government institutions have in our lives. We fear these corporate conglomerates, many of which own significant shareholder stakes in one another. We likewise fear the regulatory institutions which often appear as likely to protect as prosecute the corporations hurting us.

That doesn’t mean RFK is right, of course. His belief in the superiority of “natural immunity” over vaccines, to cite just one example, suggests he thinks people once shrugged off polio, smallpox, and the plague. This is goofy. Like belief in homeopathy or crystal resonances, belief in “natural immunity” only makes sense with a complete unawareness of the history of science and medicine. Thus RFK Jr. is worse than ignorant.

However, progressives ignore RFK’s underlying message at great political cost. Just as we disparaged the Wuhan Labs hypothesis simply because Trump said it, we’re now ignoring a history of institutional abuse of science, simply because RFK says it. If progressives want to reclaim these voters from the kooks who have hijacked it, we need to start by acknowledging their concerns. Because they aren’t wrong; powerful people have lied to us.

Wednesday, November 20, 2024

East Coast Jazz-Age Gothic Extravaganza

Riley Sager, The Only One Left: a Novel

The mansion known as Hope’s End has enough terrible stories to make Disney’s Haunted Mansion look sedate and staid. Amid the waning Jazz Era, a shocking triple homicide left an entire family of Back East elite dead and mutilated. Now Lenora Hope, the only survivor, paraplegic and mute, has decided to tell her story. Working through an outdated manual typewriter, she begins unloading on a poor nurse unprepared for everything.

This is Riley Sager’s seventh thriller under this byline, and eleventh overall, in only fourteen years—a veritable assembly line of paperback chills. Working at such a pace, it’s perhaps unsurprising that Sager uses genre stereotypes. Diving into this novel, I quickly recognized allusions, some more overt than others, to Edgar Allan Poe, Daphne du Maurier, and D.H. Lawrence. And that turned out to be just for starters.

In 1983, Kit McDeere knows the Hope’s End murders as merely a lingering urban legend. Pursued by her own demons, Kit accepts a caregiver job for Lenora Hope, mostly because she can’t find anything else. When Lenora proves to be, not a daffy patient just winding down, but a crafty schemer ready to divulge secrets she’s kept hidden since 1929, Kit simply isn’t prepared for the mass of horrors dumped in her lap.

Once lavishly appointed and luxurious, Hope’s End has become a time capsule of Gatsby-era nostalgia. Lenora’s dwindling household staff preserves the house, but also basically keeps her prisoner inside an upper-story suite overlooking the Atlantic. Kit gets drawn into the building’s faded grandeur, but also into Lenora’s story, which she types painstakingly with only her left hand. The secrets, and the decaying house, take on labyrinthine proportions in Kit’s mind.

Sager emphasizes the contradiction between the house’s aspiration, and its condition. It’s beautiful, but also literally falling apart. This proves to be a metaphor for the agreed-upon fictions that keep everyone behaving politely toward one another. The staff’s autocratic matron, Mrs. Baker, works hard to maintain the illusion that nothing has changed since 1929. This illusion becomes harder and harder to maintain.

Everything exists on multiple levels. Just as the house is a metaphor for the gentlemen’s agreements binding polite society together, the dirty secrets Lenora wants to divulge are metaphors for the traumas people need to express, but can’t. Kit reads Lenora’s long-buried secrets and becomes a crusader for justice denied for over fifty years, but that’s a metaphor for the doubts and self-blame she can’t address within herself.

Riley Sager

Within these interlocking symbols, one rule of mass-market fiction is: the more assiduously characters believe something on Page One, the more thoroughly they’ll find their expectations inverted on the closing page. We readers have become accustomed to twist endings and big reveals. Therefore, authors will wedge multiple twists into the narrative, because they know we’re keeping suspect lists and testing them against accumulating evidence.

Yes, Sager does that too. I mentioned three novelists above whose work Sager channels. As the story accelerates, though, it becomes increasingly clear Sager expects his readers to know story boilerplates mostly from Hollywood, and his narrative assumes a three-act structure familiar from Syd Field screenwriting manuals. I started noticing narrative conventions pinched from filmmakers like Hitchcock, Billy Wilder, and Robert Aldritch.

I don’t say this to disparage Sager’s storytelling… except when I do. In later chapters, Sager’s chapter breaks start occurring on big plot reveals or cliffhangers, sometimes so pointed that readers can practically hear the soundtrack orchestra playing a dramatic flourish. Seasoned readers start observing what film critic Roger Ebert called his Law of Conservation of Characters: every named character serves a purpose, and if we don’t know it, we just don’t know it yet.

Individual readers will decide whether this bothers them. The novel’s filmic qualities give it a galloping pace that keeps readers curious, wanting to have our unresolved questions answered. We find ourselves genuinely caring about these characters, and hurting when they’re disappointed. But as cantilevered revelations start accumulating, they start straining readers’ credulity. We start wondering, sometimes out loud: could they really keep such secrets for fifty-four years?

Probably not.

Readers’ ability to enjoy this novel will match the degree to which they’re able to remind themselves it’s a novel, not reality. If readers can enjoy the allusions to classic Gothic romanticists of yore, and the movies they may still have gathering dust in their VHS collection, they’ll probably lose themselves in Sager’s contrivances. The minute we start asking questions about plausibility, we’ll yank ourselves abruptly out of the book.

Monday, November 18, 2024

What Forgiveness Is, What Forgiveness Is Not

Lysa TerKeurst, Forgiving What You Can't Forget: Discover How to Move On, Make Peace with Painful Memories, and Create a Life That’s Beautiful Again

Forgiveness is one of the most necessary, and one of the most difficult, aspects of the Christian experience. When neighbors, enemies, and earthly powers affront us, the Gospel calls us to forgive generously; but our human impulse is to nurse grudges and seek vindication. Essayist Lysa TerKeurst found this in her personal life, when her husband’s infidelity nearly imploded her marriage. So she went in search of what the Bible actually says about granting forgiveness.

I find myself divided about TerKeurst’s findings. Her conclusions are biblically sound, extensively sourced, and balanced by personal experience. She found that, whenever she couldn’t bring herself to forgive, her resentments turned malignant, wounding her far beyond the original transgressions she suffered. When she opened herself to the experience of Christlike forgiveness, she didn’t need to excuse the harm done, or compromise her boundaries. She just stopped carrying her old resentments around in her pockets.

However, I quickly noticed several elements missing from TerKeurst’s exegesis. For starters, though she describes insights she gleaned from her therapist, she cites nothing from science, and little from any extra-biblical sources. She name-drops St. Augustine, C.S. Lewis, and Charles Spurgeon in the text, hardly rigorous scientific sources. Her insights come mostly from personal anecdotes, and her text reads more like a memoir than therapeutic guidance. She assumes you can learn from her personal journey.

I also noticed the near-complete absence of one word from TerKeurst’s text: “repentance.” I don’t recall that word appearing until an appendix. Christians must frequently forgive someone who hasn’t repented or sought to amend their transgressions, because it’s more important to stop carrying that stone ourselves. But I’ve frequently observed that powerful people demand forgiveness before they’ve demonstrated a whit of repentance, placing the burden of transgression on the one wronged, and excusing the transgressor.

We’ve seen this recently in churches. Floods of accusations, not only against religious leaders who have socially or sexually abused their parishioners, but also against church institutions that papered over the abuse, have revealed decades of unhealed trauma. Insurrectionists bearing Christian insignia besieged the American government, then urged voters and legislators to “just move on.” Forgiveness has become an obligation the powerful impose on the masses, not a gift freely given to us by Christ.

Lysa TerKeurst

TerKeurst’s larger text contains important pointers and tools to enact forgiveness in our lives. Again, she roots these insights on her personal experience rather than larger psychological research, but pause on that. Her suggestion to, for instance, begin the forgiveness process by writing down the original transgression, and its long-term impact. After reading TerKeurst’s direction, I applied this exercise myself. I found that crystallizing the hurt into words makes it manageable, not vast and insuperable.

She also expounds about what forgiveness is not. Though TerKeurst accepted the struggle to reconcile with her husband, reconciliation isn’t an obligatory component of forgiveness. Sometimes Christians must unburden ourselves of others’ transgressions, but that doesn’t mean allowing those who hurt us back into our lives unconditionally. There’s a wide gulf between forgiveness, and being a doormat. TerKeurst dedicates an entire chapter to creating and enforcing boundaries to ensure the offender doesn’t hurt us again.

Perhaps the greatest shortcoming in TerKeurst’s reasoning reveals itself in one fact: after this book shipped, her husband returned to old habits, and she reluctantly admitted her marriage was over. I don’t say this to gloat. Rather, I want to emphasize the White Protestant fondness for forgiveness, separate from repentance, has consequences. God is loving and merciful, but God is also just, and Christians who elide the need for repentance miss part of the journey.

In the New Testament, the Greek word metanoia is variously translated as both “repentance” and “conversion.” In either case, metanoia signifies a transformation of mind, a complete reorientation of outlook in service of a renewed life. Metanoia doesn’t happen instantaneously, and it isn’t something someone professes verbally. Rather, repentance makes itself known in a life realigned to serve higher goals. Apologizing and accepting responsibility are good first steps, but repentance comes in a reorganized life.

Don’t misunderstand me. Though TerKeurst purposes to write a self-help book, she actually gives us a good memoir of spiritual struggle, one which yields valuable insights, even if—we now know—her struggle wasn’t complete. If we read it that way, we have plenty to learn from her experiences. But one of the necessary lessons is that forgiveness without repentance creates a downward spiritual spiral. Don’t carry burdens unnecessarily, but don’t rush to forgiveness either.

Friday, November 15, 2024

For the Kingdoms of the Earth

“And you will cry out on that day before the king you chose for yourselves and he will not answer you on that day.”
—1 Samuel 8:18, Robert Alter translation
Artistic representation of King David

Samuel, the Hebrew leader who oversaw Israel’s transition from the age of judges to the age of prophets, specifically warned Israel what would happen if they selected a king. The monarch would seize and redistribute the best farmlands, a massive injustice to an agrarian society. He would seize Israel’s farm implements to reforge them into weapons of war. Kings would spend tax revenue on palaces while farmers squatted in huts.

This maintains a pattern recurrent throughout the Bible, the declaration that power hierarchies are inevitably unjust. To theistic minds of the post-Bronze Age Levant, human hierarchies seize power that belongs uniquely to God. As Robert Alter writes in his extensive footnotes, the God of Samuel was explicitly a Hebrew deity, and had literal political dominion over Israel and Judah. Power belonged to God; humans could only act as God’s deputies.

God’s sovereignty worked well while Israel remained poor, agrarian, and simple. A loose confederation of hill-dwelling tribes working the land with bronze implements, Israel needed little, and God provided commensurately. But the Levant witnessed the rise of political empires like Assyria, Babylon, and Egypt. Straddling the footbridge between Africa and Asia, the Levant became a necessary possession for any empire hoping to expand.

It's easy to forget, centuries removed from the “Divine Right of Kings,” that king isn’t originally a political title, it’s a military rank. Kings occasionally made and enforced laws, time permitting. But civilian laws basically existed to organize the population for military purposes: strong sons for recruitment, tradesmen to make arms, crops to resupply the front lines. As I’ve written before, the political state exists fundamentally to bolster the military.

We’re witnessing this in our time. As Israel’s pummeling of Gaza continues after over a year, international Jews loom large among those protesting the violence. While the Israeli state channels national resources toward killing despised outsiders, those who define themselves according to Jewish traditions and values are among the state’s most vocal opponents. The cleft falls along loyalty to the state versus loyalty to Judah.

As Samuel prophesied, King Saul became tyrannical and paranoid—but not without reason. As his military needs became increasingly prominent, he needed constant resupply of resources. Though Saul didn’t undertake many significant military adventures away from the Israelite homeland, his defenses against Egyptians, Philistines, and other flatlander empires became increasingly costly. In the end, Saul died defending the homeland.

Benjamin Netanyahu, acolyte
of his country's secular Priesthood

Meanwhile, as Saul became increasingly despotic, David became increasingly popular. The description of young David in 1 Samuel seems remarkably like a combination of Robin Hood and Joan of Arc, a folk hero rallying common folk against the occupying despot. David roams the Levant, gathering followers, but notably never attacking God’s anointed king. Only when Saul’s own overreach gets him killed, does David’s rabble army seize power.

My childhood Sunday School tracts always depicted this David: not necessarily rebellious, but certainly young, a friend to commoners, active and popular. The David described in 2 Samuel barely exists, because the longer David holds power in Israel, the more he resembles Saul. He’s arguably worse than Saul, because at least Saul died manning the fortifications. David instead sends others to fight, staying home himself and having sex with generals’ wives.

International Jews lived in diaspora for two millennia before Zionists reestablished the state of Israel. Jewish tradition holds that, eventually, the Israelite homeland will return, but throughout scripture, that’s always in the future. Diasporic Jews suffered massive oppression for centuries, and in places still do; but they didn’t have to absorb the moral compromise of governing an earthly kingdom. Now some do, and that’s made them massively unpopular.

Consider other world leaders chosen for their outsider status. Barack Obama and Boris Johnson both achieved national power by promising to break with stultifying political conventions. Both accomplished mere shadows of what they promised. In America, Republicans running on anti-statist platforms, like Kevin McCarthy, Paul Ryan, and Donald Trump, all needed to compromise their values to actually govern.

Oh Samuel, you warned us. Kings may require power to defend us, but power will always turn the powerful into the instruments they hated. And now it’s too late, we can’t return to our hill-country farms and our uncomplicated agrarian lifestyles. Because we, too, have become what we once hated: subjects of an occupying force that values only itself.

Sunday, November 10, 2024

Autopsy for an Institution

Vice President Kamala Harris

Conventionally, political parties conduct postmortems following every Presidential election. These self-reflections especially matter after a loss. If parties and their voters can accurately identify what led to the outcomes, they can reverse their losses later. When Mitt Romney failed to unseat Barack Obama in 2012, the Republican National Committee determined their message was insufficiently inclusive. How that led to the pugnacious, bigoted Donald Trump, I cannot figure.

In 2024, Democrats lost the Presidency and the Senate. The House of Representatives remains uncalled, but a Democratic upset appears unlikely. After a campaign anchored on promises to hurt POC, queer people, and dissidents, American voters decided they preferred that over a Democrat with a proven, but workmanlike, track record. With a stranglehold on American government, Republicans stand poised to unleash epoch-making pain on ordinary citizens.

Democrats probably won’t begin their postmortem until January, between the Congressional and Presidential inaugurations. However, I believe it’s necessary to commence now, while feelings remain high. Why, with so much at stake, did millions of Democratic voters stay home? Trump gained almost no absolute numbers following 2020; his majority margin apparently consists of Biden voters who sat out the Harris campaign and didn’t vote at all. How did that happen?

(There’s no concrete evidence of voter fraud; the difference apparently consists entirely of voter apathy.)

Edit: in light of new evidence, it appears that Trump did not win an outright majority. Though he came first in the popular vote, continuing counts indicate that he fell short of the 50% threshold.

Though every election has its own character, one recurrent thread remains evident throughout my lifetime: Democrats desperately want to prevent another 1968 Democratic National Convention. Bipartisan anger at Lyndon Johnson’s mishandling of the Vietnam War, coupled with Hubert Humphrey’s general unpopularity, caused streets to erupt in violence. The brutality made the Democrats look slovenly and dangerous, which handed the general election to Richard Nixon.

Former President Jimmy Carter

Beginning arguably with Jimmy Carter, the Democrats began fleeing their legacy with FDR’s New Deal and Johnson’s Great Society. Teddy Kennedy primaried Carter from the Left for exactly this reason. Unfortunately, this challenge fractured the Democratic coalition and helped hold the door for Ronald Reagan. In autopsying their 1980 drubbing, Democrats decided their lesson was that they needed to tack harder to the center and abandon the New Deal.

Thus commenced the Democrats’ centrist fixation, beginning with Walter Mondale’s historically milquetoast 1984 campaign. 1988 should’ve been Gary Hart’s year, until he imploded following the notorious Monkey Business photo, kicking the nomination to Michael Dukakis. But Mikey-D proved inept and subjected himself to multiple humiliations, giving us the only time since World War II that the same political party won three consecutive Presidential elections.

Dukakis, a self-identified “liberal,” tanked in 1988, but centrist Bill Clinton won in 1992. To this day, many Democrats insist that this proves voters prefer centrists. Democrats decided they needed, again, to chase the center and abandon the New Deal. I’d contend that Democrats’ 1992 victory reflects less a centrist at the top of the ticket, than the absence of Lee Atwater from inside the Republican National Committee.

Moreover, though Bill Clinton won twice, he never carried a majority. He probably wouldn’t have carried the plurality if Ross Perot, a former leading Nixon donor, hadn’t split the fiscal conservative vote twice. Facing a visibly fatigued George H.W. Bush in 1992 and an unsmiling Bob Dole in 1996, Clinton was less the popular choice of the 1990s than the candidate Americans could live with. Which isn’t saying much.

Failed candidate Michael Dukakis

Al Gore won a razor-thin plurality in 2000, but lost the procedure. But like Clinton, Gore never got the majority. Both Gore in 2000 and John Kerry in 2004 appeared starchy and joyless on camera, which shouldn’t matter, but does. Since campaigns today are highly visual endeavors, driven by television and YouTube, candidates need visual dynamism, which Barack Obama, for all his many faults, had.

Obama, another centrist, won two straight majorities, the first Democrat to do so since FDR. But consider his opponents: John McCain and Mitt Romney, who ran two of history’s greatest room-temperature campaigns. Hillary Clinton, another centrist, got more airtime for her frequent verbal gaffes than her policies. Clinton lost to the more entertaining Donald Trump, who only lost reelection after shitting in the metaphorical swimming pool on national TV.

2024 could offer Democrats the opportunity to shed the illusion of the phantom centrist voter, but it probably won’t. The party squandered precious momentum chasing crossover voters who have probably never existed. So here’s my postmortem, not only for the Democrats, but also for the entire United States:

Doing what we’ve done for fifty-six years hasn’t worked. Let’s change course before it’s too late.

Saturday, November 9, 2024

Anatomy of the Un-Free Mind

Jason Stanley, Erasing History: How Fascists Rewrite the Past to Control the Future

Fascists, of both the small-f and large-F varieties, have a curiously adversarial relationship with history. Their entire political movement depends on myths of past national greatness, which is almost always presented as lost, but which they promise to restore. But they generally despise historians, and attempt to squelch nuanced or conflicting narratives. Briefly, they adore the idea of history, but despise the practice, especially if it requires any self-reflection.

Yale philosophy professor Jason Stanley has written multiple books about how fascists, propagandists, and spin doctors use language and knowledge to strangle public discourse. This book’s title appears to promise a look at how authoritarian regimes rewrite history generally, but in practice, it focuses primarily on academia. Stanley examines how regimes inculcate a spirit, not only of ignorance, but also incuriosity, among citizens at a formative age.

First, tempting though it might be, Stanley stays substantially clear of large-F Fascists. He talks somewhat about Hitler, less about Mussolini. But he mainly focuses on current strongman authoritarian regimes, especially Putin’s Russia and Netanyahu’s Israel. He spends some time on the British colonial empire in Africa and India, his father’s scholarly specialization. And he unambiguously aims his harshest criticisms as Donald Trump’s American brand of anti-intellectualism.

In Stanley’s telling, fascists begin by constructing the purpose of historical education. Their reasoning starts from an intended conclusion—instilling a love of country and an adherence to hierarchy—and retrospectively determines how to achieve that goal. This means having institutional control of textbooks, administration, and personnel. Conservatives have made firing educators and replacing trustees a cornerstone of their recent campaigns.

The process of controlling the historical narrative closely resembles the process of creating imperial colonies; this isn’t coincidental. Autocrats create a hierarchy that, they contend, has always existed. They instill a central imperial language, and make it illegal to speak indigenous languages; not for nothing did British colonialists force the Kikuyu of Kenya onto reservations, exactly as America did to its native population. Because indigeneity is necessarily anti-authoritarian.

Jason Stanley

Here, I wish Stanley went more into how administrations silence history among adults. He describes how administrations use schools to prevent passing local autonomy and traditional identities onto the next generation. But how, other than armed force, do autocrats control adults? Stanley is vaguer here, perhaps because academics began committedly studying the process only after the atrocities of World War II. Traditional knowledge disappeared quickly, and I’m unsure how.

Mythical history looms large. That might mean presenting Germans as the genetic descendants of ancient Greece, as the Reich did (they’re not), or how schoolbook history presents George Washington as blameless, honest, and certainly not a slaveholder. Either way, it presents an innocent past that enthrones the dominant population as necessarily deserving power. This mythic past presents history as a constant decline from prelapsarian goodness, which politics must promptly reclaim.

Many critics respond by insisting that “classical education” counters authoritarian overreach. But Stanley insists that there’s no single magic machine. Classical education can empower intellectual curiosity and resistance to tyranny, if teachers focus on the questions the ancients raised, and if teachers address ways that our morality has changed. But authoritarians love using “classical education” to teach mindless adoration for the dead, which only compounds state-centered mythological ignorance.

Although Stanley focuses on history, he acknowledges this applies to all disciplines. He quotes Toni Morrison, who wrote that choosing the canon of literature is very much about choosing the national culture. When science serves the purpose of politics and industry, rather than inquiry and discovery, scientists always arrive at state-sponsored conclusions. The conventional liberal arts can improve human experience, or it can tie us to autocrats. Fascists know this.

Stanley makes no bones about his motivation. Donald Trump used executive authority to propound a national history curriculum that elided slavery, native extermination, and crackdowns on organized labor. The most extreme forms of American conservatism use the same techniques of historic erasure used to justify Putin’s imperialism or Britain’s conquest of India. Informed, politically invested citizens have a responsibility to reclaim history, both its glories and its tragedies, for the commonwealth.

This breakdown is chilling, certainly for those of us who believe in learning and inquiry, but hopefully also for anyone who just has kids, or loves a free society. Knowing history isn’t just a moral good, it’s a commitment to liberty and democracy; when governments decide what citizens may know, they control electoral outcomes. But the darkness notwithstanding, Stanley’s breakdown assures us that ignorance can be resisted. If we try.

Thursday, November 7, 2024

I Can’t Trust Americans Anymore

Vice President Kamala Harris approaches the podium for her concession speech, 11/6/2024

The Associated Press called the 2024 Presidential election for Donald Trump at 3:30 a.m. where I live, and I’m ashamed to say I was awake for it. And this time, he won without the asterisk that followed his name in 2016: he became the first President Elect to win with a simple majority since Barack Obama, and only the second Republican to win outright since George H.W. Bush in 1988. A majority of Americans who could be arsed to vote, voted for this friggin’ guy.

This means that 71 million Americans (as I write) watched him promise to make it harder to be Black, Brown, gay, disabled, a woman, or a dissident, and decided they wanted that. They heard him promise to use the military to purge citizens he considered disloyal, and considered it acceptable. They heard him threaten to shoot members of his own party in the face, and said “Okay.” They watched him fellate a microphone before a mixed-age crowd, and bought what he was selling.

Edit: in light of new evidence, it appears that Trump did not win an outright majority. Though he came first in the popular vote, continuing counts indicate that he fell short of the 50% threshold.

Even beyond the policies he’s promised to enact, policies which are already costing lives, his comportment in public should be disqualifying. His revolting language about women, minorities, and nonconformists basically means that he’s expressed hatred toward someone you know, possibly someone you love. And tens of millions of Americans considered that acceptable, handing him the nuclear codes. Nothing he’s done in public dissuaded American voters.

Laying aside the question of whether he will, or even could, do everything he’s promised to do, I’m left staring at my fellow Americans, wondering what possessed us to accept this. Because he received a simple majority, a relative rarity in Presidential politics, that means that over half of voters willingly put their names behind Trump’s actions. Shielded by the anonymity of the ballot box, they gave their endorsement to everything he’s done and said for nine years.

In light of this endorsement, I’m forced to ask myself: how can I trust anyone I meet again? When meeting an adult American now, I’ll forever remain conscious that there’s a better-than-even chance this person voted for Donald Trump. Forevermore, I’ll shake hands with potential employers, contractors, landlords, new friends, dates, and ask myself: did this person vote to force my gay friends into conversion therapy? To kick my disabled friends off the payroll?

To grant the police qualified immunity in shooting my Black friends?

Okay, in fairness, not everybody will have equal odds in this sweepstakes. We know, for instance, that men were more likely to support Trump than women. We know that White people, including White women, supported Trump by wide margins. This only increases my tendency, growing since my middle twenties, to reflexively distrust White men. And I say this as a White man, that I belong to perhaps the least trustworthy demographic in America today.

Exit polls have shown several demographic breakdowns, though readers should handle such results cautiously, considering how many voters openly distrust media and pollsters. Age, race, sex, peak educational achievement, population density, and economic class played into it. Broadly stated, the older, whiter, more rural, and less financially certain someone was, the more likely they’d support Trump, and his anger-based campaign pledges.

Again, I’m incriminating myself. Among the demographic divisions that increase one’s likelihood to support Trump, I belong to most. This means I’m arguing against my own interest here. I could easily lapse into tranquility, go with the flow like a dead fish, and do okay with the upcoming administration. People who look like me probably won’t face federal pushback, if I placidly participate. Only my willingness to oppose puts me at meaningful risk.

But from that privileged position, I regard my protected status as a responsibility, not a cocoon. Too many of my fellow pasty-faced honky dudes see their position as something which needs defended, a bastion against constant attack by barbarian hordes who want our creature comforts. Given the opportunity to use our gifts to improve the world for everyone, White men have chosen to retrench ourselves, and live in a state of constant paranoia. And it shows.

I’ve read Robert O. Paxton and Timothy Snyder. Within the sloppy, vague boundaries of small-F fascism, Donald Trump meets the definition. And, as a fascist, Trump has accomplished something neither Mussolini nor Hitler accomplished: he won a straight majority. We American voters, mostly White, mostly male, and disproportionately Christian, have thrown our support behind a fascist in ways completely unprecedented. No matter what happens, we won’t walk this back easily.

Tuesday, November 5, 2024

The Shallow State, Part Two

Keri Russell (left) and Rufus Sewell as Kate and Hal Wyler, in The Diplomat Season Two
This essay follows the prior review The Shallow State.

The first season of Netflix’s series The Diplomat turned heavily on its relationship with then-current events. A career American foreign service officer gets appointed to manage the relationship between an aged American President, who is terrified of appearing old, and an oafish British Prime Minister who opportunistically seizes a catastrophe to improve his public image. In the eighteen months since Season One dropped, global politics have shifted violently.

First, Rishi Sunak’s Tory administration imploded, culminating a decade-long train wreck that included such questionable luminaries as Boris Johnson and Liz Truss. Almost simultaneously, Joe Biden removed himself from consideration for reelection as U.S. President. This set American politics up for a contest between a highly competent but anodyne Democrat, and a charismatic Republican spouting talking points plagiarized from Weimar Germany. Politics stands idle for nobody.

The Diplomat foregrounds the unelected professionals who make American and British government offices run. On the American side, this mainly includes career foreign service officer Kate Wyler (Keri Russell), who didn’t want the ambassadorship to the United Kingdom, but accepted it because it’s right. Wyler has built her career preventing impending wars and violence. The State Department thinks this makes her a good potential political candidate; she disagrees.

Season One ended with Wyler and her chief ally, British Home Secretary Austin Dennison (David Gyasi), believing they’ve discovered a conspiracy running through Britain’s government. Anybody who reads or watches thrillers regularly knows that, the more fervently characters believe something in Act One, the more thoroughly Act Three will dash their beliefs. Our only questions are: how will their expectations be upended? And, what will replace them?

This matters because the British Prime Minister isn’t elected by British voters. Though the PM traditionally must be a member of Parliament, this isn’t legally mandatory, just expected. The PM is elected by Parliament itself, and therefore is almost always the leader of the majority party. This gives the PM extraordinary power and, as Boris Johnson proved, tragically little oversight. Government conspiracies have liberty to travel quickly with little impediment.

Season Two runs two episodes shorter than Season One, primarily because it dispenses with character-building. Creator Deborah Cahn assumes you remember the characters and their relationships; she introduces few new characters this season, and no new core ensemble members. This lets her dive straight into the action, a movement made possible because Season One ended with an explosion, and lingering questions about who survived.

Allison Janney as Vice President Grace Penn

Therefore, for a show driven substantially by dialog, the pacing never feels slow and talky. Every conversation carries weight, and nobody speaks flippantly. The terse, telegraphic language packs every interaction with weight, as characters talk bullets at one another. The show bespeaks the influence of Aaron Sorkin’s similarly dialog-driven The West Wing. Probably not coincidentally, this season introduces West Wing alum Allison Janney as Vice President Grace Penn.

But this creates a difficult dynamic with the show’s real-world inspiration. Two season’s worth of events have happened in just weeks, while Anglo-American politics has whipsawed drastically over eighteen months. The aspersions cast on President Biden’s age, which Season One name-checked without mimicking, seem dated now. As Kamala Harris tries to sustain Biden’s legacy, the character of Grace Penn seems unexpectedly pointed, and potentially dangerous.

This series emphasizes an important Platonic principle: the people who most fervently desire power over others, deserve it least. One achieves political power in modern democracies by showing the people an amiable public face, but by engaging in backroom negotiations and cutting deals which push the boundaries of legality. Prime Minister Nichol Trowbridge (Rory Kinnear) is an effective leader, if he is, to the degree that he’s a terrible person.

Same goes for Grace Penn. Season One established that the American government wanted to remove Penn behind a scandal. This season establishes that Penn knows this, and seems willing to cultivate Kate as her replacement. However, Kate quickly learns that Penn faces consequences only for the scandal where she’s been caught. Like Trowbridge, Penn scaled the heights of American politics by sacrificing her morals.

Anyone who follows politics, American or international, learns quickly that purity of heart is for fools. Situations necessary for the common good, often are deeply unfair to selected individuals. Life in politics requires candidates to question which of their principles they’ll willingly abandon under pressure. This series forces Kate Wyler, a career civil servant driven by high morals, to ask these questions of herself.

And by extension, it asks us, the audience, what price we’d willingly place on our souls.

Friday, November 1, 2024

The Dark Art of Nebraska Realism

Reina de los Comodines, A History of Bad Men

Cat Taylor loves to spin stories about his romantic Bayou Country heritage, but in reality, he’s lived his life in deep Midwestern disappointment. A stereotypical pretentious drunk, Cat doesn’t speak with his nearly-grown kids, but he still aspires to build a relationship with Martha, the downstate girl he met on a dating app. He doesn’t realize that he’s walked into a netherworld that he may never escape.

Once upon a time, novelists published their works serially, dropping them chapter by chapter into high-gloss magazines and penny chapbooks. Charles Dickens, Alexandre Dumas, and even Hunter S. Thompson published their best-known works this way, which allowed them to adapt their storytelling to readers’ demands. But since television displaced magazines as truly mass media, this tradition has largely disappeared from print. Reina de los Comedines wants to resurrect the form.

Big River, Nebraska, is only a two-hour drive for Cat, who lives in the college town of Fetterman, but for Nebraskans, that’s a pretty wide gulf. Martha and Cat meet in The Bar, which, in this narrative, represents Nebraska’s id. Inside The Bar, Cat meets an ensemble cast of working-class Nebraskans who’ve seemingly trauma bonded over living in a city that modernity forgot. Reina de los Comedines writes herself into this cast.

According to Reina’s pre-release, this novel is a roman á clef, and most of her intended audience will recognize themselves. This probably undersells the actual story. The real Reina was a semi-public figure in the IRL equivalent of Big River, but chose to return to anonymity, as much as media-saturated modernity allows. This lets her depict her bar, and her Nebraska, as a highly symbolic mélange of aspiration and disappointment.

(As an aside, the real Reina lives in Big River, and I live in Fetterman. We met on a dating app. I’m trying not to take it personally.)

In the first two chapters, Cat and Martha try to have their first date, but it starts off rocky. Throughout almost the entire two chapters, The Bar’s denizens have a donnybrook about whether Jason Isbell is real country music. Chapter Three takes a sudden turn, leaping several months forward, finding Cat and “Martie” suddenly on the outs. The story also takes an abrupt tonal shift into magic realism.

Reina de los Comodines

Reading the chapters together, one suspects this later tone more accurately reflects the story Reina prefers to tell. The symbolism which her first chapters conceal in subtext, becomes more evident in Chapter Three. Her authorial self-insert character offers Cat the guidance he needs, but one gets the feeling, reading the nuanced complexity with which Cat responds, that this give-and-take is more internal than Reina admits.

When I say the author writes a self-insert, I don’t mean this as either an aspersion or a denigration. She gives the character her own pseudonym, and describes the character exactly as she depicts herself on social media. By writing herself into her story, Reina takes the initiative to tell the characters around her the truth they clearly need to hear—and to receive the criticism she needs to receive back from them.

Historically, Magic Realism has its greatest popularity in abandoned colonial empires. Jorge Luis Borges and Edwidge Danticat write from worldview predicated on the distrust that follows conquest. They present a world in which the Freudian subconscious, which citizens of industrialized empires seek to silence, is both present and real, in a physical sense. In the Magic Realist narrative, language creates reality, and symbols have mass.

That’s what happens in Reina’s third chapter. Her argument about whether Jason Isbell is real country music, is actually about who gets to control people’s identity in the hinterlands. Do the residents of forgotten agrarian communities like Big River decide for themselves, or do they purchase their identity from the corporate music publishers? In the first two chapters, this is subtext. In Chapter Three, it becomes the focus.

It may seem like I’m harping on about just three chapters. Because of this novel’s serial nature, I suspect Reina is still developing themes as she writes. However, I’m eager to see where this story goes, and to keep writing, she needs an audience. Therefore I’m willing to review a novel that’s still finding its feet in real time, because I feel it’s off to a promising start.

I postponed writing this review because I hoped to read Chapter Four, which was due to drop. However, Reina has a job and a kid, and deadlines are elastic. I only hope to steer her the audience her work deserves.

Tuesday, October 29, 2024

“Chemicals,” Food, and You

“3-methyl butyraldehyde is a compound in a blueberry. Think about that.”

Somebody threw this into the ether recently in an argument about whole foods. You know how wise and restrained online debaters are. This person seriously believed they’d made a meaningful point about why people who insist on whole foods and minimal processing were wrong. Because whole foods have chemical compositions which are difficult to pronounce, this person apparently believed all arguments for plant-based whole foods are, a priori, wrong.

In full fairness, the other person in this debate (not me) said something equally unfounded. “If you can’t pronounce an ingredient,” the other person wrote, “DON’T EAT IT!” This person apparently believed in the honest wholesomeness of “natural” ingredients, presuming that naturally occurring, plant-based substances must necessarily be healthful. The other person responded with complete trust in science and technology.

I’ve written about this before; this double-sided fallacy doesn’t bear another unpacking.

However, the 3-methyl butyraldehyde argument deserves some exploration. This person, hidden behind an anonymous sign-on handle and a cartoon avatar, claims that abstruse chemical constituents within whole foods are essentially equal to additives used in manufacturing processed foods. 3-methyl butyraldehyde, which has both naturally occurring and synthetic forms, is found in many commercial foods, both whole and processed.

Blueberries have several naturally occurring chemical constituents. Some are easy to pronounce, including protein, fat, and especially water. Others are more abstruse, such as hydroxylinalool, linoleic acid, and terpinyl acetate. Though most of these chemical compounds are harmless in naturally occurring proportions, some can be harmful if isolated and hyperdosed. Like most organisms, blueberries comprise a subtle, nuanced combination of substances.

However, no combination of these substances, in any quantity, will come together and form a blueberry, not with current science or technology. One can only grow a blueberry by carefully cultivating a blueberry bush, a commitment of time and effort, as blueberry bushes only produce fruit after two or three years. Chemical fertilizers can sometimes hasten fruiting, but at the cost of starchier fruit, which displaces both nutrients and flavor.

One recalls the common counterargument whenever hippies complain about “chemicals.” Some wag, occasionally but not often a scientist, responds: “Everything is chemicals!” To take this argument seriously, the respondent must not know (or pretend not to know) that people say “chemicals” as a synecdoche for synthetic chemicals of an unknown provenance, which, under America’s light-touch regulatory regime, are assumed safe until proven otherwise—cf. Rampton & Stauber.

Though the FDA tests and regulates pharmaceuticals (for now), many food additives, cosmetics, chemicals used in haircare products and clothes, and other things we put on our bodies, are presumed safe. This despite years of evidence that this isn’t good practice. Ethelyne glycol, cyclamate, and several food dyes were regularly used in American foods before being demonstrated as unsafe.

Even beyond safety concerns, the reduction of whole foods to their chemical constituents preserves a dangerous idea. Futurists once posited that food scientists would eventually isolate the basic nutrients in food, and effectively replace the tedium of cooking and eating with the simplicity of gelatin capsules. One finds this reasoning behind the mythology of vitamin supplements, now known to be useless for most people most of the time.

Human digestion doesn’t simply extract chemical nutrients from food like a Peterbilt burning diesel. We require the complexity of food, including fats, fiber, roughage, and limited amounts of sugar. I generally side with Michael Pollan’s ubiquitous advice: “Eat food, not too much, mostly plants.” Food doesn’t mean chemical constituents. You don’t make a blueberry smoothie by adding 3-methyl butyraldehyde, you make it by adding blueberries.

Please don’t misunderstand. I want to avoid the trap of assuming that “natural” equals good. Reasonable adults know you shouldn’t pick wild mushrooms or handle poison ivy. That’s an exaggeration, but the point remains, that nature requires respect, like any other tool. But human agronomists have selectively bred food crops for 5,000 years to maximize healthful content, and apart from occasional allergies, agriculture is broadly trustworthy.

And pretending that food only consists of its chemical compounds is bad-faith argument. You wouldn’t describe your friend by listing his tissues and internal organs, because humans are more than the sum of our parts. The same applies to food, including fresh ingredients. Cooking natural ingredients, then processing them with synthetic additives to make them tasty and shelf-stable, does change the food.

Pretending not to understand the other person is smarmy and disrespectful, and if your argument requires it, your argument is probably bad.