Tuesday, December 20, 2022

In Dispraise of Realism

George R.R. Martin

A confession: I have only watched a few episodes of Game of Thrones, and none of its spinoff, House of the Dragon. I tried reading George R.R. Martin’s novels, and couldn’t. Nothing against either Martin’s writing, which has a strong voice and command of nuance, or the adaptation’s execution; I particularly like how they took Peter Jackson’s LotR ethos and made it sootier. Martin and his adapters work well.

Rather, I’ve struggled with the underlying ethic. Martin is probably the foremost proponent of a fantasy subgenre called “grimdark.” This neologism is sloppily defined and, like porn, we know it when we see it. Rather than having parameters, grimdark fantasy has a core interpretation of humanity defined by pessimism, and a petty, captious interpretation of human nature. Grimdark writers read humans as base, vile creatures, and depict this belief luridly.

Martin’s tone isn’t unique. Writers like Genevieve Valentine, Glen Cook, and Joe Abercrombie presume similarly that humans lack core values, honor doesn’t exist, and the best we can expect is to find the least awful antihero. Grimdark often exists in direct rejection to J.R.R. Tolkien, who aggressively rejected religious interpretations of his novels, but who nevertheless wrote from an essentially Catholic belief that justice must ultimately prevail.

This pessimism often gets characterized as “realism.” Both inside and outside literature, many people who pat themselves on the back for their supposed clear-eyed ability to see the truth, actively seek the bleakest, most hopeless interpretation for everything. Recall Tolstoy’s famous thesis statement of realism in Anna Karenena: “All happy families are alike; each unhappy family is unhappy in its own way.” Then he sets out to prove it.

Similarly, in politics, “realism” often justifies history’s worst atrocities. Otto von Bismarck pioneered realpolitik, a PoliSci thesis contending that everybody’s a bad actor, and the wisest course is to destroy other nations before they destroy you. It’s easy, with historical hindsight, to draw a straight line between Bismarck and two world wars. Today, American politics is tainted by “race realism,” a primarily conservative principle that race-based conflict is inevitable.

J.R.R. Tolkien

It strikes me, however, that Tolkien and his weird shadow, C.S. Lewis, weren’t ignorant of the worst possible interpretations. Tolkien fought in the trenches at the Somme. Lewis’s allegorical Pevensie family stumbled into Narnia while literally fleeing the London Blitz. Tolkien and Lewis lived through the Twentieth Century’s most ignoble and hideous moments. And despite that, they believed goodness, Christianity, and capital-T Truth still existed.

Lewis himself, in describing Chretien de Troyes’ Arthurian romances, noted that Chretien set his stories in a distant land, at a distant time. (Chretien was French, but in the 12th Century, overland travel was ponderous, and undertaken only with great purpose.) Because, Lewis writes, true virtue is always in another land, at another time. Historically, fantasy has shared that belief, whether Tolkien’s iron-age severity, or Lewis’ playful mythic candyland.

Contrasting the sociopolitical forces that steered Tolkien versus Martin, I notice two divergences. First, though both men were baptized and confirmed Catholic, Martin lost his faith moving into adulthood. He stopped believing that capital-T Truth existed, or at least was knowable. But no substitute source of moral confidence took Christianity’s place. Faced with a panoply of philosophical and theological moralities, Martin essentially surrendered to indecision.

Meanwhile, as Martin lacked any moral compass, he also lacked urgency in context. Lewis and Tolkien were shaped by literal war, with fascism, which was itself the final instaur form of imperialism. (America take note.) Martin and his grimdark cohort, by contrast, grew up amid the Cold War. They didn’t have the option to win or die trying. Instead, they lived beholden to powerful governments that were deaf to the people’s entreaties.

Actual violence, actual war, convinced Tolkien and Lewis that truth claims matter, and virtue exists. By contrast, the lingering threat of violence, with the pervasive reality that they could neither join in nor flee, convinced Martin et al. that nothing really matters, and humanity is necessarily execrable. It seems belief in truth and virtue emerges, butterfly-like, from the need to act. But the persistent inability to resist evil starves virtue.

And in both cases, True Believers cast their belief in virtue, or lack thereof, onto a distant land unreachable across countless miles and centuries. Because closer to home, we have to acknowledge that neither virtue nor vice are absolute. Real life is sloppy, and doesn’t move according to theory. In either case, Christian idealism or agnostic pessimism, the truth only exists in a mythic land.

Friday, December 16, 2022

The Dark Side of the Street, Part 2

This essay is a follow-up to The Dark Side of the Street.
Catriona Ward

In Catriona Ward’s novel The Last House on Needless Street, major protagonist Ted Bannerman is abjectly terrified of “Mommy.” Ted tiptoes around the house, afraid of transgressing Mommy’s many unwritten rules, or damaging anything that belongs to Mommy. Though he assures us that Mommy’s wrath is sudden and direct, he also doesn’t describe her actions; he only reports her dialog through the filter of carefully edited memory.

So be warned, this commentary will include spoilers for The Last House on Needless Street.

Ted Bannerman draws immediate comparisons to another notorious mama’s boy, Norman Bates. Like Ted, Norman performs elaborate rituals to appease his mother, whose love is contingent on Norman remaining permanently childlike. For Ted and Norman alike, adulthood and the outside world is frightening and uncertain; staying home is infantilizing, but at least it’s reliable. So both accept their mother’s violent control as the adequate compromise.

Cultural critic Sady Doyle writes that mothers frequently get blamed for sons’ violent behavior. Besides Norman Bates, Doyle cites evil mothers in the Texas Chainsaw franchise, the X-Files episode “Home,” and real-life weirdo Ed Gein. Unlike evil daughters in franchises like The Exorcist, who are presented as solely culpable for their own actions, evil sons are ostensibly molded by their mothers, who are responsible for their sons’ choices.

Doyle wrote before Ward published her novel, so I’m left to speculate how she’d handle this “evil mother” fable. Considering how Ward demonstrates familiarity with horror standards, and subverts them to serve her own mission, she probably wrote the Norman Bates parallels deliberately. We watch Ted struggle to refer to “Mommy” using adult language, despite his age, and we’re coached to expect a Bates-like snap in Ted’s psyche.

Alfred Hitchcock (working from Robert Bloch’s novel) presents a mentally ill man as a ticking time bomb, a perpetrator of violence who leaves a carefully orchestrated trail of bodies behind. Sixty-two years later, I’m probably breaching no confidence to reveal that the “Mother” Norman constantly appeases, lives entirely inside his head. He’s created an entire secondary identity, and invested it with his own insecurities and terrors of the outside world.

Sady Doyle

Catriona Ward has done something similar. “Mommy” lives entirely inside Ted Bannerman’s head, reinforcing his fears. But Ward subverts that paradigm twice. First, from the beginning, first-person narrator Ted acknowledges that Mommy isn’t present. He knows already that her persistent voice is a self-made delusion, one he attempts to control with pharmaceuticals and beer. Mommy’s exact whereabouts are one of the novel’s lingering mysteries.

Second, Ward acknowledges something other writers either don’t know or don’t admit: the mentally ill are more likely to be victims of violence than perpetrators. Both Alfred Hitchcock and, more recently, M. Night Shyamalan have presented mentally ill people as incipient monsters, who assuage their tortured inner lives by inflicting that torture outward. Ted, by contrast, desperately wants to be normal. He just doesn’t know how.

American culture shuns mental divergence. While Hitchcock and Shyamalan have presented people with mental disorders as violent masterminds, our mental health community has long marginalized even minor mental illnesses. As Oliver Sacks writes, in the 1960s, simply admitting to “hearing voices” was sufficient legal grounds to forcibly institutionalize people—even though most people hear disembodied voices occasionally.

Meanwhile, a growing body of Americans, fueled by what’s derisively called “mommy blogs” (paging Sady Doyle!), have avoided vaccinating their children. Their stated motivation is avoiding autism, a poorly understood developmental disorder. These parents, derided as “mommies,” have permitted recent recurrences of polio and measles, diseases once nearly extinct, to recur in modern America. Sure, the diseases are real, but the conflation of mental illness and mother-phobic language is concerning.

Ted Bannerman demonstrates how mentally ill people are regularly acted upon by a world that fears them. Rather than monsters and manhunters, the mentally ill are usually traumatized, and that trauma usually happens in childhood. In the final reveal, we discover that Mommy has done literally every crime for which the world blames Ted. Mommy is crafty, capable of elaborate planning and terrible violence. Ted is small, weak, and traumatized.

I suspect Sady Doyle wouldn’t appreciate the stereotype of shifting blame onto Mommy. In our post-Hitchcock era, that smacks of laziness. Yet Doyle might appreciate how Ward subverts that stereotype by using it to demonstrate the consequences childhood abuse has on actual children. Unlike Norman Bates or Ed Gein, Ted Bannerman reflects how the outside world continues repeating the adult child’s trauma. Unlike Norman, Ted Bannerman just wants to heal.

Thursday, December 15, 2022

The Dark Side of the Street

Catriona Ward, The Last House On Needless Street

Ted Bannerman lives alone with his daughter and his cat, craving the ordinariness of small-town life. But his past won’t let him be ordinary. Eleven years ago, the police tore Ted’s house apart looking for a little girl who disappeared at a nearby tourist trap. Years later, that little girl’s sister hasn’t relinquished the search, and that search brings her to Needless Street, and the Pacific Northwest forest just beyond.

Sometimes I wonder what consequences M. Night Shyamalan has wrought on art. Genre audiences have come to expect twist endings and jolting revelations as their due. We (and this includes me) read or watch distractedly, browsing ahead to claim we accurately anticipated the twist. American-born British author Catriona Ward apparently knows this, and writes with the expectation that readers want a twist. Then she turns that smug expectation against us.

From the beginning, Ward dribbles out cues that horror readers have been carefully programmed to anticipate. We notice which information our unreliable narrators omit, and what inconsistencies they include. We notice, for instance, that Ted lavishes affection upon his daughter Lauren, but she doesn’t live with him full-time. Where does she go when she isn’t with him? Why does she swing abruptly between childlike glee and violent outbursts?

Ward’s story alternates between three viewpoint characters, and the world they describe is inconsistent. Ted is sweet and sympathetic, a big-hearted man-child who simply wants to love and be loved. But he also uses alcohol and pills to quiet a gnawing darkness, and hides indoors to avoid snooping judgment. He’s haunted by his absent but nagging mother, whom he still calls Mommy, despite being in his middle thirties.

Meanwhile, Ted’s cat Olivia wanders the spacious halls of Ted’s Victorian wedding-cake house. Her absolute, undying love for Ted isn’t merely sweet: Olivia believes the Lord commissioned her specifically to provide Ted the care he needs, though Olivia’s actual responsibilities are weird and contrary. Olivia reads her Bible and believes she has everything figured out, until she begins hearing a desperate voice calling from beneath the kitchen floor.

Catriona Ward

Outside Ted’s narrow world, Dee has spent eleven years seeking her missing sister Lulu. Her sister’s disappearance cost everything: her art-school scholarship, her parents’ marriage, and eventually, her father’s death. Finding Lulu has become Dee’s only mission, and her mission brings her to Washington, to Needless Street, to Ted. She’s convinced Ted knows more than he’ll admit. But discrepancies start creeping into Dee’s almost-religious narrative.

I’ll admit something: I almost didn’t finish this book. Indeed, I almost flung it aside early on, when Ted began describing “going away,” an apparent fugue state where hours, even days, disappear from his memory. The correlation of horror and mental illness is a shopworn tchotchke beloved by genre authors. But Ward presents Ted as so heartworn, almost loveable, that I felt compelled to see how this resolved itself.

I’m glad I did. Ted’s illness—given Ward’s storytelling approach, I don’t think revealing Ted’s illness counts as a spoiler—is subverted in ways that prove both sympathetic and insightful. Ted proves capable of both intense love and intense wrath. But exactly who receives Ted’s wrath matters, in ways that aren’t necessarily obvious at first. Ted apparently operates under deeper directions, and his anger serves specific psychological ends.

Because ultimately, Ward uses psychology to tell a complex, humane story. The world Ted, Olivia, and Dee share appears inconsistent to us outsiders because these characters lie to themselves so often, so persistently, that reality has become a footnote. Sure, Ward couches these lies in Jungian archetypes which horror readers will find familiar. But their lies matter less than the truths they strive, and fail, to conceal.

In the final reveal, we understand that Ward has concealed these truths through a certain amount of polite hand-waving. To overextend the Shyamalan metaphor, she’s used camera cuts and restricted POV to withhold information from us, the audience. But again, Ward realizes we readers expect that from our authors, and utilizes that expectation to tell a story that, eventually, exceeds the perceived commercial limits of her genre.

Some potential buyers may be asking themselves whether the book is scary. It’s marketed to horror audiences, after all. Yes, it has the gripping, nuanced dread of masters like Lovecraft and Hitchcock. Ward kept me up past my bedtime. But she also asks what monsters we’re willing to live with, in order to avoid bringing new monsters inside. These characters live with demons, sure. But who, exactly, invited them in?

Wednesday, December 14, 2022

Hello Darkness, My Old Friend

When I was a kid, I was afraid of the dark, as kids frequently are. My father sneered at this fear and refused to give me a nightlight. “There’s nothing there in the dark that isn’t there in the light,” he declared, insisting this ended the discussion. I asked him to prove this; he did so by turning on the light. The disjunction there apparently never occurred to him.

Back in 2019, short-story writer Amber Sparks wrote an engaging essay about the recent rise in credence given to magical thinking, particularly among women. If astrology, divination, and Wicca aren’t more popular lately, they’re certainly more talked about. To Sparks, scientistic thinking is an outreach arm of the patriarchy. Magical thinking may not necessarily be “true,” but to Sparks, it empowers the chronically disenfranchised, and that’s what matters.

Thinking about fear of darkness, I see a more literal manifestation of this exact premise. Fear of darkness is common among those who lack power in our society. This may mean metaphorical power: children and the poor, for whom being out of sight of others makes them vulnerable to abuse. I’ve also seen paralyzing fear of darkness among well-off adults who’ve been subject to violence, especially relationship violence.

But absence of power can be more literal, too. Without electric light, the night swarms with forces eager to kill you, or anyway cause you harm. Ordinary able-bodied humans receive the largest fraction of information about the world through our eyes, which don’t work in the dark. Without light, the night could be riddled with wolves, bats, bears, and more. Campfires aren’t only for warmth; they keep predators away, too.

Amber Sparks (see also)

Think about it: fear of darkness is most aggressively sneered at by people with access to light. Adults subject ourselves to degrading work conditions to keep the electric bill paid. City dwellers flood every corner of their communities with light. We frown-ups punish children for being afraid of the dark, but permit darkness into our lives only under controlled circumstances; I can’t rise to pee at midnight without turning numerous lights on.

Amber Sparks notes accurately that, for most people, most of the time, scientific truths don’t matter much. “Knowing about how a hand moves doesn’t stop it from covering your mouth.” The same applies to darkness. I know, intellectually, that darkness doesn’t invite monsters into my closet. But only shining light into dark corners proves that definitively. My father turned on the light to prove darkness wasn’t scary, apparently without irony.

Because, seriously, maybe there’s nothing present in the darkness that isn’t there when you turn the light on. But without light, how would you know? Even inside my own house, with modern central climate control and light, I’ve ventured out without turning the light on and stepped on prickly burs, cat barf, and a garter snake that got inside somehow. Darkness didn’t put them there, but it caused me to find them with my feet.

Modernist thinking tells us that science, reason, and limitless electric light have banished fears of darkness, fears they call superstition. Neither the wolves of the primordial forest, nor the demons of medieval folklore, can withstand modernity and its intellect. If we simply swallow our base animal fears, and live in the glorious light of modernity, we have nothing to fear. Wisdom, reason, and manmade light rebound everywhere.

Except we know that modernity’s benefits haven’t been distributed equally. Modernism has traveled hand-in-glove with patriarchy, racism, and war. The German pogroms in Poland, and Soviet pogroms in Ukraine, demonstrate how those who think of themselves as thoroughly hip and modern are willing to inflict massive devastation on poor and powerless peoples in order to continue propagating their diseased vision of modernity.

Nearly three years ago, I spent one night trapped in darkness and cold beside a rural Nebraska highway. Afterward, I waxed rhapsodic about my various insights, because I had nothing but my brain for company. But then, I returned to modernity, to electric light and central heat, and within days, forgot nearly everything I’d learned. I resumed abasing myself before my boss, to avoid spending another night in the dark.

My message being: fear of darkness is natural, even good. Just as Amber Sparks claims magical thinking empowers women amid patriarchy, fear of darkness empowers the poor. Those who would chastise your fear, are the same people who’d demand your subservience to capitalist hierarchies, and that isn’t coincidental. Fear is your source of power; don’t let “them” steal it.

Wednesday, December 7, 2022

Digital Art and Human Development

Art produced by Dall-E (my collection)

Friends widely embraced the Lensa AI art app this week for one simple reason: it could create faces. This year has seen remarkable advances in computer-drawn art, beginning with Dall-E, which took social media by storm this summer. But Dall-E couldn’t draw faces. Human likenesses came out pinched, distorted, with nightmarish proportions. Dall-E’s immediate successor, Midjourney, did better faces, but they were cartoon-like and whimsical.

Lensa, by contrast, could read users’ selfies, and the program could return artistic renderings. Though the pictures aren’t yet realistic, the computer was capable of making value choices which presented finished portraits that emphasized certain features like trained artists would. My friends uploaded their selfies, knowing they were relinquishing rights to a massive AI clip-art pool, because the reward was painterly renderings of themselves as wizards, spacemen, and cats.

Pushback began almost immediately. Not only did Lensa pinch your uploaded images, the claims read, but it also pinched legitimate artists’ hand-made work. The app created painterly renderings because it modeled itself on actual painters, the techniques they used, the decisions they made. AI programs are already putting human artists out of work, though not yet in huge numbers. The ethics were already creating headaches, and it’s likely to get worse.

I’d like to focus on a different ethical consideration. Yes, the human economic challenge is serious: just as American industrial jobs moved to Japan and China, where labor was cheaper and supply lines were shorter, AI art is likely to displace human artists because the renderings are nearly instantaneous. People are unlikely to pay more for slower human-made art just because doing so is right. But the problem begins much earlier than the point of sale.

Art produced by Midjourney
(Wikimedia Commons)

Many schools begin teaching art in preschool. We teach children to draw with charcoal and Conte crayon, to play “Alouette” on a plastic recorder, and to sing in unison, because these represent important developmental milestones. Art and music teach children important skills of hand-eye coordination, the ability to sit still, and patient dedication to tasks where the reward isn’t always immediately obvious. Children need all these skills in multiple disciplines.

As I’ve gotten older, though, and spent time on both sides of the classroom, I’ve realized students learn even more from art. When drawing, students learn what to include, and what to omit. The entire process begins by reducing the forms we’re rendering—landscapes, buildings, human bodies—to a few essential lines. Then the artist makes choices about which shapes, colors, and textures to incorporate. These are important value choices.

Anybody who’s witnessed the products of a freshman life-drawing class knows that a roomful of art students will produce very different images of one model. Photographers may get slightly different images from the same model, depending on choices like shutter speed, angle, light saturation, and Photoshop aftereffects. But human artists, looking at the same model, can produce wildly divergent paintings of the same base human form.

These choices are subjective, but not value-neutral. The proportions of the human form, like the proportions of musical notes in a composition, have mathematical relationships, and artists make choices about how much they value harmony, synchrony, and continuity. These choices speak to the artist’s inner process. But they’re also external, and speak to what kind of reaction the artist hopes to produce in the chosen audience.

Art produced by Lensa (promo image)

Someone must make similar choices with AI art. The person writing Lensa’s prompts, of course, chooses whether they like the image produced. But some programmer deep within the bowels of a Silicon Valley industrial park also chooses what proportions, textures, and colors the program will emphasize. That programmer made these choices months, even years, before the artwork was produced. But that person made these choices, on your behalf.

We already know that recommendation algorithms on streaming services like Netflix and Spotify both empower and circumscribe our choices. We receive access to more musicians and filmmakers whose work we might never have discovered. But we also wind up encountering fewer kinds of artists, and wind up watching or listening to the same genres repeatedly. Our tastes become deeper, but narrower.

If programs replace artists, which seems likely, these AI art generators are likely to have similar effects on students’ ability to make choices. As art becomes instantaneous, and not a process, students are likely to truncate their ability to even see the choices being made, much less implement them in their own work. Their worlds will become narrower. If choices are instantaneous, why make any choices at all?

Friday, December 2, 2022

Take This Badge Off Of Me

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 47
Sam Peckinpah, Pat Garrett and Billy the Kid

Pat Garrett (James Coburn) swaggers into the old Spanish mission where the outlaws congregate, showing off his newly minted badge. Not long ago, he was one among this anarchic bunch, but the territorial government has empowered him to bring law to the frontier. Garrett warns his old friend William Bonney (Kris Kristofferson) that the government wants Garrett specifically to bring down Bonney. Here’s your first and only warning, Garrett says: get out or get killed.

I remember first watching this movie as a teenager, impelled by the reputation of director Sam Peckinpah, and by this movie’s famous soundtrack, composed and recorded by Bob Dylan. The classic “Knockin’ On Heaven’s Door” comes from this movie. As a kid, I was appalled by this movie. I watched as it ticked down a list of traditional Western character tropes, all of whom died in the kind of bloodbath for which Peckinpah is famous.

What I didn’t understand, watching it with a kid’s wide-eyed situational ignorance, was the context in which this movie was made. Peckinpah himself had become an unlikely countercultural hero. Though twenty years older than his principal audience, he understood their Vietnam-era malaise. He also understood the self-destructive violence which drove characters in mercilessly explosive classics like The Wild Bunch and Straw Dogs, as he feuded with production houses and self-medicated to control his bipolar disorder.

William Bonney, good-looking and rife with charm, ignores his friend’s warnings. He came west to escape Back-East law, and reinvented himself as Billy the Kid. For him, outlaw status isn’t a failure to obey the law; it’s an act of obedience to his truest self. He believes the frontier myth of complete autonomy; he sees the West as Rousseau’s “state of nature.” Billy scorns Garrett, leaves the mission, and returns to his career of rampage.

But if Billy has become the amoral extreme of libertarian thinking, Garrett has become the opposite. By accepting a badge, he also accepts the Back-East government’s strict Calvinist interpretation of law. Humankind, he now believes, is irredeemably sinful, and must be restrained by law. As a lawman, his right to enforce order upon everyone else is absolute, and he doesn’t care who gets hurt in pursuing that goal. Law is Garret’s goal, not a means.

Kris Kristofferson (left) and James Coburn in Pat Garrett and Billy the Kid

Here’s where my teenage self reacted with initial revulsion. Raised in an atmosphere of country music, John Wayne movies, and Zane Grey novels, I believed American Western myths of hard-bitten individualism and frontier pluck. Peckinpah chooses the most extreme versions of that mythology: the strict lawman who believes the frontier must be tamed, and the wild-eyed outlaw who glories in the absence of law. The two must fight, and destroy everyone who comes between them.

I failed to understand the context. Peckinpah released this movie in 1973, the same year America’s involvement in Vietnam juddered to an unsatisfying halt. The American myth had devolved into extremes, not dissimilar to those depicted onscreen, and likewise destroyed everything that came between them. Even Bob Dylan’s involvement, as both composer and strange, enigmatic character, served to repudiate the entire previous decade. He’d been reduced to “Alias,” a character with no name or loyalties.

And yes, Peckinpah destroyed every Western stereotype. I saw that correctly, but utterly misinterpreted it. In one memorable scene, veteran cowboy actor Slim Pickens plays a sheriff conscripted into Garrett’s posse. A gunfight ensues, and Pickens is gut-shot. The camera lingers over him trying to hold his entrails in, eyes wide, while his wife tries to comfort him. The iconic Western character is dying, and knows it.

Pickens would parody this trope the next year, in Blazing Saddles.

Unlike in Peckinpah’s earlier films, made when he was relatively sober, these characters have limited motivations. Unlike, say, David Sumner in Straw Dogs, Billy and Garrett don’t learn or change with the story; they simply possess absolute morals, and kill to support them. But like jazz, this movie isn’t about what it’s about. Billy and Garrett don’t learn, but everyone caught between them certainly does. The lessons are cold, and usually final.

I misunderstood this at 17, because my upbringing strictly refused to accept the lessons of the post-Vietnam era. Worse, looking around today, watching my country getting shredded by a similar adherence to absolute morals, I see I wasn’t the only one who didn’t learn. Today’s moralists even use the same cowboy imagery I grew up with. And now, like then, those who follow absolute morals aren’t hurt, but those caught between them are getting killed.

Wednesday, November 30, 2022

The Violence Metaphor in Modern Politics

Mike Lindell in an uncharacteristically composed and businesslike moment

Mike Lindell, “the My Pillow Guy,” has announced his intent to run for the chairmanship of the Republican National Committee. Lindell, 61, has no prior experience in elective office or party organizing, and apparently no particular political experience beyond campaigning for Donald Trump. He’s spent the last two years yelling directly into the microphone on all-night basic cable about how Trump got ripped off, and in his mind, this apparently qualifies him for political authority.

Lindell’s on-camera outbursts started out comical, and for some people, still are. NBC comedian Seth Meyers still mimics Lindell’s grotesque behavior for cheap yuks. But more than two years later, his tantrums have crossed the line into something else, something downright tragic. Either he continues mishandling the microphone because he knows True Believers find his emotional outbursts persuasive, or something far darker is going on. It’s possible he has nothing left besides his spitting anger.

Over four years ago, I wrote that anger has become modern conservatism’s only emblem of seriousness. I wrote this after both Brett Kavanaugh and Lindsay Graham pulled Lindell-like antics inside the Senate Chamber. But four years ago, such anger was an isolated display. Lindsay Graham’s prior political career was predicated on displays of folksy charm that belied his deep party machine connections. And Brett Kavanaugh was previously (and has largely returned to being) publicly anonymous.

Since then, these displays have become something much more persistent. Mike Lindell’s two-year rage bender is perhaps an extreme example, crossing the line from pathos to bathos. But earlier this year, in the state where I live, Nebraska, an entire cadre of political neophytes seized control of the state Republican party, driven largely by displays of rage heaved clumsily into the microphone. This “leadership” then got Jim Pillen elected governor by stoking fears of CRT.

On the local level, another political novice, Paul Hazard, got elected to my city’s school board. Before running for office (he previously failed to get elected to the city council), Hazard was a state trooper—which, I shouldn’t have to specify, isn’t the same as being an educator. His qualification (singular) for office is his track record of getting screaming mad during school board meetings and town halls and, yes, shouting directly into the microphone.

Yeah, it's safe to say Justice Kavanaugh resembles Senator Graham

All these candidates pale, however, before the tragicomic spectacle that is Herschel Walker. Like his idol, Donald Trump, Walker has difficulty stringing together coherent sentences or staying on topic. This is perhaps understandable in Walker’s case: after fifteen years in the NFL, he probably has Chronic Traumatic Encephalopathy, a common brain injury in retired football (hand-egg) players. Also like Trump, Walker shows no prior familiarity with, or interest in, government or the entire legislative process.

More important than Walker’s tragic incompetence, however, is his behavior. Herschel Walker is a demonstrably bad person, with multiple credible accusations of domestic abuse, child abandonment, and attempting to induce various girlfriends to get abortions. Public progressives question how Republicans, the “party of family values,” can get behind a man who’s abandoned so many children and paid for so many abortions. Walker, like Trump, spits in the face of the values they claim to uphold.

However, placing Walker in the context of Donald Trump, Brett Kavanaugh, and Mike Lindell, a definite pattern emerges. Today’s Republican Party sees anger and violence as leadership qualities. Displays of macho savagery are more important to Republican leaders than competence, experience, or familiarity with the law. A leader, to today’s Republican party, isn’t someone who unites or gives vision. A leader is somebody who socks enemies in the jaw—even “enemies” like his own ex-wife.

Seriously, look at whom Republicans have supported recently. Donald Trump’s entire career was predicated on the promise to bring the hammer down on dissidents. Paul “ACAB” Hazard pledged to bring the same roided-out anger to the school board as he did to the highway patrol. And neither Mike Lindell’s nor Herschel Walker’s candidacies are predicated on policy, as they have none. They simply promise to shout down, or punch, anybody who dares to challenge them.

And unfortunately, this seems to work. Republican voters keep getting behind this series of reality TV stars, athletes, and camera-huggers. Herschel Walker is currently in a statistical tie with Raphael Warnock, an incumbent Senator, minister, and career civil activist. Real or promised violence has become the motivation of today’s Republicans, and it works. Faced with a panoply of digital-age problems, one major party pledges to answer with Stone-Age force, and their voters clamor for more.

Monday, November 28, 2022

Religion vs. Empire, Then and Now

Christian Scripture provides two birth narratives for Jesus Christ, and they’re kind of similar, but not the same. Matthew depicts Jesus born in Bethlehem, but not much before that; the dominant earthly power in Matthew’s Gospel is Herod the Great. Luke provides lengthy pre-birth anecdotes about Mary and her cousin Elizabeth, Mary’s interaction with the archangel Gabriel, and Mary and Joseph’s relocation to Bethlehem. Luke emphasizes Rome’s earthly authority, embodied in Quirinius, governor of Syria.

Either way, this narrative emphasizes that Jesus was born into an occupied nation. Stephen Prothero describes Israelite religion and national identity as defined by occupation, exile, and the promise of return. Matthew, a Jew, and Luke, a Greek, disagree over which occupation matters more, the domination of an Idumean client king or Roman conquest. What matters, though, is that Jesus didn’t arrive during the brief window of Maccabean independence, but under the shadow of Empire.

Empires carry laws with them, and always try to standardize morality. Consider American history. The nation couldn’t survive with slavery in some regions and not others; we fought our bloodiest war to finally end the institution. Afterward, we couldn’t survive with sectional racism, so, using nonviolent means like case law and legislation, the government managed to distribute segregation and Jim Crow nationally. This national distribution was so effective that we’ve been unable to reverse it.

Laws exist to standardize national morality, and unfortunately, America demonstrates how effective that is. That’s why it matters that Jesus came during a time of national occupation: because Roman law inevitably made changes upon Jewish morality. Maybe not ordinary Jews, or not all ordinary Jews anyway, as sects arose devoted entirely to anti-Roman resistance. But those who maintained power over ordinary Jews, in fixed buildings in large cities, needed to assimilate or get cut down.

Herod’s Temple in Jerusalem served the same ecclesial role as American megachurches today, as gathering places where ethnically and philosophically diverse peoples gathered to reassert their one shared identity. Though Jews were nominally descended from Abraham, Isaac, and Jacob, in practice, anybody who followed the Levitical law became Jewish. Likewise, “Christian” describes a panoply of religious and philosophical premises; but in churches, we gather together, speak the Apostles’ Creed and Lord’s Prayer, and become one.

Yet the comparison between Herod’s temple and American megachurches carries dark implications. As Obery M. Hendricks points out, the temple priesthood wasn’t just religious leadership, it was a proxy government. The Fortress Antoninius, the command garrison of the occupying Roman legions, was built into the temple walls, and the priesthood had police authority over the Jewish citizenry. Though ordinary Jews lived with, or sometimes resisted, Roman occupation, the temple priesthood actively enforced occupying Roman law.

In the last two years, as I write, hard-right Christians have gone from denying Christian Nationalism exists, to openly embracing the term. Authors like Stephen Wolfe have published their blueprints for a religious state dominated by clergy with billy clubs. Wolfe presents this as Christian dominion, and promises to use police authority to enforce religious law, punishing heretics and atheists. But his list of supposed religious “heretics” includes feminists, BLM supporters, and even Christian Socialists.

Wolfe demonstrates how religion under occupation comes to resemble the occupying power. Gone from Wolfe’s, and other Christian Nationalists’ faith, is the prominent anti-state stance taken by former public Christians, from William Lloyd Garrison to MLK. Religion for Christian Nationalists, like for temple priests in Herod’s Jerusalem, becomes a matter of obeying laws, not of doing right by those least able to defend themselves. Righteousness and justice become whatever the state-friendly priesthood says they are.

Throughout his ministry, Jesus remained an observant Jew. He taught several of his most important lessons in synagogues, the Temple, and potluck dinners. Yet he disdained observance of law as its own moral good. Religious leaders, who earned their living enforcing laws, frequently castigated Jesus and his followers for disdaining cleanliness rituals, food protocols, and Sabbath observance. To the priesthood, law had become its own justification, regardless of its impact on the poor and defenseless.

Versus the law, Jesus presents something altogether more difficult: facing each situation as it exists, and seeking the truth. The truth for Rich Young Ruler was that he needed to part from his possessions. For the Samaritan Woman at the Well, the truth was that she needed to stop hiding from her past. Knowing the truth isn’t easy; it means going within, spending time in study, and taking a risk. But truth, not law, saves.

Monday, November 21, 2022

Nonpartisan Jesus vs. the Apolitical Church

Andy Stanley, Not In It To Win It: Why Choosing Sides Sidelines the Church

Atlanta megachurch pastor Andy Stanley endured intense criticism, some of it borderline violent, when he paused in-person worship during the pandemic. He never wanted to become a political lightning rod, but when angry parishioners accused him with cable-TV talking points, he realized that's what he became. To Stanley, political alignment is an abdication of the gospel message. I see where he's coming from, but I can't bring myself to agree.

American Christianity, in Stanley's mind, has become too invested in winning. Whether this means winning arguments, or winning elections, or winning the "culture wars," Christians seemingly care more about worldly victory than eternal truth. When we focus on winning America, Stanley says, we lose Americans. Earthly victory is eternal loss; we gain the world, as Jesus said, and lose our souls. Not exactly a fair trade.

This strictly partisan interpretation of Christianity doesn't jibe with historical Christ followers. Jesus spoke of the "kingdom of God," and importantly, he meant it. Ancient Rome didn't distrust Christians because they prayed more, but because they pledged allegiance to another King. The idea of Christians taking partisan sides in the frequent Roman civil wars would've made no sense, because Christians were already deemed disloyal to Earthly kings.

So far, Stanley and I agree. Jesus didn't come to endorse partisan alignment, and certainly not to defend existing power hierarchy. This applies across political parties, since both parties want power; the Messiah who promises to "make all things new" wouldn't support either party that wants power at others' expense. Jesus didn't call us to win, but to love, which we can't do if we're busy divvying the world into winners and losers.

Stanley goes even further. Not only did Jesus not come to win, but Jesus specifically came to "lose," by Earthly standards. Because this world is full of losers: the poor, the disenfranchised, the outcast, the ritually unclean. Stanley spends several chapters exploring this from different angles, but his upshot is basically that Jesus loves losers, not because they're especially holy, but because they're losers. We can't follow Jesus' example if we're busy winning.

Pastor Andy Stanley

Okay, in broad strokes, Stanley and I agree. I support his overarching thesis, because I disapprove of not only the recent rise of Christian Nationalism, but also the progressive Christian response that hides in government's skirts. Both options accord more power to human institutions, which necessarily robs power from "the least of these," the sheep that Jesus called Peter to feed.

But Stanley and I diverge on what that means in practice. Stanley evidently believes Christ's church should remain, in his words, "apolitical." Though he admits he began writing this book because he was appalled by the vitriolic partisan response to his pandemic policy, he seems to think that the response is to stand above the fray. The mere fact that he couldn't, that his attempt to avoid politics was interpreted politically, doesn't change his mind.

Millions of Black American Christians have learned that simply being alive, and loving as Christ first loved us, is a political act. That's true when Christians organize deliberate resistance to worldly injustice, as Dr. King and Howard Thurman proved. But the 16th Street Baptist Church showed us that, when your existence threatens the status quo, this world's powers don't need legal justifications to hate and destroy.

As Stanley himself says, becoming a Christ follower should mean a radical reorientation of life. We're called to love those this world doesn't love. That may mean, like James and John, leaving our family; it may mean, like Matthew and Zaccheus, leaving the protection that government authority grants. Loving others threatens "the world," which thrives by creating in-groups and exploiting our fear of strangers.

What Stanley seems to not grasp is that the radical love he advocates, is indeed political. It knowingly subverts the world's widespread desire to name enemies and seek victory, which powerful people use to preserve their advantages. Refusing to play the world's political game isn't "apolitical," despite Stanley's claim. Jesus refused to either acknowledge or refuse Pilate's authority, an action which spit in Roman authority's eye.

Again, I agree with Stanley, broadly. When Christians strive to "win," we play into Earthly power structures that divide the world into winners and losers, an unloving, anti-Christian attitude. He's right that Christ calls us to lose the world. But Stanley is wrong to describe this as "apolitical." The anger political insiders show demonstrates how wholly political it is when we refuse. We can't stand aloof from the consequences.

Saturday, November 19, 2022

The Role of Art in a Divided Society

A still from Robert Wise and Jerome Robbins’ 1961 film of West Side Story

Sometime in the 1990s, I’ve forgotten exactly when, my sister’s high school theater program staged the classic musical West Side Story. Because of course they did, it’s standard theatrical repertoire. The only problem was, her school (she and I attended different high schools) was overwhelmingly White. The performance of urban tension between Hispanic and Irish communities, was played by farmers’ kids of mainly German and Czech heritage.

This meant, as you’d expect, brownface. Students playing the Puerto Rican Sharks gang dyed their hair, darkened their skin, and affected Latino accents. The White Jets, meanwhile, learned a stereotyped “New Yawk” accent and got ducktail haircuts. These students, who were entirely White and lived in Nebraska for most or all of their lives, immersed themselves in playing ethnically mixed East Coast characters, not always in the most sensitive ways.

Around twenty-five years later, my sister recalls that performance with a visible cringe. Troweling on makeup to play ethnically clichéd characters, which seemed broadly acceptable then, is patently unacceptable today. Nobody, except a few high-profile heel-draggers like Megyn Kelly, would pretend otherwise. But without the willingness to play characters who didn’t resemble themselves, I contend, these students would’ve deprived themselves, and their community, of something important.

West Side Story remains important theater, seventy-five years after its debut, because it addresses an important American cultural problem. The Jets and Sharks, defined by their race, attend the same high school and walk the same streets. But they never communicate, because they believe long-held bigoted myths about one another. When Tony and Maria dare fall in love, it transgresses one of America’s most cherished internal borders, the color line.

I’ve written before that teaching youth the humanities matters, because through art and literature, students see other people as fully dimensional human beings, with thoughts, feelings and dreams equal to their own. West Side Story reminds us that anybody, raised on such myths, could wind up believing them, and embracing the violence such division brings. Racism, this play reminds us, isn’t inevitable; it’s a choice we make, and keep making.

Arguably, that’s why White actors playing Brown characters is pretty specious, usually. If my sister’s high school had sufficient Hispanic actors to play the Sharks, they should’ve cast accordingly. No matter how sympathetically those student actors attempted to portray characters who were culturally or racially different from themselves, they’ll inevitably resort to stereotypes, sometimes hurtful ones, of people groups they’ve never actually met.

A still from Stephen Spielberg’s 2021 film of West Side Story

But simultaneously, if the school refused to perform this play, nobody would’ve had the opportunity to receive its message. Not the student actors, who needed to stretch beyond their limited small-town experience, nor the audience who, in Western Nebraska, seldom get to witness world-class art. Beyond the high school, getting to see top-tier theater means traveling to Omaha or Denver, and most people can’t spare that much money or time.

This elicits the question: is the message important enough to accept a less-than-optimum messengers? I don’t want to be mistaken for advocating brownface; the specific event I’m remembering belongs to its own time and place, and should remain there. But the event gave students and the community an opportunity to see people whose lives and experiences were wildly different from anything experienced locally. Even if those “people” were actors.

Questions like this will become more important in coming years. In 1957, when West Side Story debuted, Manhattan’s Upper West Side was predominantly working-class, racially mixed, and volatile. Within five years, the combined forces of gentrification and White Flight changed local demographics. By the 1980s, the Upper West Side was heavily populated with yuppies, while the ethnic communities celebrated onstage had been forced into dwindling enclaves.

The White small town where my sister attended high school has experienced something similar: there are now considerably more Hispanic residents, and even a few Black residents. Because the Hispanic residents are mostly agricultural workers, though, they seldom mix substantially with the community. Interactions with what locals call “Mexicans” happen in public places, like grocery stores; the actual community members seldom get to know one another beyond nodding hello.

Artistic expressions like West Side Story will matter more soon, as American society becomes more segregated, more hostile, more like the Sharks and Jets. Opportunities to see “the Other” as equally human to ourselves might make the difference between peace and violence. And sadly, not everybody will have access to racially representative casting choices. Cross-racial casting isn’t ideal, but it’s better than denying audiences the art they need to see.

Friday, November 18, 2022

Thanksgiving and the American State Church

Not the original photo (source)

I fear that, somewhere near Albany, New York, a TV station still has news footage of seven-year-old me wearing fake Native American war paint. I’d made a war bonnet from construction paper and an old terry-cloth headband, and wore it to the second grade Thanksgiving reenactment at Howe Elementary School, in Schenectady. I was the only student there representing the Native American side.

Every year, countless American grade schoolers make black conical “Puritan” hats out of construction paper and craft glue and replay the “first Thanksgiving” in mid-November. These performances are crinkum-crankum, and for good reason. The first Thanksgiving is part of American state religion, and reenacting it serves exactly the same purpose as children’s Nativity pageants on Christmas Eve: it forces us to verbally commit ourselves to the faith and morality represented.

Except, that faith isn’t equally represented. In every grade-school Thanksgiving pageant I remember, nearly everybody dressed as English Pilgrims. The uniformly somber men’s costumes, with buckles on their hats and shoes, while women bundled their hair into off-white bonnets and carried fall flowers against their pinafores. Nearly every year, the Wampanoag Indians were verbally acknowledged, but not present.

In 1982, in consultation with my parents, I decided somebody needed to represent the Indians. We didn’t really know what that meant. Thanksgiving history usually focuses on Pilgrims surviving a tumultuous winter, then learning (in passive voice) to plant maize and hunt wild turkey. In seasonal art, the Wampanoag are usually represented by one or two shirtless Brown men with feathers in their hair; the art emphasizes White people and their massive chuckwagon spread.

My parents are generally conservative, never-Trump Republicans, but they’ve always had a soft spot for Native American history. In the 1980s, though, their idea of Native Americans wasn’t differentiated by nations and regions; they believed a broad pan-American indigenous myth that mostly resembles Plains Indians. So that was our pattern, and I attended that year’s Thanksgiving pageant dressed as a White boy’s homemade idea of a Ponca warrior.

Forty years later, I struggle with this. By any reasonable standard, this was cultural appropriation: I, a White person, took it upon myself to tell the BIPOC story. But if I didn’t, who would? There was literally nobody else willing to speak that truth, that the Wampanoag existed and participated in that pageant. Without my clumsy, stereotyped mannequin, the Native American voice would’ve been completely excluded from that American myth.

A common clip art of the First Thanksgiving, with benevolent
Englishmen and highly stereotyped Native Americans

Our Thanksgiving pageant was considered newsworthy, and broadcast on regional TV, because our class partnered with the Special Education classroom down the hall. We were deemed a beacon of inclusiveness. Though both classrooms were entirely White (with an asterisk: several Jewish students), regional media wanted to praise our efforts. Camera crews, helmed by a pretty young human interest journalist, captured the whole event.

Because I was a kid, and this happened forty years ago, I don’t remember the event itself at all. My one clear memory is watching the news from Albany that evening to see our story. At one key moment, the camera zoomed in on me, the only Pretendian in the room, with my brightly colored acrylic “warpaint” and my war bonnet held together with hot glue. The journalist didn’t say anything. My presence was sufficient.

My family and I felt pretty good about that. Somebody, however feebly, stood up for the Native American presence at an important White mythological event. Forty years later, I can only remember that moment with a combination of pride and cringe. A White kid, amid forty other White kids, dressed as a Plains Indian in a Massachusetts harvest festival? The cheek of it! But… but it matters that somebody said it.

By today’s standards, that tin-earred display of cultural goulash was wildly inappropriate. But I also stood in the assembly to remind everyone, in this moment of American state church, that our mythology needed to be broader than it is. We not only preached that counter-myth to two second-grade classrooms, but with media assistance, our message carried regionally: Native Americans were there, and deserve representation.

I wouldn’t do that again, certainly. And if I had kids, I’d think long and hard before encouraging them to do likewise. But for all its ham-handed stereotyping and cultural appropriation, I also wouldn’t undo that event. Somebody needed to say it. Somebody needed to remind the American state church that its mythology has excluded too many people for too long. Maybe I was a clumsy, childlike prophet, but at least I said it.

Wednesday, November 16, 2022

Thoughts on the Importance of Creation Myths

Michelangelo's The Creation of Adam, from the Sistene Chapel ceiling

Plant ecologist Robin Wall Kimmerer begins her book, Braiding Sweetgrass, by comparing the Potowatomi and Christian creation myths. Kimmerer positively contrasts Skywoman, who builds the Earth from music and faith and communion with animals, with Eve, whose only described accomplishment is failure and exile. Not only does Skywoman promote harmony with nature, Kimmerer believes, but Eve encourages attitudes of fatalism and misogyny.

I won’t say Kimmerer is wrong, because she isn’t. But the more versed I become in comparative religion, the more I believe her correctness is conditional. The Skywoman narrative describes the creation of a people defined by their relationship with one place and the land. Adam and Eve describes a people defined by exile and return. The Hebrew Masoretic Text is bookended by Israel’s exiles in Egypt and Babylon, and their respective returns.

Religious absolutists generally take their creation myths seriously, and one of the Twentieth Century’s greatest controversies has been how to teach, for instance, science in light of seven-day creationism. But I contend that religious creation myths are only literally true for those who have forgotten why those myths were written. Creation myths don’t pretend to accurately describe how the world came into being; rather, they describe their authors’ identity.

Adam and Eve are doomed to wander, not because their creation myth is fatalistic, but because Israelite history is one of resident aliens amid occupying nations. Arguably, Adam and Eve, who are vaguely defined characters, are less important as creation archetypes than Cain. Christian interpretations of Cain and Abel characterize Cain as the antagonist. But perhaps Cain, both exiled and protected by God, is the actual Israelite ancestor.

The Native American creation myths I’ve read generally spotlight either an Animal, such as Coyote in many Southwestern myths, or a woman, such as Kimmerer’s Skywoman, or the Corn Woman common in many narratives. Either an animal spirit, a maternal spirit, or both, brings forth reality. Humans, in these stories, are afterthoughts. Our world isn’t a habitation, as in Abrahamic religions, but a responsibility, one which White invaders habitually shirk.

While American schools cope with how, and whether, to teach Abrahamic creationism, the real mythological battle takes place in history classes. Politicians and educators feud mightily over how to teach American history, because the narrative we learn in public (state) schools—the only narrative some students ever learn—defines how we receive ourselves as a nation. The official history has become a creation myth.

The frontier as depicted by Currier and Ives (click to enlarge)

This isn’t metaphorical, either. We literally learn a sanitized version of history because, like Eve and Cain, the narrative is about us, the living. Where I live, in Nebraska, we learn just-so stories about plucky settlers who walked overland, Conestoga wagons in tow, to claim and domesticate an unsettled prairie, pulling crops from uncooperative soil. Agrarian industriousness is Nebraska’s state religion, Laura Ingalls Wilder’s Little House books our scripture.

Except.

This holy writ is compromised from the beginning. As LSU historian Nancy Isenberg writes, the original sodbusters who “domesticated” this soil were despised, and were chased off the land once it became lucrative for Back-East speculators. Conestoga settlers and their immediate heirs, the cowboys, were valorized in American mythology only once they were safely dead and couldn’t challenge our beliefs about our ancestral greatness.

Besides which, as Yale historian Greg Grandin writes, White settlers didn’t “domesticate” the prairie. My ancestors only crossed the frontier line years after the U.S. Cavalry cleared Native Americans off the soil. As I've noted elsewhere, Wilder’s Little House books weren’t really history, they were Libertarian myth-making, heavily rewritten by her daughter, Rose Wilder Lane. As myth-makers have always known, true virtue always exists in a distant, morally scrubbed past.

Though the White sodbuster narrative contains nuggets of truth, it’s as much mythology as Zeus on Olympus. That’s why American history classrooms have become as hard-fought as Martin Luther pleading his case before Cardinal Cajetan, because the “history” we’re fighting over is American state religion. We aren’t fighting over how to teach facts, because facts are ancillary. We’re fighting over which story we use to define our shared national identity.

Perhaps that’s why progressives struggle in this debate. They believe they’re laying out “facts,” when what the debate needs is a counternarrative, an alternate myth. Historian James Loewen notes that American classroom history is presented as an unbroken arc from triumph to triumph, which precludes both backsliding and penitence. What Americans need isn’t facts, it’s a more nuanced story, a creation myth that includes room to admit mistakes and learn.

Monday, November 14, 2022

I Am Vengeance, I Am the Night—And So Much More!

It’s been touching to watch tributes roll in for American voice actor Kevin Conroy, following his passing this past Thursday at age 66. Fan loyalty to an actor they knew almost entirely through his voice speaks volumes to how important that symbol remains for so many who were young in the 1990s and early 2000s. I suspect Conroy knew his importance to a generation; his enthusiastic reception at fan conventions is the stuff of legends.

Live-action superhero actors come and go; cinema is currently on its sixth or seventh Batman, depending how you count. Fans greet every new Batman actor with hostility, though public sentiment usually adjusts quickly (pipe down, George Clooney). Behind them all, Kevin Conroy has persisted; he portrayed Batman from 1992 to 2019, when he finally portrayed the character in live-action on an episode of Batwoman. Twenty-seven years associated with the role.

However, I can’t help noticing how every tribute focuses on one role. Besides Batman, Conroy played a handful of other television and movie roles, particularly supporting roles on daytime soaps and crime dramas; few got any particular traction. But his theater career was extensive, and focused heavily on Shakespearean roles. Conroy studied at Julliard, under John Houseman, and his classmates included Robin Williams and Kelsey Grammar.

As a sometime actor myself, I wonder the implications of being so closely associated with one role. Like Jeremy Brett, whose once-storied career collapsed entirely into the role of Sherlock Holmes, every eulogy for Kevin Conroy remembers one role. His entire career has been compressed into something that happened in a sound studio, while the mostly anonymous animators worked around the needs of his voice.

I don’t want to disparage Conroy, or the influence his raspy, war-torn performance had on his intended audience. For two generations, his performance encapsulated not only Bruce Wayne’s willingness to fight for his beliefs, but the price he paid for continuing that fight. Conroy was the first gay actor to portray Batman (though this fact wasn’t initially known), and with his hard-chisled features and intense stare, it’s amazing he didn’t get live-action screentime sooner.

Kevin Conroy pictured with frequent co-star Mark Hamill

But actors with diverse range and untapped capabilities often get their careers reduced to one iconic role when they die, or even retire. When Ian Holm passed away in 2020, dozens of obituaries named his appearance in only two franchises: Alien and Lord of the Rings. When David Letterman retired from nightly TV, The New Yorker ran an illustration of him throwing pencils at the camera—which he hadn’t done in over twenty years.

Likewise, Kevin Conroy’s Shakespearean career largely vanished. So did his dedication to public service: following the terrorist attacks of September 11th, 2001, Conroy cooked and served meals for first responders sifting the rubble of the World Trade Center. In an era dominated by public attention-seekers like Kanye West and Elon Musk, Kevin Conroy happily worked hard, gave back, and let the results speak for themselves.

Conroy persistently remained conscious of his public role as an actor. As a gay man performing during a time when being publicly out could submarine a man’s career, Conroy took seriously theatre roles like “Peter” in Richard Greenberg’s Eastern Standard, a closeted entertainment executive who didn’t dare live his truth, for fear of imploding his career. In interviews, Conroy described such roles as an important moral statement.

Therefore I find myself torn. Like everyone, I think it’s fitting to celebrate the role that had such wide-ranging influence on a generation, and mourn the fact that this role has now ended. Yes, some other voice actor will certainly portray Batman, and probably mimic, to some degree, Conroy’s performance; but it’ll never be Kevin Conroy. Like Adam West before him, Conroy’s Batman mattered, and now it’s over.

Yet that isn’t Conroy overall. Like Jeremy Brett as Sherlock Holmes, or Harry Corbett as Harold Steptoe, Conroy’s career has largely vanished into one role. I don’t suppose Conroy minds, considering his active embrace of the organized fan community. (Unlike Corbett, who despised his iconic role and tried to distance himself from it.) Fan tributes to Conroy are remarkably one-note, and though it’s an awesome note, it isn’t a symphony.

Batman’s message matters, and the animated performance, uncluttered by the studio interference that reportedly hamstrung Tim Burton, conveys that moral complexity. Kevin Conroy will always be vengeance and the night for an entire generation, and he rightfully should be. But he was so much more than that and, actor to actor, I fear that getting lost.

Friday, November 11, 2022

Quacking In My Boots

Marjorie Taylor Green is so routine in her spiteful rhetoric and over-the-top claims that I sometimes mistake her for a Saturday Night Live character. From the moment she fumbled ass-backward onto the national stage, spouting QAnon theories and mangling her sentences, she’s played like a slightly sexist satire of bottle-blonde conservative women. Her racist statements, foot-in-mouth moments, and love of shouting have endeared her to the sensationalist media.

This week’s tweet about “our enemies… quacking in their boots” seems apropos. I admit having mocked her myself, because it’s consistent with Greene’s oeuvre of public gaffes, including “gazpacho police” and “peach tree dishes.” I’m having second thoughts, though, because unlike those notorious spoken blunders, this has a simpler explanation. Greene tweeted from her iPhone, and got AutoCorrected. Anybody who’s ever inadvertently typed “duck this pizza ship” knows that feeling.

Greene’s AutoCorrect error has, unfortunately, overshadowed the revealing information she tweeted out on purpose. “Quacking” is funny, yes. But the sentence’s real meat is the word “enemies.” Like President Trump before her, she characterizes her opposition as hostile adversaries, as foes who need defeated, as though politics were a real-time game of Dungeons & Dragons, and Greene sees herself as a paladin. That’s a painful insight into Greene’s moral calculus.

I remember learning, in 12th grade American Civics, how democratic politics rests on certain shared suppositions. Different political parties may disagree on the most efficient way to organize an economy or levy taxes, for instance. Such disagreements can even be beneficial, since they result in debates and evidence testing to refine first blush ideas. But small-d democratic participants have to agree on one precept: the process itself.

For democracy to function, all participants must agree that functioning democracy is, itself, a good. They must regard elections as desirable, fellow elected officials as peers, and office as service, not power. That’s why, in Congressional debates, representatives who disagree with one another on fundamental issues of power and government, are supposed to refer to one another as “my esteemed colleague” or “the honorable Representative.”

Rep. Marjorie Taylor Greene (R-GA)

This veneer of respect is, certainly, often gossamer-thin. In 1856, Charles Sumner, an abolitionist Massachusetts senator harangued the Senate chambers, calling pro-slavery senators a string of ugly personal names. South Carolina Representative Preston Brooks responded by beating Sumner with a cane. This failure of the ability of rhetoric to resolve deep regional differences is regarded by historians as evidence that the Civil War was, by then, inevitable.

Please don’t misunderstand me. Greene’s sloppy, high-handed rhetoric isn’t evidence that a Brooks-style physical attack is imminent. However, the characterization of ideological opponents as enemies, rather than as fellow participants in the democratic process, is a sign that procedural norms are failing. Greene, Trump, and those who agree with them have abandoned the pretense of agreement. Governance, for them, is a fight to win, not a debate to resolve.

We’ve witnessed this in, for instance, the way Greene notoriously harassed gun-control advocate David Hogg. Greene eschewed standards of procedural debate and literally chased Hogg down the street. Greene’s defenders will note that she wasn’t yet elected, and her actions have no official governmental standing. But she permitted herself to be recorded, and used the resulting footage in her Congressional campaign, an action which reflects her intent.

The fact that Greene was elected to a second term this week, even as she’s continued such high-profile antics in her official status as a Representative, speaks to more than just her. It tells me that voters, at least in Georgia’s mostly-White 14th Congressional District, actually like this behavior. Greene’s voting base sees her performing such ridiculous stunts, frequently with undisguised malign intent, and says: we’ll have more of that.

Greene was one of several Representatives elected in 2020 on the promise, not to govern responsibly, but to vanquish supposed enemies. While North Carolina Representative Madison Cawthorn got turfed out in the primaries after, Colorado Representative Lauren Boebert is, at this writing, likely to win a second term on a whisker-thin majority. The 2024 Republican presidential ticket is a likely split between culture warriors Donald Trump and Ron DeSantis.

Democrats continue playing the game soberly, using titles like “the honorable” and inviting the opposition party to debate. But that doesn’t work anymore. In my state, Nebraska, governor-elect Jim Pillen refused to debate the Democrat, Carol Blood, and Pillen won. Small-d democratic precepts are currently failing in America. If we don’t face that fact soon, we’ll face it when the governing party starts suspending elections and civil rights laws.

Monday, November 7, 2022

On Losing and Regaining My Love For Science

Tom Baker as the Fourth Doctor

I first wanted to become a “scientist” in second grade, not long after discovering Doctor Who reruns on PBS. I’m sure it wasn’t a coincidence. The Doctor, then played by Tom Baker, presented himself as a scientist, and frequently expounded on difficult scientific topics in layman’s language to advance the story. But for him, science was a journey, an opportunity to meet new people and have new experiences and, frequently, confront injustice at the root.

Whenever anybody asked grade-school Kevin what he wanted to be when he “grew up,” he continued insisting he wanted to be a “scientist” for years. I read books on science history for kids, which often presented science in metaphor: Louis Pasteur’s early vaccination experiments, for instance, were presented as armed soldiers posting pickets around a weakened body and defending it against an invading army. Science became a source of adventure.

Not until middle grades did I actually study science as a distinct discipline. Then, we began performing “experiments” demonstrating important concepts like, say, the states of matter, the function of liquid capillarity, or the complexity of vertebrate vascular systems. Fun stuff, in isolation. Except we performed each “experiment” one time, and if we didn’t achieve the preordained outcome, we flunked. This “science” was remarkably rote and cheerless.

Where, I wondered, was the adventure which The Doctor encountered, and equally importantly, the moral purpose? We weren’t venturing into unknown countries to gather new evidence and fight the scourge of ignorance that kept entire populations enslaved. We were repeating experiments so crinkum-crankum that the results were absolute. While we individually definitely learned new facts, the facts we learned were vetted and ratified in advance by authority figures.

Before going further, let me emphasize: I don’t blame individual teachers for this. Teachers must face bureaucratic intransigence, work with textbooks pre-approved by those same authority figures, and teach to the test. As Dana Goldstein writes, America’s school systems are organized around cost efficiency, not learning outcomes. Many top-tier teachers resist monolithic book learning, but can only accomplish so much when fighting the system.

Louis Pasteur, discoverer of multiple
medical procedures

But the effect was the same: the sense of moral adventure which Doctor Who promised came sideways against an educational system which only permitted experimental results which were absolutely true. There was no venturing off the map in school science. I now know, as I couldn’t have known in middle school, that this wasn’t accidental. Powerful people, and the legislators they purchase, want all “learning” to result in predictable outcomes which discourage questions.

In my childhood, science was the battlefield to control the public discussion. Important religious leaders actively torpedoed any inquiry which would verify the theory of evolution (and, in some places, still do). Today, that battle has shifted to history, where teachers are required to teach bland myths and scrub history of any ambiguity or fault. In both cases, the underlying philosophy remains unchanged: prevent questions by excluding doubt.

During college, I discovered physics, and felt jolted. Before college, my limited understanding of “science” basically bifurcated into either chemistry or biology, both of which deeply disappointed me. Physics, by contrast, held the same qualities I found in science fiction adventure stories: degrees of uncertainty, reasoning through analogy, and an element of faith. In physics, all explanations are provisional, and failure is embraced in ways high school chemistry rejects.

Had I discovered physics earlier, my life might look different today. Surely some teacher somewhere introduced the discipline, but amid the crush of mandatory points which state boards required them to hit, the information got lost. By college, I’d shifted to literature, the discipline which promised the moral purpose which “science” no longer offered. Also, without a scientific goal, my math scores had languished beyond repair.

Mathematician Paul Lockhart writes about teaching middle-school math by ripping away students’ reliance on absolutely correct answers. When uncertainty becomes common again, students reinvest themselves in the process, and fall in love with learning as an adventure. A history teacher I know does something similar, capping his course with a role-play about rebuilding civilization after an EMP. Doubt becomes central to students’ intellectual investment.

I embraced the idea of “science” in childhood because it seemed bold and adventurous. But by eighth grade, I’d abandoned that ambition because it became tedious and repetitive. Only in adulthood did I discover how that tedium was engineered by powerful people to support their own power. We citizens need to reject the narrative, in any discipline, that questions are bad. Because bad people profit from our lack of answers.

Wednesday, November 2, 2022

Why Doesn’t Tarzan Have a Beard?

Johnny Weissmuller as Tarzan

Somebody presented this to me as a head-scratcher recently: why is Tarzan, who lives in the jungle and has never encountered a razor, clean-shaven? In saying “Tarzan,” of course, the asker meant Johnny Weissmuller, the gold medal-winning Olympic swimmer who played Tarzan in twelve feature films from 1932 to 1948. But seriously, the same applies to Buster Crabbe, Gordon Scott, and Alexander SkarsgÃ¥rd: Tarzan is portrayed without facial or body hair.

Weissmuller’s Tarzan remains the character’s iconic depiction, with the curved muscles and sleek skin of somebody who trained his body to resist water drag. But checking photos, I realize Weissmuller wasn’t just clean-shaven. His hair is also neatly barbered, slicked back in the “RKO Pictures Means Business” style that might, maybe, have reflected jungle sweat, but is clearly Brylcreem. Sure, apes groom one another, but it doesn’t look like that!

My immediate response was: same reason Elizabeth Taylor as Cleopatra has a fashionable bob. Because these movies are never really about what they’re about; they’re about the people who make and watch them. This stock answer could apply to countless historical or mythological epochs. Cinematic depictions of Hercules, Abraham Lincoln, Gandalf, or King Richard III always say more about us than about the characters.

Thinking about that answer, however, I’ve become increasingly dissatisfied with it. Weissmuller’s moderately muscled, glossy Tarzan isn’t a statement about the people who make or consume those movies, any more than the more absurdly muscled depictions of Hugh Jackman as Wolverine or Chris Hemsworth as Thor really reflect us. These characters aren’t who we, the audience, are; they’re lectures about who we, the audience, should be.

Abandoned from infancy, Tarzan grows to adulthood in an Edenic jungle politely untainted by ordinary old Black Africans. He innately understands Euro-American standards of personal grooming, fitness, and hygiene, which travel hand-in-glove with his instinctive ability to fashion tools and shelter. His ability to command animals is interesting, but incidental. His real accomplishment is bending the “wilderness” to suit his distinctly industrial-era demands.

Elizabeth Taylor as Cleopatra

Tarzan is the perfect colonial agent. He shapes nature to his expectations, but he also, on first encountering White people, recognizes their superiority, and longs for assimilation. Sure, in the movies he always returns to Africa, because if he ever permanently leaves, the franchise ends and RKO loses money. But once there, he consistently aids White imperialists and never once sullies himself with boring old Africans.

(I know, the movie with Alexander SkarsgÃ¥rd attempted to subvert this and make Tarzan more inclusive. That movie also tanked. All the perfumes of Arabia can’t wash the stink of colonialism off the franchise.)

Taylor’s Cleopatra tells a very different story. Released as the post-WWII generation hit adulthood, with the industrial excesses and pop-culture liberation that 1963 entailed, Cleopatra was no less a moralistic lecture. Surrounded by riches, adoration, and power, Cleopatra represented postwar American splendor. But she also represented deep distrust of powerful women. The movie repeatedly moralizes about how destructive imperial power becomes in feminine hands.

In 1963, women like Wanda Jackson, Lesley Gore, and even Elizabeth Taylor herself stopped accepting men’s shit. They demanded autonomy, which they weren’t always willing to state as explicitly sexual, though Taylor was already on her fourth marriage, age 31. Meanwhile, Cleopatra hit cinemas the same year Betty Friedan’s The Feminine Mystique dropped, pushing second-wave feminism into America’s mainstream. This wasn’t coincidental.

Cleopatra is presented as commanding, imperial, regal, but also doomed. The movie depicts her openly consuming male adoration, setting her own sexual terms, and demanding recognition. But we, the audience, know she’s already doomed. She’s going to embrace the wrong war, backed by the wrong allies, and will eventually choose suicide to avoid the ignominy of capture. We already know this, and implicitly, so does she.

Both movies arise from cultural contexts. Tarzan appeared, first in Burroughs’ short novels, then onscreen, as European empires in Africa and India were disintegrating, but America was establishing colonies in the Asian Pacific. Tarzan’s African jungle was transferable to American soldiers in the Philippine rainforests. Cleopatra subsequently emerged as women began challenging a male-dominated social order.

So no, I realize, these characters don’t really reflect us. Rather, they establish moralistic models for how we should or shouldn’t behave. Tarzan bespeaks the values of White empire, while Cleopatra warns about the perils of female ambition. Both characters serve a White male power hierarchy. One buys in, and is rewarded; the other rebels, and is punished. They aren’t us; they’re who Hollywood’s elite wants us to be.