Wednesday, November 30, 2022

The Violence Metaphor in Modern Politics

Mike Lindell in an uncharacteristically composed and businesslike moment

Mike Lindell, “the My Pillow Guy,” has announced his intent to run for the chairmanship of the Republican National Committee. Lindell, 61, has no prior experience in elective office or party organizing, and apparently no particular political experience beyond campaigning for Donald Trump. He’s spent the last two years yelling directly into the microphone on all-night basic cable about how Trump got ripped off, and in his mind, this apparently qualifies him for political authority.

Lindell’s on-camera outbursts started out comical, and for some people, still are. NBC comedian Seth Meyers still mimics Lindell’s grotesque behavior for cheap yuks. But more than two years later, his tantrums have crossed the line into something else, something downright tragic. Either he continues mishandling the microphone because he knows True Believers find his emotional outbursts persuasive, or something far darker is going on. It’s possible he has nothing left besides his spitting anger.

Over four years ago, I wrote that anger has become modern conservatism’s only emblem of seriousness. I wrote this after both Brett Kavanaugh and Lindsay Graham pulled Lindell-like antics inside the Senate Chamber. But four years ago, such anger was an isolated display. Lindsay Graham’s prior political career was predicated on displays of folksy charm that belied his deep party machine connections. And Brett Kavanaugh was previously (and has largely returned to being) publicly anonymous.

Since then, these displays have become something much more persistent. Mike Lindell’s two-year rage bender is perhaps an extreme example, crossing the line from pathos to bathos. But earlier this year, in the state where I live, Nebraska, an entire cadre of political neophytes seized control of the state Republican party, driven largely by displays of rage heaved clumsily into the microphone. This “leadership” then got Jim Pillen elected governor by stoking fears of CRT.

On the local level, another political novice, Paul Hazard, got elected to my city’s school board. Before running for office (he previously failed to get elected to the city council), Hazard was a state trooper—which, I shouldn’t have to specify, isn’t the same as being an educator. His qualification (singular) for office is his track record of getting screaming mad during school board meetings and town halls and, yes, shouting directly into the microphone.

Yeah, it's safe to say Justice Kavanaugh resembles Senator Graham

All these candidates pale, however, before the tragicomic spectacle that is Herschel Walker. Like his idol, Donald Trump, Walker has difficulty stringing together coherent sentences or staying on topic. This is perhaps understandable in Walker’s case: after fifteen years in the NFL, he probably has Chronic Traumatic Encephalopathy, a common brain injury in retired football (hand-egg) players. Also like Trump, Walker shows no prior familiarity with, or interest in, government or the entire legislative process.

More important than Walker’s tragic incompetence, however, is his behavior. Herschel Walker is a demonstrably bad person, with multiple credible accusations of domestic abuse, child abandonment, and attempting to induce various girlfriends to get abortions. Public progressives question how Republicans, the “party of family values,” can get behind a man who’s abandoned so many children and paid for so many abortions. Walker, like Trump, spits in the face of the values they claim to uphold.

However, placing Walker in the context of Donald Trump, Brett Kavanaugh, and Mike Lindell, a definite pattern emerges. Today’s Republican Party sees anger and violence as leadership qualities. Displays of macho savagery are more important to Republican leaders than competence, experience, or familiarity with the law. A leader, to today’s Republican party, isn’t someone who unites or gives vision. A leader is somebody who socks enemies in the jaw—even “enemies” like his own ex-wife.

Seriously, look at whom Republicans have supported recently. Donald Trump’s entire career was predicated on the promise to bring the hammer down on dissidents. Paul “ACAB” Hazard pledged to bring the same roided-out anger to the school board as he did to the highway patrol. And neither Mike Lindell’s nor Herschel Walker’s candidacies are predicated on policy, as they have none. They simply promise to shout down, or punch, anybody who dares to challenge them.

And unfortunately, this seems to work. Republican voters keep getting behind this series of reality TV stars, athletes, and camera-huggers. Herschel Walker is currently in a statistical tie with Raphael Warnock, an incumbent Senator, minister, and career civil activist. Real or promised violence has become the motivation of today’s Republicans, and it works. Faced with a panoply of digital-age problems, one major party pledges to answer with Stone-Age force, and their voters clamor for more.

Monday, November 28, 2022

Religion vs. Empire, Then and Now

Christian Scripture provides two birth narratives for Jesus Christ, and they’re kind of similar, but not the same. Matthew depicts Jesus born in Bethlehem, but not much before that; the dominant earthly power in Matthew’s Gospel is Herod the Great. Luke provides lengthy pre-birth anecdotes about Mary and her cousin Elizabeth, Mary’s interaction with the archangel Gabriel, and Mary and Joseph’s relocation to Bethlehem. Luke emphasizes Rome’s earthly authority, embodied in Quirinius, governor of Syria.

Either way, this narrative emphasizes that Jesus was born into an occupied nation. Stephen Prothero describes Israelite religion and national identity as defined by occupation, exile, and the promise of return. Matthew, a Jew, and Luke, a Greek, disagree over which occupation matters more, the domination of an Idumean client king or Roman conquest. What matters, though, is that Jesus didn’t arrive during the brief window of Maccabean independence, but under the shadow of Empire.

Empires carry laws with them, and always try to standardize morality. Consider American history. The nation couldn’t survive with slavery in some regions and not others; we fought our bloodiest war to finally end the institution. Afterward, we couldn’t survive with sectional racism, so, using nonviolent means like case law and legislation, the government managed to distribute segregation and Jim Crow nationally. This national distribution was so effective that we’ve been unable to reverse it.

Laws exist to standardize national morality, and unfortunately, America demonstrates how effective that is. That’s why it matters that Jesus came during a time of national occupation: because Roman law inevitably made changes upon Jewish morality. Maybe not ordinary Jews, or not all ordinary Jews anyway, as sects arose devoted entirely to anti-Roman resistance. But those who maintained power over ordinary Jews, in fixed buildings in large cities, needed to assimilate or get cut down.

Herod’s Temple in Jerusalem served the same ecclesial role as American megachurches today, as gathering places where ethnically and philosophically diverse peoples gathered to reassert their one shared identity. Though Jews were nominally descended from Abraham, Isaac, and Jacob, in practice, anybody who followed the Levitical law became Jewish. Likewise, “Christian” describes a panoply of religious and philosophical premises; but in churches, we gather together, speak the Apostles’ Creed and Lord’s Prayer, and become one.

Yet the comparison between Herod’s temple and American megachurches carries dark implications. As Obery M. Hendricks points out, the temple priesthood wasn’t just religious leadership, it was a proxy government. The Fortress Antoninius, the command garrison of the occupying Roman legions, was built into the temple walls, and the priesthood had police authority over the Jewish citizenry. Though ordinary Jews lived with, or sometimes resisted, Roman occupation, the temple priesthood actively enforced occupying Roman law.

In the last two years, as I write, hard-right Christians have gone from denying Christian Nationalism exists, to openly embracing the term. Authors like Stephen Wolfe have published their blueprints for a religious state dominated by clergy with billy clubs. Wolfe presents this as Christian dominion, and promises to use police authority to enforce religious law, punishing heretics and atheists. But his list of supposed religious “heretics” includes feminists, BLM supporters, and even Christian Socialists.

Wolfe demonstrates how religion under occupation comes to resemble the occupying power. Gone from Wolfe’s, and other Christian Nationalists’ faith, is the prominent anti-state stance taken by former public Christians, from William Lloyd Garrison to MLK. Religion for Christian Nationalists, like for temple priests in Herod’s Jerusalem, becomes a matter of obeying laws, not of doing right by those least able to defend themselves. Righteousness and justice become whatever the state-friendly priesthood says they are.

Throughout his ministry, Jesus remained an observant Jew. He taught several of his most important lessons in synagogues, the Temple, and potluck dinners. Yet he disdained observance of law as its own moral good. Religious leaders, who earned their living enforcing laws, frequently castigated Jesus and his followers for disdaining cleanliness rituals, food protocols, and Sabbath observance. To the priesthood, law had become its own justification, regardless of its impact on the poor and defenseless.

Versus the law, Jesus presents something altogether more difficult: facing each situation as it exists, and seeking the truth. The truth for Rich Young Ruler was that he needed to part from his possessions. For the Samaritan Woman at the Well, the truth was that she needed to stop hiding from her past. Knowing the truth isn’t easy; it means going within, spending time in study, and taking a risk. But truth, not law, saves.

Monday, November 21, 2022

Nonpartisan Jesus vs. the Apolitical Church

Andy Stanley, Not In It To Win It: Why Choosing Sides Sidelines the Church

Atlanta megachurch pastor Andy Stanley endured intense criticism, some of it borderline violent, when he paused in-person worship during the pandemic. He never wanted to become a political lightning rod, but when angry parishioners accused him with cable-TV talking points, he realized that's what he became. To Stanley, political alignment is an abdication of the gospel message. I see where he's coming from, but I can't bring myself to agree.

American Christianity, in Stanley's mind, has become too invested in winning. Whether this means winning arguments, or winning elections, or winning the "culture wars," Christians seemingly care more about worldly victory than eternal truth. When we focus on winning America, Stanley says, we lose Americans. Earthly victory is eternal loss; we gain the world, as Jesus said, and lose our souls. Not exactly a fair trade.

This strictly partisan interpretation of Christianity doesn't jibe with historical Christ followers. Jesus spoke of the "kingdom of God," and importantly, he meant it. Ancient Rome didn't distrust Christians because they prayed more, but because they pledged allegiance to another King. The idea of Christians taking partisan sides in the frequent Roman civil wars would've made no sense, because Christians were already deemed disloyal to Earthly kings.

So far, Stanley and I agree. Jesus didn't come to endorse partisan alignment, and certainly not to defend existing power hierarchy. This applies across political parties, since both parties want power; the Messiah who promises to "make all things new" wouldn't support either party that wants power at others' expense. Jesus didn't call us to win, but to love, which we can't do if we're busy divvying the world into winners and losers.

Stanley goes even further. Not only did Jesus not come to win, but Jesus specifically came to "lose," by Earthly standards. Because this world is full of losers: the poor, the disenfranchised, the outcast, the ritually unclean. Stanley spends several chapters exploring this from different angles, but his upshot is basically that Jesus loves losers, not because they're especially holy, but because they're losers. We can't follow Jesus' example if we're busy winning.

Pastor Andy Stanley

Okay, in broad strokes, Stanley and I agree. I support his overarching thesis, because I disapprove of not only the recent rise of Christian Nationalism, but also the progressive Christian response that hides in government's skirts. Both options accord more power to human institutions, which necessarily robs power from "the least of these," the sheep that Jesus called Peter to feed.

But Stanley and I diverge on what that means in practice. Stanley evidently believes Christ's church should remain, in his words, "apolitical." Though he admits he began writing this book because he was appalled by the vitriolic partisan response to his pandemic policy, he seems to think that the response is to stand above the fray. The mere fact that he couldn't, that his attempt to avoid politics was interpreted politically, doesn't change his mind.

Millions of Black American Christians have learned that simply being alive, and loving as Christ first loved us, is a political act. That's true when Christians organize deliberate resistance to worldly injustice, as Dr. King and Howard Thurman proved. But the 16th Street Baptist Church showed us that, when your existence threatens the status quo, this world's powers don't need legal justifications to hate and destroy.

As Stanley himself says, becoming a Christ follower should mean a radical reorientation of life. We're called to love those this world doesn't love. That may mean, like James and John, leaving our family; it may mean, like Matthew and Zaccheus, leaving the protection that government authority grants. Loving others threatens "the world," which thrives by creating in-groups and exploiting our fear of strangers.

What Stanley seems to not grasp is that the radical love he advocates, is indeed political. It knowingly subverts the world's widespread desire to name enemies and seek victory, which powerful people use to preserve their advantages. Refusing to play the world's political game isn't "apolitical," despite Stanley's claim. Jesus refused to either acknowledge or refuse Pilate's authority, an action which spit in Roman authority's eye.

Again, I agree with Stanley, broadly. When Christians strive to "win," we play into Earthly power structures that divide the world into winners and losers, an unloving, anti-Christian attitude. He's right that Christ calls us to lose the world. But Stanley is wrong to describe this as "apolitical." The anger political insiders show demonstrates how wholly political it is when we refuse. We can't stand aloof from the consequences.

Saturday, November 19, 2022

The Role of Art in a Divided Society

A still from Robert Wise and Jerome Robbins’ 1961 film of West Side Story

Sometime in the 1990s, I’ve forgotten exactly when, my sister’s high school theater program staged the classic musical West Side Story. Because of course they did, it’s standard theatrical repertoire. The only problem was, her school (she and I attended different high schools) was overwhelmingly White. The performance of urban tension between Hispanic and Irish communities, was played by farmers’ kids of mainly German and Czech heritage.

This meant, as you’d expect, brownface. Students playing the Puerto Rican Sharks gang dyed their hair, darkened their skin, and affected Latino accents. The White Jets, meanwhile, learned a stereotyped “New Yawk” accent and got ducktail haircuts. These students, who were entirely White and lived in Nebraska for most or all of their lives, immersed themselves in playing ethnically mixed East Coast characters, not always in the most sensitive ways.

Around twenty-five years later, my sister recalls that performance with a visible cringe. Troweling on makeup to play ethnically clichéd characters, which seemed broadly acceptable then, is patently unacceptable today. Nobody, except a few high-profile heel-draggers like Megyn Kelly, would pretend otherwise. But without the willingness to play characters who didn’t resemble themselves, I contend, these students would’ve deprived themselves, and their community, of something important.

West Side Story remains important theater, seventy-five years after its debut, because it addresses an important American cultural problem. The Jets and Sharks, defined by their race, attend the same high school and walk the same streets. But they never communicate, because they believe long-held bigoted myths about one another. When Tony and Maria dare fall in love, it transgresses one of America’s most cherished internal borders, the color line.

I’ve written before that teaching youth the humanities matters, because through art and literature, students see other people as fully dimensional human beings, with thoughts, feelings and dreams equal to their own. West Side Story reminds us that anybody, raised on such myths, could wind up believing them, and embracing the violence such division brings. Racism, this play reminds us, isn’t inevitable; it’s a choice we make, and keep making.

Arguably, that’s why White actors playing Brown characters is pretty specious, usually. If my sister’s high school had sufficient Hispanic actors to play the Sharks, they should’ve cast accordingly. No matter how sympathetically those student actors attempted to portray characters who were culturally or racially different from themselves, they’ll inevitably resort to stereotypes, sometimes hurtful ones, of people groups they’ve never actually met.

A still from Stephen Spielberg’s 2021 film of West Side Story

But simultaneously, if the school refused to perform this play, nobody would’ve had the opportunity to receive its message. Not the student actors, who needed to stretch beyond their limited small-town experience, nor the audience who, in Western Nebraska, seldom get to witness world-class art. Beyond the high school, getting to see top-tier theater means traveling to Omaha or Denver, and most people can’t spare that much money or time.

This elicits the question: is the message important enough to accept a less-than-optimum messengers? I don’t want to be mistaken for advocating brownface; the specific event I’m remembering belongs to its own time and place, and should remain there. But the event gave students and the community an opportunity to see people whose lives and experiences were wildly different from anything experienced locally. Even if those “people” were actors.

Questions like this will become more important in coming years. In 1957, when West Side Story debuted, Manhattan’s Upper West Side was predominantly working-class, racially mixed, and volatile. Within five years, the combined forces of gentrification and White Flight changed local demographics. By the 1980s, the Upper West Side was heavily populated with yuppies, while the ethnic communities celebrated onstage had been forced into dwindling enclaves.

The White small town where my sister attended high school has experienced something similar: there are now considerably more Hispanic residents, and even a few Black residents. Because the Hispanic residents are mostly agricultural workers, though, they seldom mix substantially with the community. Interactions with what locals call “Mexicans” happen in public places, like grocery stores; the actual community members seldom get to know one another beyond nodding hello.

Artistic expressions like West Side Story will matter more soon, as American society becomes more segregated, more hostile, more like the Sharks and Jets. Opportunities to see “the Other” as equally human to ourselves might make the difference between peace and violence. And sadly, not everybody will have access to racially representative casting choices. Cross-racial casting isn’t ideal, but it’s better than denying audiences the art they need to see.

Friday, November 18, 2022

Thanksgiving and the American State Church

Not the original photo (source)

I fear that, somewhere near Albany, New York, a TV station still has news footage of seven-year-old me wearing fake Native American war paint. I’d made a war bonnet from construction paper and an old terry-cloth headband, and wore it to the second grade Thanksgiving reenactment at Howe Elementary School, in Schenectady. I was the only student there representing the Native American side.

Every year, countless American grade schoolers make black conical “Puritan” hats out of construction paper and craft glue and replay the “first Thanksgiving” in mid-November. These performances are crinkum-crankum, and for good reason. The first Thanksgiving is part of American state religion, and reenacting it serves exactly the same purpose as children’s Nativity pageants on Christmas Eve: it forces us to verbally commit ourselves to the faith and morality represented.

Except, that faith isn’t equally represented. In every grade-school Thanksgiving pageant I remember, nearly everybody dressed as English Pilgrims. The uniformly somber men’s costumes, with buckles on their hats and shoes, while women bundled their hair into off-white bonnets and carried fall flowers against their pinafores. Nearly every year, the Wampanoag Indians were verbally acknowledged, but not present.

In 1982, in consultation with my parents, I decided somebody needed to represent the Indians. We didn’t really know what that meant. Thanksgiving history usually focuses on Pilgrims surviving a tumultuous winter, then learning (in passive voice) to plant maize and hunt wild turkey. In seasonal art, the Wampanoag are usually represented by one or two shirtless Brown men with feathers in their hair; the art emphasizes White people and their massive chuckwagon spread.

My parents are generally conservative, never-Trump Republicans, but they’ve always had a soft spot for Native American history. In the 1980s, though, their idea of Native Americans wasn’t differentiated by nations and regions; they believed a broad pan-American indigenous myth that mostly resembles Plains Indians. So that was our pattern, and I attended that year’s Thanksgiving pageant dressed as a White boy’s homemade idea of a Ponca warrior.

Forty years later, I struggle with this. By any reasonable standard, this was cultural appropriation: I, a White person, took it upon myself to tell the BIPOC story. But if I didn’t, who would? There was literally nobody else willing to speak that truth, that the Wampanoag existed and participated in that pageant. Without my clumsy, stereotyped mannequin, the Native American voice would’ve been completely excluded from that American myth.

A common clip art of the First Thanksgiving, with benevolent
Englishmen and highly stereotyped Native Americans

Our Thanksgiving pageant was considered newsworthy, and broadcast on regional TV, because our class partnered with the Special Education classroom down the hall. We were deemed a beacon of inclusiveness. Though both classrooms were entirely White (with an asterisk: several Jewish students), regional media wanted to praise our efforts. Camera crews, helmed by a pretty young human interest journalist, captured the whole event.

Because I was a kid, and this happened forty years ago, I don’t remember the event itself at all. My one clear memory is watching the news from Albany that evening to see our story. At one key moment, the camera zoomed in on me, the only Pretendian in the room, with my brightly colored acrylic “warpaint” and my war bonnet held together with hot glue. The journalist didn’t say anything. My presence was sufficient.

My family and I felt pretty good about that. Somebody, however feebly, stood up for the Native American presence at an important White mythological event. Forty years later, I can only remember that moment with a combination of pride and cringe. A White kid, amid forty other White kids, dressed as a Plains Indian in a Massachusetts harvest festival? The cheek of it! But… but it matters that somebody said it.

By today’s standards, that tin-earred display of cultural goulash was wildly inappropriate. But I also stood in the assembly to remind everyone, in this moment of American state church, that our mythology needed to be broader than it is. We not only preached that counter-myth to two second-grade classrooms, but with media assistance, our message carried regionally: Native Americans were there, and deserve representation.

I wouldn’t do that again, certainly. And if I had kids, I’d think long and hard before encouraging them to do likewise. But for all its ham-handed stereotyping and cultural appropriation, I also wouldn’t undo that event. Somebody needed to say it. Somebody needed to remind the American state church that its mythology has excluded too many people for too long. Maybe I was a clumsy, childlike prophet, but at least I said it.

Wednesday, November 16, 2022

Thoughts on the Importance of Creation Myths

Michelangelo's The Creation of Adam, from the Sistene Chapel ceiling

Plant ecologist Robin Wall Kimmerer begins her book, Braiding Sweetgrass, by comparing the Potowatomi and Christian creation myths. Kimmerer positively contrasts Skywoman, who builds the Earth from music and faith and communion with animals, with Eve, whose only described accomplishment is failure and exile. Not only does Skywoman promote harmony with nature, Kimmerer believes, but Eve encourages attitudes of fatalism and misogyny.

I won’t say Kimmerer is wrong, because she isn’t. But the more versed I become in comparative religion, the more I believe her correctness is conditional. The Skywoman narrative describes the creation of a people defined by their relationship with one place and the land. Adam and Eve describes a people defined by exile and return. The Hebrew Masoretic Text is bookended by Israel’s exiles in Egypt and Babylon, and their respective returns.

Religious absolutists generally take their creation myths seriously, and one of the Twentieth Century’s greatest controversies has been how to teach, for instance, science in light of seven-day creationism. But I contend that religious creation myths are only literally true for those who have forgotten why those myths were written. Creation myths don’t pretend to accurately describe how the world came into being; rather, they describe their authors’ identity.

Adam and Eve are doomed to wander, not because their creation myth is fatalistic, but because Israelite history is one of resident aliens amid occupying nations. Arguably, Adam and Eve, who are vaguely defined characters, are less important as creation archetypes than Cain. Christian interpretations of Cain and Abel characterize Cain as the antagonist. But perhaps Cain, both exiled and protected by God, is the actual Israelite ancestor.

The Native American creation myths I’ve read generally spotlight either an Animal, such as Coyote in many Southwestern myths, or a woman, such as Kimmerer’s Skywoman, or the Corn Woman common in many narratives. Either an animal spirit, a maternal spirit, or both, brings forth reality. Humans, in these stories, are afterthoughts. Our world isn’t a habitation, as in Abrahamic religions, but a responsibility, one which White invaders habitually shirk.

While American schools cope with how, and whether, to teach Abrahamic creationism, the real mythological battle takes place in history classes. Politicians and educators feud mightily over how to teach American history, because the narrative we learn in public (state) schools—the only narrative some students ever learn—defines how we receive ourselves as a nation. The official history has become a creation myth.

The frontier as depicted by Currier and Ives (click to enlarge)

This isn’t metaphorical, either. We literally learn a sanitized version of history because, like Eve and Cain, the narrative is about us, the living. Where I live, in Nebraska, we learn just-so stories about plucky settlers who walked overland, Conestoga wagons in tow, to claim and domesticate an unsettled prairie, pulling crops from uncooperative soil. Agrarian industriousness is Nebraska’s state religion, Laura Ingalls Wilder’s Little House books our scripture.

Except.

This holy writ is compromised from the beginning. As LSU historian Nancy Isenberg writes, the original sodbusters who “domesticated” this soil were despised, and were chased off the land once it became lucrative for Back-East speculators. Conestoga settlers and their immediate heirs, the cowboys, were valorized in American mythology only once they were safely dead and couldn’t challenge our beliefs about our ancestral greatness.

Besides which, as Yale historian Greg Grandin writes, White settlers didn’t “domesticate” the prairie. My ancestors only crossed the frontier line years after the U.S. Cavalry cleared Native Americans off the soil. As I've noted elsewhere, Wilder’s Little House books weren’t really history, they were Libertarian myth-making, heavily rewritten by her daughter, Rose Wilder Lane. As myth-makers have always known, true virtue always exists in a distant, morally scrubbed past.

Though the White sodbuster narrative contains nuggets of truth, it’s as much mythology as Zeus on Olympus. That’s why American history classrooms have become as hard-fought as Martin Luther pleading his case before Cardinal Cajetan, because the “history” we’re fighting over is American state religion. We aren’t fighting over how to teach facts, because facts are ancillary. We’re fighting over which story we use to define our shared national identity.

Perhaps that’s why progressives struggle in this debate. They believe they’re laying out “facts,” when what the debate needs is a counternarrative, an alternate myth. Historian James Loewen notes that American classroom history is presented as an unbroken arc from triumph to triumph, which precludes both backsliding and penitence. What Americans need isn’t facts, it’s a more nuanced story, a creation myth that includes room to admit mistakes and learn.

Monday, November 14, 2022

I Am Vengeance, I Am the Night—And So Much More!

It’s been touching to watch tributes roll in for American voice actor Kevin Conroy, following his passing this past Thursday at age 66. Fan loyalty to an actor they knew almost entirely through his voice speaks volumes to how important that symbol remains for so many who were young in the 1990s and early 2000s. I suspect Conroy knew his importance to a generation; his enthusiastic reception at fan conventions is the stuff of legends.

Live-action superhero actors come and go; cinema is currently on its sixth or seventh Batman, depending how you count. Fans greet every new Batman actor with hostility, though public sentiment usually adjusts quickly (pipe down, George Clooney). Behind them all, Kevin Conroy has persisted; he portrayed Batman from 1992 to 2019, when he finally portrayed the character in live-action on an episode of Batwoman. Twenty-seven years associated with the role.

However, I can’t help noticing how every tribute focuses on one role. Besides Batman, Conroy played a handful of other television and movie roles, particularly supporting roles on daytime soaps and crime dramas; few got any particular traction. But his theater career was extensive, and focused heavily on Shakespearean roles. Conroy studied at Julliard, under John Houseman, and his classmates included Robin Williams and Kelsey Grammar.

As a sometime actor myself, I wonder the implications of being so closely associated with one role. Like Jeremy Brett, whose once-storied career collapsed entirely into the role of Sherlock Holmes, every eulogy for Kevin Conroy remembers one role. His entire career has been compressed into something that happened in a sound studio, while the mostly anonymous animators worked around the needs of his voice.

I don’t want to disparage Conroy, or the influence his raspy, war-torn performance had on his intended audience. For two generations, his performance encapsulated not only Bruce Wayne’s willingness to fight for his beliefs, but the price he paid for continuing that fight. Conroy was the first gay actor to portray Batman (though this fact wasn’t initially known), and with his hard-chisled features and intense stare, it’s amazing he didn’t get live-action screentime sooner.

Kevin Conroy pictured with frequent co-star Mark Hamill

But actors with diverse range and untapped capabilities often get their careers reduced to one iconic role when they die, or even retire. When Ian Holm passed away in 2020, dozens of obituaries named his appearance in only two franchises: Alien and Lord of the Rings. When David Letterman retired from nightly TV, The New Yorker ran an illustration of him throwing pencils at the camera—which he hadn’t done in over twenty years.

Likewise, Kevin Conroy’s Shakespearean career largely vanished. So did his dedication to public service: following the terrorist attacks of September 11th, 2001, Conroy cooked and served meals for first responders sifting the rubble of the World Trade Center. In an era dominated by public attention-seekers like Kanye West and Elon Musk, Kevin Conroy happily worked hard, gave back, and let the results speak for themselves.

Conroy persistently remained conscious of his public role as an actor. As a gay man performing during a time when being publicly out could submarine a man’s career, Conroy took seriously theatre roles like “Peter” in Richard Greenberg’s Eastern Standard, a closeted entertainment executive who didn’t dare live his truth, for fear of imploding his career. In interviews, Conroy described such roles as an important moral statement.

Therefore I find myself torn. Like everyone, I think it’s fitting to celebrate the role that had such wide-ranging influence on a generation, and mourn the fact that this role has now ended. Yes, some other voice actor will certainly portray Batman, and probably mimic, to some degree, Conroy’s performance; but it’ll never be Kevin Conroy. Like Adam West before him, Conroy’s Batman mattered, and now it’s over.

Yet that isn’t Conroy overall. Like Jeremy Brett as Sherlock Holmes, or Harry Corbett as Harold Steptoe, Conroy’s career has largely vanished into one role. I don’t suppose Conroy minds, considering his active embrace of the organized fan community. (Unlike Corbett, who despised his iconic role and tried to distance himself from it.) Fan tributes to Conroy are remarkably one-note, and though it’s an awesome note, it isn’t a symphony.

Batman’s message matters, and the animated performance, uncluttered by the studio interference that reportedly hamstrung Tim Burton, conveys that moral complexity. Kevin Conroy will always be vengeance and the night for an entire generation, and he rightfully should be. But he was so much more than that and, actor to actor, I fear that getting lost.

Friday, November 11, 2022

Quacking In My Boots

Marjorie Taylor Green is so routine in her spiteful rhetoric and over-the-top claims that I sometimes mistake her for a Saturday Night Live character. From the moment she fumbled ass-backward onto the national stage, spouting QAnon theories and mangling her sentences, she’s played like a slightly sexist satire of bottle-blonde conservative women. Her racist statements, foot-in-mouth moments, and love of shouting have endeared her to the sensationalist media.

This week’s tweet about “our enemies… quacking in their boots” seems apropos. I admit having mocked her myself, because it’s consistent with Greene’s oeuvre of public gaffes, including “gazpacho police” and “peach tree dishes.” I’m having second thoughts, though, because unlike those notorious spoken blunders, this has a simpler explanation. Greene tweeted from her iPhone, and got AutoCorrected. Anybody who’s ever inadvertently typed “duck this pizza ship” knows that feeling.

Greene’s AutoCorrect error has, unfortunately, overshadowed the revealing information she tweeted out on purpose. “Quacking” is funny, yes. But the sentence’s real meat is the word “enemies.” Like President Trump before her, she characterizes her opposition as hostile adversaries, as foes who need defeated, as though politics were a real-time game of Dungeons & Dragons, and Greene sees herself as a paladin. That’s a painful insight into Greene’s moral calculus.

I remember learning, in 12th grade American Civics, how democratic politics rests on certain shared suppositions. Different political parties may disagree on the most efficient way to organize an economy or levy taxes, for instance. Such disagreements can even be beneficial, since they result in debates and evidence testing to refine first blush ideas. But small-d democratic participants have to agree on one precept: the process itself.

For democracy to function, all participants must agree that functioning democracy is, itself, a good. They must regard elections as desirable, fellow elected officials as peers, and office as service, not power. That’s why, in Congressional debates, representatives who disagree with one another on fundamental issues of power and government, are supposed to refer to one another as “my esteemed colleague” or “the honorable Representative.”

Rep. Marjorie Taylor Greene (R-GA)

This veneer of respect is, certainly, often gossamer-thin. In 1856, Charles Sumner, an abolitionist Massachusetts senator harangued the Senate chambers, calling pro-slavery senators a string of ugly personal names. South Carolina Representative Preston Brooks responded by beating Sumner with a cane. This failure of the ability of rhetoric to resolve deep regional differences is regarded by historians as evidence that the Civil War was, by then, inevitable.

Please don’t misunderstand me. Greene’s sloppy, high-handed rhetoric isn’t evidence that a Brooks-style physical attack is imminent. However, the characterization of ideological opponents as enemies, rather than as fellow participants in the democratic process, is a sign that procedural norms are failing. Greene, Trump, and those who agree with them have abandoned the pretense of agreement. Governance, for them, is a fight to win, not a debate to resolve.

We’ve witnessed this in, for instance, the way Greene notoriously harassed gun-control advocate David Hogg. Greene eschewed standards of procedural debate and literally chased Hogg down the street. Greene’s defenders will note that she wasn’t yet elected, and her actions have no official governmental standing. But she permitted herself to be recorded, and used the resulting footage in her Congressional campaign, an action which reflects her intent.

The fact that Greene was elected to a second term this week, even as she’s continued such high-profile antics in her official status as a Representative, speaks to more than just her. It tells me that voters, at least in Georgia’s mostly-White 14th Congressional District, actually like this behavior. Greene’s voting base sees her performing such ridiculous stunts, frequently with undisguised malign intent, and says: we’ll have more of that.

Greene was one of several Representatives elected in 2020 on the promise, not to govern responsibly, but to vanquish supposed enemies. While North Carolina Representative Madison Cawthorn got turfed out in the primaries after, Colorado Representative Lauren Boebert is, at this writing, likely to win a second term on a whisker-thin majority. The 2024 Republican presidential ticket is a likely split between culture warriors Donald Trump and Ron DeSantis.

Democrats continue playing the game soberly, using titles like “the honorable” and inviting the opposition party to debate. But that doesn’t work anymore. In my state, Nebraska, governor-elect Jim Pillen refused to debate the Democrat, Carol Blood, and Pillen won. Small-d democratic precepts are currently failing in America. If we don’t face that fact soon, we’ll face it when the governing party starts suspending elections and civil rights laws.

Monday, November 7, 2022

On Losing and Regaining My Love For Science

Tom Baker as the Fourth Doctor

I first wanted to become a “scientist” in second grade, not long after discovering Doctor Who reruns on PBS. I’m sure it wasn’t a coincidence. The Doctor, then played by Tom Baker, presented himself as a scientist, and frequently expounded on difficult scientific topics in layman’s language to advance the story. But for him, science was a journey, an opportunity to meet new people and have new experiences and, frequently, confront injustice at the root.

Whenever anybody asked grade-school Kevin what he wanted to be when he “grew up,” he continued insisting he wanted to be a “scientist” for years. I read books on science history for kids, which often presented science in metaphor: Louis Pasteur’s early vaccination experiments, for instance, were presented as armed soldiers posting pickets around a weakened body and defending it against an invading army. Science became a source of adventure.

Not until middle grades did I actually study science as a distinct discipline. Then, we began performing “experiments” demonstrating important concepts like, say, the states of matter, the function of liquid capillarity, or the complexity of vertebrate vascular systems. Fun stuff, in isolation. Except we performed each “experiment” one time, and if we didn’t achieve the preordained outcome, we flunked. This “science” was remarkably rote and cheerless.

Where, I wondered, was the adventure which The Doctor encountered, and equally importantly, the moral purpose? We weren’t venturing into unknown countries to gather new evidence and fight the scourge of ignorance that kept entire populations enslaved. We were repeating experiments so crinkum-crankum that the results were absolute. While we individually definitely learned new facts, the facts we learned were vetted and ratified in advance by authority figures.

Before going further, let me emphasize: I don’t blame individual teachers for this. Teachers must face bureaucratic intransigence, work with textbooks pre-approved by those same authority figures, and teach to the test. As Dana Goldstein writes, America’s school systems are organized around cost efficiency, not learning outcomes. Many top-tier teachers resist monolithic book learning, but can only accomplish so much when fighting the system.

Louis Pasteur, discoverer of multiple
medical procedures

But the effect was the same: the sense of moral adventure which Doctor Who promised came sideways against an educational system which only permitted experimental results which were absolutely true. There was no venturing off the map in school science. I now know, as I couldn’t have known in middle school, that this wasn’t accidental. Powerful people, and the legislators they purchase, want all “learning” to result in predictable outcomes which discourage questions.

In my childhood, science was the battlefield to control the public discussion. Important religious leaders actively torpedoed any inquiry which would verify the theory of evolution (and, in some places, still do). Today, that battle has shifted to history, where teachers are required to teach bland myths and scrub history of any ambiguity or fault. In both cases, the underlying philosophy remains unchanged: prevent questions by excluding doubt.

During college, I discovered physics, and felt jolted. Before college, my limited understanding of “science” basically bifurcated into either chemistry or biology, both of which deeply disappointed me. Physics, by contrast, held the same qualities I found in science fiction adventure stories: degrees of uncertainty, reasoning through analogy, and an element of faith. In physics, all explanations are provisional, and failure is embraced in ways high school chemistry rejects.

Had I discovered physics earlier, my life might look different today. Surely some teacher somewhere introduced the discipline, but amid the crush of mandatory points which state boards required them to hit, the information got lost. By college, I’d shifted to literature, the discipline which promised the moral purpose which “science” no longer offered. Also, without a scientific goal, my math scores had languished beyond repair.

Mathematician Paul Lockhart writes about teaching middle-school math by ripping away students’ reliance on absolutely correct answers. When uncertainty becomes common again, students reinvest themselves in the process, and fall in love with learning as an adventure. A history teacher I know does something similar, capping his course with a role-play about rebuilding civilization after an EMP. Doubt becomes central to students’ intellectual investment.

I embraced the idea of “science” in childhood because it seemed bold and adventurous. But by eighth grade, I’d abandoned that ambition because it became tedious and repetitive. Only in adulthood did I discover how that tedium was engineered by powerful people to support their own power. We citizens need to reject the narrative, in any discipline, that questions are bad. Because bad people profit from our lack of answers.

Wednesday, November 2, 2022

Why Doesn’t Tarzan Have a Beard?

Johnny Weissmuller as Tarzan

Somebody presented this to me as a head-scratcher recently: why is Tarzan, who lives in the jungle and has never encountered a razor, clean-shaven? In saying “Tarzan,” of course, the asker meant Johnny Weissmuller, the gold medal-winning Olympic swimmer who played Tarzan in twelve feature films from 1932 to 1948. But seriously, the same applies to Buster Crabbe, Gordon Scott, and Alexander Skarsgård: Tarzan is portrayed without facial or body hair.

Weissmuller’s Tarzan remains the character’s iconic depiction, with the curved muscles and sleek skin of somebody who trained his body to resist water drag. But checking photos, I realize Weissmuller wasn’t just clean-shaven. His hair is also neatly barbered, slicked back in the “RKO Pictures Means Business” style that might, maybe, have reflected jungle sweat, but is clearly Brylcreem. Sure, apes groom one another, but it doesn’t look like that!

My immediate response was: same reason Elizabeth Taylor as Cleopatra has a fashionable bob. Because these movies are never really about what they’re about; they’re about the people who make and watch them. This stock answer could apply to countless historical or mythological epochs. Cinematic depictions of Hercules, Abraham Lincoln, Gandalf, or King Richard III always say more about us than about the characters.

Thinking about that answer, however, I’ve become increasingly dissatisfied with it. Weissmuller’s moderately muscled, glossy Tarzan isn’t a statement about the people who make or consume those movies, any more than the more absurdly muscled depictions of Hugh Jackman as Wolverine or Chris Hemsworth as Thor really reflect us. These characters aren’t who we, the audience, are; they’re lectures about who we, the audience, should be.

Abandoned from infancy, Tarzan grows to adulthood in an Edenic jungle politely untainted by ordinary old Black Africans. He innately understands Euro-American standards of personal grooming, fitness, and hygiene, which travel hand-in-glove with his instinctive ability to fashion tools and shelter. His ability to command animals is interesting, but incidental. His real accomplishment is bending the “wilderness” to suit his distinctly industrial-era demands.

Elizabeth Taylor as Cleopatra

Tarzan is the perfect colonial agent. He shapes nature to his expectations, but he also, on first encountering White people, recognizes their superiority, and longs for assimilation. Sure, in the movies he always returns to Africa, because if he ever permanently leaves, the franchise ends and RKO loses money. But once there, he consistently aids White imperialists and never once sullies himself with boring old Africans.

(I know, the movie with Alexander Skarsgård attempted to subvert this and make Tarzan more inclusive. That movie also tanked. All the perfumes of Arabia can’t wash the stink of colonialism off the franchise.)

Taylor’s Cleopatra tells a very different story. Released as the post-WWII generation hit adulthood, with the industrial excesses and pop-culture liberation that 1963 entailed, Cleopatra was no less a moralistic lecture. Surrounded by riches, adoration, and power, Cleopatra represented postwar American splendor. But she also represented deep distrust of powerful women. The movie repeatedly moralizes about how destructive imperial power becomes in feminine hands.

In 1963, women like Wanda Jackson, Lesley Gore, and even Elizabeth Taylor herself stopped accepting men’s shit. They demanded autonomy, which they weren’t always willing to state as explicitly sexual, though Taylor was already on her fourth marriage, age 31. Meanwhile, Cleopatra hit cinemas the same year Betty Friedan’s The Feminine Mystique dropped, pushing second-wave feminism into America’s mainstream. This wasn’t coincidental.

Cleopatra is presented as commanding, imperial, regal, but also doomed. The movie depicts her openly consuming male adoration, setting her own sexual terms, and demanding recognition. But we, the audience, know she’s already doomed. She’s going to embrace the wrong war, backed by the wrong allies, and will eventually choose suicide to avoid the ignominy of capture. We already know this, and implicitly, so does she.

Both movies arise from cultural contexts. Tarzan appeared, first in Burroughs’ short novels, then onscreen, as European empires in Africa and India were disintegrating, but America was establishing colonies in the Asian Pacific. Tarzan’s African jungle was transferable to American soldiers in the Philippine rainforests. Cleopatra subsequently emerged as women began challenging a male-dominated social order.

So no, I realize, these characters don’t really reflect us. Rather, they establish moralistic models for how we should or shouldn’t behave. Tarzan bespeaks the values of White empire, while Cleopatra warns about the perils of female ambition. Both characters serve a White male power hierarchy. One buys in, and is rewarded; the other rebels, and is punished. They aren’t us; they’re who Hollywood’s elite wants us to be.