Monday, November 30, 2020

Hellfire and Damnation (the Lite Version)

Garth Ennis (writer) and Steve Dillon (artist), Preacher: Book One

Reverend Jesse Custer shepherds a small West Texas congregation, but one gets the impression he doesn’t have much faith. One Sunday, he mounts the pulpit, still hung over from a rage-fueled Saturday bender at the local tavern, when a massive fireball surges up the aisle and into his soul. When he regains consciousness, Reverend Custer can speak with the voice of angels. But he still doesn’t know what to say.

This graphic novel, a reprint of the first twelve issues of the monthly comic by writer Garth Ennis and principal artist Steve Dillon, comes with a reputation among comics fans. Sadly, I just don’t see it. Ennis and Dillon supposedly ask important questions about what words like “God” and “salvation” mean in a world where Christianity seems increasingly tangential. But this questioning never gets beyond a Goth-ish middle grade level.

Poor Reverend “Just Call Me Jesse” Custer’s quest begins with an important discovery. The being that possesses him is a runaway spirit, with powers so vast and ambiguous, it threatens God’s very dominion. An archangel informs Jesse and his compatriots that God has fled this spirit in terror; the throne of eternal verity sits unoccupied. Only Jesse and his friends have power enough to put this situation right.

Unfortunately, not everybody wants God restored to glory. Before he’s even gotten all his facts organized, Jesse finds powers, both human and transcendent, arrayed against him with drawn weapons and nihilistic arguments. Apparently, in a world wracked with division and pain, some people would rather embrace eternal nothingness, than face judgement from God. Who, after all, created the nonsense we currently suffer through?

Watching Jesse and his allies, Tulip the assassin and Cassidy the vampire, confront their existential quest, I got the impression that writer Ennis, an atheist from Ireland, thinks he’s the first unbeliever to postulate these questions. He clearly has no conception of theodicy, the historical struggle to reconcile a loving God and a secular world. He’s hardly the first unbeliever I’ve met who thinks nobody ever, ever faced doubt before.

This lack of familiarity with Christian history comes across in how artist Dillon depicts Jesse. When he preaches, he wears a collarless pastel suit, reminiscent of disgraced 1980s televangelist Jimmy Swaggart. After the runaway spirit, code-named Genesis, immolates that suit, Dillon re-clothes him in a cowboy shirt with silver collar points and a bolo tie. These British creators evidently tie Christianity together with Southern American cultural excess.

Promo art for Preacher

The first half of this volume, collecting the first six issues of the comic, are set in Texas, and mostly involve exposition. Our protagonists get to know one another, while piecing together the circumstances which made God go missing. Meanwhile, a literally unstoppable foe emerges, dressed like a villain in a Sergio Leone B-movie. The Saint of Killers has only one objective: stop Jesse’s gang at any cost.

By the second half, with the throat-clearing finished, our protagonists actually commence their quest for the missing God. This story couples our chicken-fried protagonists with a parody of 1990s Manhattan crime dramas, including a character who helpfully narrates his story in voice-over captions. Reading along, it becomes increasingly clear our artists only know America from prime-time network TV.

Sometimes I enjoy media constructed from scraps of previous pop culture; other times I despise it. The difference generally boils down to one question: does the artist appear to be having any damn fun? In this case, I respond with “meh.” Like, our creators apparently enjoy what they’re creating, but not enough to conceal their unfamiliarity with their topic. It’s not fun enough to sweep me past their glaring flaws.

British anthropologist (and adult convert to Catholicism) E.E. Evans-Pritchard wrote, in his 1965 book Theories of Primitive Religion, that the discipline of comparative religion suffered because too many theorists had no faith. Because they couldn’t comprehend the experience of believing in something, their theories reflected their prejudices, not facts. Evans-Pritchard didn’t prescribe any specific religion, but suggested that faith, as an experience, is necessary to studies of others’ religions.

That, I fear, describes my experience reading this book. Ennis and Dillon hold religion in undisguised contempt. Therefore they don’t realize the questions they raise are centuries old, or that their characters are little more complex than paper dolls. They just hold the characters, and their faith, up to mockery and derision, and think they’ve created a story. They interject moments of fun and complexity, but largely, they address religion like petulant children intolerant of doubt.

Saturday, November 28, 2020

Deep Canyon Blues and the 12-String Revolution

L-R: Regina Spektor, Jakob Dylan, Beck, and Cat Power
rehearse their arrangement in Echo in the Canyon

Andrew Slater (director), Echo in the Canyon

In 1965, as America struggled to handle the British Invasion, an answering sound emerged from the margins of Los Angeles. Though the Beach Boys pioneered the West Coast Sound, the real movement began when Roger McGuinn, a disaffected folkie, quit Greenwich Village and moved west. Back then, young artists could still afford to starve in La-La Land. Many of them found their way to the same place, an incubator of ambition and innovation: Laurel Canyon.

Fifty years later, several Los Angeles recording professionals organized a concert and accompanying cover album to memorialize Laurel Canyon’s impact on musical history. This documentary, with son-of-the-times Jakob Dylan interviewing several of Laurel Canyon’s surviving veterans, lingered in post-production purgatory for years, but it creates an atmosphere to help audiences, jaded on a half-century of intervening history, understand just how momentous these few years really were.

It’s hard to define the Laurel Canyon sound. It was characterized, in part, by complex, layered arrangements, poetic lyrics sometimes derived directly from East Coast folk music, and dense multi-part vocal harmonies. This documentary quickly chucks any attempt to define the sound, preferring instead to identify its most influential proponents. There are fleeting references to the Monkees, Joni Mitchell, and Frank Zappa, but the greatest screen time goes to four acts: the Byrds, the Mamas & the Papas, the Beach Boys, and Buffalo Springfield.

Young Dylan interviews a cast of thousands to reconstruct the culture and climate of 1965. Luminaries like Roger McGuinn and David Crosby of the Byrds, Stephen Stills of Buffalo Springfield, the Beach Boys’ Brian Wilson, and Michelle Phillips of the Mamas & the Papas, recount what they brought into the Canyon scene. Eric Clapton, John Sebastian, Graham Nash, and Ringo Starr recount what they brought out of it. Their recollections are hectic and sometimes contradictory, but brimming with classic rock spirit.

It’s important to note, nobody onscreen purports to reveal the only true account. David Crosby admits his youthful arrogance made him pugnacious, and he frequently didn’t get along with other Canyon artists. McGuinn and Stills are remarkably forthcoming about the quantity of drugs they consumed, something they were cagier about previously. Phillips describes how band members frequently became so isolated from the outside world that their perceptions became distorted, their memories unreliable.

Between these interviews, we get music. Vintage TV performances and rare studio footage depict original artists performing their most important tracks. But the emphasis lies on younger artists recreating the music. Jakob Dylan is joined, alternately in studio and on the Olympic Theatre stage, by luminaries like Regina Spektor, Cat Power, Fiona Apple, and Beck. These recordings and concert performances don’t just mimic the classics, they recreate how new, dangerous, and exciting the hits felt.

The Echo in the Canyon house band performs before footage of the Byrds

Taken together, the mingling of interviews and concert performance resembles Scorsese’s The Last Waltz. Like Robbie Robertson in that documentary, these vintage artists are desperate to curate how survivors remember their influence after they go. (The gushing fan tributes include, heart-wrenchingly, Tom Petty’s last onscreen interview. He visibly has trouble moving.) And like The Band’s concert segments, these artists don’t let age stand between them and providing the most muscular performances possible.

Upon release, this documentary received warm reviews, and enjoys an overwhelmingly positive Rotten Tomatoes score. Within months, though, critics began reassessing their opinions. Some began fault-finding, criticizing Jakob Dylan for not exercising more journalistic rigor in his interviews. Others complained about the Laurel Canyon artists omitted from the roster: Joni Mitchell gets a single fleeting mention, while Jim Morrison, Linda Ronstadt, and the Nitty Gritty Dirt Band never get mentioned at all.

These nitpicks, however, are pretty unfair. Considering that the documentary was organized to support the concert, the included bands were probably just the ones from whom they could get performance rights. Track selection probably also reflects who they could secure for on-camera interviews: the surviving Doors are notoriously media-shy, and the Dirt Band seldom plays their early Laurel Canyon songs anymore. What some critics see as “a missed opportunity,” probably stems from simple logistical limits.

My opinion, though, is biased. These bands, particularly the Byrds and Buffalo Springfield, are the acts who sprung me from a conservative youth dominated by slick, chart-friendly music. Even though these bands dissolved before I was born, their lyrical and instrumental complexity opened new vistas for me. Then, like everybody else, I got accustomed to them and forgot. This documentary doesn’t just recount what the original acts sounded like. It reminds me how innovative, even revolutionary, their music once sounded.

Monday, November 23, 2020

Is Democracy a Moral Good?

Joe Biden

President-elect Joe Biden snagged a narrow majority in this month’s elections, but what’s that worth? As the lame-duck president’s continuing legal challenges keep the transition to another administration in paperwork limbo, I start hearing chants from the left, demanding that electors honor the people’s will. Give us democracy, they demand. The people voted for Joe Biden, and Biden should become President!

My distrust of “the will of the people” comes from experience. In 1992, the first year I could legally vote, I lived in a Nebraska town so far west, we were more culturally beholden to Denver than Lincoln or Omaha. For readers too young to remember 1992 politics, that was the year Colorado passed Amendment 2, a voter-backed initiative that wrote a clause into the state constitution declaring that sexual orientation would not be a protected status for civil rights purposes.

Back before the Internet and social media carefully pre-screened everyone’s news sources, I got my world and national news in 1992 mainly from the Rocky Mountain News, a now-defunct paper. Therefore I had opportunities to witness this debate unfold in real time, but no opportunity to participate. This distant, almost god-like perspective, has influenced my opinions of what constitutes “democracy” and “justice” ever since.

The arguments regarding Amendment 2 broke down into two camps, who couldn’t resolve their differences because they weren’t speaking compatible languages. Defenders of the amendment claimed to represent democracy. The initiative passed, they repeatedly reminded the public, by six points, wide enough to be considered a “mandate.” Why, they demanded to know, can’t Colorado just honor the will of the people?

Opponents didn’t engage this argument. Rather, they insisted that the amendment was simply wrong on constitutional grounds, holding that it contravened the 14th Amendment of the U.S. Constitution, which guarantees equal protection under the law to all persons. Amendment 2, they admitted, didn’t make anti-gay discrimination mandatory; it simply held that legal protections didn’t apply to LGBT persons. And that was wrong.

Thankfully, I could watch the debate unfold with cool detachment, unable to vote in Colorado. Because I couldn’t reconcile the two arguments. Raised in a conservative household, where the political language of the Cold War remained hotly reactive, I believed in America’s two great guiding principles, Democracy and Justice. And the two seemed irreconcilably opposed in Colorado from 1992 to 1995. Until somebody said something that broke the tie.

“The will of the people in George Wallace’s Alabama was for segregation, too.”

A 1993 protest regarding Amendment 2 (source)

This statement, in an unsigned Letter to the Editor in the Rocky Mountain News, flipped a switch for me. I realized, in ways I hadn’t before, that Democracy is only a second-tier moral good. The will of the people only matters when it promotes justice. Sadly, history—and not just American history—teems with evidence that democracy doesn’t necessarily honor justice. The people’s will can bend toward iniquity.

Conservative writers love observing that the word “democracy” doesn’t appear in the Constitution. That’s because America’s Founders distrusted democracy, which, as classically educated aristocrats, they associated with ancient Greek city-states. Early Greek democracies were little better than well-organized street gangs, ruled by whimsy. That, I realized, had happened in Wallace’s Alabama, and in 1992 Colorado.

Amendment 2 became a benchmark in LGBT rights, the first sexuality case to reach the U.S. Supreme Court, in 1995. In an unusual 6-3 ruling, the court sided with the 14th Amendment argument, holding that the popular will comes second. The people cannot vote for discrimination, bias, or exclusion; justice, to exist, must include all Americans. Though courts and legislatures have wavered in practice, this principle remains definitively American.

Watching the current debacle, it appears the sides have reversed. Progressives, eager to dispatch the lame duck, shout: “We won, it’s over, move on.” Conservatives want to prolong the debate with claims, mostly debunked, of widespread injustice. The ramifications of this reversal are weird and bleak, with conservatives promising to uphold justice by demolishing democracy, which hardly seems a reasonable compromise. But still.

The unintended consequence here, is that Democrats are relinquishing the argument from justice, while Republicans redefine “justice” by keeping specious claims alive long enough to seem reasonable. Both sides are making themselves weaker by failing to engage the debate. Those who already fundamentally agree with one side, have no reason to change their minds, because the other side keeps talking past them.

We teach children daily that America is a country of “liberty and justice for all.” It’s time we start acting like one.

Friday, November 20, 2020

Innocence, Experience, and Justice on Netflix

The squadron of pilots from Netflix’s Voltron

Did you ever wonder how the overeducated wanderer’s mind reacts to isolation? As my second week of COVID-induced quarantine drags on, I’m ashamed to confess the quantity of Netflix I’ve consumed. But two titles stand out: Voltron: Legendary Defender, the reboot of the 1980s Americanized anime, and Longmire, based on Wyoming novelist Craig Johnson’s mystery series. They represent two different interpretations of the classic generational conflict.

Voltron starts by loosely retreading the 1980s story: five human teenagers encounter an ancient world dominated by dark magic and arcane technology. Part of this melding is Voltron, a creature half robot, half demigod, controlled by its human pilots. Voltron stands between a just universe, balanced and beneficent in nature, and hungry, conquering evil, embodied in Emperor Zarkon. This Manichaean conflict implies the battle between natural balance and original sin.

In the season one finale, however, an unexpected plot wrinkle emerges: Zarkon, who has conquered unceasingly for ten millennia, was secretly a former Voltron pilot. His conquests have simply displaced his desire to reclaim his status as captain of justice, a desire he’s nursed since before humans began building stone pyramids. While the wet-behind-the-ears human squadron struggles to control their beloved warrior robot, Zarkon’s familiarity with Voltron is chilling.

On Longmire, meanwhile, title character Walt Longmire has served as sheriff of Absaroka County, Wyoming, for years. He’s manifestly effective, but hidebound, a throwback to frontier justice. Sure, he drinks to numb the pain lingering from his wife’s murder, and sometimes he’ll bloody his knuckles getting confessions from swaggering cowboys; but by nightfall, he’s cornered his suspect, and law always prevails.

Walt’s deputy, Branch Connally, unexpectedly announces an election challenge, believing Longmire is past his prime. Walt and Branch are both White, rugged, and handsome, the before-and-after of cowboy justice. But Branch is clean-shaven and photogenic. Walt is rumpled and stubbly, and always looks like he maybe slept with his Stetson pulled over his eyes. Though largely interchangeable, their generational conflict culminates in a punch-up along a Wyoming gravel road.

These two shows have several important overlaps. Both believe justice exists, and humans should strive to achieve it, though evil forces would pervert it for their own ends. Both believe a judicious application of violence is the most efficient means to achieve justice. They ask similar questions about how much of oneself leaders must sacrifice for the common good. How much can lawkeepers protect themselves while still protecting others from atrocity?

The entire sheriff's department on Longmire

The Voltron Force is defined by their youth. They struggle to control their super-weapon, and one recurring theme throughout the seasons is the incremental gains they achieve in mastering themselves and their skills. Their central opponent, however, has enjoyed power for ten thousand years. In a remarkably solid-state empire, Zarkon has grown to accept power as his due. In his mind, he’s entitled to control the universe’s greatest weapon.

Conversely, Longmire’s defining characteristic is age. He’s earned his scars, studied under previous masters, and, samurai-like, now defends the law as his domain. The show makes clear that Branch Connally is a good deputy, but still has unpaid dues. Even Branch admits he still has lessons Walt could teach, if he weren’t too impatient to learn. Justice, in Longmire’s world, comes from experience and seniority, not youthful ideals.

What’s remarkable is the respective attitudes to change. Voltron’s universe has evolved little over millennia, until young humans, willing to learn, disrupt the rusticating empire. Change, for Voltron, is necessary to unseat fat, entitled evil. For Longmire, change is decline. “I remember when I could count the number of murders in this county on one hand,” Walt growls into his beer, as the body count accumulates. “Two at most.”

In brief, Voltron believes justice comes from uprooting the fat, decrepit entitlements of age. Longmire believes justice comes from preserving the lessons taught by experience. Neither position is absolute: the Voltron Force has older mentors beside them, and takes frequent opportunities to learn from aliens they encounter. Longmire, likewise, requires his upstart deputies’ frequent challenges to maintain his competitive edge. But each has specific attitudes toward youth and age.

These different shows have different audiences, of course, and remind their respective target demographics that society values their contributions. Neither youthful idealism, nor hard-won seniority, is enough to bring justice by themselves. Both also charge their audiences to remember that justice exists, but isn’t passive. We must never forget that seniority can cause corruption, and youth can cause chaos. It’s important to keep up the fight, at any age.

Wednesday, November 18, 2020

Vaccines, Economics, and Individualism


“Imagine living in a world so spoiled by lack of serious illness,” the message board statement read, “that we forget the benefits of vaccines and start to reject the same vaccines that brought us to this point.” If anyone asks, I tell them I avoid anonymous Chan-style discussion boards, because they encourage ignorance, self-righteousness, and boorish behavior. But I keep coming back, because sometimes, like here, somebody says something so right, it deserves further amplification.

As I approach two weeks without face-to-face contact with another human being, COVID-induced cabin fever reducing me to thoroughly unproductive jelly, this message seemed especially pointed. We Americans, alongside the British, Indians, and a few other highly populous countries, continue struggling with the balance between liberty and health. What form of unfairness do we, collectively, consider acceptable? Which moral precept do we value higher? And exactly how much do we even exist, as separate individuals?

These questions don’t matter merely in the abstract. They also matter because, as the original comment observed, we didn’t achieve the level of widespread health which humanity now enjoys, without many sacrifices along the way. The discovery of vaccines stopped the spread of once-virulent diseases like polio, smallpox, and even potentially lethal tetanus. Penicillin became the first of several wonder drugs. Even handwashing, once deeply controversial, became a lifesaver. Living to age thirty became commonplace.

Now a noisy minority, deaf to history’s lessons, are committedly kicking down the ladder their ancestors painstakingly ascended. Vaccine denial is just one component of this. Science denial has become a hallmark of wealthy modernity. Widespread anecdotal reports of patients denying COVID-19 exists, even as they’re dying of it, are just the latest manifestation of this. Faced with concrete, painful evidence that the way we’ve always done things doesn’t work anymore, some people just deny.

What worries me most, though, is that this problem isn’t unique to science. Reading the pro-vax comment which got me thinking in this direction, I realized I’d seen it elsewhere, almost verbatim. Gar Alperovitz and Chuck Collins made these arguments, almost verbatim, regarding economics. Post-WWII, Americans enjoyed a period of unmatched prosperity and security, made possible by government subsidies and public-private cooperation. Then we destroyed the network, convinced it provided others with an “unfair advantage.”

Admittedly, this prosperity and security wasn’t universal. The New Deal and its Eisenhower-era successors had racism baked into their structure, and the benefits accrued mainly to White Americans. This applied to health equally: Black and Latinx Americans have always had more limited access to preventative medicine, like vaccines, and are more likely to live downwind from smoke-belching factories than White Americans. But in the aggregate, Americans enjoyed government-backed prosperity, right up until we destroyed it.

COVID-19 provides a microcosm of why we seemingly can’t solve persistent problems in America. Just as a lead smelter cannot discharge waste into the air and water and assert that “it’s my property,” I cannot breathe into the shared air and claim “it’s my lungs.” It clearly isn’t mine alone. Yet this belief, which permeates American politics and economics, that we’re all disconnected individuals drifting along, separated from others, is disproved by a contagious disease.

Certainly there are areas where we exist as separate beings. If I enjoy science fiction, and you enjoy romance novels, let’s just read our respective books privately and not bother one another. But that doesn’t apply to circumstances that clearly belong to the community. Our society’s wealthy and powerful, who can afford to purchase legislators at fire-sale prices, have looked at centuries of collective stewardship of shared resources, and decided: nah. Kick the ladder down.

Humanity has achieved widespread levels of health and prosperity when we’ve acknowledged our shared responsibilities. When we admit I have a responsibility to get vaccinated against smallpox, not only so I don’t contract it, but also so I don’t transmit it to others. When I have a requirement to steward my land, not just because it’s mine, but because the invasive species which flourish here spread seeds elsewhere. When ownership derives from trust, not arrogance.

Some people look at their health, prosperity, and comfort, and think: I did this. Certainly, these individuals contributed to it, since heedless people tend to squander money and get sick from needless risks. But to believe you, individually, created your comfort, without any contribution from those around you, or before you, shows profound ignorance of history. If we inherit the dividends of others’ efforts, we inherit the responsibility to keep fighting their good fight onward.

Friday, November 13, 2020

The Church of Zealous Americanism

Egyptian icon of Akhenaten as priest of the sun god

I had an interesting discussion recently with a pastor, who also teaches comparative religion as a sideline. Someone made a flippant comment about the ways recent politics have turned into cults of personality. As we watch, many American conservatives openly refuse to believe reported election outcomes unless they support the incumbent. Meanwhile, I personally watched friends walk away from the Democratic Party because it wouldn’t provide their perfect candidate.

This principle, that the candidate is, or anyway should be, a perfect individual embodying the people’s values, perhaps makes sense, in scale. We want elected officials, and the Executive in particular, to take point in transforming our moral precepts into law. However, when the individual becomes so important that we’ll relinquish democracy to support the individual, this becomes less like politics, taking on trappings of religion, even occult.

My pastor friend says this isn’t as unprecedented as Americans might perceive it. She cites ancient societies, like pharaonic Egypt, which believed their kings were literal gods in human flesh. In medieval times, God or the gods remained more distant from government, but kings still believed they received their authority from God. When (and if) Prince Charles eventually inherits the British throne, he’ll receive the crown from the Archbishop of Canterbury.

Meanwhile, contemporary one-party states create state personality cults around their Dear Leaders. Many scholars have spilled much ink over the ways North Korea’s government has created religious devotion to the Kim dynasty. Even when governments are nominally altogether secular, they often engage in religious posturing. The Soviet Union engaged in organized iconoclasty which distinctly resembled that of the German Peasants’ War of 1524.

So on one level, the cult of personality which drives much American politics today isn’t unprecedented. But it feels icky in a society which is officially secular. On paper, American political authority derives from the people, and the government exists to enact (within limits) the people’s will. Absolute, and frequently violent, dedication to an individual, seems more consistent with Charles Manson or David Koresh than a term-limited president.

Nevertheless, as I’ve written before, American presidential politics since 2000 has been, explicitly or implicitly, a contest between duelling messiahs. Americans believe a President from their political party will save America from whatever moral depredations the previous President inflicted upon us. Electing the correct president will, through means vague and frequently undescribed, make us good. The definition of goodness is, unfortunately, not fixed.

Try not to cringe

Thinking about it, I realized this isn’t innovative. The White-washed version of American history I received in public school frequently had a religious foundation. The First Thanksgiving roughly corresponds with “In the Beginning,” with the Constitutional Convention of 1789 matching Moses bringing the Tablets down Mount Sinai. Waiting for the messianic presidency is merely a culmination of prophecy taught in American secular religion.

Nor is this religion entirely unfair. Any government needs underlying principles; otherwise it becomes a tool for powerful people to enrich themselves at others’ expense—as we’ve seen whenever we’ve believed contingencies required us to pause certain principles. Any government, even a democratically elected one, cannot practice justice, without some prior agreement of what justice actually is.

Reverend Jim Wallis, who criticizes American politics from the Christian Left, frequently cites Proverbs 29:18—“Where there is no vision, the people perish.” Reverend Wallis believes America, and any nation really, needs unifying national moral vision, in order to remain together. That vision needn’t necessarily be religious, though it often has religious origins. (Professor Kevin M. Kruse would argue, further, that this origin is more PR than religion anyway.)

Therefore, presidential elections become a referendum on national morals. Which party’s values better represent America overall? The President doesn’t just enforce legislation and command the military; he becomes invested with our moral aspirations. When the President’s morality strays, from Clinton lying about getting a BJ, to Trump imprisoning children in a disused Walmart, these actions aren’t personal, they’re a moral judgement upon the nation.

I’ve written before that a religion requires certain traits. Capital-T Truth must exist, and that Truth must be revealed by a prophet or messiah who is, also, conveniently dead and therefore immune from cross-examination. But I’ll add another stipulation: a religion needs a future. We must expect the Truth will return, whether to judge us in the afterlife or to restore justice in this world. But it must never arrive.

Thus, we’re witnessing America transition to another temporary messiah. We’ll adjust our morals appropriately, fight over where our future lies, and do it all again.

Wednesday, November 11, 2020

Why Capitalism Requires Christianity (or Something Like It)

Martin Luther

When I entered the job market thirty years ago, my parents gave me advice which I’ve attempted to live by since then: don’t, they said, apply for the manager’s job first. Nobody starts at the top of the ladder. Take time, pay your dues, and the organization will recognize your contributions and reward you appropriately. I believed them then and, despite thirty years’ evidence that it doesn’t work that way, I still heed that warning.

Recently, reading religious history, I encountered important parallels with contemporary economics. The medieval Roman church insisted that salvation was very catechistic: that is, one achieved salvation through right thinking, which one could achieve through study. Salvation was a classroom exercise. The two great Northern European reformers, Luther and Calvin, disagreed. They saw salvation as a matter of right faith, which one achieved only directly from God. Their salvation, though more scripturally sound, was much lonelier.

Smarter scholars than me, particularly Max Weber, tie the Protestant Reformation, and resulting Catholic Counter-Reformation, to nascent capitalism. Just as salvation became a purely private matter, Weber writes, so did disposition of money, land, and other formerly shared resources. The checklist to salvation, like the checklist to prosperity, became arcane and murky, hidden to the masses, visible only to those able to see truth. We hope to go to heaven, without knowing what heaven is.

In capitalism, we who internalized the economic precepts young, can cite the cardinal virtues: hard work, loyalty, perseverance, self-sacrifice. These, we’re told, are keys to worldly success. Yet though I’ve long internalized these ethics, I need only witness the world around me to see that isn’t true. Our economy rewards financiers, actuaries, and attorneys, while chronically underpaying nurses, teachers, and construction workers. I know this, because I have eyes; yet I can’t shake internalized belief.

We’ve learned, from infancy, that hard work produces reward. And this isn’t entirely wrong, as hard work is necessary to reward; workshy layabouts will spend their inheritance quickly and finish with nothing. But hard work isn’t sufficient for reward; in British economist George Monbiot’s much-cited words, “If hard work were all it took to get rich, every woman in Africa would be a millionaire.” So clearly, success in capitalism requires hard work plus something else.

Exactly what that missing ingredient is, causes much controversy. When somebody comes from nowhere (or nominally nowhere) to achieve runaway success, Jeff Bezos say, friends and foes alike will scrutinize his history for causes of success. Bezos’ supporters will assert that he created a platform marketplace, fine-tuned it for customer needs, and outperformed his competition. His critics will cite that he came from wealth, his parents loaned him seed money, and he underpays his workers.

John Calvin

For ecclesiastical purposes, however, exact one-for-one explanations don’t matter. John Calvin noted that not everyone who believes right and does right will necessarily reach Heaven when they die. Some people will just succeed, for reasons known only to God; Calvin called these people “the Elect.” We cannot know, throughout this life, whether we’re among the elect. One day we’ll simply open our spiritual eyes and either see God, or not, and that’s all that matters.

Jeff Bezos, Mark Zuckerberg, and other massively successful capitalists didn’t achieve capitalist salvation because of their earthly merits; they were simply among the Elect. Why did Facebook flourish and MySpace founder? Why did VHS beat Betamax? And why do some hard workers become managers and entrepreneurs, and others die in the salt mines? Scholars retroactively construct deterministic explanations, but fundamentally, the reasons don’t matter. The capitalist god, which is the Market, simply chooses the Elect.

Recently, over dinner, my parents turned to me and, with tones of disbelief, commented that the idea of merit-based pay raises and promotions seems like an ideal of the past. Why, they said, it’s almost like hard work doesn’t produce worldly reward! I swallowed my initial “told-ya-so” response and talked them through their reasoning. Briefly, they finally realized, in their seventies, that hardworking people of good character were making others rich, and gleaning no reward.

But that’s exactly what capitalism requires. The system cannot work if people don’t believe they might get to Heaven. Like Calvinists, we simply accept that we cannot see the evidence of reward during this life, because this life is sinful. We’ll know our efforts have been rewarded after the rewards arrive: after wealth, honor, and responsibility accrue upon us. And if they don’t, well, that’s how we’ll know that we were never among the Elect.

Wednesday, November 4, 2020

The Horror That Originates Inside

Ensemble cast photo from The Haunting of Hill House

The third episode of Netflix’s The Haunting of Hill House features a subplot involving sick kittens. Shirley, eldest daughter of Hill House’s resident Crain family, finds an abandoned litter, so small and frail that their eyes haven’t opened yet, and takes them in. But these kittens were dying before Shirley found them, and one by one, they die in her care. The series connects this childhood trauma, and Shirley’s adult occupation, as a mortician.

I struggled to watch this plot unfold. Mere months earlier, I similarly found an abandoned litter at work, and struggled to feed and nurture them, without success. Fortunately, when I failed, I reached my local PD’s animal control office, who whisked my kittens to a foster home. So Shirley’s doomed efforts touched me, directly, and I almost couldn’t watch the story, because I anticipated every worst-case scenario, just as I did back in April.

I’m glad I watched it, though. In the final analysis, the show’s dominant theme proves to be the consequences which the Crain family endures for refusing to face trauma directly. The characters personify different forms of denial, evasion, repression, and pig-headed stubbornness. They inflate the possible awfulness of the situation until it becomes epic in their minds, far disproportionate to the actual facts.

Freddy Kreuger
My parents didn’t let me watch horror movies as a child. While my peers reveled in the excesses of Freddy and Jason, and brought their favorite slasher knickknacks to school, my parents believed I was too sensitive for such fare. They probably weren’t altogether wrong, as I still take minor slights way too personally, and tend to catastrophize every tiddling setback. I couldn't understand that, though; I only knew my peers shared an experience which remained off-limits to me.

But just because I didn’t watch horror movies, doesn’t mean I didn’t internalize their implications. During the great slasher film boom of the 1980s, images of soulless bloodthirsty monsters permeated pop culture. My brain combined these floating images, and children’s common fears of abandonment, hopelessness, and being ignored, with the ruthless efficiency of a Cuisinart. My childhood nightmares were a beauty.

This pattern continues into adulthood. My fears have adapted to reality, and Freddy Kreuger doesn’t stalk my dreams anymore. Instead, I have nightmares of failing in life, in careers and relationships, and getting sent back to places where I previously lived, and suffered through loneliness. I fear human rejection, and having to spend the rest of my life alone with myself.

Believe me, that seems like a pretty horrific life sentence.

Now as then, though, my real fear stems not from reality, but from anticipation. My brain is capable of conjuring terrible consequences for stray moments and minor images. Poor Shirley and her kittens faced something awful, certainly. Every child’s first hands-on encounter with death is traumatic. But if I hadn’t watched it directly, my brain would’ve inserted something far worse into the story. It was already trying to.

Shirley’s kittens reminded me of a seemingly tangential corollary: Robin Thicke’s stanky number-one hit “Blurred Lines.” When I learned YouTube had banned Thicke’s video for obscenity and violating community standards, I immediately started imagining how awful it must be. My imagination got so florid that I had to watch, because it had already planted a seed in my mind. Imagine my relief to discover the video was merely lewd, and not an enactment of the depths of human depravity.

Jason Voorhees
Since I started watching horror films recently, I’ve realized the real horror stems, not from what happens, but from what could happen. We, the audience, anticipate the myriad of disasters that could befall the characters, filling the blanks when the camera lingers over an ominously empty graveyard or locked door. When the worst finally occurs, it’s actually a relief, because it’s seldom as bad as it could've been.

At this late date, it’s foolish to second-guess my parents’ child-rearing decisions. But I wonder if my life might’ve unfolded differently if I’d watched horror movies when my peers did. My adulthood has often been characterized by me inventing the worst possible outcome if I applied for my dream job, pressed for some reward I’d rightfully earned, or asked the girl I liked for a date.

Might my life have occurred differently if I’d learned earlier that my worst possible imaginings seldom came to pass? It’s probably too late to say that now. But it’s at least a possibility. And yes, I’m definitely middle-aged now, but perhaps it’s not too late to learn.

Monday, November 2, 2020

Sherlock Holmes and the Contemporary Victorian Mess

Millie Bobby Brown (center) as Enola Holmes, with Henry Cavill (left)
as Sherlock, and Sam Claflin as Mycroft

Enola Holmes, the latest confection rush-released from surprise content factory Netflix, is a surprisingly good Holmes movie. I say “surprisingly,” because amid the recent deluge of Holmesiana, it’s pretty difficult to say anything particularly new or innovative. Like King Arthur or Robin Hood, whose most recent onscreen adventures landed with a distinct thud, Sherlock Holmes has been significantly exposed recently. Yet somehow, he remains new and relevant.

Yet why is a Victorian character, whose usual narrative arc is so predictable that his own creator grew to despise him, so durable? Sherlock Holmes has appeared onscreen more, supposedly, than any other character: more than Dracula or Miss Marple, more by some estimates than Jesus. He evolves to suit the times; I question whether any, but the most dedicated fans, have actually read Conan Doyle’s notoriously turgid Holmes adventures.

Robert Downey, Jr., in Guy
Ritchie's Sherlock Holmes
When Guy Ritchie’s reinvention of the character, entitled simply Sherlock Holmes, dropped in 2009, I initially refused to watch it. I’d recently seen two attempts to create a more contemporary and relevant for 21st-Century audiences. This included a TV film, in which Vincent D’Onofrio got top billing for playing Moriarty, and the camera lingered over Holmes having drunken sex with prostitutes, while Watson performed an autopsy live onscreen. It was pretty bad.

For contemporary audiences, the Holmesian appeal lies partly in the distant setting. Victorian London seems well removed, and we’ve become suffused with the images of rococo splendor. Conan Doyle wrote, after all, during the decades when the British Empire enjoyed (if that’s the word) its greatest success, as measured by wealth and plunder. He couldn’t know that, within a generation, World War I would begin the empire’s undoing.

Yet despite the supposed Victorian wealth and comfort, that London was thoroughly rotten. Holmes often wandered into East End flophouses, opium dens, and other scenes of what Victorians would’ve considered moral degradation. Conan Doyle didn’t signpost this class struggle, mostly because he didn’t need to. Like his contemporary, Oscar Wilde, he used coded language that seems opaque to modern readers. But his Victorian audience knew exactly what he meant.

The first screen adaptation I remember dealing explicitly with class and poverty, was the legendary Granada TV version starring Jeremy Brett. Launching in 1984, it didn’t deal directly with Victorian poverty, but it included many street scenes with Holmes and Watson walking through mud, past street vendors selling live chickens and rabbits. I remember one scene where, before entering a building, Watson paused to pick carriage-horse shit off his brogues.

While the TV version with D’Onofrio that I hated attempted to shock the audience, it didn’t linger over the Victorian division between wealth and poverty, between White English middle-class values and the supposedly morally degraded immigrants and sailors living on the East End. It failed to acknowledge what Victorian London had in common with today. The moralistic justification of English imperial wealth in the 1880s sounds painfully familiar in 2020.

Recent successful Holmes adaptations have taken one of two tracks. Some have embraced the shocking poverty of Victorian England: Enola Holmes wanders into Limehouse pursuing clues, where a hired thug repeatedly pushes her into the dung-filled streets. Ritchie’s Sherlock flees from machine guns that prefigure the trauma of two world wars. Brett’s Sherlock never commented upon class divisions, but nevertheless visibly lived among them.

Benedict Cumberbatch in Sherlock
Other adaptations move Holmes into a dreamland. Both Moffat and Gatiss’ Sherlock, and its American contemporary, Elementary, are set in 2010s cities, London and Manhattan respectively. But both are ceremoniously scrubbed, tourist brochure-friendly versions of those cities, with sleek architecture, merrily jostling streets, and almost no filth. Sure, people get murdered, but not mugged or even much discomforted; the cities are remarkably anodyne.

These adaptations could also learn another lesson from Conan Doyle: learning when to stop. The author’s later Holmes stories, a crinkum-crankum mess of spiritless finger exercises, reflect how much Conan Doyle hated his cash cow. Similarly, the fifth season of Sherlock was jeered so badly that there’ll probably be no sixth, while the seventh and final season of Elementary ran as a summer replacement and disappeared quietly.

Basically, Holmes remains relevant because he provides succinct commentary upon today’s world, while remaining notably apart from it. The veneer of escapism lets us examine today’s injustices at enough of a remove that we don’t get emotionally agitated by them. Like Holmes himself, we’re able to keep our cool when confronted by manifest evil around us. As Victorian as he is, Holmes and his stories are ultimately still about us.