Monday, June 29, 2020

Fear of a Disabled President


President Trump has evident difficulty drinking water from a glass. Everyone from late-night comedians to basic cable pundits to America’s social media clusterfuck has reminded us that the President occasionally struggles to drink gracefully. His intermittent need to double-fist a drinking glass has become, alongside his famous struggle to descend a ramp, a metaphor for the President’s opponents about his struggles to execute his constitutional duties in office.

One year ago, I probably would’ve joined this pile-on. I would’ve participated when left-leaning media figures mocked his evident difficulty, comparing him to a toddler with a sippy cup. I would’ve jeered at the twitter video macro that called him “the greatest president ever” when, at his June 20th Tulsa rally, he demonstratively drank water with only one hand, I would’ve laughed along when that great authority on political decorum, Fonzie, showed up the president’s water-drinking panache.

Because it really is funny, in isolation. President Trump likes mocking any perceived weakness or shortcoming in his opponents. During the 2016 campaign, his attacks on Jeb Bush and Hillary Clinton for being “low energy” and lacking “stamina” were clear metaphors for what he considered their moral weaknesses: if they can’t be constantly frenetic, he implied, they couldn’t have moral fiber enough to lead. The water-drinking jibes simply turn that back on him, and reversal jokes are funny.

But life circumstances have made me conscious of public ableism recently. When we use people’s physical shortcomings to belittle their moral character, I’ve realized, that redounds on us. I don’t dare get into too much detail about what’s made me conscious of this, since it’s not entirely my story. However, in brief, when people with serious physical disabilities struggle to perform common tasks, while attempting to remain useful to society, that’s on us.


Like most of my generation, I grew up considering disability as synonymous with paraplegia. I was among the first generation who had to internalize the reality that you mustn’t park in the designated “handicapped spot,” a term I realize is now considered offensive, but still commonly used. And the “handicapped spot” was signified by a wheelchair icon. Disabled people must necessarily have immobile or missing limbs.

Worse, if anybody uses the “handicapped spot” who doesn’t visibly, obviously need it, my generation internalized the attitude that this person is committing a moral offense. Reserving the “handicapped spot” isn’t just a positive good, it’s a moral imperative. So if somebody emerges from their car in that spot, and walks away, we feel moral outrage akin to witnessing sexual harassment or public urination: this person has transgressed the unwritten law.

Thus disability becomes a moral indicator. People who transgress our idea of what constitutes a disability have committed a sin. But what about people who have invisible disabilities? Fibromyalgia, neuromuscular disorders, connective tissue problems, and chronic pain place severe limits on people’s ability to perform routine acts. They can’t necessarily work eight consecutive hours, carry their groceries unaided, or walk across a long, hot parking lot.

Watching the President struggle to drink water, I see a man facing common musculoskeletal problems which come with age. Most of us past age thirty recognize that we can’t always perform physical tasks with the grace and equanimity we once had. The man might have a minor age-related disability which needs insignificant accommodation. We could make serious accusations against this administration; struggling with drinking isn’t one of them.

Humans, though, think in metaphor. Outsiders watching Trump struggle with his water glass and who, unlike me, haven’t been jolted into awareness of how society treats invisible disabilities, see an external manifestation of moral culpability. It’s difficult to grasp the years-long “kids in cages” scandal, cozying up to nuclear-armed dictators, doubling down on systemic racism. These issues are big and decentralized. Drinking-glass difficulty is small and easy to see.

So, when small-screen pundits belittle Trump’s difficulty drinking water, they mean their insults poetically. His real scandals are so big, and hard to see, that our outrage dissipates. So we pull focus onto his physical infirmities. But in so doing, we reinforce the message that everyone facing invisible disabilities is actually concealing moral failures. We’re bolstering the idea that the disabled are just bad people.

There are multiple serious accusations to make against this administration. In pulling focus, paid pundits are trying to create a capsule argument for Trump’s moral degradation. But they’re doing it in a way that paints with an unsustainably broad brush. And whether they realize it or not, they’re creating an environment that’s going to be even more hostile to people facing regular, day-to-day difficulties. That doesn’t help anyone.

Friday, June 26, 2020

The First Church of Charles Darwin

Nick Spencer, Darwin and God

Charles Darwin studied for the priesthood in his youth. Many readers already know this, and have framed this fact to explain or dismiss the idea that Darwin entertained secret religious convictions. Nick Spencer, a British public academic specializing in the intersections of religion and politics, wondered what Darwin’s history meant in context. Fortunately, Darwin wrote extensively on the topic, leaving behind copious primary sources for future scholars.

Darwin came descended from two categories of religious dissidents. His mother’s family, the Wedgwoods, were industrialists with Unitarian beliefs; while his father harbored doubts and became a sort of sceptic, while encouraging young Charles not to voice questions too loudly, because they were rude. Caught between these forces, and unsuccessful in his medical studies, Darwin shrugged his way into Anglicanism, and started studying theology as a career track.

In his youth, Darwin professed “a sort of Christian” belief, a quote Spencer uses generously, because it reflects Darwin’s conflicted position. The future natural philosopher repeated the motions of early Victorian upper-class Christianity, which meant diligent study of books and public politeness, but not necessarily feeding the hungry and clothing the naked. Christianity, in the early 19th Century, was simply expected of Britain’s entrepreneurial classes.

But he never had any particular faith. His letters and private journals, which Spencer uses as his chief source, reveal that Darwin questioned whether he could undertake the ceremony of ordination with a clear conscience, since it required him to attest a strongly felt spiritual calling. Darwin was squeamish about telling such a bold lie. When the opportunity to sail on the HMS Beagle as Captain FitzRoy’s confidante came available, Darwin jumped on it.

These early years, Spencer describes only loosely, because Darwin left fewer notes than later, and Spencer appears reluctant to speculate too far beyond his primary sources. His Beagle voyage, however, opens new vistas for Spencer’s documentation. Beginning with this voyage, Darwin began keeping detailed personal notebooks, and sending lengthy letters home to England. His writings began showing increasing doubts about polite Victorian public theology.

Spencer’s remarkably brief (under 130 pages) book doesn’t spend much time on Darwin’s scientific accomplishments; he apparently believes that, if you’re reading him, you’re already aware of Darwin’s science. Rather, he shows foremost interest in how the increasing friction between Darwin’s childhood beliefs, and the evidence he collected from the surrounding world, became something he couldn’t reconcile. One or the other eventually had to go away.

Charles Darwin
Copious correspondence with his future wife, with leading British and American public intellectuals, and with British gentleman clergy, reveals Darwin’s mind brimming with questions. Not necessarily “doubts,” as that would imply moving away from deeply held moral convictions, which he never particularly had. However, he increasingly needed closure on questions about life, suffering, and death, which Christianity couldn’t provide. Especially after Annie Darwin, his favorite daughter, died.

According to Spencer, Darwin never became an atheist, unlike his most vocal adherents today. Indeed, though he saw the enforced cheerfulness of Victorian Christianity as untenable, he couldn’t entirely reject the idea that a universe as complex as ours needed a Creator somehow. He just couldn’t accept upper-class Britain’s dominant theology. He happily accepted the newly minted term “agnostic,” because he had no alternate theory to explain his tumultuous world.

Spencer describes Darwin’s move away from religion as paralleling the movements in science in the 19th Century. Previously a spare-time hobby for gentlemen, many of them clergy, science was instead becoming a discipline in its own right. This parting of ways meant that investigations of the physical world needed to emerge from a pursuit of truth, not from one’s existing beliefs. This was a transition which many Anglican churchmen resented.

Despite that tension, Darwin retained an amicable relationship, not only with scientists and academics, but with theologians. He counted many churchmen as friends, corresponded professionally with others, and used these relationships to test his still-changing religious beliefs. He had his children christened in church, and gave generously to the parish; he remained, in short, a proper Victorian gentleman. He just couldn’t believe in any kind of personal God anymore.

In Spencer’s description, Darwin’s falling-off isn’t a “crisis of faith,” because he never much had faith. He simply stopped giving intellectual assent to a particular kind of Christianity, and felt no inner moral pull to any other religion. Told this way, substantially in Darwin’s own words, Spencer makes Darwin’s struggle feel timely and relevant today. Because the old practices are falling away, and there’s nothing much to take their place.

Wednesday, June 24, 2020

The Problem With Geniuses

Jeff Bezos
With this week’s announcement that the Segway corporation will discontinue its namesake vehicle, I find myself struck by something important. Those of us old enough to remember the Segway’s launch in 1999, will remember the massive fanfare. Its backers, including Jeff Bezos, withheld what it actually did, spurring massive speculation. They told us it would revolutionize whatever it did. Then it came out, and it was… a scooter.

Capitalists, or more accurately “Capitalists,” want to ballyhoo the insights of business geniuses. We love celebrating Bezos, Steve Jobs, Mark Zuckerberg, or Sergey Brin and Larry Page, who supposedly reinvented their particular industries by pioneering something that should’ve transformed our lives. But we celebrate them retrospectively, highlighting the good they’ve already accomplished. We call them geniuses, but in so doing, we predict the past.

We even have a name for that: “Disruptive Innovation.” Coined by Harvard business professor Clayton M. Christensen, Disruptive Innovation claims that true commercial revolution, like revolution in science, completely upturns anything which came before. The geniuses who create something truly new, who coincidentally always have massive bankrolls and teams of engineers supporting them, are to capitalism what Robespierre was to democracy

The Segway was touted as innovative, even subversive. My fellow science fiction aficionados got excited, wondering what Ray Bradbury whizz-bang we could expect. What we received, however, was slower than a car, more cumbersome than walking, harder to park than a bicycle, and almost as expensive as a down payment on a house. It took multiple functions and made them all more unpleasant. And for that, we got publicity comparable to another Star Wars film.

Steve Jobs
Contrast this brouhaha with the introduction of the iPhone. Its functions were never secret; Apple wanted to put the functionality of a computer onto a cellular phone. It seems pretty obvious now. While I admit having dragged my heels on smartphones, looking back, their success seems inevitable. It combines two activities people already do: calling their friends and dicking around on the Internet. It also made texting convenient for fat-fingered typists like me.

But notice two terms I used above: “looking back” and “seems inevitable.” This is a straightforward case of what I previously called predicting the past. We look at something which appears successful in hindsight, describe its success as obvious, and draw generalizations from the particular example. These generalizations seldom carry. As sociologist Duncan J. Watts writes, we don’t explain the success, we only describe it.

In my favorite example, American historian Jill Lepore writes that Clayton Christensen, the theorist behind Disruptive Innovation, asserted that the iPhone would tank, because it didn’t fill any existing market. We already had phones and computers, Christensen asserted; why combine them in an uncomfortable pocket brick? But by 2015, the Harvard Business Review, house magazine of Christensen’s school, called the iPhone “a sustaining innovation in the smartphone market.”

Rather than making reliable predictions from evidence, this model simply rewards the already well-rewarded. Steve Jobs, or more accurately his Apple engineering team, gambled on their slightly innovative technology, and won. Rather than calling them fortunate, business theorists call them geniuses, and assert how their success was inevitable. These engineers weren’t just industrious and lucky; they were insightful, even prophetic.

What about failed innovations, though? Consider technologies that stunk up the joint, including Google Glass, Microsoft Zune, and Solyndra. That’s saying nothing of technologies that were initially successful, but failed to stay relevant, like MySpace, BlackBerry, and AOL. Were the engineers and PR professionals behind these technologies inherently stupid? Or did they just guess wrong, where others guessed right? Duncan J. Watts and I have our theory.

Duncan J. Watts
These supposed geniuses, propped up by the multi-million-dollar rollout events which Steve Jobs and others made famous, and furthered by the hype machine that includes Bloomberg and CNBC, sell themselves as an ultimate product. If you’ll recall, when it came out that Steve Jobs was dying, market specialists wondered whether the company could continue without him. Because ultimately, Apple didn’t sell products; it sold the perception of bottled genius.

The engineers behind Segway failed to sell themselves. I couldn’t name any individual behind the Segway besides Bezos, and, twenty years later, it’s impossible to sort the Internet morass to find contemporary information. The Segway’s problem was never the Segway itself, proven by the fact that the company will persevere without its founding product. Its producers just never managed to sell themselves as portable geniuses like Bezos or Jobs.

It’s almost like being a genius has no relation to being smart.

Monday, June 22, 2020

Racism in the Workplace: a Memoir (Epilogue)

The U.S. Supreme Court building
PART ONE
PART TWO
PART THREE
“What does SCOTUS stand for?” my coworker asked, gazing solemnly into his phone.

“Supreme Court of the United States,” I replied, driving a screw into a surface. We weren’t supposed to check our phones during work hours, but most of us do, to inject the illusion of meaning into tedious tasks.

Says here the SCOTUS just upheld DACA,” he continued, pointing at his screen. “What does that even mean?”

“Deferred Action for Childhood Arrivals,” I explained. “It means people who come into America undocumented as minors, usually under age sixteen, and don’t have criminal records, are kind of low-priority in the deportation process.”

As we had this discussion, word was only just trickling out that the Supreme Court, with the assistance of nominally conservative Chief Justice John Roberts, had ruled the Trump Administration couldn’t euthanize the DACA program, a linchpin of President Obama’s legacy. To me, this seems like an obvious win. Hardworking young adults, who mostly don’t speak the language of their supposed homelands, can continue adding to America’s greatness, rather than fighting tedious court battles against deportation.

“Okay, mmm-hmm,” he said, nodding, without looking up from his phone. “So in other words we’re getting fucking soft.”

I swallowed hard. This is the same coworker who wondered aloud what made George Floyd so special. Since then, he’s insisted Americans should go easy on the police who shot Rayshard Brooks, saying: “If people understood the stress cops are under, they wouldn’t be so judgemental.” His complaints have a persistent tone: on wearing masks against COVID-19 he’s said “Nobody asked me if I wanted to participate.” After taking his wife to see Ocean’s 8 at the cinema, he claimed he got tricked into seeing a “chick movie.”

Since I’ve known him, he’s harbored strong opinions, which he doesn’t hesitate to share. And those opinions always, somehow, redound to the benefit of white, male, heterosexual, native-born Americans. He delights in airing his opinions, and if anyone disagrees, he treats that dissent like a personal attack and comes back swinging. I try not to give his self-pitying bloviation unnecessary air.

But in recent weeks, he seems increasingly strident. We work in an environment of constant noise and conversation, so at breaktimes, I prefer to read and keep quiet; but recently, he’s become insistent on talking about politics and current events. He especially wants to talk to me, and he uses racist language in doing so. It’s become so pointed, I suspect he’s reading this essay series and recognizes himself.

I intended this series to run three installments, and stop. I’d made my point, to my own satisfaction. Yet my coworker’s comment about “getting fucking soft” drove something home for me.

Historically, conservatives regard all change as decline. Slang is always degraded language, pop music is always worse than it used to be, and politics is always venal in a way it didn’t used to be. As a jaded ex-conservative myself, I understand this belief, because I once shared it. But I changed, and for a simple reason: I realized the past I lionized, embodied in classic rock, didn’t objectively exist. It was curated, and the curators stood to profit.

My coworker, to whom I often feel friendly but whom I wouldn’t necessarily consider a friend, is definitely racist. But he isn’t a committed bigot, would never join White Power organizations, and isn’t more than superficially committed to racism. Rather, for him, racism is part of a massive background collage that exemplifies the way things simply are. Racial justice, women’s issues, police reform, and wearing a mask, all demand he change.

That brought everything together for me. From the outright bigotry I see in blue-collar work, to the subtle, systemic exclusion I witnessed in teaching, it’s never been directly about race. It’s always been about the reality that those who are comfortable and well-protected, don’t want to sacrifice their comfort and protection, even for the common good. I don’t know exactly why I willingly embraced change when I did. But most people must be forced to change.

I turned to my coworker, stifling a sigh. Challenging him on the facts won’t change his mind. Instead I used my Hank Williams, Jr., argument, which I’ve described before, that conservatives once saw refugees as a sign America was doing something right. If they’re coming here, it’s because we uniquely have something they need.

He nodded at that, and I saw him biting his lips thoughtfully. Clearly I gave him a viewpoint that defended his worldview while making change possible. I probably didn’t change his mind.

But I can hope that I planted a seed which will bloom in the future.

Friday, June 19, 2020

You Should Be Reading Victor LaValle

1001 Books To Read Before Your Kindle Battery Dies, Part 105
Victor LaValle, The Devil in Silver and The Ballad of Black Tom


A monster roams the psychiatric unit of New York’s New Hyde Hospital, terrorizing the patients least prepared to defend themselves. A guy known only as Pepper has been committed to New Hyde without a trial. At first, Pepper wants only to escape the hospital’s narrow confines. But when the violent bison-headed monster comes within inches of killing him, Pepper adopts a new mission: find and destroy the monster.

Victor LaValle isn’t the first novelist to use genre fiction conventions to create more self-conscious literature. Indeed, as his text name-drops writers like Stephen King and Ken Kesey, it’s difficult to read him without thinking about Jorge Luis Borges or Salman Rushdie. Reading LaValle, I felt like I’d fallen for a bait-and-switch, but in a good way: his back cover copy implies a downmarket paperback thriller. His text is sophisticated, character-driven, and literary.

In The Devil In Silver, Pepper enters the hospital, thinking himself a sane man committed unjustly. He never completely shakes that belief. But his narrow-minded obsession with escape, which he increasingly realizes means passing through the Devil’s domain, starts hurting other people. He soon has the ward in open revolt. Somehow, even when people keep dying, Pepper finds ways to rationalize that it isn’t his fault.

Because the Devil exists, trapped in a no-access hallway behind a locked silver door. The staff know it’s there, the patients know, even the police see the Devil rampaging through New Hyde’s hallways and remain willfully blind. Pepper knows he can’t leave without first facing it. Yet every challenge to the hierarchy which protects the Devil, results in Pepper getting hit with high-dose pharmaceuticals. They’d rather dose the rebellion away.

Surprisingly, the Devil itself, and the story’s supernatural elements, remain off-table throughout most of this novel. It isn’t about the monster, primarily; LaValle spends his greatest length on the ways his characters come to grips with themselves, or more often, the ways they don’t. Pepper thinks he’s Randle McMurphy, leading an insurgency against an unjust administration. Only slowly does he realize his fellow patients seriously need their treatment.

The twist, though, is: the administration really is unjust. Simply because the patients tenuously depend on their medications, doesn’t mean a tyrannical administration and a moribund staff culture aren’t keeping them down. Pepper desperately tries to mediate the three-way battle between the bureaucracy, the patients, and the Devil. Somehow, he never completely realizes he’s possibly part of the problem.

The Ballad of Black Tom is more unambiguously dark fantasy, a direct rewrite of H.P. Lovecraft’s notorious short story, “The Horror at Red Hook.” But even when explicitly rebuilding a genre classic, LaValle keeps emphasis on characters and literary weight. Charles Thomas Tester, a Jazz Age son of Harlem, makes bank delivering cursed items for supernatural customers. He’s moved so many artifacts, their wizardry has rubbed off on him.

Victor LaValle
As a sideline, “Tommy” Tester is a mediocre guitarist. Brooklyn eccentric Robert Suydam hires Tommy as entertainment for a darkly themed banquet. Suydam really wants Tommy’s paranormal aura; apparently Suydam, a student of ancient arcana, intends his dinner guests as mass sacrifice to revive a sleeping God. Personal tragedy, though, renders Tommy less than docile, and at a key moment, he interrupts the spell.

LaValle clearly intends this novella to comment upon its source, the story that probably most directly voices Lovecraft’s racism. Lovecraft lived briefly in Red Hook, Brooklyn, but couldn’t stand mingling with its racially diverse population. He called it “a maze of hybrid squalor,” a term LaValle recycles twice. Lovecraft’s story, though genuinely terror-inducing, is also supremely bigoted, so LaValle interjects a character poised to resist.

Lovecraft centered the story on Thomas Malone, an Irish-American detective. LaValle splits that character in two. Detective Malone, in LaValle’s telling, becomes a rather obvious cypher for Lovecraft himself, a sensitive but bigoted seeker after truth. Tommy Tester inherits Lovecraft’s more muscular tendencies, but also a tragic history perfectly suited for today’s BLM culture. These two stand poised for a cataclysmic confrontation, and LaValle doesn’t disappoint.

These stories share themes of exclusion, bureaucracy, and willful blindness. But they aren’t retreads of the same story. The Devil In Silver is long, with an ensemble cast and multiple subplots; The Ballad of Black Tom is a concise novella with a singular, uncluttered through-line (but two POV characters). LaValle, like most authors, has themes he treasures, but he doesn’t simply repeat himself. His characters’ darkness isn’t just a gimmick.

Truth, for LaValle, is found only by passing through the dark.

Wednesday, June 17, 2020

Racism in the Workplace: a Memoir (Part Three)

PART ONE
PART TWO

One day three of us sat down to lunch in the company breakroom. I usually take lunches in my truck to avoid exactly the kind of encounter that was about to happen, but I haven’t had a working motor vehicle in months now, so I was forced to share space with the other guys. A subcontractor invited himself to join us, as they sometimes do—I’ve gotten used to forced familiarity with near-strangers.

Then he decided to tell a joke. It was too off-color to repeat on a family blog, but it was about the shortcomings of the Affordable Care Act. Fine, whatever, I turn selectively deaf whenever my coworkers talk politics. Except rather than dividing the two guys in the joke into the Democrat and the Republican, or the Rich Guy and the Poor Guy, he divided them into the Black Guy and the White Guy. The punch line: Black people are poor, and Obamacare condescends to them.

In Stamped From the Beginning, his history of American race relations Dr. Ibram Kendi writes that racism didn’t justify slavery, slavery justified racism. Economic policies generally come first, and bigoted attitudes follow afterward, a post facto justification. Kendi, a historian, cites generous primary sources to prove this, and I believe him. American history isn’t unequal because we’re inherently racist, we’ve become racist to live with our inequality.

However, the fact that one White worker could tell that joke, and two other White workers could laugh at it, tells me attitudes don’t necessarily follow policy. Once people who consider themselves protected by the economic structure, they won’t relinquish that protection, just because policies shift to ban outright discrimination. My coworkers have internalized racist attitudes that make “Black equals poor” funny, and will resist changing their minds, whatever the cost.

As I wrote this series, the Rayshard Brooks murder grabbed international headlines. My co-worker I wrote about previously, the one who wondered why George Floyd was so famous, also groused aloud about the attention poured on Brooks. Even after acknowledging that police shot Brooks in the back, while he was running away, and that he posed no threat to anybody, this guy still complained: “If people knew the stress police were under every day, they wouldn't complain so much.”


My conscience reverberates whenever anybody says anything like that. I know I should protest, that I should challenge the attitude that any stress ever justifies official violence against nonviolent civilians. But I’ve previously been made to feel unsafe at work, even felt it necessary to rip political stickers off my truck to protect my safety. My coworkers’ attitudes are no longer about justifying official policy. They’ve internalized an attitude of inequality.

It strikes me, however, that society’s nominal protection, accorded to people on the “good” side of social divides—the protection sometimes called “White privilege”—doesn’t often protect these people. Sure, they have the reassurance that they won’t get murdered under color of authority for nonviolent offenses. But most of the men I work with are living paycheck to paycheck, with little expectation of doing better. We’ve become a semi-permanent White underclass.

Worse, that underclass is spreading. My former university colleagues are increasingly disadvantaged by the status quo, their pay scales frozen, their department dominated by adjuncts, and their discipline openly disparaged by STEM advocates in the administration. They, too, respond by buying deeper and deeper into a hierarchy that keeps the department lily-white. Blue-collar or white-collar, workers would rather commit to the lie than face the truth.

Because the truth is, the problem we face isn’t transitory. The entire system favors those who have over those who strive. Blue-collar management positions and skilled tradesman jobs go to those who have expensive academic credentials, not those who work hardest. Scholarly jobs move toward those to whom academia comes easily, which mostly means coming from backgrounds wealthy enough to afford book collections and college prep courses.

Fixing the problem will require dismantling the entire system, possibly by force. And that will hurt everybody in different ways. First, revolution always displaces the poor immediately. Later, though, we’ll have to scrap the entire superstructure, meaning anybody who’s played by the rules and gained any scrap of seniority, will see their accomplishments taken away. To put it another way, the current system is unfair, but fixing it will also inevitably be unfair.

So my coworker, who’s only marginally protected now, would rather defend that protection than risk losing everything to improve everyone’s lot. And somehow, that’s funny.

Tuesday, June 16, 2020

Is This the Best Right-Wing Film Ever?

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 40
John Milius (writer/director), Red Dawn

You already know the story: Soviet paratroopers descend upon an unsuspecting Colorado town during the local high school’s first period. The invasion is swift and decisive. Six teens, representing six personality types—jocks, rich kids, brains, et cetera—flee into the Rocky Mountain foothills, equipped with rudimentary survival gear and assault rifles. After a brief period of indecision, they begin their insurgency against the invaders. Which tired, combat-hardened kids will survive to see liberated America?

It’s sometimes difficult to appreciate exactly how conservative Hollywood’s 1980s Brat Pack movement was. The trend was spearheaded by screenwriter John Hughes, in movies like Pretty in Pink and The Breakfast Club, pictures that didn’t exactly wear their conservatism externally. These pictures, however, significantly trusted tradition and authority; they only feuded over which traditional authorities deserved teen audiences’ loyalty. This movie remains unique in broadcasting its political motivations, explicitly asserting American greatness is under threat.

The plot almost doesn’t bear recapitulation. The six teens (joined by two girls whose major contribution is being girls) organize a grassroots insurgency against the Soviet invaders and their Latin American allies. The Soviets respond with reprisals against civilian targets. Each teenager endures some traumatic personal loss, usually a dead or turncoat parent, but rather than capitulating, the kids reload and keep firing. The didactic story plays with the inevitability of a medieval morality play.

Writer-director John Milius was a classmate of George Lucas and Steven Spielberg. He arose from the same New Hollywood background that informed their earliest work, and even did uncredited script-doctor work on Spielberg’s Jaws. However, while those directors drifted increasingly into broad, downmarket blockbusters, Milius became increasingly enrapt with politics. His career peak was characterized by increasingly militant right-wing pictures. This one probably stands as his personal pinnacle, and conservative Hollywood’s Reagan-era high water mark.

Importantly, Milius doesn’t pretend his story isn’t instructive. From the opening crawl, he establishes that America has grown soft through dependence on NATO. The Soviets are able to track the teenage insurgents, who christen themselves the Wolverines (their high school mascot), through ATF paperwork recovered from the sporting goods store. In his most famous visual, Milius’ camera pans from an NRA “Cold Dead Hands” sticker to its owner, smoking pistol in his cold, dead hand.

This morality isn’t accidental. Since at least Edmund Burke, philosophical conservatism has insisted that society is occupied, all goodness colonized by pervasive sin. Good people, conservative leaders insist, must constantly refine and purge their characters through violence, literal or metaphorical. Burke despised the French Revolution, but considered it necessary, as the ancien regime had grown lenient and squishy. Teddy Roosevelt suggested having wars every generation, to mold young men’s characters. Modern conservatism is inherently warlike.


For Milius, this war isn’t institutional, it’s personal. The Wolverines find an American fighter pilot surviving behind enemy lines, almost certainly a nod to Lord of the Flies. This pilot provides the teens military training; he also narrates the developments of World War III, which unfolds as pure background. The entire war happens to teach these boys, and it’s certainly about the boys, the values they must defend against Bolshevik wickedness. It’s all about individuals.

Surprisingly, for both its genre and its era, this movie doesn’t shy from hurting the characters. Where John Hughes taught teenagers important lessons about society and values by shaming them in school, John Milius wholly tortures his kids. They watch their parents tortured and murdered. One kid, the mayor’s son, discovers his father is a collaborator, proving the failures of conventional politics. Where many action films constantly protect their protagonists, Milius kills his heroes onscreen.

We cannot avoid Milius’ conclusion, which he signposts without stating it outright: we must destroy Cold War America to save it. Milius appears further Right than Ronald Reagan, who at least nominally supported diplomacy and negotiation. Where political leaders talk and make horse trades, Milius (channeling Burke) asserts that society is a bellum omnium contra omnes, and values, including American greatness, must arise from savagery. Moral goodness comes to anyone willing to destroy evil violently.

Certainly, this movie is dated. It makes assumptions about the Soviet Union’s alliances which, we now know, were patently untrue. Its battalion of teenage archetypes represents 1980s ideals that haven’t aged well. A 2012 remake, recasting the enemy as North Korea, died on arrival. We must watch this movie as an artifact of its time; but within that context, it’s a taut, well-paced introduction to conservative philosophy. It concisely forecasts the America we inherit today.

Monday, June 15, 2020

Racism in the Workplace: a Memoir (Part Two)

PART ONE
I helped build the new Kearney (Nebraska) High School

When I started working construction, I told everybody: “This is undoubtedly the most segregated workplace I’ve ever seen.” I’d only worked the field for a couple of days before I realized we had one White masonry crew, and one Black masonry crew, and they didn’t work together. Despite hiring a couple of Hispanic junior project managers, my general contractor was almost entirely white. And, at least in my area, I discovered White people don’t hang drywall or install suspended ceilings.

Yes, I told anybody who’d listen how segregated my workplace was. Only after doing this for several months did I realize “anybody who’d listen” was almost exclusively White. The friends I made volunteering at the community theatre; my former colleagues at the University; the friends who’d carried over from other workplaces and my school days. The people I had any opportunity to speak with outside work, were pretty solidly white.

It’s not like my community has the lily-white complexion of a pre-Norman Lear sitcom, like Mayberry or something. Though my small-ish Great Plains community is Whiter than America overall, we have a pretty sizable Hispanic community, and our African American population has increased in the last four or five years. But my spaces—my leisure activities, my commerce, my watering hole, my church—don’t overlap with theirs.

Most importantly, as I became aware of this, I also became aware that construction probably wasn’t my most segregated workplace. The race-based crew divisions were visibly jarring in construction, because they were visible. Most of my workplaces haven’t mirrored the complexions of the communities in which they were built, because employers tend to be self-selecting. Nowhere was that more pointed than in education.

Teaching English, I had three non-White colleagues: a Black man, a South Asian woman, and a Native American woman. I loved all three, had good collegial relations with each, and remain in contact with all of them a decade later. But by the time my teaching hitch ended, two had already moved elsewhere, and the third was winding down. Who knows what happened later, but when I left teaching, the department had become more White, not less.

In the unlikely event that my former colleagues are reading this, I must strenuously emphasize: this doesn’t mean you’re racist, not in the sense of personal bigotry or spewing the N-word. As I emphasized in Part One, I’ve seen that ugliness exposed in my current workplace. The racial division I saw in teaching was invidious, but also impersonal. No individual caused it. Chances are, no individual could’ve stopped it, either.

A promo photo from my old university’s teaching endorsement program

Instead, the racism I experienced in teaching established itself long before the University. Black and Hispanic people have fewer opportunities to attend high-achieving post-secondary school, and even fewer to attend graduate school. The educational system sorts what traits and accomplishments it considers worthy of reward; and those accomplishments generally resemble the leadership’s own. That leadership has, historically, been White.

Further, university admissions generally reward student accomplishments at the high school level. Throughout most of America, public schools are funded largely through local property taxes. This means well-off neighborhoods have competitive schools with the latest resources; and, as Sheryll Cashin writes, the most well-off in America are overwhelmingly White. Poorer, less-White neighborhoods soldier on with dilapidated buildings, underpaid staff, and outdated technology.

Worse, as American educator Jonathan Kozol writes in Savage Inequalities, even citizens who consider themselves progressive react unthinkingly when asked to consider spending wealthy communities’ funds in poorer school districts. This ensures that education, American society’s great economic leveler, keeps poor kids in poor schools. And it ensures few people have opportunities to meet across racial lines, except in books, which are good but insufficient.

Therefore higher education, which trains the next generation of schoolteachers, continues to reward a system of accomplishments which are overwhelmingly White, middle class or wealthier, and unresponsive to changing national demographics. Construction industry racism involves dropping the N-word and segregating the work crews. Education racism involves ignoring non-White accomplishments, often without realizing that’s what we’re doing. Both preserve an existing, unfair power dynamic.

Education, as a work field, isn’t immune to change and improvement. Within living memory, the field assumed women should only teach children, and men would handle the heavy lifting of college. Thankfully, this attitude hasn’t survived, and many schools have overturned this gender hierarchy. We could do the same with racial hierarchy, if we wanted. But it would involve acknowledging invisible segregation, which White people notoriously hate doing. Still, the time has clearly come.


TO BE CONCLUDED

Thursday, June 11, 2020

Racism in the Workplace: a Memoir (Part One)


“What makes George Floyd so special that they made him famous?” my co-worker grumbled without looking up from his phone. “I mean, you never hear about all the cops getting killed by Black people, and then one cop fights back and wins, and he becomes a celebrity. I mean what the fuck?”

I was desperately trying to eat lunch and read my book with minimal interference. Working construction, I spend my entire day surrounded by human-made noise, the constant racket of tools and equipment and radios and conversation. Lunch often is my only opportunity for relative quiet; but people who work construction generally don’t treasure silence like I do. They talk, sometimes constantly, to fill the silence.

Like everyone else, their lives currently are suffused with two topics: COVID-19 and the protests sparked by George Floyd’s killing. So naturally, when they chatter amiably, their conversations run toward these topics. I wind up hearing their opinions, which they share generously, confident in the knowledge that everyone essentially agrees with them. They get their information, and their filters, from mostly the same places, after all.

“God, did you see the pictures of the protests?” this same co-worker asked me earlier in the week. “There was this police horse with, like, blood running down its face, where some asshole threw a brick at it.” No, I told him, I hadn’t seen this; I asked where he got this from, knowing it’ll probably be the same source, always relayed with a shrug that says the answer is obvious. “It was on the Fox News app.”

No sir, I hadn’t seen any pictures of bleeding horse. When I Googled it later, I found multiple sources sharing the same three pictures of a black horse, wearing police colors, bandaged around its muzzle. Every source was explicitly linked to right-wing politics and included exhortations against the evils of protestors. None of the sources were impartial journalists.

However, the same Google searches yielded another statement from actual journalists: the mayor of Houston apologizing for footage showing a mounted officer knocking over a pedestrian, apparently on purpose. I’d already seen that footage, melded into a montage showing various police atrocities. Cops in riot gear, linking shields and advancing into crowds like an Athenian hoplite phalanx. Pepper-spraying protestors. Targeting journalists.

I don’t pretend for one minute that my sources are impartial. I tend to distrust centralized authority, and like most human beings, I tend to seek sources that share my beliefs. Therefore, I shouldn’t complain too much that my co-workers also seek sources that ratify their pre-existing assumptions about reality. However, there’s a distinct difference that comes across in the language which results.

I say “Fuck the police,” but not out loud at work.

My co-worker, the one who complained about the fame attaching itself to George Floyd, that same day also used a word I won’t repeat to describe Hispanic subcontractors.

He has also dropped the N-bomb.



Remember that image, which has gotten wide traction, of a police officer aiming a pepper-ball rifle directly at a television camera? Man, that made me angry. But to make my anger go away, that police officer need only do one thing: surrender his weapon and stop being a cop. Because I’m angry at him for a thing he does.

When my co-worker describes other people, right there on-site, using the N-word or the S-word, he’s showing contempt for them over something they are. And he isn’t alone. When he complained about how unfairly put-upon the poor, innocent police are by the horde of Black people, our boss, eating his own lunch not far away, nodded sagely and said “Mmm-hmm.”

Which, for my workplace, is remarkably mild. As I've written before, management at my company is as likely to instigate racist language as anybody else. And they certainly won’t quell it. About a year ago, I reached my limit and voiced what I thought was an anonymous complaint about the HVAC professionals playing Rush Limbaugh and Alex Jones at maximum volume. My boss told the HVAC guys exactly who complained, and I spent months paying for it.

So when my co-worker uses racial language, I no longer complain. I bury my nose in my book and become studiously deaf. The entire workplace culture is organized against change, indeed against basic fairness. Much as I’d prefer to brazen out others’ judgements to support what I believe is right, I have no fallback for rent and groceries. So I swallow my objections and turn deaf.


TO BE CONTINUED

Monday, June 8, 2020

Dick Wolf, of Law & Order Fame, Can Suck It

Dick Wolf, creator of TV's Law & Order
The headline in Variety magazine’s online edition says everything: “Dick Wolf Fires Writer From ‘Law & Order’ Spinoff for Threatening to ‘Light Up’ Looters.” The brief subsequent article announces that writer-producer Craig Gore tweeted photos of himself wearing a face mask and carrying a large-bore rifle, threatening to “light up” protesters threatening his property. In response, Dick Wolf, the Paradigm talent agency, and others dropped Gore from their roster.

I understand Wolf’s urgency to publicly denounce Gore, and subordinates like him, who offer to exacerbate violence during an already tense time. Gore’s high-profile presence on a lucrative media franchise makes him potentially explosive, and Wolf, a noted donor to center-left political causes, needs to visibly place distance between himself and such language. The quantities of money riding on Wolf’s decisions make polite, reserved dignity an impossible option.

However, I question the sincerity of Wolf’s actions. Not the sincerity of his decision to drop Craig Gore, but the sincerity of attempting to create another Law & Order property, particularly one featuring returning star Christopher Meloni as Elliot Stabler. Despite his moderately left-wing politics, Wolf’s star franchise has long favored “hero cops” who consider themselves too important for boring old rules, and Stabler ranks near the top of that category.

According to the Law & Order fan wiki, Stabler has killed six people, two offscreen. The same wiki lists Stabler having longstanding authority issues, including a refusal to seek mental health treatment following officer-involved shootings and other traumatic incidents. The show has historically shown Stabler roughing up not only suspects, but witnesses and bystanders who display his same anti-authoritarian tendencies. The storyline takes pride in Stabler’s “loose cannon” ways.

Yale-trained social scientist David Graeber notes, in his book The Utopia of Rules, that American police dramas tend to favor police who disdain authority. Stabler comes from the same stock as Dirty Harry, Murtaugh and Riggs, and John McClaine, rogue forces operating outside the official sphere of supervision, often without organizational support. Graeber notes that it’s common for “hero” cops, in media, to turn in their badges and go hunting.

Stabler never surrendered his badge, not in any episode I remember. (Yes, I’m an audience member of the show I’m condemning. I’m not deaf to the disjunction.) However, his adversarial relationship with Internal Affairs was such an integral part of the show’s arc that his investigating officer became a recurring guest star. In one episode, Stabler and his investigating officer sat opposite one another, sarcastically complaining about a fabricated charge.

The scene was played for laughs.

Christopher Meloni as Elliot Stabler
In early seasons, the series established Stabler as damaged by his occupation, but ultimately honorable and reliable, a force for good. However, as often happens in series television, the character became increasingly defined by a smaller and smaller number of characteristics, which became exaggerated by overuse: his tendency toward violence, disregard for procedure, and thrill-seeking behavior. He became cartoonish and, worse, he became as destructive as the criminals he hunted.

This partly reflects fan service: the behaviors audiences respond well to, generally become prominent through repetition. That keeps audiences hooked, and ad revenue rolling in. But it also reflects something deeper underlying the franchise’s philosophy: that law is worth defending, even when it’s demonstrably unjust. As political scientist Ian Haney López writes, “law and order” is a longstanding dog-whistle for suppressing dissidents and protesters.

Historically, Stabler has tap-danced around the spirit of the law, while remaining friendly with the letter. However, he used techniques, onscreen, that, in a military interrogation, would count as war crimes. Stabler’s behavior became so caricatured that, when actor Christopher Meloni moved laterally to the series Happy, that show’s writers’ room played the same behavior for low comedy.

Therefore, when Dick Wolf acts publicly outraged when his writer emulates the same behavior, I have little interest in his crocodile tears. Wolf helped glamorize the idea that police would torture suspects, utilize physical violence, and get away unscathed. He is, partially, responsible for the cultural moment which produced George Floyd’s murder, and countless other murders by police who expect bureaucrats, and society generally, to accept this behavior.

I don’t mean Dick Wolf caused this moment. He exists in a continuum with previous Hollywood police writers who directly glamorized official violence and top-level lawlessness; for him, it’s simply a job. However, he needs to recognize that his job helped create the cultural conditions which made current events possible. His official protests of innocence don’t interest me. And firing one writer is a case of “too little, too late.”

Thursday, June 4, 2020

Why I Still Believe In Voting

A Libertarian meme on the theme
Every four years, give or take, Americans hear the same argument repeated: “If you don’t vote, you don’t get to complain.” Even discarding what bullshit it is to make free speech contingent upon participating in a quadrennial bureaucratic exercise, I think this argument distorts what rule by small-D democracy really means. Do we really relinquish the liberty to inveigh against injustice because we didn’t check a box on an arbitrary November day? Be serious, please.

But I’m seeing an even more invidious argument starting to arise from America’s hard-line progressive circles. (Yes, I realize America’s “hard-line progressives” are centralists globally. Save that argument for later, please.) An increasing number claim that voting is not just ineffective, it’s counter-revolutionary. To participate in the current order’s leader selection process, they assert, makes you part of that order. In other words, I’m hearing leftists saying not to vote, because it slows the revolution.

We’ve heard that low voter turnout supposedly helps conservatives retain power, an intuitive response that doesn’t necessarily withstand scrutiny. But certain elements in progressivism want this outcome, because consolidated conservative power during a time when America’s youth are shifting leftward, makes the likelihood of revolution more likely. Supposedly. They want uprisings against the power structure, which they see as so innately corrupted that it cannot be redeemed. Voting, they fear, steals their movement’s revolutionary vigor.

First, I’m not persuaded revolutions are necessarily good. We lionize the American Revolution, but it discarded the English aristocracy in favor of a colonialist slaveholding aristocracy; and though we removed slavery, nearly ninety years later, we’ve never completely removed that aristocracy. That’s saying nothing of the disasters which followed other revolutions, like Oliver Cromwell’s Irish purges, Julius Nyerere’s village programs, and the famines caused by collectivized farming in China and the Soviet Union. Revolutions kill.

Admittedly, tinkering around the edges of the existing system hasn’t produced anything better. We’re suffering through massive environmental catastrophe, partly, because bureaucratic infighting has prevented America from passing any new climate legislation since 1991. Barack Obama promised us “Change You Can Believe In,” but left a country less economically equitable than when he was inaugurated. The violence rending American cities now bespeaks the essential powerlessness of even “progressive” governments. Things just keep not getting better.

A Marxist meme on the theme
So, if revolutions hasten disasters, but small-D democracy does nothing better, what options do Americans have? During times of civilian violence unseen since Dr. King was assassinated in 1968, staying the course doesn’t exactly seem desirable. It took a week of nationwide protests, including riots, to indict four cops who killed a man for a non-violent offense. Surely, I imagine you saying, we can’t stick with what we have? Placate ourselves by casting a ballot?

That seems a false dichotomy, though. Participating in America’s democratic norms doesn’t preclude participating in demonstrations demanding other reforms at the same time. We don’t have to choose either voting or protesting. Indeed, if we take protesting seriously, I suggest we need to cultivate a government which listens to protesters, believes their demands come in good faith, and doesn’t treat street protests like insurrection. We need a government that is sympathetic to its people’s anger.

To achieve that, we need to participate in democracy. We need to utilize the rights we have, even if those rights frequently feel truncated and vestigial. Even if we sometimes feel we have to choose the least awful option (and, speaking personally, that certainly sounds like 2020 to me), we nevertheless have the ability to pick the best possible status quo to fight against. Admittedly, that doesn’t make a ringing endorsement. But what’s your alternative?

I understand the impulse to retain moral purity by refusing to participate altogether. It seems, superficially, like if we vote, we’re getting the bureaucratic stain upon us. Certain religious groups, like the Jehovah’s Witnesses, refuse to vote or join the military for exactly that reason: once you participate, you become part of this world, and you’re stained with its tarnish forever. Yes, I appreciate that nobody wants to horse-trade their morals. It feels like treason.

Refusing to vote, though, entrenches the existing power system, which has demonstrated itself willing to kill dissidents. While our president isn’t necessarily a tyrant yet, he’s clearly allied himself with a slicked-up, high-tech fascist movement, and stacked the deck with insiders. Certain progressives’ fantasies about revolution would create a power vacuum, into which, history tells us, someone worse would probably step. That’s why I still vote: to pick the power I’m ready to fight against.

Tuesday, June 2, 2020

Being a Peacemaker in a Violent World

“Blessed are the peacemakers, for they will be called children of God.”
—Matthew 5:9 (NIV)
As a Christian, I have struggled with this message. It seems like a desirable message, that those who bring peace and mercy to this world have God’s calling. And peace seems especially prominent for those who believe. “Go in peace,” Jesus says, “and be freed from your suffering.” And, “Peace be with you! As the Father has sent me, I am sending you.” And, “I have told you these things, so that in me you may have peace.”

But what does that mean? When we offer to bring peace, we probably all have our own definitions of “peace,” just as we all have our own definitions of “God.” What did Jesus mean when He spoke about peace? His first-generation audience, almost entirely Jewish, would’ve understood His meaning of Shalom in the Hebrew tradition. Considering the destruction wrought by prophets like Elijah and Samuel, that doesn’t sound too peaceful.

In recent years, Christianity has come under increased scrutiny because the “peace” advocated in the Bible, particularly the Hebrew scripture, doesn’t accord with today’s warm, low-friction definition of peace. Too often, in our multicultural, scientistic world, we call for “peace” as the reduction of tension, as the elimination of violence. The Biblical YHWH certainly doesn’t, though, and that includes His Son, who said: “I did not come to bring peace, but a sword.”

Biblical peace often seems confrontational, even violent. God seems frequently willing to kick the beehive, or have His people kick it. Our late-modern, hug-a-bear definition of peace doesn’t accord well with a Messiah who would flip tables in the Court of the Gentiles. That definition of peace seems downright terrifying, not least because many churchgoing Christians would find ourselves underneath that flipped table.

Time spent in prayer and contemplation, however, led me to the Epistle of James. Written by Jesus’ reputed brother, this book was so controversial that Martin Luther supposedly wanted it removed from the Bible. Yet I find great comfort, as Luther didn’t, in James’ injunction about how our actions testify to our faith. I especially find comfort in James’ injunction, found in James 2:15-16:
Suppose a brother or a sister is without clothes and daily food. If one of you says to them, “Go in peace; keep warm and well fed,” but does nothing about their physical needs, what good is it?
James, who presumably knew Jesus’ message pretty intimately, understood peace, not as eliminating tension, but as eliminating deprivation. When Jesus charges Christians to bring peace, He doesn’t mean smoothing over tensions or making eruptions of violence go away. Indeed, as the “not peace but a sword” message conveys, Jesus understood that bringing peace could frequently mean confrontation.

That’s hard for me. I’m naturally averse to confrontation. I’ve joined several anti-war organizations since I moved away from my conservative, semi-nationalist upbringing, but I haven’t always been brave enough to participate in direct action. I certainly haven’t shared the bravery of public Christians like Dr. King or Father Daniel Berrigan, who repeatedly endured arrests and beatings to pursue their mission of peace.

Yet where we lack peace in this world, we don’t lack it because we have violence. That’s easy thinking, a simple solution available to White people. Peace comes from confronting injustice, and injustice comes, most often, from some form of self-seeking. When the rich seek wealth, or the powerful seek glory, the result is usually some form of injustice wrought upon the poor and the powerless. We’ve all seen it.

Peace, then, isn’t the absence of tension, and peacemaking isn’t just smoothing over high feelings or calming another’s outrage. Peacemaking must involve confronting the source of injustice. In James’ account, it requires me to bring hot food and a coat to my naked, starving fellows. But it also requires me to ask why some people, who work hard and love their families, nevertheless remain naked and starving, year after year after year.

Again, that’s hard for me. Prayer and study have convicted me that I haven’t been fully Christian, that my journey still has miles to go. Yet I believe, as one who honors peace, that I now have a clearer vision of my mission.Certainly I haven’t answered every question, not for me, or for anybody else. But I have comfort in knowing that, if I have a mission of peace, that mission must begin with justice.

Maybe you all would like to join me on this journey?

Monday, June 1, 2020

The Danger of Worshiping Saint Martin

Dr. King, center, commences a march

White people love trotting Doctor King out of retirement whenever civil unrest happens in America. Like, say, now. We love pictures of him crossing the Edmund Pettis Bridge, linking arms with a mixed-race coalition, resolute and heady. We especially love tossing orphaned quotes around heedlessly, stripped of context: “content of our character,” perhaps, or “Hate cannot drive out hate; only love can do that.” We love that.

Having grown up surrounded by anodyne White suburbia, in neighborhoods and schools with minimal diversity, I was exposed to a sanitized, low-risk version of Martin Luther King, Jr. I heard some of his speeches, like “I Have a Dream,” or “I Have Been to the Mountaintop,” soul-stirring examples of rhetoric which fired a boy’s imagination. I grew up never not knowing who Doctor King was, for which I’m certainly thankful.

However, the version of Doctor King I knew was sanitized and made into a simple morality narrative. This version challenged racism, embodied in naked bigotry of Jim Crow segregation and Bull Connor crackdowns, and won. The stories placed an individual of surpassing morality and personal rectitude, against a system so riven by internal rot that its defeat was written into its structure. The end was always inevitable.

In short, the schoolbook version of Dr. King I learned was a saint.

This version excluded the great complexity of King’s struggle. For instance, I too saw the pictures of King crossing the Edmund Pettis Bridge. Not until college did I see photos of what happened next: Alabama police hitting his marchers with batons and loosing German shepherds on protesters who’d already fallen. I learned in fourth grade that King received the Nobel Peace Prize. In college I learned America protested this award.

Nor did King’s complexity end externally. He also had significant problems within his own soul. Famous for his forward-thinking engagement on racial issues, King had retrogressive attitudes about women, and held organizational positions vacant for years rather than let women take charge. This gendered thinking manifested in his now-extensively documented adultery, and accusations of far worse.

Dr. King depicted as a literal saint,
Martin Luther King of Georgia
Further, as Ibram Kendi demonstrates, the centrist, cooperative King beloved by White moderates, from early in his career, is often at odds with the frequently darker, more confrontational King of his later career. Early King often soft-pedaled White abuses, and urged Black Americans to live up to impossibly high White standards to achieve an ever-elusive level of acceptance. He later acknowledged this mistake, but it was already written down.

Even his strengths were far from perfect. While he made great inroads on American racism, and saw many forms of outright oppression banned, he didn’t really end racism, it just went underground. Plus, his concerns didn’t end with racism. Later in life, he described the Giant Triplets of Evil: Racism, Militarism, and Materialism. Remember, he died in Memphis, where he had gone to organize a labor union; his economic concerns remain largely untouched.

This places an important rift between Saint Martin, the man whose bold stands make clear moral lessons for today’s children, and Dr. King, the man whose struggles remained largely unresolved upon his death. The schoolbook version of King, trotted out in Internet memes whenever Black Americans become restive, is definitely the sainted version, always beneficent, never ruffled. The real man became frequently angry and frustrated.

Sainthood can provide powerful instruction in moral goodness. When somebody has accomplished something so outstanding that their work becomes memorable after their deaths, we can study what they achieved, and how. Then we can mimic them until we internalize their moral strengths and become able to act independently. That, presumably, is why churches canonize saints and other holy figures.

But sainthood also freezes people in moments without context. The rush to canonize, say, Mother Theresa, came sideways on important questions raised in her journals, published only posthumously. We now know, as her contemporaries didn’t, that she struggled with deep doubts, wondering not only whether God noticed her actions, but even whether God existed. Her records show she didn’t find the answers she sought in this life.

Freezing Dr. King this way, eliminating his dark side and ignoring the fights he didn’t win, teaches today’s audiences the wrong lessons. It makes us perceive setbacks as permanent, doubts as disqualifying, and sins as irredeemable. If even Dr. King, the great saint of my childhood textbooks, could hold awful opinions, and lose his most important battles, it means my efforts, however thwarted, still matter. That’s much more valuable to me than Saint Martin.