Saturday, October 31, 2020

Grammar and Race, or, Should You Capitalize “White”?

Another “tempest in a teapot” erupted in my local newspaper recently: an angry Letter to the Editor complained about how their copy checkers standardize spelling races. Many style sheets, including that favored by the Kearney Hub, assert that “Black” should be capitalized when referring to race, a standard practice since at least the Black Power Movement of the 1960s. However, “white,” as a race, frequently isn’t.

My local newspaper’s letter questioned whether capitalizing one race, and not another, might represent according respect to one, or worse, bestowing disrespect upon another. A friend responded by pointing out that White people (I capitalize races, for reasons I’ll describe later) have distinct ethnic heritages, like Irish or Italian. Black people, as a legacy of American slavery, don’t. Therefore being White is categorically different from being Black.

I find this logic unsatisfactory, for one reason: it assumes most White Americans come from ethnically homogenous backgrounds and communities, and know their background. This just isn’t always true. Many White Americans, including me, have only a murky understanding of our heritage, or have heritage so muddled that we can’t claim anything clearly. Indeed, the concept of Whiteness exists specifically to erase ethnic or national ties.

This isn’t to say that Americans don’t have ethnic heritage. Back-East cities like New York and Boston are notorious for their checkerboard of ethnic neighborhoods, planted in the 19th Century when immigrants used old-country ties to smooth the process of expatriation. Garrison Keillor made Minnesota’s broadly Scandinavian culture famous. Nebraska, where I live, has a patchwork of German, Czech, and even Welsh communities.

By now, however, these terms mean little. As immigrant generations died off, and American-born children grew up speaking primarily English, many felt tenuous roots in their communities, and moved around frequently to pursue work and education. Many intermarried, creating a White American culture not bound to older heritage. Observance of a town’s ethnic background became mere civic niceness, like singing the National Anthem before a softball game.

Personally, I have no particular ethnicity. I’ve casually adopted Irish, because Green Linnet CDs helped me through a lonely period in the 1990s, but I have scanty evidence of any ethnicity, and what I know is irretrievably mixed. My grandmother told a story from her childhood, of a teacher instructing students to ask their parents about their ethnic background. “You tell them you’re American,” they responded, “that’s all that matters.”

So yes, White Americans have ethnicity, insofar as we’ve preserved it, but that’s ceremonial, without much substance. In practice, White Americans with Celtic, French, and Slovak surnames can intermingle freely, with little impediment, and the occasional repetition of “how do you pronounce that?” My German surname has never stopped me venturing freely into any American city, because I’m no ethnicity. I’m White.

White, like Black, refers not to any genetic or cultural absolute, but how people treat one another. Writers like Ian Haney López and David Roediger have written about the maneuvers which legislators, captains of industry, and public intellectuals have used, to give White a legally defensible quality it never had before. Whiteness doesn’t simply exist; it was consciously, painstakingly created throughout the 19th and 20th Centuries.

For years, I had no consistent orthography for writing “White.” Though capitalizing “Black” was standard English before I was born, “White” remains widely inconsistent. Earlier blog entries show me sometimes capitalizing “White,” sometimes not. I only started dependably capitalizing “White” after first reading Ibram Kendi in 2018. Only then did I realize the lengths taken to squeeze White Americans together—and squeeze Black Americans out.

Therefore I capitalize White, when describing race, because it doesn’t describe a color. American Whiteness, like American Blackness, isn’t value neutral; it describes a shared history of power relationships, civic struggle, and economics. Where Europeans have class consciousness, and define themselves according to wealth struggles, Americans have race consciousness. Whiteness is, for good or ill, a shared heritage.

And yes, I anticipate the rejoinder, that White American heritage is artificial. But all heritages are artificial. James C. Scott describes the feats which, say, Parisian engineers undertook to expunge regional identities and replace them with pan-French nationalism. Almost all ethnic heritages represent instruments of control, designed to standardize humans into a governable mass. That includes whatever ethnic heritage you hold.

Grammar, like race, doesn’t just objectively exist. Human beings create, and re-create, both constantly. Capitalizing words has more in common with etiquette than science. And spelling rules, like salad fork rules, change to suit the changing world around them.

Wednesday, October 28, 2020

Vampire Cowboy Cyborg Racist Firestorm

Paul Bettany (left), Maggie Q (right), and Cam Gigandet (rear) in Priest

When the nameless Priest threatens to kill his niece, Lucy, if he believes she’d been “infected,” the whole picture comes together. I realize why Scott Stewart’s action-horror film Priest, which died on arrival in 2011 despite an all-star cast and elaborate technical prestige, bothers me. Because, despite its attempts to look new-fangled and edgy, this movie is essentially a remake.

This is John Ford’s 1956 cowboy, ahem, “classic,” The Searchers.

This leads me to an important question: is it possible for a movie, which never mentions race or racial concerns, to nevertheless be racist? John Ford supposedly intended his Western to plumb the depths of White racism that so permeated Texas cowboy culture, that it became unnoticed, simply normal. However, many audiences watching The Searchers have suggested it fails that goal, instead serving to glorify the racism which often exudes from John Wayne movies.

In Priest, the titular protagonist, veteran of a war which civilized people would rather forget, gets contacted by a frontier lawman. Seems the vampires have attacked Priest’s brother’s homestead, killed his family, burned his home, and kidnapped Priest’s niece. Only Priest has the skills necessary to outwit the vampires and bring Lucy home safely. But Priest worries about something far worse: what if Lucy has been miscegenated?

Seriously. Replace the word “vampire” with “Comanche,” and it’s the same story.

Researching this essay, I discovered director Scott Stewart and screenwriter Cory Goodman included these references deliberately. They considered it a homage. Which isn’t entirely unfair: many auteurs, including George Lucas and Martin Scorsese, have picked over The Searchers for artistic influence. From Luke Skywalker’s burning homestead to David Lean’s long desert pan shots in Lawrence of Arabia, Ford’s movie casts a long shadow over cinema generally.

As I’ve written elsewhere, literature becomes classic if it speaks both to the time it was created, and to our time. The Searchers reflects a time when racism appeared everywhere in American society, and when challenged on it, the White response was to double down. When John Wayne, as Ethan Edwards, speaks vile slurs against the Comanche, we today feel uncomfortable, not because this behavior happened in the 1870s or 1950s, but because it happens today.

John Wayne’s Ethan Edwards is racist. Not only does he despise Native Americans, and Comanche in particular, but he so despises their influence, that he announces his intention to murder his own kin if he believes the Comanche have changed her—which, implicitly, means if they’ve had sex with her. Maintaining Debbie’s purity becomes Ethan’s obsession. Her White identity, evidently, can be irreparably marred by the Indians.

Priest maintains all these themes, Ethan’s obsession with family and purity and his willingness to kill his own blood rather than see her made impure. It just erases any reference to race. Indeed, despite the movie’s broad use of frontier homesteader themes, including railroads and “reservations,” the movie’s only non-White character is mixed-race Maggie Q, former protégé to Jackie Chan. The movie avoids racial discussions by whitewashing the frontier.

Yet despite scrubbing any reference to race, this movie remains littered with what look like racist dog whistles. The “vampires” live on “reservations,” tended by “familiars,” humans who’d rather live among the vampires than in human towns. At one point, the protagonists pursue the vampires to their homeland, which looks suspiciously like a giant African termite mound. The horrible, monstrous vampires even swarm like insects.

So unlike The Searchers, which, its defenders insist, was critical of Ethan’s racism, Priest simply omits race altogether. Yet where The Searchers foregrounds Ethan’s frequent bigoted statements, subjecting them to critical scrutiny, Priest buries racial issues in symbolism. Therefore, the latter movie’s defenders (who, admittedly, are few) could charge me with being the real racist, for even bringing race into the discussion.

Let’s not kid ourselves: despite its ambiguous dealings with racism, The Searchers is a good movie, an adept and difficult portrait of one man’s obsessions, which he uses as coping mechanisms to bury his guilt over supporting the Civil War’s losing side. Priest, by contrast, is lousy, a ragbag of science fiction and horror boilerplates assembled hastily to make a dollar. Maybe analyzing it gives credence it doesn’t deserve.

Yet racism remains widespread in America today, as recent events have exposed. Papering over Ethan Edwards’ racism doesn’t make it go away. Indeed, Priest’s White triumphalism, coupled with “othering” images, is arguably more vexing, since it doesn’t permit us to address its hero's racism directly. It just lets us continue pretending nothing’s wrong.

Monday, October 26, 2020

Calvinism, America, and Work

John Calvin (etching by Konrad Myer)

I didn’t do one goddamn thing all weekend.

I had sincere intentions this weekend. I planned to get so much writing done, and perform some household repairs, and clean the living room, and maybe get my unused second bedroom into a condition where I could comfortably have guests over. Then I disregarded my alarm both days, laid in bed, read books, watched a YouTube church service on Sunday, and ordered pizza. By contemporary standards, it was a wasted weekend.

As a Christian, I struggle with conflicting impulses. Remembering the Sabbath and keeping it holy is so important, that God wrote it among his Top Ten, alongside not killing and not committing adultery. Refraining from work for one day out of seven seems pretty important. This is emphasized by non-theistic regimes, like Revolutionary France and Soviet Russia, both of which rushed to reinstitute lite-beer Sabbaths when workers became widely sick with overwork.

However, Christianity also strictly condemns idleness, considering it a form of impiety. Though we’re enjoined to reserve one day in seven for doing nothing, the other six are ordered to remain cluttered with effort, an unending struggle to demonstrate God’s creative impulse made manifest in ourselves. Christian scripture brims with exhortations to keep busy, work hard, and remain ever productive. Sitting down and catching your breath is considered wasteful.

This contradiction becomes most manifest in Calvinism. In distinction from the preceding medieval Roman tradition, John Calvin insisted Christians had an imperative from God to constantly improve ourselves and improve our worlds. This improvement meant preaching the Gospel, certainly (though Calvin was squishy on actually feeding the hungry), but also human industriousness. Calvin thought God demanded we constantly work, build, and manufacture.

Although a dominant branch of conservative thinking preaches America as a Christian nation, we don’t commonly think of America as structurally religious. Yet the Calvinist tradition comes into America through our secular worship of the New England Puritans. This combination of religious lip-service and secular myth-making creates a distinct goulash of politics, religion, and economy which we, in America, call “capitalism.”

I’m not unique in drawing this conclusion. German sociologist Max Weber, writing in the early Twentieth Century, sees an almost straight-line connection between Calvinism and capitalism. America’s economic structure couldn’t survive if a critical mass of Americans didn’t believe work makes people morally good. Weber asserted that Calvinist Christianity is absolutely necessary for a thriving capitalist economy.

Max Weber
Thus, by blaming myself for not working harder this weekend, I’m doing the bidding of my capitalist clerics, apparently.

I must note, however, that this Calvinist ethic has come unmoored from its religious roots. I’m Lutheran myself, a tradition that doesn’t share Calvin’s insistence on work as a moral imperative. I have friends from many religious traditions, and no religious tradition, who also insist on this idea. One agnostic friend says aloud that seven-day work weeks are mandatory if you mean to do a good job; days off only inculcate laziness and slovenly work ethic.

Thus, because of secular myth-making, Calvinism has escaped the religious confines which cultivated it, and have become simply “American.” Peons, like myself, absolutely demand a “work ethic” from ourselves which involves complete self-abnegation, and deriving our identity from our employment. We don’t become whole persons through relationships, family, or community anymore; we become human when some institution grants us a paycheck.

Since joining the blue-collar workforce, I’ve noticed this widely. Though not all my co-workers are religiously observant, the language of Christianity nevertheless permeates workplace discussions. The more a person’s workplace language contains reference to God, the more resistant they are to organizations, like trade unions. Going alone makes people morally good, and therefore sufferings, like poor pay and scanty benefits, are a necessary sacrifice.

Many Americans have internalized this ethic. We believe doing something with our time, something upon which we can slap a price tag, is mandatory and constant. Spending a well-earned weekend putting our feet up, chatting with friends, and reading books, is time wasted, time which we’ll never recover and successfully monetize. And I, a lowly worker, sit around punishing myself for catching my breath.

And I’m still working. I can only imagine the internal strife plaguing Americans sidelined by COVID-19, what struggles they must endure because the pestilence continues to devalue work. After a weekend spent doing what I love, rather than what makes money, I feel lost, alienated, and inconsiderate. If I punish myself thus for sitting around for two days, how do people handle being sidelined for six months?

Thursday, October 22, 2020

Mjolnir vs. Plato: Comic-Book Philosophy


If Thor, the Norse god and Marvel Comics superhero, set his hammer, Mjolnir, down on a boat, would the boat sink? A stranger recently asked this question on an Internet discussion board, and I initially thought it a silly question. In the second Avengers movie, Age of Ultron, Thor clearly sets Mjolnir on a coffee table, and the table isn’t crushed. Therefore clearly the difficulty in lifting Mjolnir isn’t about weight.

Another stranger, though, complicated the question: leaving Mjolnir in the boat, could you pull the boat ashore? Could you successfully row the boat with Mjolnir aboard? Only Thor can lift Mjolnir, but could others move Mjolnir indirectly, as by moving whatever it’s sitting on? When Thor sets Mjolnir on Loki’s chest, in his first movie, Loki is incapacitated, but not destroyed. What force, then, makes mortals unable to lift Mjolnir?

Philosophically, these questions seem trivial. Except I’d argue they’re not. Plato, in The Republic, uses the Ring of Gyges myth to test theories of human morality. Fables, including comic book fables, have the capacity to push moral principles to their breaking point, without the complicating friction that reality inevitably provides. Questions about Mjolnir may have no practical value, but they open doors for other, more useful philosophical inquiry.

Start with the question of merit. In the first Thor movie, Odin exiles Thor from Asgard, and casts his hammer to Earth, proclaiming: “Whosoever holds this hammer, if he be worthy, shall possess the power of THOR!” This enchantment includes no meaningful definition of “worthy.” In the movie, Thor sacrifices his life to protect humanity, and is resurrected as an Asgardian. Worth, therefore, apparently means willingness to die on principle.

Except several heroes who die fighting, like Iron Man or Black Widow, can’t lift Mjolnir. Being worthy of Thor’s power evidently involves also resembling Thor’s principles, something that, in the final battle, only Captain America can match. Thor, apparently, deserves Thor’s powers, because he’s the being who most completely resembles Thor. Thus a common problem with purely theistic morality: God is righteous because God is God.

Being divine and untinged by human venality, Thor can lift Mjolnir. Tony Stark, who uses alcohol and promiscuity to plug his daddy issues, can’t. On balance, maybe that makes sense. When applying this morality to inanimate objects, it therefore extends to explain why Mjolnir doesn’t crush that coffee table, crash the SHIELD helicarrier, or plunge through multi-story buildings. Inanimate objects simply exist; matter alone is morally neutral.


This puts MCU morality in opposition to Greek gnosticism. Matter cannot be evil, because matter isn’t purposeful, only existent. This would confirm both Steve Rogers’ Christianity and Tony Stark’s atheism, both of which see matter as simply what is. That hypothetical boat would never sink, because it has no motives or purposes in itself. Though built by human hands, that boat, while floating idly, has no necessary morality.

The moment humans attempt to move that hypothetical boat, though, the movement (though not the boat itself) gains moral direction. Machines, technology, and manufactured products don’t have morality, but humans do. We can use our built environment to improve humanity and protect nature, or we can willfully cause harm. That moral judgement doesn’t accrue to the technology used, though tech might make immorality easier; judgement only applies to humans.

Our hypothetical boat, therefore, wouldn’t be immobile. It would drift on a river’s current, or rise on an ocean’s tide. If it hit a sandbar, it would still be grounded. Nature, being matter enacted by principles of physics, wouldn’t impede that boat’s progress. Mjolnir isn’t an anchor, holding that boat in one place. If it did, think how awful the consequences of Earth’s rotation would be!

Humans attempting to move that boat, however, would incur moral judgement. Either pulling the boat ashore, or rowing, would be futile efforts, except for the minority of humans pure enough to share Thor’s worthiness. Every action humans perform has purpose, even if that purpose is pre-conscious or transitory. Therefore every such action incurs judgement. In the MCU, where effect very closely follows cause, morality is always imminent.

Sadly, this creates more questions. If humans dam a river, have we imputed morality onto the altered current? Well, if the current destroys somebody’s home, perhaps. Matter may simply exist, but humans change it; that’s our nature. Therefore matter isn’t morally neutral, once humans exist. Thor’s hammer might not destroy a coffee table, but what about setting it aboard a Nazi battleship? I feel a headache coming on.

Wednesday, October 21, 2020

An Argument Against Free Speech

I consider myself a free-speech absolutist. But like most absolutists, I’m not that absolute. You shouldn’t, for instance, use speech to incite violence, a position long supported by the U.S. Supreme Court. Certain speech acts, such as whipping a crowd into a Nuremberg Rally-style violent froth, leave the realm of “speech” altogether and cross into the realm of “action”; and actions are frequently regulated, for good reason. For a more detailed analysis, see Mick Hume.

We’re witnessing, right now, an example of what happens when we consider the right to free speech too absolute. The debate, or more accurately “debate,” over whether cloth masks prevent the spread of COVID-19, has dragged on for seven months, while American statistics show over eight million diagnoses, and nearly a quarter-million avoidable deaths. Sadly, this is no longer a debate. We’re only arguing about how to minimize casualty counts aboard the Titanic.

Anti-mask advocates drag the debate on indefinitely, using squishy tactics. They drag up orphaned quotes from Anthony Fauci and other public health officials, some from as early as March, keeping ideas alive after the science has developed. They recycle Cold War claims of Marxist evils, claims easily debunked by anybody who watched Chernobyl and saw the bureaucrats send miners into harm’s way without PPE. Anything they can imagine to keep the debate alive, they’ll deploy.

Because, as long as Americans keep arguing, nothing gets done. As Rampton and Stauber point out, defenders of the status quo don’t need to win debates to achieve victory; they only need to preserve the illusion that the debate remains unresolved. If a critical fraction of the population believes there are still questions unanswered, they’re unmotivated to participate in meaningful solutions, or to demand collective action from the people we’ve elected to represent our interests.

Nebraska, where I live, currently has the fifth-highest Coronavirus infection rate in America. Despite this, Governor Pete Ricketts persists in refusing to issue a statewide mask mandate. Historically, he’s stated that he sees mandates as “the big, heavy hand of government,” indicating he believes freedom from government interference outweighs the biggest public health crisis in our lifetimes. (He has, however, endorsed America's most restrictive abortion laws, so his anti-government sentiments are apparently contingent.)

COVID masks have become a thumbnail sketch of why America, and the world generally, can’t organize to address the major, crippling crises of our age. We persist in believing we need to resolve every debate, permanently and definitively, before taking any collective action. Public pundits continue arguing about whether anthropogenic global warming is real, even as the entire western United States is on fire, because the science is still evolving. Let’s just wait and see.

Except, that’s literally not how science works. Nothing is ever permanently resolved. Isaac Newton published his theory of gravity in 1687, and physicists continue testing and refining his principles over 330 years later. Gravity, a concept so ubiquitous that it seems impossible to question, a force so axiomatic that “what goes up, must come down” is fundamental to Western philosophy. If that’s still subject to question, a disease that didn’t exist twelve months ago won’t be resolved in one sitting.

At this point, somebody will inevitably bring Michael Crichton into the discussion. The American science fiction author gets exhumed whenever discussions like this happen, particularly his famous quote: “Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you're being had.” Personally, though, I’d prefer to see Crichton banned from these discussions. His entire premise consists of hating everything and impeding every discussion. Philosophically, Crichton is a liability.

So, keeping debates alive, while perhaps an admirable Socratic effort in theory, does immeasurable harm in practice. Sometimes, as with COVID masks and global warming, we should declare the fundamental debate closed, and move onto action. Sure, we can continue arguing over the fine details: how often does my mask need washed? Is carbon dioxide or methane a worse greenhouse gas? But the core facts are so robust, that the debate needs to just stop.

However, in reality, this opens other questions. Who gets to declare debates closed? The President? This President would happily declare all debates closed, and all progress unfair, rolling back society to whenever he thinks things were good, which is apparently 1964. Scientists and doctors might declare debates closed on their side, but run up insurmountable bills, while bean counters would do the opposite. Some debates need to close. But who gets to close them?

Saturday, October 17, 2020

Philosophy for Writers: Free Will vs. the Plot

Plato and Aristotle, as painted by Raphael

Back in graduate school, we read Aristotle’s Poetics, and several of us rolled our eyes. Wow, several of us Creative Writing types thought, what a load of hooey! Not only did Aristotle waste the audience’s time on windy discursions about linguistic rhythm, which only make sense in classical Greek; but his hierarchy of qualities in well-written stories made plot the most important element. Can you imagine, we scoffed in unison.

When I attended grad school in the 2000s, the predominant idea held that character, not plot, drove the narrative. Standard academic precepts agreed with this for years before I formally studied the field; I remember encountering the idea that plot arose from character interaction, in textbooks from the 1980s, and I presume the idea predates that. Characters don’t serve the story, standard textbooks insisted; story arose from the characters.

Imagine my surprise, then, when ads for plot generation tools and software began crossing my social media feed with increasing frequency. Selling books, and more recently software, to struggling or apprentice writers has a long history. Whatever these tools peddled, I suspect, reflected the attitudes about what makes commercially successful content. To judge by ads, the trend is moving back toward plot-driven narrative, with character retreating to second place.

Admittedly, that’s a generalization, but it piqued my curiosity. Ruminating on the question, I realized an important cultural variable. Exactly how much characters are beholden to plot, and vice versa, possibly reflects how beholden we, the reading public, believe ourselves are to outside forces. The debate, then, isn’t between plot and character; it’s between free will and determinism. And that debate is shifting during our lifetimes.

Marcus Aurelius

Aristotle didn’t critique “literature” abstractly. His Poetics addressed specifically the Homeric epics, and the classic tragedies performed at the City Dionysia of Athens, a major act of civic religion. The sharing aloud of Homeric epic, and the public performance of tragedies, served to bind the community in shared values, an act of liturgy almost completely missing from modern society. Imagine the Super Bowl, with literally all Americans attending.

As acts of religion, these performances reinforced the Greek traditions, and reminded everyone of classical cosmology. Ancient Greeks believed the Three Fates spun destiny for everyone, humans and gods alike, before they were born. Oedipus Rex, which Aristotle praises for its tightly woven plot, stars a man beholden to destiny, though he resists. That play reminds us that the gods bring rebellious and willful humans to heel.

Greek thought, therefore, is deterministic. Humans live out a narrative pre-scripted by supernatural beings. Character doesn’t matter, because our stories are already written. The fact that Aristotle believes plot dominates good writing, reflects the cultural acceptance of deterministic order. In other words, characters in Greek tragedy aren’t free, because Greeks don’t perceive themselves as free.

Christianity monkey-wrenched this belief. Admittedly, Christianity holds that humans are beholden to Original Sin, forever doomed to fall short of God’s glory; but Christianity doesn’t perceive humanity as necessarily deterministic. Christ stands at the door and knocks, giving humans the option whether to answer. Medieval Christian literature, like Piers Ploughman and The Book of Margery Kempe, depicts humans torn between individual freedom and divine destiny.

Roland Barthes
The rise of the novel as literature’s highest form, in about the 18th century, corresponds roughly with the retreat of religion into a subordinate social role. Human agency, not divine predestination, became the dominant social model. Though some systems, like John Calvin’s aristocracy of the elect, remain active, they generally haven’t mattered much to how we actually live for a while. Humans, today, are regarded as essentially free.

So are our literary characters, unyoked from plots which Roland Barthes disparaged, appropriately, as “theological.”

Except that’s reversing. A rising philosophical tide, driven by neuroscientists like Sam Harris, and philosophers like Daniel Dennett, calls free will into question. Though Harris and Dennett are outspoken atheists, their form of biological determinism grants authority to external forces which serve the god-like role Aristotle accorded the Olympians. Contemporary gods are impersonal, vague, and soulless, but the outcomes are, for all practical purposes, identical.

The recent rise in “plot generation” tools for writers reflects the resurgence of determinism in moral philosophy. Exactly how free fictional characters are, or how beholden to plot, reflects our human belief in how free our wills are. As quasi-theological determinism makes a comeback, so does the theological narrative plot. The debate’s resolution is hardly a foregone conclusion. But perhaps writers, not philosophers, are the bellwether of how the debate is actually going.

Wednesday, October 14, 2020

What Does “Law” Mean to the Powerless?

Richard Delgado and Jean Stefancic, Critical Race Theory: an Introduction, Third Edition

You probably discovered Critical Race Theory recently, like I did, through the news. For some reason, in the late summer of 2020, right-wing media pundits rapidly coalesced around Critical Race Theory as a major force undermining stable American government, threatening to throw society into hasty disarray. As often happens with media-driven moral panics, the great spokespeople never defined the monster they inveighed against. They just screamed bloody murder until their complaints seemed mainstream and commonplace.

University of Alabama law professors Richard Delgado and Jean Stefancic wrote the first edition of their introductory textbook during the Clinton Administration, when CRT (as they abbreviate it) was emerging from the scholarly circles which created it. The third edition dropped as the Obama years transitioned into Donald Trump, which presumably isn’t coincidental. But calling it a “textbook” makes it seem falsely imposing and threatening: it’s little more than a pocket-sized pamphlet with discussion questions.

“Critical theory” carries the whiff of philosophy and important literature. But CRT originated in law schools in the 1970s, and by the 1980s it spread rapidly throughout the social sciences. It dealt specifically with the interaction between law, as an ideal of social order, and the ways ordinary Americans lived their lives. (Its origin was specifically American, though it eventually spread internationally.) That interaction, CRT’s proponents insist, is often harsh, lopsided, and laden with friction.

Delgado and Stefancic trace CRT’s origins to Derrick Bell, the Harvard Law School professor who influenced both Barack Obama and Ian Haney López. Bell asked pointed and timely questions about how the law, an abstract and theoretical entity, interacted with non-White citizens at ground level. Then, having asked such questions, he began proposing answers. His solutions remain often controversial, partly because they involved forms of direct action that had fallen on disfavor by the 1980s.

However, anybody who’s ever studied critical theory knows that no theory is ever unitary and self-contained. Just as Professor Bell proposed important interpretations of law, others, including Haney López, and even Delgado himself, identified other interpretations, valuable inconsistencies, and questions beneath questions. Therefore, this text doesn’t present a list of closed debates and clearly defined right answers. It describes the controversies which define CRT, and the interaction between race issues and the law in general.

Our authors give very brief summaries of what they call CRT’s “hallmark themes,” which largely orbit two questions: how ingrained is racism in American law, and what policy-based remedies should America undertake? In later chapters, they delve into issues like how personal empathy, despite its well-meaning, doesn’t address underlying structural issues, and how the narratives which minority citizens provide can steer policy debate. They also question how to address the partisan backlash CRT has received.

Richard Delgado, left, and Jean Stefancic: official University of Alabama photos

Right-wing opposition to CRT probably derives from two points which Delgado and Stefancic explore. First, CRT assumes that racism is fundamentally constructed into American social structure, and therefore cannot be expunged by persuading individuals to not be bigoted. In other words, racism isn’t an individual behavior or character failing, it’s written instrumentally into American law. Fixing American racism will mean major structural alteration to American law— and exactly what structures need altered isn’t necessarily clear.

Second, CRT is fundamentally deterministic. Yes, our authors use the word “determinism” frequently, a term pinched from philosophy, though they cite legal scholarship to justify it. That is, our authors believe, as CRT does, that human will is circumscribed, and our choices are determined by economics, social pressures, and other external forces. Therefore, people cannot simply choose to obey the law, especially when the law opposes the external forces which determine your scarce available choices.

Again, the authors intend this book for classroom application, though it’s written in non-specialist English and reasonably priced. Therefore, they include discussion topics and classroom exercises. Unlike the main text, the classroom exercises seem more pointedly partisan, written to steer participants toward a determined outcome. As a sometime teacher, I feel squeamish about these exercises, and would probably write my own. However, the provided exercises provide a useful jumping-off point for self-scrutiny and personal study.

CRT isn’t only one subject, our authors remind us. It’s a rubric for historical questions from the latter Civil Rights era, and a guidepost into important burgeoning questions about the differing needs of different races, for instance, or where LGBTQIA+ concerns overlap race. As the title implies, this brief ledger only introduces a more complicated topic. However, for anyone interested in the friction between power and people, it provides a compelling synopsis for future studies.

Monday, October 12, 2020

David Tennant in: Cash-and-Carry Justice!

Dean Devlin (director), Bad Samaritan

Feckless young stoner Sean Falco (Robert Sheehan) parks cars outside an upscale Portland, Oregon, restaurant. But that’s a side gig: once customers trust them with their keys, he burglarizes their homes while they’re dining. He’s gotten good, too, at selecting subtle loot that nobody misses. Until the day slick, upscale Cale Erendreich (David Tennant) gives Sean his keys, and Sean discovers a battered girl chained in Cale’s home office.

Only Dean Devlin’s second directorial outing, Bad Samaritan opened to lukewarm reviews and dismal receipts. Those who saw it, gave it somewhat warm, but not overwhelming, reviews; but not many people saw it. This isn’t entirely unfair, given its straight-to-DVD characteristics: much action is squarely centered and unsubtle, like the director expected audiences to watch with one eye, while cooking dinner. This isn’t cinema as high literature.

But within that limit, it nevertheless makes an interesting commentary on American justice and unequal economics. Cale Erendreich has everything Americans have learned to expect from prosperity, including a glamorous house, numerous girlfriends, and virtual impunity. He also tortures and kills women. Sean Falco is poor, strung-out, and a petty criminal, but as the only witness to Cale’s depravity, he’s desperate to be perceived as honest.

Sean attempts to report Cale’s crimes to Portland PD, twice. And twice, the fuzz disbelieves him. One detective even sits at Cale’s kitchen island, drinking coffee and chatting amiably, while Cale lies like a rug. Worse, returning to the station, the detective threatens Sean, based on his prior history for broken-windows offenses. Apparently, Sean’s vandalism arrests and other petty convictions rank worse than Cale’s disdain for humanity.

Women everywhere can probably sympathize.

Once Cale recognizes Sean’s intrusion into his carefully controlled world, he organizes ways to control and dominate Sean. He gets Sean’s parents fired from their honest, blue-collar jobs, demolishes Sean’s relationship with his out-of-his-league girlfriend, and destroys Sean’s half-restored vintage car. Piece by piece, he dismantles Sean’s life, leaving him alone and defenseless against a city that doesn’t care.

Here’s where this movie earns some of its harsh criticism. Cale’s deconstruction of Sean’s life follows, almost note-for-note, the pattern described in Blake Snyder’s screenwriting guide, Save the Cat!. Though Sheehan, Tennant, and a sterling supporting cast of inordinately good-looking performers give their all, and Tennant maintains a remarkably good American accent, we quickly realize, the story is beholden to a beat sheet. The characters are simply carried along.

David Tennant sadistically enjoys toying with his prey in Bad Samaritan

However, in parallel to this beat-sheet story format, one character stands out. Character actor Tracey Heggins, as a young FBI agent eager to break her first case, chooses to ignore her superior officers’ advice and take Sean’s accusations seriously. She admits his story doesn’t sound altogether plausible, but it at least remains consistent, which sets him above the run-of-the-mill crank. She doesn’t necessarily believe Sean, but takes him seriously.

Desperate to protect his loved ones, Sean pursues Cale using any tools available. He believes the entire law-enforcement establishment opposes him, and knows both the law, and criminals far more skillful than him, will demolish him should he falter. That doesn’t matter; he only knows there’s a helpless girl, a family who doesn’t understand, and somebody who takes pleasure in others’ suffering. He only wants to put things right.

Cale openly boasts that his money makes him immune to consequences. Throughout most of this movie, that’s true. I cannot help comparing this movie to another socially motivated horror thriller, Jordan Peele’s Get Out, in that it foregrounds a White villain whose wealth distorts the value of justice. Where Peele makes his story about race, Devlin makes his about wealth. The difference probably doesn’t matter much to the desperate protagonists.

Audiences can probably perceive this movie one of two ways. The cat-and-mouse suspense narrative definitely leaves something to be desired. As stated, it follows the beat sheet included in a mass-market screenwriting guide. Tennant, as Cale, comes across as a poor man’s Hannibal Lecter. Sean is no Clarice. It’s not boring, but it does play by the numbers, reaching through standard confrontations, toward a conclusion we feel is probably inevitable.

Simultaneously, the movie makes clear comments on whom the law actually serves. At multiple points, it reminds us how police obey when rich people call, yet reflexively consider the poor untrustworthy. We watch Sean desperately telling the truth, while the “justice system” sweeps real, substantive crimes under the rug. As a thriller, this movie is okay, but not groundbreaking. As social commentary, it has something to say about cash-and-carry justice.

Sunday, October 11, 2020

Too Many Channels, Too Few Ideas

Game of Thrones, while it was in the process of collapsing

CBS All Access, the American network’s online-only second channel, is producing a series of Stephen King’s The Stand. Um, wait, didn’t somebody already do that? Yes, a quick Google search reveals ABC, another American network, broadcast a The Stand miniseries in 1994, to warm reviews and a robust sweeps-month audience. We’ve already seen The Stand on TV, to some success. Why on earth is it worth remaking?

I’m among the last generation which grew up with limited TV options. Most American markets had three network affiliates, two unaffiliated local channels, and PBS. The widespread adoption of cable TV corresponded heavily with my grade-school years. But even then, cable granted us twenty or thirty channels, not the hundreds in today’s market. For a small fee, you’d also get HBO, the only source that piped cusswords directly into our houses.

After Netflix switched from mailing DVDs, to streaming original content, we had a brief Wild West period where streaming services performed stunts to attract customers. Your favorite shows, now available on Hulu! Or Sling, or Amazon Prime, or YouTube TV! All for a nominal rate. Studios, like Disney, began skipping networks and cinemas, sending their content directly into our networked devices. Apple and AT&T became content creators.

The result, paradoxically, has been a strange two-pronged paralysis. While I’m loath to subscribe to any fee-demanding streaming service, content creators have become fearful of creating anything terribly new. Besides The Stand, CBS All Access includes two Star Trek series. Netflix has the third Chronicles of Narnia adaptation. Warner Bros. has announced the third adaptation of Dune. Doctor Doolittle, Call of the Wild, and Stephen King’s It have been multiply remade recently.

The protagonists of The X-Files, looking as tired as I feel

With the massive profusion of media markets, the studios have become massive content recycleries. They seem afraid to create anything that hasn’t already shown success as a comic book, classic novel, or previous movie. Importantly, if Internet buzz is reliable, most of these adaptations aren’t very good. Dismal responses to The Invisible Man and Artemis Fowl, and negative advance responses to Terry Pratchett’s The Watch, speak volumes.

I’ve never watched The Mandalorian. As someone who grew up amid the generation which defined itself according to Star Wars, this confession feels almost shameful. But I’ve also never seen The Walking Dead, Modern Family, or The Masked Singer. Numerous fans tried to steer me onto Game of Thrones and Lost, right up until both franchises collapsed into flaming heaps of widespread disappointment. Suddenly, my aversion seemed justified.

But, in today’s near-constant online connection, barraged with social media, we all receive messages daily, encouraging us to adopt some new TV phenomenon. Baby Yoda pictures and “This Is the Way” memes keep me largely abreast of The Mandalorian, without ever needing to watch it. I imagine you, like me, find yourself constantly immersed in media you’ve never actually consumed. We’re simultaneously trapped in media silos, and flooded with media.

Certainly innovative ideas exist; working writers produce new ideas, or new spins on classics, constantly. My friend Bishop O’Connell produced a damned good novel trilogy recently, but without PR support, it’s gone out of print. With a little skillful location work and minor CGI, Netflix or Disney could translate his novels into streaming TV content and make a mint. Instead, they keep rehashing content first popularized in the 1970s.

One of many attempts to keep Star Trek alive and relevant

Maybe it reflects my age (okay, definitely), but the more TV options exist, the less TV I watch. Other than Doctor Who and PBS Masterpiece, I watch virtually no TV anymore. As content becomes both ubiquitous and timid, and streaming services confuse using cusswords for being edgy, nothing draws my attention anymore. The profusion of choices keeps growing faster than the market does. I find myself withdrawing, reading books.

Admittedly, I recognize the contradiction between my recent praise for different takes on Arthur Dent, and my aversion to different takes on Stephen King. Perhaps I’ll reconcile the difference later. I just know that, while Arthur Dent evolved with his creator, Stephen King’s remakes keep happening without him; he’s busy creating new content, while studios limply revisit his back catalog. He gets rich, while corporations get lazy.

This disconnection, between too many content sources and too few ideas, leaves me pessimistic. The gatekeepers of American (and international) culture have created gates so narrow, that ambitious and original writers can’t get through. Perhaps that’s the paradox of modernity. In art, politics, business, and economics, the more venues there are for marginal voices, the more blandly restrictive the mainstream inevitably becomes.

Friday, October 9, 2020

Will the Real Arthur Dent Please Stand Up?

Simon Jones as Arthur Dent in 1981

I re-watched the 1981 television adaptation of The Hitchhiker’s Guide to the Galaxy for the first time since VHS days recently, and something struck me about Arthur Dent. Through the years, I’ve read him described as the story’s protagonist, antihero, viewpoint character, or voice, but importantly, never as the hero. Watching it again, it seemed glaring, what a wet rag Arthur Dent is. One wonders how such a gormless character can have such lasting appeal.

Simon Jones played Arthur Dent on TV, and also radio, audiobooks, and occasionally onstage. Writer Douglas Adams reputedly wrote Arthur specifically for Jones, with whom he performed in the Cambridge Footlights; and Jones ducked into the role, off-and-on, for twenty-five years, from 1978 to 2003. Only thirty-one when he played Arthur on TV, Jones is tall and good-looking, with broad shoulders and lustrous hair; yet he makes Arthur look childlike, a black hole of charisma.

Contrast Martin Freeman, who played Arthur in Garth Jennings’ 2005 HHGttG movie. Freeman is almost studiously average: in height, looks, body type. Freeman’s Arthur is a mildly altered version of Tim Canterbury, the role from The Office that made Freeman famous: amiable and kind, but largely forgettable. Yet, like Freeman’s other major big-screen role, Bilbo Baggins, this Arthur grows into his newfound role. He starts relishing confrontations with aliens, and eventually embraces the hitchhiking life.

Perhaps these different Arthurs reflect their creator, Douglas Adams, a man who struggled with identity throughout his career. In the introduction printed in most omnibus editions of his HHGttG novels, Adams describes conceiving the story’s first core while hitchhiking aimlessly around Europe after college, a nominal adult nevertheless lacking direction, both literally and figuratively. The story only took form, though during a period of prolonged pessimism, when he wrote it specifically to destroy planet Earth.

Adams’ original Arthur reflects both Adams’ frequently purposeless life, and Britain a generation after World War II. Witnessing its global empire enduring its death throes, Britain produced new literary heroes: venal monsters fighting international tyranny, like John le Carré’s George Smiley, or pathetic, nebulous, frequently comedic entities like Kingsley Amis’ Lucky Jim. Clearly, Arthur Dent falls into the latter category: his individual wandering and lack of direction reflects the United Kingdom’s national sense of futility.

Martin Freeman as Arthur Dent in 2005
For Arthur, in the original radio and TV series, this directionlessness manifests as frequent whining. “I seem to be having tremendous difficulty with my life-style,” Jones’ Arthur mewls, almost directly into the camera. Presumably, Adams’ first-generation British audience would’ve understood why that complaint mattered—and also why, when that complaint passed into a wormhole, it would’ve offended two warlike races preparing for battle. Because such a whimper certainly would have offended Brits of Churchill’s generation.

Notably, that whine doesn’t appear in the 2005 movie. Though the movie dropped four years after Adams’ passing, Adams was involved in the movie’s production for twenty years, and the final production used a lightly doctored version of Adams’ own script. (The movie’s development hell caused Adams’ lack of productivity through the 1990s.) Presumably Adams himself excised that moment from the story. It reflected a moment long past, both for Britain generally, and Adams personally.

My theory is: Adams himself changed. Adams cranked out two radio series of HHGttG and four novels in quick succession, from 1978 to 1984, then went suspiciously quiet. Notoriously bad at deadlines, Adams only produced the fifth novel eight years later, in 1992. That novel, Mostly Harmless, is markedly different from the previous four. Largely dry, often angry, and lacking Adams’ trademark wordplay, it features Arthur living mostly in one place, his hitchhiking days over.

It also features Arthur raising the teenage daughter he never knew he had. Adams married in 1991, and though his only daughter wasn’t born until 1994, surely he understood fatherhood was a possibility. Forced to take stock, Adams probably decided that wandering through life, lacking purpose and goals, wasn’t acceptable anymore. A lifelong atheist, Adams couldn’t derive purpose from transcendence; therefore he had to manufacture it internally. Aimless wandering became a journey towards a destination.

That, I believe, conditions these two different Arthurs. Simon Jones’ Arthur reflects a new Britain, born of wartime conditions, trapped in protracted adolescence. Martin Freeman’s Arthur reflects Douglas Adams specifically, a man who got away with living like a teenager well into his forties, suddenly accepting adulthood. Had Adams lived, one wonders how the character would’ve continued evolving: Adams would be pushing seventy now. We, his audience, have the opportunity to continue evolving with him.

Monday, October 5, 2020

The Matthew Effect and the American Senate

The San Francisco skyline: once the hub of all California

When California became a state in 1850, the legislature floated a bill that would’ve returned the state’s entire southern half to the federal government as an unorganized territory. Back then, the state’s population base centered on San Francisco, while Los Angeles and San Diego were small, distant towns. Most of the Southern California population was Spanish-speaking rancheros, whom White Californians regarded as mostly foreigners.

I didn’t know this fact until reading Bill Kauffman. Despite having spent nearly half my public-school years in various California cities, and taken multiple “love your home state” history courses, they never mentioned that early White California was so racist that they’d rather unload half the state than live with rural Mexicans as fellow Californians. Yet the older I get, the more I realize California speaks for Americans overall, in ways we shouldn’t take pride in.

A recent Vox article opens with this telling complaint: “Currently, over half the country lives in just nine states, which means that less than half of the population controls 82 percent of the Senate.” The article uses this to complain about “the malapportionment of the Senate,” a legitimate concern. But the article never questions why the population has become so concentrated. Why are America’s demographics so unequally distributed?

Media outlets, with their headquarters in the largest cities of America’s largest states, decry the unjust apportionment of Senate seats, and with them Electoral College votes, every election cycle, because that’s when the problem hits them. But they always state the question in terms of why the system treats large states so unfairly. Seldom, if ever, do they question why other states are so small. Therein, perhaps, lies the missing equation.

As a Nebraska resident, I’m downright bored hearing regular cries from legislators wanting to reverse the state’s patterns of outmigration. But they never ask why promising young strivers leave Nebraska. What will they find in, say, Chicago or Kansas City or Seattle, that they can’t find here? Anybody who’s lived in Nebraska, outside the Lincoln-to-Omaha corridor, can tell you exactly what youth hope to find elsewhere:

Work.

It’s more complex than that, certainly. People seek new experiences, an improved quality of life, and a chance to meet partners who share their values, all opportunities more readily available in large cities. But those opportunities exist, in concentrations enough to matter, in places where people have good-paying, stable employment. People flock to Manhattan, Chicago, and coastal California because these places disproportionately create America’s economic prosperity.

California's population, divided into quartiles, by geographic area. Click image for source.

Nowadays, of course, California’s population base is in L.A., and Los Angeles County’s population represents one-quarter of the state. The county has a larger population than forty-one entire states. Yet not only have California’s state lines remained unchanged since 1850, but the Los Angeles County line hasn’t moved since 1889. California’s political jurisdictions have a chillingly 19th-Century aura. We’re saddled with lines drawn during steam locomotive days.

This imbalance between Nebraska’s available space, and California’s available work, creates what sociologists call “The Matthew Effect,” the accumulation of opportunities in places where opportunities are already abundant. This flocking effect has good and bad consequences. It means young adults know where to look for lucrative, upwardly mobile work. But it also leads directly to the overbuilding which exacerbates California’s seasonal wildfires.

In the electoral field, it rewards small-state voters who elect to continue with existing, ineffective policies, over large states which favor change. The legislative and Electoral College advantages accrued by driving youth into ever-more-concentrated markets, means policies policies for change suffer crib death. In Nebraska, the corn board wants to increase ethanol production, not for environmental reasons, but to create a stable market for bloated mechanized agriculture.

The national mass media’s quadrennial whine about the unfair Senate lets journalists feel engaged and busy. But it papers over the reality: yoked with state lines that, in the Lower 48, haven’t changed since the 19th Century, ambitious young people will move to follow opportunity. They always have. If large states, like New York and California, want to reverse this trend, they need to participate in creating economic opportunity in small, substantially rural states.

I’ve historically defended America’s unfair Senate, as a place where large states must face smaller states as equals. I still believe this. But by preserving state lines unchanged since the California Gold Rush, while encouraging policies that increase geographic concentration of economic prosperity, large states have encouraged their own unfair treatment. If they’re serious about changing this trend, the policy correction needs to start closer to home.

Friday, October 2, 2020

Dietrich Bonhoeffer and the King of America

Dietrich Bonhoeffer
Dietrich Bonhoeffer, the famed German theologian and Nazi resister, had his first run-in with Hitler’s administration almost immediately. Only two days after the Nazis took power in 1933, Bonhoeffer delivered a radio address warning Germans that their desire for a Führer, a leader with broad and unquestionable power, was cult-like, even idolatrous. Bonhoeffer’s address was silenced mid-sentence. His biographer, Eberhard Bethge, admitted it might’ve been a technical error, not censorship, but nobody really believes that.

I recalled Bonhoeffer’s clash with authority earlier this week, when online… er… personality Kaitlin Bennett, the self-described Kent State “Gun Girl,” posed on Twitter wearing a shirt reading “Trump Is My King.” Apparently Bennett, an apprentice to notorious loose screw Alex Jones, sells these products through her husband’s online store, and expects fellow American conservatives to purchase them. A disillusioned ex-conservative myself, I think most right-wingers will feel rightly incensed here. But not all will.



Conservatives, historically, have sought the appeal of a unifying moral authority. Kings, popes, and potentates give nations an identity and vision, without the sloppy hand-wringing which often accompanies debate. Democracy, as even its most avid defenders will admit, is slow-moving, and often subject to public whimsy. It also leads to an irresolvable pluralism. When everybody’s position is, theoretically, as valuable as anybody else’s, the nation lacks a single, unifying vision, and nobody makes binding decisions.

Therefore, in free nations which historically overthrew their monarchs, there’s often a tendency to restore kings, directly or indirectly. First Rome exiled its final king, then the Roman Republic gave monarchical power to the Caesars. Germans cheered when Kaiser Bill fled in disgrace, then floundered for a generation, before the Brownshirts provided a suitable replacement. Russia went from tsars to premiers to Vladimir Putin, because, as awful as each was, at least they brought order.

Currently, there’s no reason to believe Kaitlin Bennett’s desire to anoint King Trump has any support outside her husband’s website. American conservative identity anchors heavily on mythology of powder-wigged ancestors rebelling against a king. Investing an American monarch would contradict their own story and, I think, create cognitive dissonance they couldn’t resolve. However, there’s already a move afoot to grant the presidency significant powers, which the individual could exercise with monarchical power, a back-door coronation.

Political theorist Corey Robin, in his book The Reactionary Mind, postulates a simple theory for right-wing thinking. Conservatives essentially believe society has a naturally occurring power hierarchy, and those possessing moral weight and personal merit will naturally gravitate upward. Therefore, any attempt to redress structural inequalities, Robin suggests, will raise up undeserving people, while tearing down the meritorious. How can we tell who deserves power? Easy: they already have it. Power is its own justification.

If Robin’s explanation holds water, then power isn’t merely political; it’s moral. Therefore, power needs to remain while the powerful individual remains moral. That’s why public Christians like Franklin Graham and Eric Metaxas feel so compelled to demonstrate Trump’s supposed Christianity, despite his documented infidelities and vulgarity. Power, in service to protecting the established order, requires moral authority, which it then distributes with papal infallibility. The powerful are always right, because right is always powerful.

Jerry Falwell, Jr., and Donald Trump: the unholy union of piety and power

Audiences like me may feel squeamish at this correlation between morality and power. History is replete with evidence that powerful people regularly behave immorally. It’s absurd to deny, under the current weight of evidence, that Donald Trump is a racist, sexist, philandering tax cheat. Yet his defenders would insist that, if Trump is himself morally good enough to achieve power, anything he does is perforce acceptable. This logic is downright Nietszchean in its looping complexity.

Here’s where Germany returns. As Bonhoeffer noted about the Führer cult, power doesn’t derive from morality, morality derives from power. We look for the best possible explanations of powerful people’s actions, to justify our own morality. That’s why much of the political Left happily looked the other way when President Obama expanded the drone warfare program, sold licenses to drill petroleum in the Arctic Ocean, and more. We justify power when it serves our ends.

A fringe element of American conservatism wants to crown Trump. Frankly, that’s predictable and expected. But maybe, by doing it so brazenly, Kaitlin Bennett has done Americans a favor. If we take this opportunity to unyoke power from morality, we can prevent the catastrophes we’ve witnessed, under the last three presidents. When we stop thinking government is the arbiter of moral goodness, we can pursue what we know is right. Let’s free ourselves from Caesar.