Wednesday, October 30, 2019

Star Trek, the Frontier, and the Borg


“We are the Borg,” the faceless cube intones in the movie First Contact. “Your culture will adapt to service us. Resistance is futile.” These words have struck such fear into a generation of science fiction fans that the final sentence has escaped the genre pen and become a larger cultural touchstone. Simply say the words “Resistance is futile,” and most people, sci-fi fans or not, will immediately understand exactly what you mean, and fear appropriately.

Throughout genre history, American science fiction has frequently had a frontier element. Whether it’s Han Solo promising to move quantities of goods to market without revenuer interference (“selling whiskey to Indians”), Commander Adama plotting a course beyond the Red Line (“light out for the territories”), or Malcolm Reynolds’ all-around cowboy ethic, science fiction has pilfered generously from America’s frontier mythology. That goes double for Star Trek, which Gene Roddenberry pitched explicitly as a “space western.”

Captain Kirk announced the original Star Trek by describing himself traversing the “final frontier.”  He made it his goal to encounter undiscovered peoples, several of which, in the original series, are depicted as having Native American qualities. (These qualities are broad stereotypes, and often played by White actors in heavy makeup, though that’s too many themes too address here.) Though the Prime Directive stops Kirk becoming an out-and-out conqueror, he’s nevertheless frequently a White Savior.

Perhaps that’s what made Picard’s contact with the Borg so terrifying: colonists and empires don’t like having their own actions thrown back at them. In the original series, Kirk and the Federation dealt intermittently with the Klingon threat, a metaphor for America’s interactions with the Soviet “menace,” but never investigated that in detail, perhaps because, in light of the ongoing conflict between two global superpowers, creators couldn’t face the implications. America was an empire, too.

As the Cold War wound down, though, such investigation became available to Picard. The Next Generation included more scrutiny of the Federation and its operant principles, which were primarily White, male, and expansionist. The series still held itself back in service to the tastes of its time, and Roddenberry’s humanist philosophy, but we began seeing rot and disorder among the Federation hierarchy. White humanity’s innate goodness wasn’t something the Federation could take for granted anymore.

A Borg drone (Hugh) from the episode "I, Borg"
White America’s history with indigenous peoples will never escape the taint of racism. Forced removals and the Trail of Tears; state-sponsored missionary activity designed, in no uncertain terms, to abolish native culture; the Carlisle Indian School. And that’s just our interactions with Native Americans. The phrase “Your culture will adapt to service us” accurately describes the philosophy underlying how White people viewed Native American culture until very recently—and, in too many quarters, still does.

Funny enough, while America saw itself as a bastion of rugged individualism, it painted the “Evil Empire” as a faceless collective bent on assimilating citizens’ identities into a top-down social stratum. It’s almost like, viewing from within, we saw ourselves as individuals with goals and personalities, while viewing our “opponents” from without, we considered their people the sum total of their government’s belligerent rhetoric. One wonders how Native Americans saw the Cavalry crossing the frontier.

American frontier myth, to work, always requires viewing from the White side. Whether it’s children playing Cowboys and Indians, or NASA pledging to establish colonies on the Moon and Mars, somehow the White settlers always win. But science fiction, which regularly abandons moorings in the putatively “real” world, has liberty to speculate on big themes which we’re often denied in reality, including the possibility that, someday, somebody new may cross another frontier and find us.

That’s what happens with the Borg. After decades of Roddenberry’s mythology assuming humans, mostly White males, would win the colonial enterprise, the Borg upend this assumption. They enter our space. They want to conquer us, and make us part of their, ahem, enterprise, rather than us inviting strangers into our Federation. And though, pursuant to the needs of episodic television, the humans always successfully resist the alien invaders, they nevertheless remind us, victory isn’t certain.

Our frontier myth has frequently tainted America’s interaction with the wider world, as historians like Greg Grandin have noted. The myth has faltered significantly in the 21st Century, but still basically drives our foreign policy. That’s why we need something like the Borg, to remind us that, viewed from outside, we probably don’t look like the Federation. We probably look like an anonymous mass of technological monstrosities, bent on yoking other peoples to our vision.

Sunday, October 27, 2019

Jewish Jesus

1001 Books to Read Before Your Kindle Battery Dies, Part 101
David H. Stern, Ph.D., Restoring the Jewishness of the Gospel: a Message For Christians


Rabbi Yeshua ben Yosef was Jewish. So were his Apostles and earliest converts. Even Paul, the “Apostle to the Gentiles,” never stopped calling himself Jewish. Though Yeshua’s ministry included stops among Samaritans and Romans, his outreach centered on fellow Jews, and included extensive citations from the Hebrew Tanakh. But after the last Apostles died, Yeshua’s ministry became dominated by Gentiles. During Bar-Kochba’s Rebellion, Jews and Yeshua-followers split permanently.

Dr. David Stern, a Princeton-trained economist and lifelong Jew, became persuaded in middle age that Yeshua was the Messiah of prophecy, the promised one to reunite the scattered people of Israel. He has written about this conviction since the 1970s, becoming a leader in the Messianic Jewish community. But, half a century later, confusion remains around what Messianic Judaism means. Dr. Stern hopes to solve some of that confusion.

History hasn’t been kind to Jews who believe Yeshua fulfills the Covenant prophecy. Such Jews were rejected by fellow Jews, particularly surrounding the violence enacted by Roman conquerors. Then Gentile Christians required Jews who wanted to follow Yeshua to abjure their Judaism, contrary to Scriptural messages from Jesus and Paul. Only since the middle 1800s has the Venn diagram of Judaism and Yeshua-belief reclaimed the overlap that once dominated.

Messianic Judaism isn’t Christianity. Stern emphasizes that, the few times the word “Christian” appears in Scripture, it always describes Gentile converts. Instead, Messianic Jews remain Jews, with all the controversy of identity that this entails (for instance, should we keep kosher? Stern and his family do, but he refuses to criticize others who don’t). As such, the Hebrew Covenant continues to govern his people, who continue living under the Law of Moses.

This causes some confusion, since Paul’s letters, and to a lesser degree the Gospels (particularly John’s) proclaim believers’ freedom from the law. Doesn’t that mean the Hebrew Covenant doesn’t apply? No, Stern insists, and uses the historical context under which Paul and the Evangelists wrote to prove his point. He makes a persuasive case that, while Gentile converts live under a New Covenant, the Hebrew Covenant still governs Yeshua-beliving Jews.

Jesus as a young Jew, painted by Rembrandt
Moreover, he writes, Christians owe Jews heavily. Stern describes what he calls “olive-tree theology,” based on an analogy Paul writes in Romans, Chapter 11, that all Israel is a cultivated olive tree, onto which new believers are grafted. New Yeshua-believers needn’t necessarily become Jewish (see Acts 15), but based on statements from Paul and Yeshua, Stern believes Gentiles are grafted onto Israel, and eventually both will become one tree.

(It’s worth asserting here, that some organizations calling themselves “Messianic Jewish” aren’t Jewish in any true sense. Though they use Jewish liturgical language and selectively apply kashrut law, their theology is Evangelical Protestant. Dr. Stern doesn’t address these organizations; his interest is in actual Jews who consider Yeshua the Messiah of prophecy, and his message is unambiguous: Jews who become Yeshua-followers never stop being Jewish.)

As such, since new believers become joined into the Hebrew Covenant, Jewish believers have a different relationship with Yeshua than Gentile Christians do. Gentiles are invited into covenant which begins with relationship with Yeshua. But Jews already have their covenant; they don’t need invited in. Instead, for them, Yeshua becomes the culmination of a covenant they already possess, the fulfillment of Hebrew prophecy, and the unity of the Jewish nation.

Based on this, Stern believes it’s anti-Semitic for Christians to not extend outreach to Jews. Many Christians, he acknowledges, are justifiably reluctant to evangelize Jews, because historically, Christians have used high-handedness, even violence, to forcibly convert Jews. But Stern asserts this doesn’t mean we should refuse to evangelize; it only means our evangelism must begin in humility, willing to seek forgiveness for our forebears’ failure to follow their own teachings.

This synopsis does Stern’s theology a disservice. Though brief, his book is dense with references to Jewish and Christian history, deep dives into Scripture, and careful analysis of ways Yeshua and (especially) Paul speak from categorically Jewish foundations. He carefully translates these concepts into laypersons’ language, aware that he’s writing for a diverse audience. But he doesn’t shy away from complicated theological arguments.

Recently, I’ve heard much about Gentile Christians becoming curious about their religion’s Jewish foundations. Not long ago, the firewall between Christianity and Judaism was difficult to breach, but not anymore. Dr. Stern writes for Gentiles willing to learn more about where their religion originated, and Jews eager to discover the figure Yeshua who claims to be the Messiah. Both audiences will find much to discover.

On a related topic: Indigenous Jesus

Friday, October 25, 2019

The Real Cost of Fake Statistics

Jerry Z. Muller, The Tyranny of Metrics

We know the horror stories: schoolteachers ignoring larger goals and teaching for the test. Universities needing to create new administrative positions to handle the paperwork, then offsetting the cost by packing departments with adjunct instructors. Financiers booking profits for this quarter, while punting debits to the next quarter. Police failing to report serious crimes. Tales of trusted professionals manipulating statistics have become downright legendary.

Economic historian Jerry Z. Muller doesn’t unilaterally oppose metrics and other attempts to quantify abstract goals. They have important value, he writes, when applied appropriately. When used for internal analysis, say, or in low-stakes testing to determine progress on clearly defined goals, statistics can have powerful beneficial effects. People love metrics because, at first, they provide important information that improves our ability to pursue our purposes.

Problems arise, Muller writes, because people who aren’t familiar with important fields, believe they can stretch short-term benefits of metrics infinitely into the future. When small, underfunded schools’ funding, even their existence, gets threatened because students cannot ace standardized tests written in another state, for instance. When politicians intimidate police into keeping crime stats low. When hospitals falsify medical records to avoid Medicaid decertification.

Modern technocratic societies began demanding elaborate metrics to measure the immeasurable, according to Muller, for completely understandable reasons. New research in psychology and behavioral economics has demonstrated the subjectivity of human judgement. The demand for “transparency” in public affairs means people want abstruse topics translated into layperson language. And little seems more objective and transparent than plain, old-fashioned numbers.

In practice, though, this isn’t true. First, the metrics most often measure the areas easiest to measure, not what matters most. Schools measure questions with obviously right answers, not students’ ability to reason on nuanced or debatable topics. Financiers receive bonuses for sheer money made, regardless of stability. Militaries count bodies killed, not hearts and minds won. This results in very short-term thinking and reaching for low-hanging fruit.

This distorts the purposes served by “transparency,” and its close cousin, “accountability.” (Muller makes a persuasive case that transparency frequently isn’t desirable anyway, since it’s the opposite of nuance.) Knowing their funding and careers hang precariously, officials tasked with maintaining the metrics start gaming the system, chasing desirable outcomes. Safe streets, healed patients, and an educated citizenry become secondary to cramming the numbers.

Jerry Z. Muller
Further, this demand for metrics creates amassive mandatory bureaucracy. Government demands for more numbers from schools, universities, and hospitals have forced them to hire actuaries whose sole job is to write and refine government statistics. It’s become axiomatic recently to complain about the increasing number of administrators in public institutions. But they need those administrators to fulfill their accountability mandate—and the government needs bureaucrats to read the resulting numbers.

This comes at the expense of seniority and experience. Muller writes, correctly, that serious science calls human judgement abilities into question. But metrics are scarcely better, since they apply the judgement of the metrics writers—who frequently are outsiders to the field they measure—over the judgement of those who dedicate their lives to work, study, or practice in a field. For all its flaws, human discretion is invaluable when it originates from time-tested skills.

And don’t cite “run government like a business” here. Muller demonstrates, extensively, that corporations are just as procedural, and just as prone to distort numbers for indefensible reasons.

I’d go beyond Muller. When enormous bureaucratic institutions demand conformity to numbers, those numbers generally reflect the beige middle ground. Thus public schools with substantially Black or Hispanic student bodies, American aid agencies working in non-Western countries, and doctors treating patients ill-equipped to monitor their own wellness, eventually force the poor, the non-white, and the international, to conform to middle-class White American standards that don’t apply.

Muller isn’t entirely pessimistic about metrics. From the beginning, he studiously distinguishes misapplications of statistics, the kind of horror stories we see on the nightly news, from correctly applied numbers. He praises appropriately applied numbers, particularly those used to measure internally, as read by experienced minds. If metrics measure what matters, and people accept metrics’ limits, numbers can have powerful beneficial effects.

Muller extensively cites Campbell’s Law, which, paraphrased, states that statistics, when misapplied, always distort what they measure. Though recent studies have demonstrated this apparent truism, we keep forgetting it. Clear back in the Victorian era, the poet and educator Matthew Arnold decried how top-down measurement efforts damage what they pretend to support. Maybe, by easing up on misapplied statistics, we can improve outcomes for everybody.

Wednesday, October 23, 2019

What Is This Thing We Call “Lynching”?

Gregory Peck as Atticus Finch
Several years ago, I had a small supporting role in a community theater production of To Kill a Mockingbird. As part of my role, I formed part of the crowd that came to lynch Tom Robinson. I had no lines, just some improv behind the speaking parts. This quiet anonymity let me watch events unfolding from behind, permitting me to notice that this was an unrealistic depiction of a lynching.

With the President’s Twitter declaration yesterday that the perfectly legal and Constitutionally mandated impeachment proceedings against him are a “lynching,” perhaps it’s time to reconsider what that word means. Because, although the tail end of the “lynching era” remains within living memory, that’s getting further and further away. Fewer people remember what a lynching really meant, and it’s getting misrepresented by people with a dishonest message to peddle.

To Kill a Mockingbird depicts a crowd of lower-class toughs from the wrong side of town rushing the prison. Their courage boosted by liquor, they work one another into a lather of rage until, propelled by their sense of moral outrage, they distract the sheriff with false leads, thinking they’ll have an undefended jailhouse door. Except they get there and find Atticus Finch guarding it with his lantern, Diogenes-like. And, of course, his rifle.

This misrepresents lynchings in multiple ways. First, it distances both Harper Lee and her intended audience of Northern White leftists from the event by making lynchers into po’ white trash. Except the history of lynching makes clear that people from every social stratum attended lynchings, including members of the professional and upper classes. In some events, lynchings became community events, with picnics and music attending.

Besides, lynchings didn’t happen by cover of darkness, and didn’t require calling law enforcement on a “snipe hunt.” The very nature of lynchings required them to happen in broad daylight, and the sheriff was as likely to hold the door as bar it. Because although lynchings were considered “extra-legal,” they weren’t really forbidden. The whole point was to showcase how the perpetrators of unlawful killings had no fear of consequences.

The President's original tweet.
Click to enlarge
By contrast, the Democratic leadership has been paralyzed by fear of consequences. Despite having a robust House majority and a President who seemingly can’t open his mouth without confessing yet another crime, Nancy Pelosi dithered for months before officially beginning impeachment inquiries, because she believed her party would suffer at the polls. This despite the proceedings being so completely legal, they’re the only trial proceeding defined in the Constitution.

Theologian James H. Cone calls lynching the closest American society had to the Roman tradition of crucifixion. Like the death of Jesus, the deaths of countless Black Americans wasn’t simply intended to kill the target, it was intended to humiliate the target. And the real audience wasn’t the victim, it was the survivors, who received the unquestionable message: your lives are in our hands.

To understand how thoroughly unafraid of consequences White lynchers really were, realize that lynchings were documented with cameras. Not only did perpetrators photograph the bodies after they’d been killed, they photographed themselves with the bodies, and sold the resulting pictures as souvenir postcards. Lynching perpetrators literally boasted about their participation in violent murders, and shared images of themselves doing it, like we share Instagrams of our dinner.

In Mockingbird, the lynchers desist their efforts when Scout Finch recognizes one of them and calls him by name. Certainly, if we acknowledge that lynchers photographed themselves with the body, getting recognized won’t likely convince them to quit. But even if Harper Lee misrepresented lynching thus, it means she believed one thing: people who committed lynchings were capable of shame. And that’s where the analogy breaks down.

When we consider who, in the current arrangement, boasts about crimes, shares images of their transgressions, and turns a profit on the effort, it’s hard to avoid the conclusion that the lynching isn’t being perpetrated by the Democrats. Trump and Giuliani have gleefully confessed multiple crimes, sometimes on live TV, confident they’ll face no consequences. Meanwhile, the “checks and balances” fail to do anything, because they’re beholden to extralegal pressures.

To Kill a Mockingbird is possibly America’s most widely read book about race and racism; I have difficulty determining exact numbers, but I’ve read estimates that it’s read in as many as 75% of American schools. But it sanitizes racism by shifting the onus onto poor Whites clinging to society’s margins. It depicts lynching unrealistically. And this leads directly into the President misrepresenting lynching toward himself.

Friday, October 18, 2019

Scorsese and “New Hollywood” Part 2

Martin Scorsese's longing for the auteur-driven Hollywood culture which birthed his career is, as I wrote yesterday, naïve and anachronistic. But it isn't wrong. Because the conditions which assassinated New Hollywood around 1982, currently exist again. Only this time, it'll be much, much harder to dislodge them, for two reasons. And Scorsese's abhorred Marvel Comics movies, which caused the current controversy, are only part of one reason.

I stand behind my statement that franchise spectaculars like Star Wars provided a necessary countervailing force to New Hollywood's somber introspection, while also bankrolling more esoteric fare. Initially. But it didn't take long for the economic weight of these movies to change the companies that owned them. George Lucas hadn't finished his first trilogy before abandoning his planned story and turning Return of the Jedi into a toy commercial.

However, this rapid descent of the blockbuster ethos into silliness corresponded with the first reason it’ll be hard to fix. Under the Reagan Administration, the deregulation of Hollywood meant studios were merging, while also buying controlling interests in the distribution networks and cinema chains which got their products in front of eyeballs. Dying of ennui in 1974, by 1984 the studios were flush with cash, and began using it to consolidate their market angle.

Today the big-screen market has few controls: studios own each other, their distributors, the cinemas, and now streaming services. Disney, which Michael Eisner famously saved from extinction in the early 1980s, has become a monolith, controlling Lucasfilm, Marvel, and now Fox. Amazon, once a pinch point in distribution, has become a studio in its own right. A few corporations so thoroughly control the supply end of the curve that, in essence, the entertainment market isn’t free.

This hasn’t been magic for the studios. The apotheosis of vertical integration happened with Roland Emmerich’s 1998 Godzilla remake, which opened on an estimated one-quarter of America’s cinema screens, and still died. The pattern has repeated recently, with Universal Studios’ planned Dark Universe dying in its cradle, and the DC Extended Universe apparently following suit. Though pitching to the middle is supposedly lucrative, audiences are provisionally leery.

But, as Scorsese notes, Marvel movies remain a cash cow. They’ve forestalled the terminal malaise that doomed other series, like Jason Bourne or Pirates of the Caribbean. Audiences still love them, which is the second reason studios have little motivation to abandon the blockbuster model. If, as I stated yesterday, audiences embraced blockbusters because New Hollywood became too repetitive, I can’t explain why this problem dooms some franchises, but not others.

Though the standard model holds that audience preferences drive the industry, that isn’t always so. People are naturally drawn to what’s already familiar, and resist the truly innovative. Movies now regarded as classics, from Citizen Kane and Night of the Hunter to The Big Lebowski and The Shawshank Redemption, initially died at the box office, because audiences found them too different. Risky movies often become successful only in retrospect.

So, between the entertainment industry’s monopoly practices, and audiences’ love for the familiar, there’s little motivation for studio bean-counters to approve anything too innovative. Studios once provided homes to executives like Alan Ladd, Jr., whose gut instinct on unproven properties like Star Wars proved insightful. Today, their decisions are controlled by algorithms which test whether proposed new properties sufficiently resemble what’s been successful in the past.

And because the studios also overwhelmingly own the distributors, they have manifest disincentive to permit smaller, independent producers to get into the business. Occasional upstarts like Blumhouse Productions occasionally break into the distribution business, giving self-financed filmmakers glimmers of false hope; but overall, without a contract with the majors, don’t expect your film to ever particularly go anywhere.

So, combining what I said yesterday with what I’ve written here, where does this leave us? Martin Scorsese’s naïve yearning for New Hollywood is a non-starter, because audiences don’t want that, and the film studios have become too consolidated to do something risky. Industry thinking has become very short-term. But the success of Marvel movies notwithstanding, box office receipts suggest “blockbuster fatigue”; even Star Wars is showing diminishing returns.

Conditions exist for something new to emerge. It probably won’t resemble either New Hollywood, with its self-consciously artsy flavor, or the blockbuster era, dependent on low-risk franchises. Because the five major studios have a stranglehold on the industry, it’ll take some executive willing to defy algorithms and accountants… which is rare currently. Maybe Scorsese can find and cultivate such an executive. Maybe we’re ready.

Thursday, October 17, 2019

Martin Scorsese and the Ghosts of “New Hollywood”

Martin Scorsese
Martin Scorsese, one of the few cinematic directors from his generation still making lucrative movies, garnered attention this weekend when he publicly dumped on Marvel Studios. Extending upon previous comments, he referred to comic book-based action movies as “not cinema” and said, “We shouldn’t be invaded by it.” Street-level commentators and bloggers like me have dogpiled on Scorsese for his comments. But on further consideration, I have my doubts.

I cannot help focusing on his phrase “not cinema.” After fifty-one years directing movies, surely this fellow understands what “cinema” means. Scorsese is among the final survivors of “New Hollywood,” an auteur-driven movement in filmmaking when studios, feeling unable to compete with television, handed unaccountable stacks of cash to ambitious directors and granted permission to go crazy. But does that give him permission to define cinema for everybody else?

New Hollywood began, critics largely agree, with Warren Beatty’s notorious over-the-top craptacular, Bonnie and Clyde. It was ultimately done in by one of its staunchest adherents, George Lucas… though we’ll return to that. During its generation, observers writing New Hollywood’s obituary actually blamed Michael Cimino, whose bloated ego vehicle Heaven’s Gate lost so much money, it killed United Artists. I contend, though, New Hollywood could’ve survived that debacle.

Undoubtedly, New Hollywood had serial flaws: as studios increasingly trusted auteurs, movies became longer and slower, prone to sententiously lecturing the audience and following protagonists on somber internal journeys. Not surprisingly, these protagonists were played by performers who also wrote and directed their own stories. Many such films are rightly regarded as classics now, but in the moment, they became massively repetitive, and audiences wanted something more.

The problem is, after fifteen years of unchallenged box-office supremacy, these auteurs thought they owned the concept of “cinema.” Influenced by movies emerging from postwar Europe and Japan, where pacing and visual effects were limited by the hobbled economy, these auteurs thought “real art” happened in the moments of contemplation where mostly male heroes lived inside their own heads. Relationships, action, and women characters were subordinate to that introspection.

Please don’t misunderstand me, I enjoy several New Hollywood classics. Robert Altman’s MASH, William Friedkin’s The Exorcist, and Clint Eastwood’s The Outlaw Josey Wales are among the best movies ever made. But taken together, the movement’s trajectory becomes overwhelmingly identical; these movies are enjoyable today because we can dip into earlier and later historical periods as necessary. Imagine how oppressive this uniformity must’ve felt in the moment.

George Lucas
Into this milieu came George Lucas. Though his first two features, THX1138 and American Graffiti, are firmly New Hollywood, both films, the latter especially, radiate stylistic nostalgia for the less self-conscious cinema that happened between the World Wars. Raised in an agrarian community with television as his lifeline, Lucas understood the world through more retro content broadcast on Saturday afternoons. This became the spine of his runaway breakout, Star Wars.

Here, not in comic books, is where the “invasion” Scorsese abhors began. Star Wars commenced the era of tentpole franchises. Its massive box office receipts also subsidized the technological innovations that made today’s intensely realistic screen graphics not only possible, but affordable. Perhaps most importantly, his characters acted rather than ruminating; even Yoda’s famously sententious homilies are made possible by the physical nature of his training regimen.

Long before Marvel, Hollywood discovered the lucrative nature of franchises like James Bond, Harry Potter, and Star Trek. Even films conceived as one-off enterprises, like Die Hard and Rocky, couldn’t withstand the demand for sequels to capitalize on existing momentum. The surfeit of sequels and series films, which critics have lamented throughout my lifetime, begins in 1977, when Hollywood realized Star Wars could bankroll their more esoteric projects.

Though it’s tempting to cite Scorsese’s age against his criticisms (he’s currently 76 years old), fuddy-duddiness doesn’t explain it, for one reason: his movies continue making money. Though his last wide-release film, the 2016 religious drama Silence, thudded on arrival, his recent CV includes such successes as The Wolf of Wall Street, Shutter Island, and The Departed. Clearly, unlike many of his contemporaries, Scorsese’s best work isn’t finished yet.

But when I heard his anticipated upcoming film, The Irishman, is three hours and nineteen minutes long, I cringed. No wonder he can’t handle Marvel films. Like Heaven’s Gate, which ran 3:39, Scorsese is directing for an audience that wants to sit for a really long time, an audience I’m not sure really exists anywhere. He’s continuing to write for New Hollywood. Which, sadly, is old news.


To Be Continued

Monday, October 14, 2019

Capitalism and the Common Cold

My friend Sarah caught an upper respiratory infection off a coworker recently. Like millions of Americans, this coworker, “Rachel,” felt compelled to ignore her illness, go to work, and potentially expose everyone else. To other workers, it’s probably a common cold—a nuisance, admittedly, but nothing catastrophic. But owing to asthma and a systemic hypermobility-related condition, Sarah has limited ability to fight routine infections. Colds, for her, often turn into bronchitis, and she’s out for weeks.

This got me thinking about the times I’ve bucked medical advice, chugged Day-Quil, and gone in sick anyway. Like millions of hourly workers, I don’t have compensated sick days; if I don’t clock in, I don’t get paid. And believe me, I’ve tried foregoing rent and groceries, with little success. Unless I’m too impaired to move, I have no choice but to ignore my illness and work. Same holds, sadly, for most nurses, fry cooks, and other low-paid workers in highly transmissible fields.

During my factory days, one of only two times I got a stern talking-to about my work ethic involved attendance. I breathed a lungful of dust off some chemically treated paper, and spent a week flat on my back. My supervisor called me into a conference room and informed me that, notwithstanding my doctor’s note from the company clinic, I had missed what they considered a substantial amount of time, and was now officially on warning.

(My other stern talking-to involved getting angry at my supervisor, throwing down my safety gloves, and walking out. That’s a discussion for another time.)

My supervisor warned me that, even beyond the pinch I’d enforced on my company, I had imposed upon my fellow line workers, who needed to offset my absence. Clearly, this warning conveyed, I had a moral obligation to ignore the signals my body told me, and come to work. This was only one among many times when the messages I got from family, school, employment, and others, told me that work was more urgent than protecting my bodily health.

Clearly Rachel got the same message, because she even lied to Sarah about how contagious she was. Even while continuing to sneeze on Sarah and other coworkers, Rachel insisted she was past the contagious stage. At this writing, Sarah has been housebound for a week, hooked to her nebulizer and struggling to feed herself. All because Rachel felt the social cue to not spread her cold mattered less than the moral imperative to keep working.

I cannot separate this morality from the capitalist ethic. Like me, you’ve probably spent your life bombarded by messages that work makes us happy, productive, and well-socialized members of society. Conversely, staying home, even when wracked with wet phlegmy coughs, makes us weak, lazy, and otherwise morally diminished. Our bodies aren’t something to respect and listen to; they’re impediments that need silenced so we can become happy contributors to the economy.

(As an aside, Sarah has already written about this experience. She and I discussed this experience, and tested ideas on one another; while she and I don’t say exactly the same thing, there are significant overlaps. My take is slightly less first-person.)

But who benefits when we power through and work sick? I certainly don’t; I feel miserable and sluggish, and also feel guilty for my inability to produce at accustomed levels. My employer doesn’t benefit, because he must pay standard wages for diminished outcomes—indeed, as I can’t rest and recuperate, he must pay more for my illness than if he offered paid sick time. And considering I must pay higher deductibles for off-hours doctor visits, my illness imposes on everyone.

In short, by making my continued attendance morally mandatory, I diminish everyone’s outcomes. Plus I infect everyone around me, including people who, like Sarah, can’t shrug off a cold. But I keep working, so hey, I benefit the capitalist class, right? So I accept the requirement to work, while socializing the risk, and my employer privatizes the outcomes. This offers a distorted morality that literally prioritizes money over individual and public health.

Perhaps you think I’m overstating things, that we don’t really value economic outcomes over health. If so, try telling your employer that hourly workers deserve compensation so they can avoid infecting one another without missing rent. See how your boss reacts with moral outrage. More importantly, see how you feel the gut-clench of wracking guilt before you even speak. That’s the capitalist ethic trying to silence you. Because we’ve made common colds literally immoral.


Also on capitalist morality:
Capitalism, Religion, and the Spoken Word

Sunday, October 13, 2019

Indigenous Jesus

1001 Books To Read Before Your Kindle Battery Dies, Part 100
Steven Charleston, The Four Vision Quests of Jesus

What is Christianity to a colonized people? Can Jesus reach the descendants of those who have been forcibly converted in His name? Reverend Steven Charleston asked himself these questions as a young seminarian; he heard God's voice telling him to keep working, and the answer would come to him. This book is the culmination of his life's work, and the resolution God has granted him.

Steven Charleston is an Episcopal priest and a citizen of the Choctaw nation. This double path colors his interpretation of Scripture. Half spiritual autobiography, half work of Christian theology, this book describes how Charleston came to understand what he calls "Native Jesus" by understanding the four times He took friends with him and undertook a classic Native American vision quest.

Charleston's people have been Christian since before Andrew Jackson chased them off their homelands. In his telling, the Choctaw invited Presbyterian missionaries into their communities, investigated their claims, and deemed their theology compatible with Choctaw beliefs. One suspects the history was somewhat rockier, but let's accept Charleston's account. His people know Jesus from the Native angle, a belief undimmed by subsequent violence, performed by Whites wearing sacramental vestments.

In Matthew's Gospel, Jesus left the crowds four times, accompanied by very few friends, to have an intimate and personal experience with God. Charleston names these times as the Wilderness, the Mount of Transfiguration, Gethsemane, and Golgotha. Each time, Jesus received an important message from the Father. And each time, he returned to share it with the People.

To understand Christ's vision quests through Native eyes, Charleston first had to unlearn Christianity's burden of European cultural baggage. So, he contends, must we. Too often, Christian missions have been tacit imperialism (the word "propaganda" comes from efforts to convert Natives in South America). But when we shed European blinders, Native Jesus teaches us something new and magnificent.

Native Americans, like Jews, were a covenant people with a sacred relationship to their land. But, like Jews, Natives were conquered by a foreign empire, forced into exile, and continue to live in Diaspora. Jews and Natives use rituals to maintain their identities, while striving to protect their language from assimilation. The last century saw both peoples driven to the brink of extinction.

Reverend (Bishop) Steven Charleston
Viewed this way, Charleston writes, Jesus comes as the fulfillment of both Covenants. His coming, prophesied by John the Baptist, reflects the relationship between the Clown and the Prophet in Pueblo and Plains tradition. His anguish at Gethsemane is His version of the Lakota Sun Dance. And His death on Golgotha open Him up to all the spirits of humanity combined.

Don't mistake this, though: Charleston clarifies that the Native and Jewish Covenants are not interchangeable. Natives don't believe humans sinned at the moment of Creation, for instance, so Jesus's death cannot be seen as substitutionary penance, as European theologians paint it. No, Native Jesus does something different in that moment, something so complex and revolutionary that I'm scared to cheapen it through synopsis.

From the beginning, Charleston identifies this as his personal theology, achieved through his own vision quests. He doesn't proclaim to speak for Native Americans generally. However, he cites diverse traditions, especially Lakota and Hopi; names the experience of historical figures like Sitting Bull, Pocahontas, and Wovoka; and justifies his Native Christianity through the various perspectives of North America. What he writes is both intensely personal, and applicable to others equally.

It's also eye-opening for non-Native readers. We've grown up surrounded by a Christian message that, at times, doesn't come from the Son of Man. We interpret Christ's mission through a cultural prism that certainly makes it comprehensible to ourselves, but distorts the real message when speaking across cultures. When we believe that we have uniquely universal understanding of what Jesus accomplished, we become arrogant, which can be the first step toward colonialism.

By paring the cultural sediment off Christ's actions, Steven Charleston doesn't create something new. Instead he reveals the glory that has remained unvisited underneath, reminding readers like me that we don't own Jesus. In creating a theology for Native Americans, Charleston also prohibits me from becoming comfy in a self-granted salvation. He reminds me that God, not I, decides what is Truth.

This book runs remarkably short, barely 160 pages plus back matter. Yet reaching the end, we feel we've undergone an intense journey, and emerged transformed. No, this book isn't a vision quest itself. But reading it, I feel Charleston prepares us for our quests, reminding us that our vision matters.

Saturday, October 12, 2019

15-Minute Egyptian Chess

David G. Royffe (game designer), Pylos

We sometimes hear the phrase “three-dimensional chess” as a metaphor for complicated thinking. But most attempts I’ve seen at creating actual three-dimensional chess variations have fallen short. I like the idea of a game that forces players to think both vertically and horizontally: it increases the complexity while remaining within the bounds of human comprehension. With Pylos, I’m one step closer to finding real 3-D chess.

The board is slightly less than twelve inches on a side. The pieces are small spheres, each slightly larger than a shooter marble, fifteen each in light or dark colors. Players arrange these thirty spheres in a pyramid shape; the winner is whoever places their sphere at the apex. The rules are so brief, they fit on one page. Sounds simple, right? Well, like Go, the simplicity conceals layers of nuanced strategic thinking.

Promotional photo

Sadly, that Go comparison only carries about so far. The much smaller board and fewer pieces result in much more circumscribed options for strategy; with practice, I would assume your greatest advantage comes from learning to read your opponent. Trying to anticipate another player’s moves in three dimensions creates more subtlety than the pieces. Therefore, I suspect this game would make for good family game nights.

I have mixed feelings about this game. I’ve enjoyed playing it, and it does have enough complexity to unfold in different ways and create several variations. However, speaking as a beginner, it doesn’t feel like it takes “a lifetime to master,” as promotional literature claims. Having played it a few times, I find my hands falling into a comfy place. Unlike chess, Go, or Onitama, this game has a finite feel that I cannot quite shake.

Overall, I enjoy playing this game; it’s quick but complex, easy to learn but difficult to master. But like poker, I suspect the greatest complexity comes from the other player, not the game itself. This isn’t a criticism; I certainly don’t intend to stop playing anytime soon. It’s just a recognition that the board places limits on the game. The real challenge is your perceptions, not really the game.

Wednesday, October 9, 2019

Capitalism, Religion, and the Spoken Word


Every shift at the factory began with our line supervisor reading a sheet of exhortations. She’d begin by belting out, in a voice to beat the machinery: “What are our top three goals?” And we’d respond in unison: “Safety! Quality! Productivity!” The sheet then transitioned into a list of instructions, things like “Check your machines are in good working order before using them,” and “Keep your workspace clean.” Basic stuff, common to most industrial workplaces.

I worked at this factory for months before I realized this sheet wasn’t busywork. By making everybody participate simultaneously, and requiring us to chant parts of the script in unison, management was steering everybody’s thoughts toward the requirements of work at the beginning of the shift. The unified participation forced us to leave outside obligations outside, unify our thoughts, and shift our brain rhythms toward work. We have a word for this. It’s called “liturgy.”

Liturgy is the verbal assertion of what religious people believe. Perhaps it seems silly comparing industrial labor to church, but bear with me. The order of worship in Christian churches; Islam’s five daily prayers; the tightly scripted mealtime recitations on Jewish High Holy Days—all these are liturgy, and they serve to unify everyone involved in one goal. By reciting liturgy together, believers stop being individuals, and become one coalescent body. Many souls become Soul.

Religions encourage this unity because individuals are necessarily arrogant. The untethered mortal frequently becomes an instrument of appetite, consuming and consuming without ever becoming full. Our culture likes the archetype of the nonconformist bohemian, but it’s an ideal very seldom realized; most people, including myself, can’t be trusted as individuals. We need community and the shared experience of others to restrain our animal desires and become completely human; liturgy is one way to achieve that.

Émile Durkheim wrote, clear back in 1912, that liturgy makes participants speak their values aloud, together. It isn’t enough to privately affirm our beliefs, and treasure their truths in our hearts; anybody can do that, but life’s constant strains force us to compromise our values. We’re all occasionally hypocrites. Speaking our values aloud, together, reminds us not only what we believe, but that we don’t struggle alone. Religions with sturdy liturgy see very little apostasy.


Many non-religious groups recognize this unifying power in reasserting what we believe in public, in unison. That’s why public schools require students to recite the Pledge of Allegiance, and why it caused such controversy when football players elected not to participate in the National Anthem. For national and government purposes, refusing to participate in these acts resembled refusing to speak the Apostles’ Creed and sing Kyrie Eleison in church. They risk generating widespread national apostasy.

For capitalists to embrace liturgical practice serves two purposes. First, it gets everybody in a work mindset immediately. At the factory, we needed to reset our mental rhythms to the pace of the assembly line, without hesitation. Our shared chant, with its almost Shakespearean cadence, accomplished that. At my current job, we have no such liturgy, and getting started on any meaningful work thus requires thirty minutes of grumbling and fumbling as our brains realign.

Second, capitalist liturgy forces us to accept, on some level, capitalism itself. Sure, not everybody who speaks the Kaddish or the Nicene Creed believes the words, but they at least give some level of assent to the principles, making themselves bearers of the words’ value. Likewise, workplace chants, company songs, and the tradition (most common in Japan) of calisthenics at the top of the shift, make workers leave their identities outside and become, temporarily, Employees.

In other words, workplace liturgies, like religious liturgies, make us subjugate our identities to The Other. Whether that Other is God or Capitalism matters only sub-structurally; both approaches get us to stop being individuals. The structure of Church and Capitalism bear remarkable similarity. That doesn’t mean the sub-structural qualities don’t exist; Church calls us to stop being individuals to serve humankind, while Capitalism wants us to serve Capitalists. But they structure it the same way.

Perhaps that’s what makes Capitalism so difficult to unseat, even as we workers look outside and see our labors making someone else rich. Our conscious minds know we aren’t achieving the promise of Capitalism, but we’ve liturgically committed ourselves to the capitalist ideal. Changing our minds now wouldn’t make us merely non-capitalists; it would make us apostates. Just as leaving religion can be terminally painful, abandoning Capitalism forces us to abandon the words we’ve spoken.

Wednesday, October 2, 2019

Not Gonna Take It Anymore

Eric Blanc, Red State Revolt: The Teachers’ Strike Wave and Working-Class Politics

2018 saw a sudden upsurge in teacher strikes and other labor actions in several American states, mostly states that went heavily for Donald Trump. This strike wave defied multiple accepted theories among the punditocracy: theories about how strikes are outmoded, or the “white working class” represents an ideological monolith, or that labor action does no good. What made 2018 special? Can American labor do it again?

NYU sociologist and former high-school teacher Eric Blanc was commissioned by Jacobin magazine to cover the 2018 teachers’ strikes. He focused on three: West Virginia, Oklahoma, and Arizona, the states where multi-day walkouts resulted in significant concessions from conservative governments. What Blanc finds has eye-opening implications for organized labor. But I question how portable these insights are.

Schoolteachers, like other skilled professionals throughout the American economy, accepted austerity as the necessary condition following the 2008 financial crash. But ten years later, despite putative recovery, their wages, benefits, and working environment remained locked in post-crash conditions. School districts granted waivers to put non-credentialed teachers in front of classrooms. Insurance was getting adjusted downward amid a supposedly hearty economy.

However, as Blanc observes early, “Economic demands are rarely only economic.” Schools in many states, especially states with historic Republican governments, have been long neglected, with class sizes exceeding what qualified teachers can handle, physical plants in disrepair, and an adversarial relationship between legislatures and teachers. Educators didn’t only strike for improved pay and insurance; they felt the state had denied them the authority to teach.

Legitimate action began in West Virginia. Donald Trump won this state with a two-thirds share, and Republicans with an anti-organized labor stance controlled the statehouse. Admittedly, West Virginia has a longstanding union tradition, dating back to the Coal Wars of the 1890s. It was once a Democratic Party stronghold. But like many Democratic-leaning states, West Virginia grew disgusted with Democrats running on center-left promises, and governing on right-wing principles.

Blanc provides generous evidence that, since at least Jimmy Carter, Democrats have consistently fielded a lite-beer version of the Republican economic agenda. Both parties have repeatedly cut public education funding, mandated standardized tests written by private contractors, and shifted financial responsibilities onto local communities unprepared for the burden. Teachers’ unions have complied with this trend, apparently on a devil-you-know basis.

Striking teachers in the West Virginia statehouse, 2018 (CNN photo)

West Virginia’s strike combines old-school organization with innovative grassroots action. The state’s teachers were divided among three competing unions (which isn’t uncommon in right-to-work states), so coordination had to begin with the membership. While union leaders feared upsetting the apple cart, educators and, importantly, support staff organized online, including much-despised social media, to create pressure from below. It ultimately worked.

America hadn’t seen a statewide teachers’ strike since 1990. Accepted wisdom for an entire generation held that strikes created bad blood and undermined communications between labor and management— and sometimes, they do. But compliance with authority hadn’t produced any better results, either. When union membership pushed a strike authorization vote, almost ninety percent of West Virginia teachers supported a walkout. The die was cast.

Inspired by West Virginia, teachers in Oklahoma and Arizona elected to strike. But these states learned largely opposite lessons from the experience. Oklahoma gave remarkable power to non-union firebrands who had great energy, but no organizing experience. Importantly, Oklahoma forgot to include support staff in their organizing efforts. Arizona fared better, taking time to lay groundwork for an unprecedented strike action in possibly America’s most Republican state.

Blanc basically provides an oral history of the three movements. Why did West Virginia and Arizona succeed, while Oklahoma resulted in a split decision at best? And why did other states with similar grievances, like Kentucky and Colorado, manage just one-day walkouts with only salutary effects? The answers to these questions largely exist in participants’ testimonies, and Facebook groups where the grassroots members created momentum.

Also, Blanc acknowledges significant limitations to the model he describes. Teachers were able to engage community support because their local connections cross class boundaries in ways, say, auto workers cannot. Blanc admits the strikes avoided addressing race issues, which isn’t insignificant; as Ibram Kendi points out, labor unions have long been bastions of White protectionism. Ignoring race might work for one strike, but isn’t sustainable.

Still, even if Blanc’s account doesn’t create a blueprint for revitalized American unionism, it provides pointers of ways workers build countervailing power against capitalist might. Teachers broke the taboo against labor stoppages, and proved that simple numbers can reverse intransigence. They didn’t solve everything, certainly. But they proved change is possible.