Friday, February 28, 2020

Robert McNamara's Very Long Afterlife

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 37
Errol Morris, The Fog of War: Eleven Lessons From the Life of Robert S. McNamara


President-elect John Kennedy tapped Robert McNamara as Secretary of Defense mere months after he’d become the first Ford Motor Company CEO unrelated to the Ford family. McNamara’s experiences in World War II qualified him as America’s leading civilian authority on military matters, but McNamara admits, his business successes really attracted JFK’s attention. His attitudes as a corporate bean-counter originated in the Pacific Theatre, and carried over into Vietnam.

Documentarian Errol Morris pioneered an interview technique in which subjects speak directly into the camera, Morris himself is mostly silent, and he permits his subjects to keep speaking until they reveal something true and awful about themselves. McNamara, eighty-five years old when Morris interviewed him, proves well-suited for Morris’ technique. In post-production, Morris supplements McNamara’s interviews with advanced graphics, rare archive footage, and a stirring Philip Glass soundtrack.

McNamara unfolds his story thematically, rather than sequentially. In his telling, the story begins not with Vietnam, nor his years spent fighting in the Pacific, but with the Cuban Missile Crisis. These thirteen days define his memories of government service, because they demonstrate the give-and-take necessary, and also because they demonstrate that the most rational actors will behave in ways that seemingly defy reason. War, McNamara discovers, turns sane people irrational.

From there, he unfolds his life backward and forward. He briefly touches on his early life, and his Harvard teaching career, before diving headlong into World War II. There he served under General Curtis LeMay, one of history’s most effective commanders. LeMay taught McNamara important lessons about efficiency, about computing relevant data to achieve desirable outcomes for his side. Despite his tough-talking rugged reputation, LeMay was an early technocrat.

But LeMay, with his officers’ complicity, also pioneered techniques of Total War which targeted civilian populations. McNamara confesses to organizing a bombing sortie over Japan that, he says, killed over 100,000 civilians in one night. “Were you aware this was going to happen?” Morris asks. McNamara replies: “Well I was, I was part of a mechanism that in a sense recommended it.” He continues: “He [LeMay], and I’d say I, were behaving as war criminals.”

Only after this lengthy preamble does McNamara graduate to the conflict everyone associates with his leadership: Vietnam. He describes a conflict premised on false ideas, political saber-rattling, and useless patriotic fervor. President Johnson justified the advance bombing of North Vietnam in recompense for an attack which, McNamara reveals, later proved never to have happened. “Believing and seeing,” McNamara says sanctimoniously, “are both often wrong.”

Errol Morris (left) and Robert McNamara

In interviewing McNamara, Morris reveals a man riven by incompatible desires. McNamara wants to take accounting of his life’s accomplishments, good and ill; yet he repeatedly kicks responsibility for his greatest failures up the chain of command. He believes in efficiency, data, and accountability, yet also distrusts rationality and evidence. He desires to be completely honest, yet stands behind the importance of lies told forty years earlier.

Throughout his tenure, McNamara describes conflicts inside the administration. He and President Johnson had very different visions of how to prosecute the war. Curtis LeMay, by this time Chairman of the Joint Chiefs of Staff, believed in completely wiping out any resistance, and advocated nuclear annihilation of any opposition. McNamara hated this idea, believing that you can learn from mistakes in conventional war, but not nuclear war.

Occasionally, McNamara displays humility enough to recognize the times he believes his own propaganda. Besides admitting his war-criminal behavior in Japan, he also describes meetings in the 1990s with Fidel Castro and members of the North Vietnamese government, when he discovered his opponents believed almost the opposite of what the war information machine insisted. Rather than realize what he didn’t know, McNamara often accepted his own agitprop, with catastrophic consequences.

Working together, Morris and McNamara distill his experiences, conflicts, and doubts into eleven portable lessons. (They actually found twenty-one, but needed to cut for time; if you watch the DVD, the other ten are buried in special features.) Taken together, these lessons display a worldview that appears optimistic for the long term— McNamara sometimes sounds remarkably dove-ish— but bleakly fatalistic about the present.

Recorded in 2003, when America was getting into its biggest overseas conflict since Vietnam, Morris clearly intended this movie to comment on subjects outside itself. It certainly does that. Like the best literature, it ultimately isn’t about its nominal subject, it’s about us, the audience. It’s about what we accept and tolerate, and what we consider finally intolerable. And what, like McNamara, we’re willing to paper over.

Wednesday, February 26, 2020

When Rural News Becomes World News



Nebraska generally only makes national and world news when something awful happens. Charles Starkweather, for instance. But the most common reason Nebraska, where I live, makes the national headlines, is because weather becomes extreme, like when blowing snows close Interstate 80, one of America’s major transcontinental arteries, or droughts threaten meat prices. In spring of 2019, record snows followed by record rains destroyed a huge portion of my state.

I recall this, not to elicit your sympathy or tell anyone that they should pay Nebraska better attention, but rather, because my state isn’t alone. Just days ago, I discovered that similar history-making rains are currently destroying eastern Kentucky, a state with a similar economic backbone of agriculture and light industry. I remember, in 2019, complaints from flooded areas that mainstream journalists were ignoring the devastation. This year it’s happening again.

Many major journalism outlets don’t even have stringers positioned in the worst-hit regions for environmental destruction like this. Sometimes, on those rare occasions when Nebraska makes national headlines for weather-related devastation, national outlets will buy printed reports or camera footage from locally based reporters. The AP apparently keeps a paid stringer in Lincoln, in case something newsworthy happens. He doesn’t post very often.

Not that national venues ignore rural America altogether. Barely three weeks ago (as this posts), national-grade journalists flooded Iowa for the 2020 Presidential caucuses. They slavishly followed candidates (mostly Democrats) around the state, recording them knocking on doors, shaking hands, and ginning up morale among volunteers. Then the caucuses happened, the candidates moved on, and the journalists folded their tents and followed. The Midlands retreated to insignificance.

St. Louis-based journalist Sarah Kendzior writes that Americans living in “flyover country” grow accustomed to getting routinely ignored. Life in the middle of the map requires a reëvaluation of what we consider “fair,” Kendzior writes, because the difference between the richest and the meanest citizens is increasingly small. Those who muster enough money, or have so little they can affordably walk away, leave this region for coastal cities, which have something ours don't:

Jobs.

Please understand, this isn’t sour grapes. I understand that national and world news happens in major cities, which have access to money, instant media technology, and populations ready to attend camera-friendly events. Cities create a reciprocal relationship between reporters and events: things happen in places which have journalists ready to report. And journalists stand ready to report where things happen. This feedback loop makes simple sense.

But in the near future, what events demand reporting will probably change. Until now, human-made circumstances, like economics and technology, have demanded journalists’ attention, and journalists have obliged. Going forward, what we euphemistically term “acts of God” will make national and global headlines, and places with sparser populations will get first and hardest. Nobody is prepared in those areas to report; the media apparently expect somebody competent  to rush in later.

This pattern has already begun. Besides Nebraska, the entire Missouri and Mississippi River Valley system saw record-setting floods in 2019; the Mid-Atlantic region was socked by massive floods in 2018; the Lower Mississippi was devastated in 2017; and, to quote USA Today, “U.S. had more floods in 2016 than any year on record.” Devastating weather events are becoming normal… if you see and recognize the pattern.



A pattern which even I, a dedicated news-follower, completely missed until it hit my home last year. Floodwaters from the 2019 Nebraska floods came within eight feet of my front door. I was personally fortunate, because my building didn’t actually get infiltrated. Several friends weren’t so lucky; some were driven from their homes when water entered. One friend had her home flooded a second time while still conducting repairs from the previous flood.

In 2019, Nebraskans complained on social media that our state’s devastating floods got ignored in national media. This was perhaps an exaggeration; sources like the New York Times and Wall Street Journal did, indeed, run wire-service reports on the flooding. But the reporting usually got buried in “regional news,” only got two or three paragraphs in websites, and otherwise got nodding acknowledgement without much real, attention-grabbing coverage.

Now it’s happening again. Floods are socking eastern Kentucky, an area journalists ignore except during massive coal-mining disasters. The pattern becomes clear to those who see it: those places where the vanguard of environmental damage is happening, are least likely to have coverage. Thus it becomes easy to pretend life continues unchanged. Because destruction is happening, but nobody tells us about it.

Monday, February 24, 2020

Machiavelli, in His Time and Ours

Patrick Boucheron, Machiavelli: The Art of Teaching People What To Fear

Sometime around 1513, disgraced Florentine diplomat Niccolo Machiavelli wrote a short treatise on government. Though published only posthumously, The Prince gained such influence that it attracted ire from Counter-Reformation clergy, and the Catholic Church banned it for centuries. Modern critics still argue about how seriously to take the book’s precepts. Everyone seemingly has an opinion, regardless whether they’ve read it, and Machiavelli’s name has become a political byword.

French historian Patrick Boucheron thinks Machiavelli, as a man, means something different than his most famous book implies. Machiavelli wrote amid tempestuous times, when the Renaissance, the Reformation, and the Italian Wars made life chaotic and unpredictable. And, Boucheron believes, we face similar tumult today. So time has come again to read Machiavelli, understanding his entire corpus, within his historical context. Boucheron accomplishes this eloquently.

Machiavelli emerged from Republican Florence, born into what we’d consider today the upper middle class: common citizens, that is, but relatively comfortable. But within his lifetime, the Medici family privatized Florence’s public domains, advancing the cause of nascent capitalism. Machiavelli used family connections to land a lucrative appointment to the chancery of the Grand Council, giving him a bird’s-eye view to civic governance— and to democracy’s rapid decline.

Even before the political texts which made Machiavelli immortal, he left copious written evidence, through his personal letters, state documents, and other writings. Boucheron reconstructs Machiavelli’s biography from the innumerable records he left, many of which survive in his own handwriting. Being neither gentry nor pedestrian, Machiavelli saw most of civic order with an outsider’s perspective. This was furthered by his extensive diplomatic journeys and international embassies.

But the Italian Wars saw Machiavelli’s beloved Grand Council abolished, and Machiavelli himself exiled to his ancestral estates. Reduced to country yeomanry, he rediscovered his childhood love of learning. He spent hours engaged in liberal arts studies, as Boucheron quotes him, conversing with the greats of ancient Greece and Rome. He would alternate between reading the classics, and writing his own books, which sought the connection between ancient literature and the Florentine Renaissance.

Patrick Boucheron
Boucheron combines biography with literary criticism to guide audiences through the complex thorn-bush of understanding Machiavelli’s work. Surely the man himself understood how difficult and morally complex his writings appeared. Famous among the populace for comedies like The Mandrake and The Golden Ass, Machiavelli preferred to couch his philosophy for commoners in art and imaginative literature; ordinary Florentines probably considered him a reclusive belle-letterist.

Meanwhile, political works like his authoritarian The Prince and his small-R republican The Discourses circulated among the city’s intelligentsia in manuscript format. Machiavelli attracted a following among the upper crust, and apparently conducted salons, discussing Latin classics for moneyed aristocracy. (Boucheron notes that Machiavelli never savvied Greek, which excluded him from true membership in Italy’s burgeoning patrician Humanist movement.)

Despite being an Ėcole normale scholar himself, and writing about one of history’s most controversial authors, Boucheron keeps everything in vernacular language, staying completely away from academic jargon. Perhaps this reflects this book’s origins as thirty weekly short radio broadcasts on French educational radio: form follows function. It may also reflect award-winning translator Willard Wood, who keeps both academic accuracy and linguistic clarity tied for number-one position.

Notwithstanding one false hope, Machiavelli never regained his political standing within Florence, as it flip-flopped between autocratic principality and madcap democracy. Since his most important works got printed for wider distribution only posthumously, we can only speculate how Florentine aristocracy perceived him. After all, as Boucheron notes, once his diplomatic dispatches stopped, the major source of Machiavelli’s life became his personal journal, which reflects internal turmoil over external acclaim.

Therefore Machiavelli descends to current readers, not as a person, but as a legacy preserved in others’ controversies. Critics debate exactly how literally to take his political positions, especially since he contradicted himself from one manuscript to another. Boucheron doesn’t eliminate this controversy, only to guide readers to participate in the debate more informed about the author. Even he admits, frequently Machiavelli intended less to be taken seriously, than to provoke an otherwise complacent audience.

This book’s main body runs barely 130 pages, and many pages are illustrated; few chapters exceed three pages. Eager readers could consume it in one ambitious Saturday, though I’d recommend spending longer in thought and rumination. An introduction written for the American edition admits Boucheron chose this subject specifically to address the Trump influence in global politics. Boucheron believes Machiavelli speaks to times as ungovernable as his own. Surely this counts as one such time.

Friday, February 21, 2020

The Enemy of My Enemy's Enemy is My... What?

Bernie Sanders
I shouldn’t pretend I didn’t know the American Left wasn’t a monolith. Despite growing up Republican, believing anything more progressive than Spiro Agnew was a threat to American greatness (or something), I’ve always understood that Leftism wasn’t a unified political front. But watching the 2020 primary has really revealed how little American Leftists have in common. We’re unified by what we oppose, not what we favor, and that scares me.

Like many ex-Republicans, I’ve long understood conservatives weren’t a united front. American conservatism is a loose coalition of religious traditionalists, Ayn Rand agnostic libertarians, strong-state war hawks, small-state fiscal restrictionists, and others. These conservative groups share occasional core beliefs, like the idea, explained by CUNY professor Corey Robin, that society’s power hierarchy exists naturally, and shouldn’t be monkeyed with. But overall, they’re as divided as they are unified.

But somehow, it’s only becoming obvious to me now that American progressives have this same division. Though we share an idea that fairness exists, and our economic and social order should be amended to reflect our values of fairness, we lack any through-line of what fairness actually means. The Democratic Party debates have highlighted this underlying lack of agreement, to say nothing of those too far left for the Dems.

What vision of fairness unifies the Left? I struggle to identify any consistent through-line. I’d like to imagine economic fairness looms large, since the last Democrat to win the presidency with a simple majority, Barack Obama, promised sweeping change during the 2008 housing bubble meltdown. Sure, his policies were somewhat less than revolutionary. But his promises of economic transformation fired the base, and got frequent non-voters to the polls.

Elizabeth Warren
Economic fairness, though, proves totally elusive. Trade unionists, the backbone of the Democratic Party’s coalition under FDR, fled during the Johnson and Carter administrations, leaving Democrats flailing for a generation. The group organized specifically around demanding fairness in pay and workplace conditions, currently has limited apparent interest in progressive voting. This probably reflects Reagan-era union-busting, but I think it runs deeper.

American trade unions have traditionally been White, male, and dominated by citizens. Despite the AFL being founded by an immigrant, Samuel Gompers, unions have frequently held an unacknowledged subtext of racism, sexism, and nativism. Beginning in the 1960s, when Lyndon Johnson threw the Democratic Party’s support behind Civil Rights, while tying it to an increasingly unpopular Asian war, unionists began leaving the party.

Thus, arriving at my construction job daily, I see manual laborers with hourly wages—the people most likely to be displaced by the current administration’s economic policies—wearing red ballcaps, or having “Trump 2020” hand-painted on their hard hats with acrylic paint. Because the party traditionally dedicated to economic fairness, has also affiliated itself with other forms of fairness, which their traditional base sees as an attack.

Racial fairness, for instance, requires letting Black and Brown people into jobs which unions historically protected as all-White. Fairness for homosexuals, transgender people, and other Queer groups, can feel threatening to people who define themselves as assertively male, heterosexual, and nuclear-family-oriented. And, not incidentally, the political Left has long attracted atheists and agnostics, while White traditionalists often lump religion, especially Christianity, as a core identity issue.

A brief overview of Leftism reveals a coalition deeply sundered. Karl Marx’s writings reveal deep-seated prejudiced against not only obvious enemies, like the rich, but also less-obvious foes. Marx hated smallholders and farmers, whom he called “lumpenproletariat.” He openly described non-Whites as “primitive” and “savage.” Marx’s later followers, including the Black Panther Party, have needed to read his writings selectively, to expunge his deep social conservatism.

Donald Trump
Right now, the two leading Democratic contenders talking about economic fairness, Bernie Sanders and Elizabeth Warren, are members of groups which traditional unionists marginalize. Sanders is Jewish and agnostic; Warren is a woman. Though I suspect many of my coworkers would nod sagely if I read these candidates’ platforms aloud, once I mentioned their names, my coworkers would retreat into their nativist cocoons.

No wonder centrists and lite-beer Republicans, like Biden and Buttigieg, persist so adamantly.

Democrats will probably win in November, because opposition to the spray-tanned monstrosity in office will coax frequent non-voters to the polls. But that isn’t enough to build a persistent coalition. Unless American progressives find ways to link their various definitions of fairness together, 2020 will prove a fluke. Leftists are linked, right now, by the walking embodiment of what we hate. But we’ll never survive unless we can unify over what we actually love.

Monday, February 17, 2020

I’m Officially Sick of the Oxford Comma!

I’d never heard the term “Oxford comma” before my final year of college. Looking back, that seems weird, because as a student who wrote at an advanced level from an early age, I’d gotten front-loaded into upper-grade writing classes and taught a panoply of grammatical rules throughout school. Only as a college senior, writing a paper for a theatre history course, did a professor circle my missed comma and ding me points for unclear writing.

Since the advent of social media, though, the Oxford comma has become an issue of serious, or anyway po-faced, contention. On one side, Oxford comma proponents shout loudly that it’s the only barrier between meaningful writing and complete gibberish. On the other, “common use” grammarians call this rule pretentious and unnecessary. The most common form of pro-Oxford rule derives from internet memes, because memes are good for simplistic arguments. One common meme looks like this:


Except for one thing: no reasonable person would think the writer meant Washington and Lincoln are rhinoceri. My fifth-grade spelling textbook explained list commas thus: within the list structure, commas hold the place otherwise occupied by the word “and.” Since the final two list items are already separated by the word “and,” the comma is strictly optional. Not that it was necessary, but neither that it was wrong; it simply boiled down to aesthetic choice.

In my experience, there’s two types of grammatical rules. The first is rules we identify by describing how ordinary people use language in ordinary circumstances. We’d call it “a rule,” for instance, that English word order is fundamentally Subject-Verb-Object, “I kick the ball.” We’d note occasions which reverse that order (“The ball is kicked by me”), but we’d consider this non-standard, because it’s wordy and awkward, and we’d give it a special name, “passive voice.”

The second type, is rules we decree from above, and expect others to observe. Invented rules like “You can’t end sentences with a preposition,” or “You can’t split infinitives,” are invented by grammar mavens to create a more specific and precise language. But we feel strong aversion to these rules, partly because we understand they’re artificial: I know you can split infinitives, because I’ve done so many times. But important people tell us we can’t.

Recently, an article crossed my desk entitled Take That, AP Style! Court of Law Rules the Oxford Comma Necessary. The argument isn’t new; the link was nearly three years old. It’s common knowledge that legal documents are ruled by rigid grammar, which courts enforce to eliminate any suggestion of ambiguity. Iron-clad rules are necessary in law. But do we want lawyer-ese to set our civilian writing standards? Must all writing resemble your student loan agreement?

My friend who shared this link wrote: “THE OXFORD COMMA IS CIVILIZATION. This is a hill I will die on, or gladly make *others* die on. And now it is a part of legal record!” I’d question whether my friend’s own writing matches his supposed style, since internet writing encourages hyperbole. Surely my friend doesn’t literally intend to kill anyone over punctuation. Yet if legal contracts are our grammatical model, he’s required to gird himself for battle.

This isn’t academic vapor, friends, battles over grammatical rules matter. Specifically, they matter because they determine who gets to participate in civil debate and leadership. Historically, grammatical “rules” are invented by aristocracy, or those who consider themselves aristocracy, to make language so obscure that ordinary people, with their public-school educations, can never understand. By this technique, the poor, second-language learners, and others who will never savvy all the aristocracy’s rules, will never ascend to leadership.

In fairness, my friend, a schoolteacher, will say he must enforce stricter rules than I, because he must instruct children in the expectations they’ll face in careers, media, and technocracy. As a former teacher myself, I appreciate that. But as I told my students, grammar “rules” aren’t rules like baseball, where we follow rules because they’re rules. Grammar rules are more like etiquette, knowing how to show others respect by living up to their expectations.

When one damn comma becomes life-or-death, and we shame people (or, as in the meme, treat them like they’re too stupid to understand context) over a literal stroke of the pen, we’re functionally silencing those who don’t conform. We’re telling the poor, and people with learning disabilities, that they have no place in public life. We’re creating a more hostile public sphere. And that’s something I, a teacher at heart, can never accept.



See also:
The Grammar Police State
me write essay on talk nice for you

Friday, February 14, 2020

Jesus Christ, Master Speechwriter

Lois Tverberg, Reading the Bible With Rabbi Jesus: How a Jewish Perspective Can Transform Your Understanding

The Bible wasn’t written for Westernized Europeans living in the 21st Century. Let’s start with that premise. Like most literature, its intended audience shared certain cultural touchstones and spoke a shared language. The Scriptures from which Jesus Christ and the Apostle Paul quoted extensively, were written in Hebrew, a language which doesn’t share our modern correlation between “word” and “meaning.” And that’s saying nothing of non-literal meanings hidden below the surface.

Lois Tverberg isn’t the first Hebraicist to publish on themes of cultural meaning which Jesus’ original Jewish apostles would’ve understood, but which modern Christians have forgotten. She cites Athol Dickson and E. Randolph Richards, to name just two. This isn’t even her first book on the topic, though it’s her first to focus substantially on language rather than action. Christianity’s Hebrew roots have gotten voluminous coverage in the last twenty years.

However, just because Tverberg’s topic isn’t new doesn’t mean her information is well-known. Many Chritians remain unaware of Judaism’s collectivist impulse, as Tverberg writes, and misinterpret Jesus’ promises as purely personal salvation. They see the New Covenant as completely negating the Old Covenant, which Jesus rejected, both explicitly and implicitly. And they see Hebrew Scripture as a rough draft of Chrisianity, which Jesus, she demonstrates, did not.

Tverberg identifies the problem as “Greek thinking,” a form of Westernized rationalism based on literal language and if-then reasoning. Importantly, Tverberg doesn’t insist Greek thinking is wrong; she just considers it the incorrect framing to understand the Hebrew Scriptures, written in metaphor and poetry. We must recognize, Tverberg writes, that in Hebrew, words get their meanings from situations, and messages come from story, not syllogism.

Working from this premise, Tverberg explicates several situations where Jesus, working from Hebrew Scripture, weaves stories, speaks in poetry, and plays off single words’ double meanings. Jesus especially quoted from Isaiah and the Psalms, but Tverberg shows ways he alluded widely to Micah, Jeremiah, the Torah, and elsewhere throughout Hebrew tradition. After all, Jesus taught during Judaism’s great Mishnah period of intense oral tradition.

From the beginning, an immersion in Jewish tradition provides a distinct look at meanings. Modern Christians have worked hard to create definitions of “Christ” which encompass all the theological weight we expect Jesus to bear (while excluding historical or current religious trends we find distasteful). But to Jews living in the First Century CE, “Christ” had very specific meanings. Without that history, we lose understanding of what Jesus himself intended.

Lois Tverberg
Further, Hebrew tradition was considerably more collective than modern Euro-American culture. We miss important threads because English doesn’t have a plural “you,” leading today’s readers to perceive Jesus’ directions and promises as very individualist. But many times Jesus and the prophets say “you,” they mean “you all.” The great prophetic promises are intended for the people united, not individuals. (In my experience, this weakness is more common in White than Black churches.)

Jews understood Jesus’ extensive scriptural quotations, Tverberg avers, because they memorized Scripture in ways Christians don’t. Jesus’ ability to create immediate recognition using just one or two words meant something powerful to his first-generation audience. Without that ability to call entire prophetic books to mind by rote, Christians miss entire huge swaths of what Jesus actually intended, because his allusions are both subtle and frequent.

Throughout this book, Tverberg returns to Emmaus as her metaphor. In Luke’s Gospel, the resurrected Jesus teaches two disciples how the Hebrew prophetic tradition pointed toward Jesus’ ministry. This relationship between the prophets and Christianity has been substantially lost to contemporary Christians, because we don’t memorize Scripture, don’t think collectively, and don’t remember the Messianic promise woven into Jewish liturgy.

However, Tverberg warns us, don’t mistake what Emmaus means. Just because the prophets all pointed toward Jesus doesn’t mean the entire Hebrew Scripture is a dead letter, intended for prooftexting Jesus’ ministry. Jesus, Paul, and the other apostles understood Christ’s ministry as a continuation of the prophetic continuum, and themselves as heirs of Jewish tradition. That’s why Jesus and Paul quote Isaiah and Jeremiah widely, and Plato and Aristotle never.

Jesus and his first-generation apostles had important rabbinical assumptions wrapped up in their language, assumptions hidden behind what, to modern readers, look like simple turns of phrase. Tverberg and other Hebraicists like her want to reclaim this heritage for modern believers. Because without Jesus’ Hebrew thinking, we receive only an abridged version of Jesus’ message. Tverberg admits she doesn’t cover Jesus’ every Jewish allusion. But she opens our minds to a Truth of surpassing beauty.

Wednesday, February 12, 2020

Fear of Darkness: Part Three

This essay follows Fear of Darkness: Part One and Fear of Darkness, Part Two
So, to recap: stuck in a broken-down truck beside a rural highway, miles from help, with no cell signal and no way to contact humanity, I had two realizations. First, darkness and isolation dismantle the distractions of my senses, forcing me to confront truths I’ve long known but couldn’t process. Second, human perceptions of time are illusory; “now” only exists later, when I think about it. Only what has happened, and what could happen, really exist.

Trapped and isolated, forced to sit still and listen, these first two realizations led to my third and final realization of the night: that only action really matters. We make excuses for the past, and promises for the future, because fundamentally, we humans know what only darkness made me realize, that “now” doesn’t exist, and therefore doesn’t matter. Instead, we focus on what we did, what we could’ve done, and what we hope to do going forward.

This realization puts me at odds with two of humanity’s greatest religious traditions. Christianity teaches that we have an intimate relationship with transcendence, that our ability to commune with God defines our souls, and the ultimate disposition of our undying essence. Whether that means the catechistic salvation of Catholic tradition, or the Reformation’s promise of “salvation by grace through faith,” we are saved by transcending ourselves, not by doing anything.

Simultaneously, Buddhism teaches that living outside the moment creates suffering. Chaining ourselves to the past reminds us of our limitations, and keeps us anchored, unable to exist currently; while casting our hopes onto the future keeps us striving after goals which cannot literally exist, because the future never wholly arrives. Though different Buddhist schools disagree on how to achieve their goals, they agree: only the present really, meaningfully exists.

I wandered into my personal wilderness, like Jesus seeking temptations at Lent or Siddharta renouncing his palace; yet my conclusions differ wildly from theirs. Yet I don’t believe I’ve contravened their realizations, either. Because Jesus offers salvation, while Buddha offers Nirvana, and I offer neither. Indeed, I offer nothing, because I don’t believe I’ve uncovered truths that apply to anybody but myself. And that’s my biggest reward for a dark, tumultuous evening.


Huge swaths of my life have been defined entirely by the desire to avoid causing offense. Sometimes I’ve wondered why this is. Was it my parents, encouraging me to pursue life goals consistent with earning a living rather than accomplishing personal fulfillment; teachers who harped on every shortcoming in my work, making me so afraid to screw up that I’m left paralyzed; my rootless youth, needing to reinvent myself every two or three years? Who knows?

Whatever the reason, the future became a source of terror, the past a reservoir of regret. My entire present has been an effort to avoid hurting anybody’s feelings or making anybody think poorly of me; as a result, I’ve remained perpetually unfulfilled. Ironically, in attempting to keep people thinking well of me, I’ve avoided doing anything that would’ve built deeper human connections, like getting married or putting down roots. I’ve become a ghost.

Rather than avoiding offense, I need to act boldly, acknowledging that some outcomes remain beyond my control. Yes, I will inevitably do something to hurt or pique others. Because I’m fundamentally human, and mistakes simply happen. That’s what it means when the present only exists retrospectively: I cannot know the effects my actions have upon others. But look what avoiding those effects has accomplished. That isn’t the second-best choice.

Instead, I must simply do something, anything. And, if I cause pain or moral injury to another, I must seek forgiveness and learn from my mistakes. I must make myself a better person, and make the world around me better for my having been present; and if, in doing so, I hurt others, I must seek atonement with the same boldness with which I’ve acted. I must own my future, and in doing so, I must own my past.

If my nighttime lesson is true, the present exists only for me; future and past exist for everyone together. This hasn’t been an easy lesson to internalize. In the days since receiving this personal truth, I’ve frequently wanted to retreat into the comfy habits which existed before that night. I have to consciously remind myself to keep acting toward the future. But I believe, if I keep practicing, my nighttime truth-visit will make me, and everyone around me, better for the experience.

And that, friends, is kind of scary.

Monday, February 10, 2020

War, and the Memory of War

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 36
Ted Kotcheff (Director), First Blood


An unshaven stranger wanders into a Pacific Northwest town, with an American flag patch on his jacket and a Bowie knife on his belt. The local sheriff mistakes him for a common vagabond and shows him the far side of town. But the wanderer comes back into town, stolidly refusing to explain his unwillingness to leave, so the sheriff arrests him. Inside a subterranean holding cell, the prisoner suffers his first violent flashback to Vietnam.

Through the 1980s, movie studios struggled to reconcile John Rambo, and his massive popularity, with the stories Americans told ourselves about our Vietnam experience. They turned Rambo into a musclebound antihero of Cold War exceptionalism, a picture of single-minded virility willing to continue fighting America’s battles after the government abandoned him. This image has become so pervasive that we forget he wasn’t created that way; he was laconic, unwanted, and a receptacle for America’s doubts.

At the movie’s beginning, we know nothing about Rambo. We witness him trying to find his unit’s only other surviving veteran, only to discover that Agent Orange finally took him; the weak smile we see on his face during that scene never recurs, as he realizes nobody but himself remembers what his unit survived. Without the war, nothing gives his life structure. So he resumes the only task which postwar life has provided him: walking.

Rambo’s 1982 big-screen debut followed ten years of development purgatory after David Morrell’s novel dropped, while the war was still going on. Morrell presented Rambo (no first name in the novel) as a villain, a killing machine which America’s government built, then discarded. Wandering his homeland without direction, Rambo becomes a force of destruction, because violence gives him meaning. The movie changes this characterization, reflecting how America’s self-justifying Vietnam narrative had evolved over ten years.

Sheriff Will Teasle (Brian Dennehy) doesn’t hate Rambo initially. He simply prizes order and cleanliness over fairness and justice. Rambo looks, to Teasle, like a stereotypical drifter with no means of support; Teasle assumes that he’ll wind up panhandling downtown, undercutting local businesses. He bears Rambo no animosity, he just wants the scruffy vagrant gone. But Rambo refuses to leave, for reasons entirely his own—he’s persistently taciturn. So Teasle arrests Rambo on specious charges.

You probably already remember what happens next: an overzealous deputy, feeling authorized by Teasle’s unthinking attitudes, attempts to shave Rambo with a straight razor. Rambo suffers a flashback, attacks the massed deputies, and escapes to the wilderness. (In the novel, he kills several deputies.) The law commences a massive manhunt, carrying military-grade assault weapons, but Rambo, trained in wilderness survival, manufactures simple weapons and stays one step ahead. So Teasle calls in the National Guard.

Sylvester Stallone (left) and Brian Dennehy in First Blood

This encapsulates how Rambo represents America’s struggle with itself after Vietnam. The law wants to bury him, because if he disappears, so does the narrative of their missteps. Rambo wants only to survive. Later movies would transform Rambo into an icon of America’s individualism mythology, but in this movie, he isn’t an individualist; he’s a soldier, trained and awaiting orders, who gets dropped by the government that should’ve controlled him. He’s become an unwanted memory.

In one key scene, several National Guard “weekend warriors” preen for the camera before a mine shaft they’ve just detonated with hand artillery, believing they’ve killed Rambo inside. One of them shouts: “Now take one for Soldier of Fortune!” That magazine, which arose in Vietnam’s immediate aftermath, helped spread the belief that America’s government betrayed troops in Vietnam, popularizing the stab-in-the-back myth. These fluffy-bearded kids apparently miss the irony of killing a decorated Vietnam veteran.

Stallone heavily rewrote the story, making Rambo a more sympathetic character, ending the story with hope of future redemption. In Morrell’s novel, Rambo and Teasle kill one another. Stallone, by contrast, gave Rambo a final monologue that helped coalesce, for American imagination, how war caused trauma for wide-eyed soldiers who trained under the belief that they were doing good for world democracy. The changed ending reflects changes in how Americans understood our shared Vietnam experience.

Don’t misunderstand: this movie teems with myths Americans weren’t ready to unpack about how we brought “heroes” home and expected them to reintegrate into society. But as an artifact of a struggle Americans were only beginning to publicly confront, it remains a landmark of self-scrutiny. Sadly, just three years later, the studio flinched from continuing that scrutiny, sending Rambo back to re-fight the war. Thus we return to this moment again, after every subsequent war.

Thursday, February 6, 2020

Fear of Darkness: Part Two

This is a follow-up to Fear of Darkness: Part One
Alone in the dark, isolated from artificial distractions and desperate for sleep, two kinds of thoughts began plaguing me. First I had the commonplace nuisance thoughts that keep everyone awake occasionally: remember that embarrassing thing you did in fourth grade? Why did that strange woman look weirdly at you the other day? Here’s some song lyrics you haven’t bothered to remember in twenty years.

These thoughts passed quickly, though. Lingering on nuisance thoughts is the luxury of somebody secure in a warm bed, with rent paid and no fear of starvation. The minute conditions get cold, and you have legitimate reason to worry that you might lose all the financial security you’ve spent the last year struggling to recover, the internal monologue changes. You become aware that life really is precarious, even when you aren’t necessarily dying.

It began by remembering that Sarah expected me in Lawrence shortly before midnight. With no cell signal, I had no possibility of contacting her and explaining the situation. I couldn’t reassure her that I was essentially safe, despite the uncertainty now hanging around my head. Would she start worrying? Would she assume the worst and contact the police, fearing I was injured or dead?

(As it turned out, yes, that’s exactly what she did. Though I wouldn’t know that until the next afternoon.)

Something occurred to me: I was wondering what Sarah was doing right now. But does the concept of “right now” mean anything in these situations? Certainly, the world continues moving, and people continue living their lives, even when we cannot see one another. But when we speak of things happening simultaneously, it becomes very difficult to calibrate the movement of time. “Right now” is a very localized phenomenon.

Let me explain, as the thoughts came to me, late Friday night and into Saturday morning: I can say that “right now,” I am laying across the front seat of my pickup truck, using my duffel bag as a pillow, trying to find a comfortable position where the seat belt buckle doesn’t stab into my kidneys. I have the necessary information to know exact events happening within the range of my senses. But outside? Beyond the limits of my senses?

Another vehicle could be approaching beyond the next rise. Or it might not. That vehicle’s presence or absence means nothing until it’s confirmed—and it’ll only be confirmed when it actually crests the rise. I can only pinpoint another vehicle’s presence by working backward. Therefore, even though it definitely exists when I cannot see it, nevertheless I cannot describe its actions “right now” until it no longer is “right now.”


Therefore, while Sarah is definitely doing something “right now,” I cannot know what that is until much later. She exists, but that doesn’t mean anything. Two events, like me struggling to sleep and her contacting the Buffalo County Sheriff’s Department, only happen simultaneously later. Without information, I cannot possibly process two separate events together. “The Present” is circumscribed by my senses.

Saying two things happen simultaneously might make sense to an omniscient narrator in a novel, or a benevolent god watching from above. For everyone else, time is an illusion created by presence, or absense, of knowledge. As a technological society, we have constant streams of information available from satellites and LTE signals and fiber optic cables, but that information is always at least a few milliseconds old.

It always creates a gap between us and “right now.”

Trapped inside my broken-down truck in the winter countryside, in the dark, with no data signal and no other humans, my information field has collapsed. My immediate bubble is less than six feet wide, with some secondary data from outdoors, where I dare not actually venture because it’s cold. The constant, digitally streaming present with which Americans surround ourselves is, temporarily, lost to me.

“Right now,” my cats might be eating or sleeping or wrestling at home. Sarah might be worrying herself crazy, or she might’ve fallen asleep. Strangers might be falling in love or breaking up. Senior citizens are dying, taking a lifetime of stories with them, while babies are being born, commencing stories so vast, I’ll never comprehend them. I know so much is happening “right now.”

Yet even as “right now” exists, it simultaneously doesn’t, until we communicate it with one another. Each of us remains trapped inside our bubble of unknowing, inside our own private broken-down Silverado, isolated and lonely. That’s what it means to be truly in the dark.

See also: Fear of Darkness: Part Three

Wednesday, February 5, 2020

The Conquest of the Kingdom

Mark Charles and Soong-Chan Rah, Unsettling Truths: the Ongoing, Dehumanizing Legacy of the Doctrine of Discovery

The era of European “discovery” didn’t begin in 1492, despite what my American History textbooks claimed. In 1452, Pope Nicholas V issued charters giving European kingdoms authority to “discover” lands currently occupied by non-Christians, especially those of an off-white hue, and claim that territory for Christianity. These bulls identified discovery specifically as the act of taking possession of land, even land already charted and well-documented.

Reformed (Calvinist) theologians Mark Charles and Soong-Chan Rah take vehement exception to the Doctrine of Discovery, an accumulation of religious precepts investing White imperial nations with moral authority to convert other peoples by choice. They don’t come by this position lightly. Mark Charles is of mixed Navajo and European heritage, and Soong-Chan Rah is a first-generation immigrant. Both authors know the destructive impulse inherent in imperial authority.

Christianity’s Doctrine of Discovery didn’t happen overnight. It represents an accumulation of principles, drawn originally from papal bulls, later from Reformation theological tracts, based on the innate goodness of earthly power. It derives from the notion of Christendom, the belief that nations with Christian populations will govern with Christian authority. And like all political authority, the Doctrine of Discovery, however well-intentioned, serves mainly to preserve itself against all challenges.

But despite its origins in 1452, our authors believe the Doctrine of Discovery originates much earlier in Christian history. They trace what they call “the heresy of Christendom” to early Christian philosophers, particularly Eusebius and Augustine, who sought relief from early Roman persecutions. Giving religious authority to secular rulers, especially the Emperor Constantine and his successors. This means yoking the church to the empire, and paying the cost for that.

Our authors trace the Doctrine of Discovery’s effects on native inhabitants of North America, which they call Turtle Island. (From the earliest pages, our authors acknowledge this is mainly Mark Charles’ book, and it includes many long autobiographical discursions.) Believing God through Christ has invested moral authority in technologically superior European powers, gives White Christians license to do immense damage to Native American peoples. Not a revelation, admittedly.

This isn’t abstract or incidental. Using both religious and secular sources, our authors find the words “doctrine of discovery” used in American government documents as recently as 2005, denying Native Americans land claims, despite even the government’s own admission that it seized the land illegally. The Doctrine of Discovery continues justifying the moral abasement of Native American bodies, to say nothing of souls, into the present day.

Mark Charles (left) and Soong-Chan Rah

Besides its negative effects on non-White populations, the Doctrine of Discovery has warping effects on White people, too. The desire to invest Christian authority in secular rulers creates the impulse toward a Messianic ruler. Our authors note that Eusebius, in his final books, describes the Church prostrating itself before the Messianic Emperor Constantine. White Americans, both religious and secular, have created our own White Messiah in Abraham Lincoln.

In later chapters, our authors surprise me by redefining the non-White relationship with the descendants of White colonists. They describe how non-White Americans, such as Mark Charles’ family on the Navajo Reservation, show signs of inherited PTSD-related trauma. But, in rejecting widespread explanations of either White supremacy or White fragility, they observe White Americans also show signs of inherited cultural trauma. Empire, they suggest, damages the imperialists, too.

Thus, our authors come full circle, from a poor Galilean Messiah who offered to restore a broken world, to a modern Christianity based on acknowledging how worldly power has broken Christendom. Our authors’ history of Western colonialism is dark and damning. But they don’t leave things by condemning the powerful; they recognize that powerful people need healing as badly as the powerless, and they can’t get that healing from the power that first injured them.

Despite what I’d expected, this isn’t a book of Indigenous Christian spirituality. If that’s what you want, consider other authors, like Steven Charleston. Rather, this book addresses how Christians need to face the powers of this world, how Jesus calls true believers to reconcile their faith with a broken and dangerous world. Because, as our authors remind us, Jesus stood before Pilate and said His Kingdom wasn’t of this world.

Yes, this book is definitely political. That may bother many Christians, especially those who see faith as personal and internal. But Jesus confronted tax collectors and centurions, challenging them to change their ways and focus on the True Kingdom. Christianity, I contend, is never partisan; but it is deeply political, because it always calls believers to allegiance to a different kind of Kingdom.

Tuesday, February 4, 2020

Fear of Darkness: Part One


This past Friday, January 30th, my truck broke down on a rural highway, twelve miles from the nearest town. This happened at 10:00 p.m. on a cloudy night, so I couldn’t see what happened; I wouldn’t know until the next morning that I’d cracked a ball joint. The only thing I knew for certain was that I’d heard a loud “bang” from my undercarriage, and suddenly I couldn't keep going in a straight line.

As I said, this happened at 10:00 at night, so it was pitch black outside without my headlights. To my horror, this happened in a cell phone dead zone, so I had no way of reaching roadside assistance, or letting my loved ones know what had happened to me. It was thirty degrees Fahrenheit, and twelve miles from town, so walking wasn’t an option, especially since I wasn’t dressed for the temperatures. I had no choice but to bunk down and wait.

I spent the entire first night in my truck. When it became clear nobody was going to stop, I dug out an emergency blanket and prepared for a night of shallow sleep. I was completely alone. Without a cell signal, I couldn’t talk or text with anybody, but I also couldn’t check social media, listen to music, or do the other electronic distractions Americans have grown accustomed to doing, to stave off boredom and loneliness. It was so dark, I couldn’t even read.

Sometimes I’ve encountered people who question why “darkness,” in holy texts like the Bible or the Buddhist sutras, is associated with evil. I have many reasons why I disagree with this read, particularly since “evil” is a man-made moral concept not found in most early religions; things like violence and death simply exist, alongside love and birth. But it’s true, these texts consistently consider “darkness” humanity’s greatest moral shortcoming.

I’ve explained this construction away by suggesting that, in days before cheap electric light, getting trapped away from home after nightfall was treacherous. Without light, humans are vulnerable to predators, bandits, or even pitfalls of terrain, simply because we cannot see. Darkness, to low-tech humanity, signifies one’s complete helplessness before a world that’s generally fair to populations, but blind to the needs of individuals.

Yet here I found myself in actual darkness. Yes, I could imagine coyotes outside my truck, possibly inches from my sleeping body. Yes, I felt helpless, knowing mortality is a thin shell protecting me from something completely unknown. But that night, I felt something besides fear: I felt the keen awareness of myself. I was alone. I had literally nothing but my thoughts to defend against the yawning void of nothingness, not outside, but within.


Our modern society provides copious distractions to prevent humans having to face ourselves. Television, the great diversion of recent generations, has given way to the Internet, with its constant streaming on-demand amusements, custom calibrated to ensure we never have to encounter ideas or entertainments we find particularly dangerous. When I grow bored of my smartphone or computer, I switch to books and music. I’m constantly surrounded by anything that prevents me having to look inward.

Darkness robs me of that. Unable to read, scroll, or otherwise make the time go away, I sat in my hermetically sealed cab, listening to the wind through some farmer’s corn stubble, and realized I was, for the first time in a long time, completely alone with myself. I had to take stock of recent decisions, something most of us avoid doing, and realize where life has brought me, o rather, where I’ve brought myself.

No wonder so many religious practices, like Buddhist meditation of Christian centering prayer, involve sitting still and doing nothing. The world provides constant outside stimuli to prevent us being completely in the moment. Even with friends, we mostly talk about what we’ve done recently, or our future plans; even among my oldest, closest friends, I seldom exist entirely in the present. We humans do anything to avoid existing as we are, and facing present reality.

We don’t fear darkness because we can’t see predators (though that matters, too). We fear darkness because light gives us constant stimuli to occlude our thoughts, permitting us to flee from our souls. The angels came to the shepherds by night, and Gabriel to Mohammed in a shadowed cave, because darkness ensures we have nothing to look at but the truth. And we know, without having to analyze, that truth is something we humans avoid.

See also: Fear of Darkness: Part Two and Fear of Darkness: Part Three