Monday, November 12, 2018

God's Not Dead, but the Church Might Be Sick

Promotional photo from the film God's Not Dead

I’ve never seen the movie God’s Not Dead or its sequels. I have no intention of ever doing so. I realize many people like these movies and draw hope from them, and I also recognize that they represent the leading edge of a Christian indie movie industry that’s massively lucrative: the first God’s Not Dead film grossed $62 million on a $2 million production budget. Yet, as a Christian myself, these films irk me endlessly.

And I’ll tell you why.

When my sister was seventeen and shopping for colleges, she settled on two. She ultimately wound up attending the state university a few hours’ drive from home, partly because that’s what the family could afford. But she really had her heart set on attending Northwestern College of Orange City, Iowa. This private liberal arts college is institutionally affiliated with the Reformed Church in America and has Christianity in its curriculum.

My sister (whose permission I don’t actually have to retell this story) was massively excited by Northwestern’s explicitly Christian content. She loved that Orange City, itself a heavily Dutch Calvinist community, largely closed down on Sundays so everyone could attend church. The idea of surrounding herself with all Christianity, all the time, while pursuing her liberal arts credentials, got her downright giddy. She really wanted this entirely Christian education.

Her bank account, let’s say, didn’t.

When she discovered there was no chance whatsoever of affording private-college tuition, she became paralyzed with tears. I remember her standing in the family kitchen, weeping like her dog had died, half-screaming: “If I go to a state university they’ll tell me to go stand over in the corner and keep quiet and never say anything because I’m a Christian.” The mix of rage and grief in her outburst was palpable. So was the baloney.

Architectural drawing of the new Learning Commons at
Northwestern College, Orange City, Iowa

I was more conservative then than I am today, certainly. But I’d also perused both schools’ catalogues, and knew the state university had religious options available. Certainly, as a public institution, it couldn’t offer theological or seminary training, and was too small to host a religious studies program. But it had courses in biblical literature, the social science of religion, religious psychology, and more. And it hosted, though it obviously couldn’t sponsor, several on-campus ministries.

Yet for years, in our small-town congregation, we’d gotten barraged with messages about the inherent depravity of secular education. Well, all worldly institutions, really, but state-sponsored schooling was the nexus through which everybody passed, and therefore, the first test of godless secularizing mind control the good Christian had to surpass. Getting through high school, in many people’s understanding, meant walking through a purifying fire and emerging holy. Going back for more public education? I never.

Please understand, we weren’t part of some hyper-partisan fundamentalist sect. We attended the United Methodist Church, a centrist denomination that, among other things, tried (unsuccessfully) to censure Jeff Sessions for his behavior in authority. Yet, as often happens in small communities, a lack of diversity meant people became more extreme and intolerant in their opinions. From politics and religion to loyalty to the Denver Broncos and country music, everyone generally became more rigid, not less.

But this moment forced my first meaningful break with my childhood certainties. (Childhood, yeah. I was 23.) Seeing my sister paralyzed with grief because she had to attend public university, like three-quarters of Americans and most future pastors, struck me as beyond odd, especially as she’d had a fairly successful campus tour. She’d internalized the popular narrative of modern Christian persecution. And in her mind, months in advance, it had already begun happening to her.

Asia Bibi, a woman who actually, literally fears for her life because of her Christianity

Please don’t misunderstand me. I know Christians in this world are still persecuted. As I write, Asia Bibi, a Pakistani Christian, has narrowly evaded the death sentence for “insulting the Prophet,” an accusation the high court admits is probably fake; she still lives with constant death threats. There are places in the world where Christians have to fear violence daily. America isn’t one of them. Having to attend public education isn’t a human rights violation.

Yet a constant propaganda assault has convinced some people, unable to comprehend their own blinders, that that’s exactly what’s happening today. Mass media like God’s Not Dead convince believers to see conflicting views not as challenge and learning, but as attack and oppression. Years later, my sister doesn’t regret her state university education. But could she convince another small-town kid, raised on persecution media, that she won’t be silenced for her views? I don’t know.

Monday, November 5, 2018

That One Film! Starring That One Guy!

Guy Pearce in (left to right) Iron Man 3, Jack Irish, and The Adventures of Priscilla,
Queen of the Desert
 (click to enlarge)

I’m so far behind in watching big-ticket superhero films that I didn’t realize five entire years had already passed since Iron Man 3 debuted. Honestly, Wonder Woman was the first superhero film I’d seen in cinemas since Iron Man 2. Or since. So when it appeared in my On-Demand menu, I figured, what they hey, let’s get caught up. Except, once I pressed play, a whole other question occupied my mind:

Why do I recognize Aldrich Killian?

The fella looked so damned familiar that I focused my attention on trying to place him, to the exclusion of events onscreen. (I really should re-watch the film. Eventually.) Only during the closing credits did I realize: it’s Guy Pearce, a British-born Australian actor probably best-known to international audiences as Felicia, the flamboyant young drag performer in The Adventures of Priscilla, Queen of the Desert.

Despite appearing in high-profile productions like The Time Machine and Prometheus, Pearce hasn’t achieved the immediate recognition of his sometime castmate, Hugo Weaving. He has liberty to dive into projects he chooses, remaking himself each time, unlike Weaving, who has become so visible that he cannot enter new productions without carrying the aura of Agent Smith or Lord Elrond behind him. To different degrees, each actor carries past characters behind them.

I’ve witnessed this at various stages in today’s massively media-saturated culture. When, say, George Takei entered the TV series Heroes, back when anybody could still watch it, nobody could separate him from his role in Star Trek, and the series producers didn’t even try; they even gave his limousine the license plate number NCC-1701, the registry number of the starship Enterprise.

But as modern media has become increasingly ubiquitous, it’s also become increasingly homogenous. Production companies use tentpole franchises to purchase (or steal) audience loyalty, and as a result, they persistently overstretch a story, resulting in disasters like Peter Jackson’s ridiculously overlong Hobbit trilogy, or a Han Solo movie nobody really wanted. And actors, by sheer repetition, become associated with one character.

George Takei in Star Trek (left) and Heroes

Be honest. You can’t see Patrick Stewart without seeing Captain Picard and/or Professor Charles Xavier, possibly both superimposed in your mind simultaneously. Stewart has tried playing against that role, in shows like The Eleventh Hour and Blunt Talk, but audiences have refused to accept him, because they cannot separate his face from the extreme rectitude they associate with his characters.

Once actors become sufficiently high-profile, many complain about getting typecast, and only being offered variations on the same role. This works for some, as it’s impossible to imagine Alec Guiness as Obi-Wan Kenobi without his roles as men of great integrity, sometimes played straight, as in The Bridge on the River Kwai, or subverted, as in his classic Ealing Studios comedies.

Other actors, however, suffer for this typecasting. Jack Nicholson played characters known for their unique moral code in films like One Flew Over the Cuckoo’s Nest or The Missouri Breaks, but Stanley Kubrick forced him into an extreme version of that character for The Shining. And sadly, he never returned from that, becoming more and more exaggerated. On learning of Heath Ledger’s death after playing the Joker, Nicholson reportedly said: “Well, I warned him.”

Guy Pearce has in some ways avoided these extremes. Because he’s tough to recognize from role to role, he can accept divergent characters like the flamboyant Felicia or villainous Aldrich Killian, without interrupting roles like the shabby but heroic and oversexed Jack Irish. He’s just famous enough for me to vaguely recognize him, but has liberty enough to try new roles without carrying old ones with him.

Speaking as somebody who’s done a bit of acting himself, I admire Pearce’s flexibility. I suspect most actors want that freedom. Many good actors start off that way: Johnny Depp famously spent fifteen years only taking roles he found challenging, before going broke and descending into the great tomb of ego that is Pirates of the Caribbean. Does Depp want that freedom back? He’s diplomatically circumspect, but I suspect yes.

Once an actor becomes sufficiently famous, I suspect he’s trapped by his roles. Whether it’s Nicholson’s feigned madness, Hugo Weaving’s relentless seriousness, or Adam Sandler’s unending cascade of dick jokes, actors become prisoners of whatever people liked in the last role. It’s tough to imagine Robert Downey, Jr, without Iron Man anymore. Yet the fortunate few, like Guy Pearce, somehow manage to avoid that.

There’s something inspiring about actors retaining that freedom. One wonders whether it’s portable to other career fields too.

Friday, November 2, 2018

The Magic “I” and the World

1001 Books To Read Before Your Kindle Battery Dies, Part 92
Matthew Hutson, The 7 Laws of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane, and
Margaret Heffernan, Willful Blindness: Why We Ignore the Obvious At Our Peril

We’ve all seen it: people who see something that isn’t there, from attributing intentions to their puppy dog or the economy, to believing their domestic rituals and behaviors can change the outcomes of football games and wars. Or people who steadfastly cannot see something patently obvious, whether it’s the effect their behavior has on others, or the way their work has consequences down the line. People who see too much, or else see too little.

Serious psychologists have spent decades researching the ways ordinary people, with no malign intent, distort their perceptions of the world so it conforms with their needs. Two journalists, British-American multimedia specialist Margaret Heffernan and former Psychology Today contributor Matthew Hutson, have compiled that research into plain-English books for general audiences. The two volumes, published almost simultaneously, have complementary themes which reward being read together.

Heffernan unwinds long, sometimes dire tales of ways people permit themselves to selectively filter reality. Several of her most important stories involve a few recurrent examples: how a 2005 explosion at a Texas City oil refinery should’ve warned its parent company, BP, of systematic flaws, for instance, long before the Deepwater Horizon disaster—but didn’t. Or how W.R. Grace’s notorious vermiculite mine poisoned Libby, Montana, for decades.

Hutson’s approach is more episodic. He organizes the research on magical thinking into categories, then finds relevant examples to demonstrate principles drawn from research. This makes Hutson’s book more useful for dipping into when seeking evidence in later arguments, though it means he doesn’t give themes the same opportunity to mature that Heffernan does. Hutson’s approach is more discrete, Heffernan’s is more holistic.

To give only limited examples: Hutson describes how belief in luck, a seemingly irrational precept, nevertheless gives people power. When individuals believe themselves naturally lucky, they feel emboldened to take risks, push limits, and innovate productively. Likewise, Heffernan describes how a culture of silence and self-censorship detonated BP, but protects intelligence officers at Britain’s Chatham House, headquartered just down the same London street.

These books appeared as the world economy recovered from the subprime mortgage disaster of 2007-2008. Not surprisingly, both books make generous use of examples from the financial services sector, though they stress these concepts are portable to the larger culture. Heffernan describes ways money mutes personal ethics, and the “bystander effect” prevents anybody from saying anything. Hutson shows how belief in luck could both empower people to act while preventing them thinking about their actions.

Margaret Heffernan (left) and Matthew Hutson

Perhaps the most important theme both books share, is the absolute importance of thoughts we might automatically consider harmful. As Hutson describes how humans attribute meaning, personality, and even soul to the world, he also describes how people who don’t do this often struggle with routine activities. And Heffernan admits humans cannot possibly see and process everything; selective blindness is necessary to survive. Surprisingly, neither magical thinking nor willful blindness are completely wrong.

They are, however, ways people see the world. Both authors acknowledge their desire isn’t to abolish magical thinking or willful blindness; rather they want us to understand how these forces influence us, and steer ourselves to the most useful forms of thinking. In Heffernan’s words, she wants us to “see better.” These authors don’t want readers to abandon habits which, they admit, actually work under certain conditions. They want us to use these conditions more consciously.

Between the authors, Hutson intrudes more into his narrative. He admits his seven “laws” aren’t hard scientific principles like, say, Newton’s Three Laws; he deliberately enforces arbitrary but satisfying forms on a sprawling field of research. Heffernan lets themes unveil themselves more sequentially. Hutson, who admits a material existentialist philosophy, also lets that visibly color some of his conclusions. Unlike Heffernan, he believes in only one right way to see.

So, um, he engages in some magical thinking, to which he’s willfully blind. Oops. Still, his facts withstand scrutiny.

Both authors’ use of real world examples should hit readers close to home. Though the exact circumstances that destroyed Countrywide Financial and started the Second Palestinian Intifada don’t currently exist, the context of laws, religions, and money that created those circumstances remain. So everything Heffernan and Hutson describe remains relevant to the world we live in. And every disaster they describe could recur.

René Descartes supposedly said: “We do not describe the world we see, we see the world we describe.” That quote encapsulates these two powerful books. We create our worlds afresh daily. Let’s make sure we create the best world be possibly can.

Thursday, November 1, 2018

Why Do We Hate The People Who Handle Our Food?

Promo still from the award-winning 2007 indie film Waitress

Why are food service workers paid so poorly? Everybody eats. Nearly everybody would rather eat well than eat some mush slapped mindlessly on a plate. Yet not only is food service among the most poorly paid work in America, it's among the least likely to offer paid sick days and comprehensive health insurance from one's employers. Meaning the likelihood you've been served steak with a side of pneumonia is high.

The Fight for $15 has banded together food service, healthcare, and childcare workers with other groups historically poorly paid, and provided a massive public voice for paying America’s most despised workers a better living wage. We’ve heard the moral arguments for why these groups deserve a better wage. But we, or anyway I, haven’t really heard the argument against paying these people better, except some vague mentions of supply-and-demand theory.

So I spent some time thinking furiously about the reasons we refuse food service workers better wages. It’s easy to dismiss some of the more commonly trotted-out counterclaims. The argument, for instance, that food service is a starter job for teenagers, runs immediately aground on the fact that we can order food at noon on weekdays, and other times teenagers aren’t legally allowed to work. Let’s discard those arguments immediately.

(Let’s also discard recent evidence that our economy holds doors open for certain teenagers while slamming them on others. That’s an entirely different ballgame.)

By keeping focus on the work itself, and the social context in which the work takes place, two arguments made themselves available to me. Then they fell apart. First, supply-and-demand tells us that very common jobs are priced poorly, and food service is one of America’s few growing industries since manufacturing collapsed. There are too many food service jobs. Except those jobs are ubiquitous because demand is high, so oopsie!

Where I live, I’ve seen entire communities which basically exist to sell food, gasoline, and hotel service to travelers on the highway. Since the agricultural economy has dwindled, and growing crops doesn’t bring money in like it once did, servicing the very high-demand traveling business remains the only wealth generator in much of the heartland. Classical economics would say this demand would make the supply very valuable, but economics fails.

Edmonton, Alberta-based chef Serge Belair at work

If supply-and-demand doesn’t work, the other classical economic argument for poor pay is that food service is a low-skilled job. Skills are valuable, and therefore create wages. Except, as I’ve bounced among industries after college, I’ve observed that there’s no such thing as “low-skilled jobs.” Some jobs demand skills workers bring with them, from machinists to physicians, and other jobs require skills learned while working. Food service is the latter.

So simple arguments don’t work, and classical economic arguments don’t work. I’m left with one approach: what moral attitudes do people have toward food service workers? Here I run into something more complex, and more disgusting.

On one hand, we glamourize food. We court potential future spouses over food, especially food somebody else made, to showcase our ability to splurge. We clinch important business deals over food—and, if you’ve ever watched a “business lunch” in process, you know that food mostly doesn’t get eaten. Most of our important religious and secular celebrations, like holidays and weddings, involve conspicuous overeating. We love showing off our food.

On the other hand, food is slightly icky. We enjoy eating certain kinds of food, like roasted meats or rich desserts in sauce, for their intense sensual pleasure. But we eat other foods, like starches and cruciferous vegetables, because we need the nutrients. We’ll even invent cooking methods, like drizzling them with sauce, to make them less nasty. We must eat these foods in order to survive, to withstand death.

Food memorializes human mortality. Eating reminds us how perilously close even the wealthiest and most successful among us, remain to dying. Food forms a key link in a continuum: we eat food to survive, and convert it into poop. That poop becomes nutrients for future food. This reminder of mortality disgusts industrialized Westerners so much, we break the cycle, chemically treating our poop and dumping it outside the food chain.

Food service workers handle our future poop. They remind us, by simply existing, that we’re going to die, that we could die tonight. Our dependence on food service workers challenges the Western technological myth that we’ll eventually beat gravity, physicality, and death. Food is, in essence, gross. So we punish the people who provide food by denying them better wages. Because how dare they remind us we’re going to die.

Monday, October 29, 2018

What Can One Man Do About Workplace Racism?

Unknown individuals wave the flags of Honduras and Mexico above the caravan

“What else could I have done?” I heard myself screaming, desperate to be taken seriously. “What would you have done? I have to make a living!” I realized I was waving my hands in front of me, Bernie Sanders-like, scared and desperate. Because I knew I had been wrong, but I didn’t know what other choice I had.

I was talking with a friend, over Facebook remote video chat, about something that had happened at work the previous day. I had been one of four guys installing tornado-proof outdoor furniture at a bank branch in a central Nebraska town, a process that wasn’t difficult but was tedious and time-consuming. And as dudes will do when bored, we started chatting.

The topic turned to current events. I’ve learned not to broach politics at work unless someone else brings it up, and never to offer my opinions, no matter how founded on facts and evidence, because doing so gets me in trouble. In Nebraska, and especially in blue-collar work, partisan allegiance isn’t a matter of discussion, it’s a matter of group identity. Dissent doesn’t mean you debate, it means you’re an outsider, or worse.

As the other three guys chatted about politics, and I kept my head down pretending to be selectively deaf, one guy asked another guy’s opinion about the caravan. As an estimated 7000 mostly Hondurans walk slowly toward the United States, planning to claim political asylum at a port of entry, probably Del Rio or Eagle Pass, Texas, this caravan has become America’s hottest political dividing line. This isn’t accidental.

“I don’t know much about this caravan,” one guy said, while the other two nodded like Solomon. “But I know, when you have a stampeding herd, you shoot one or two animals at the front, the rest of the herd will scatter.”

That’s what had me screaming down a Facebook video at my friend the next day. She insisted I had a moral obligation to speak against such dehumanizing language. I said I couldn’t, because when I’ve tried before, the blowback has been too vicious, and I’ve found myself ostracized for days, in a job where communication is paramount.

“You should’ve gotten the boss,” she said. “He has an obligation, by law, to provide a workplace free of that kind of hostility and discrimination.”

“I apparently haven’t made myself clear,” I replied. “That was the boss. That was the site supervisor.”

The caravan passes from Guatemala into southern Mexico

This is something I’ve encountered repeatedly since taking a construction job over three years ago. Racism is widespread in this business. I’ve been forced multiple times to swallow my objections while other workers, including my supervisors, have stood around running down Blacks, “Mexicans,” and other groups. Racist bullshitting is basically a form of group bonding.

This leaves me conflicted. I know keeping silent serves to empower the oppressors in our society. Jim Crow laws were only overturned when White people joined with Black people to call injustice unjust. When White people previously swallowed their objections, going along to get along, racists felt empowered to make laws even more unjust.

But, as I told my friend, I have to make a living. Construction isn’t a job anybody can do in isolation. If nobody will talk to me, if I find myself ostracized for speaking against group identity issues, like most workers’ shared conservatism, I can’t do my job. So sometimes I do what I know is wrong, keep quiet, and let people say things I consider morally odious.

This carries extreme risks, though. When people only speak to other people who already share their views, they tend to emerge from such discussions with more extreme versions of the views they already have. Psychologists call this phenomenon “group polarization,” though the military has a superior term for it: “incestuous amplification.”

I really feel I witnessed incestuous amplification happening last Friday at work. As otherwise good, hard-working White people stood around furiously agreeing with one another, their views became more extreme before my eyes. A guy whose expertise I respected, suddenly compared brown people to herd animals, and suggesting shooting them for doing something perfectly legal.

Now I have to return to work Monday. I have to return knowing I’ll hear language that bad again, or worse; knowing co-workers have Alex Jones InfoWars and “America For Americans” bumper stickers on the trucks where they carry their tools, knowing White people regularly write racist graffiti in shared outhouses. And I don’t know what to do if this happens again. Because it will.

Tuesday, October 23, 2018

What Is a “Race” In America Anyway?

Elizabeth Warren
Senator Elizabeth Warren permanently squandered much of my support last week. Her publication of a genetic breakdown “proving” her Native American heritage uncovered two massive flaws. First, it it demonstrated her vulnerability to Donald Trump’s “whataboutism” and his desire to distract Americans from important issues. It also reveals an essentialist thinking that plagues not only her, but White Americans generally.

Essentialism is the assumption that something subtle, ethereal, and transcendent, lies beneath all forms of reality. Philosopher David Livingstone Smith draws a fundamental example from humanity itself: any definition of “human” necessarily leaves something important out. Bipedal? Some humans can’t walk, or lose their legs. Sentient? Not everybody. Yet we know a human being when we see one. An intangible human essence binds us all together.

I have no problem with this essentialism. Sometimes it’s necessary to hold everything stable: when, for instance, White Nationalist groups define humanity in ways that exclude certain people from human rights protections, this human essence gives our counter-argument important legs. However, when this essence calcifies, freezing humanity—or whatever we’re discussing—in one place, it becomes burdensome, not defining.

Evolutionary biologist Richard Dawkins writes elsewhere that when essences become inflexible, they become an impediment to understanding. He uses the example of a rabbit. Multiple rabbit species and subspecies exist, but we understand them as all “rabbits” because we see their common essence. But when that essence becomes not a description, but a definition, we stop seeing each rabbit individually. We cannot perceive evolution, because evolution drifts away from the essence.

Richard Dawkins
Writing in The Republic, Plato postulated 2,400 years ago that the reality we see doesn’t really exist. It’s a mere shadow of a perfect world, where everything is clearly defined, every class is reliable, and every individual is perfect to its group. The concept of “chair” doesn’t refer to the British throne, a mass-manufactured La-Z-Boy recliner, and the chairs I build from pallet wood; every chair, in the real world, is a perfect expression of the chair concept.

Except Plato assumes all things real objectively exist. A “chair” can only be perfect if the chair concept precedes humans, which it cannot, because chairs are designed for the posture in which human beings sit down. If our bodies differed, so would our chairs. Therefore there cannot be a perfect chair anywhere.

This doesn’t mean there isn’t a chair essence. We can recognize the similarities in design, shape, and use between my handmade chairs and the British throne. But that essence exists inside us, and not, as Plato believes, outside the universe. The human capacity for seeing patterns and making meaning, drives the similarity between different kinds of chairs, like the similarity between different kinds of humans.

Senator Warren has accepted the belief that the difference between human racial groups is innate. I can imagine no other explanation for her insistence that her genetic profile defines her Native American ancestry. As sociologist Richard J. Perry writes, races aren’t distinct enough to have unique genetic profiles; the geneticists promising to tell you who you really are, can only identify certain patterns, all of which are subjective and contingent.

Just as the essence of “chair” comes from our perceptions, not from the chair itself, the essence of “race” comes from how we treat one another. Like many White people born in Oklahoma, it’s completely feasible that Senator Warren has some Native American blood quantum. Most Americans are, to some degree, genetically mixed. But unless somebody actually treated Senator Warren as Native American, that isn’t part of her identity.

David Livinsgstone Smith
Remember that genotyping TV ad where the guy concludes by saying “I traded my lederhosen for a kilt”? I’ve always wondered whether that dumbass really believes he can trade his learned identity for another. Population groups historically moved widely throughout Europe, indeed throughout the world. The guy thought he was pure German, but found genetic markers consistent with Scotland, for whatever that’s worth. But have there ever been pure Germans anywhere?

That’s the danger being peddled by for-profit geneticists today. They preach a fixed and intangible essence, a human nature that does not change. While they may preach some commercially acceptable form of human inclusion, they ultimately say that your genes define your identity immutably. Which is the foundation of racism.

I’m not calling Senator Warren racist. She still has time to back away. But I’m saying she stands at the head of a slippery slope and gives it credence by calling it another name. Race is a man-made concept, and as Ibram Kendi points out, it’s made to convenience other people’s economic demands. And Senator Warren cannot plausibly preach economic fairness while lending credence to other people’s false strata.

Friday, October 19, 2018

The Autobiography of the Last Free-Born African Slave

Zora Neale Hurston (ed. Deborah G. Plant), Barracoon: the Story of the Last “Black Cargo”

Back in December 1927, Zora Neale Hurston, a Columbia University-trained anthropologist and Harlem Renaissance darling, traveled from New York to Mobile, Alabama. There, in an outlying region, she met Kossula, the last known survivor of the Transatlantic Slave Trade. The man she found was often taciturn, more interested in tending his garden than recounting his story. But when she finally coaxed him to talk, she found him eager to share nearly a century’s accumulated experience.

Kossula (alternately spelled Kossola or Kazoola) was captured by Dahomey warriors, aged about 19, and sold to American smugglers, who got him and over 100 other Africans into Alabama in 1859. This despite America having banned the importation of Africans in 1808. Had the smugglers been caught, they could’ve been hanged, though few ever were. Kossula’s ship became the last ever slave vessel before the Civil War; his “owners” held him less than three years.

Hurston, born relatively poor in segregated Florida, proved an apt student, eventually studying under “The Father of American Anthropology,” Franz Boas. In 1927, she hadn’t published any books yet, only some scholarly articles. However, according to editor Deborah Plant, an independent scholar specializing in Hurston’s works, she already chafed at scholarship’s restrictions. Interviewing Kossula offered her an opportunity to immerse herself in the stories she considered most important, a hallmark of her later classic work.

When Hurston bribed Kossula with fresh fruit and country ham, he opened up. (Early pages also stress that he appreciated being called by his African name, rather than his Americanized name, Cudjoe Lewis.) He begins telling his story, mostly in order, though Hurston probably edited that for print. His voice is distinctly reminiscent of the rhythms of folktales and traditional songs, and often has neatly packaged morals; he admits being a well-regarded teller of “parables.”

The product is essentially Kossula’s autobiography, merely prompted along, Boswell-like, by Hurston’s probing questions. He recounts an interesting childhood in Bantè, a little-known region of Benin ultimately subsumed by Dahomey, and later France. He describes a childhood of relative privilege, raised by a minor nobleman’s family, and training for initiation into manhood and soldiery. Sadly, he cannot tell the entire story; Dahomey forces sacked his village for lucrative slaves before he reached the final initiation.

Kossula (Cudjoe Lewis) as photographed by Zora Neale Hurston,
left, and Hurston herself (click to enlarge)

Kossula tells extensively about the forced march from his village to the barracoons (slave barracks) of the Benin coast. Those among his people who weren’t enslaved, were murdered outright in the war. He also talks at length about the shipment in a shallow-keeled blockade runner across the Atlantic Ocean, where villagers who’s known each other for years found themselves divided among plantation owners who didn’t respect that history. Surprisingly, he talks little about slavery itself.

Liberated as suddenly, and as violently, as they were enslaved, Kossula’s people first wanted to return to Africa. However, the whites who liberated them cared little afterward, and they couldn’t afford the return passage. So they created “Africatown,” a village neighboring Mobile, Alabama, where they recreated their African lifestyle wherever possible. Kossula recounts building his people’s first church and school, having legal troubles with the railroad, and eventually outliving his wife and all five children.

Unfortunately, this volume never found a publisher during Hurston’s lifetime. Scholars speculate why: perhaps because Hurston spells Kossula’s dialect phonetically, already considered borderline racist in the 1930s, or because it implicated fellow Africans in slave trading. Though scholars have known multiple typescripts of this book exist, and have studied it extensively, opaque areas of copyright law kept it from being published until 2018, even as Hurston has become posthumously recognized as a major American writer.

Deborah G. Plant provides valuable front and back matter to place Kossula’s story in its historical context. Hurston’s draft runs barely 100 pages, very short even for oral history, and though she provides some explanatory endnotes, they’re sparse, sometimes contradictory, and based on outdated scholarship. As an anthropologist, Hurston was a remarkable storyteller and literary stylist; but, compared to her landmark researches into African American folk religion, this volume is very much a journeyman effort.

Despite very minor shortcomings, this book provides welcome insights into America’s past. As Hurston writes, slavery’s history usually comes from a White perspective: “All these words from the seller, but not one word from the sold.” Kossula, the last storyteller capable of closing that gap, does so with poetic grace and dignity. This volume also helps cement Hurston’s role as an eminent Black scholar and stylist, and hopefully a new generation will read her works.

Thursday, October 18, 2018

In Praise(-ish) of Conformity

David Sloan Wilson
In the school where I attended second grade, our classroom was two doors down from a kindergarten class. The kindergartners had to walk past our door to reach theirs. Several of my classmates had a favorite taunt they employed whenever the kindergartners wandered too close:
Kindergarten babies!
Stick your head in gravy!
Wash it off with bubblegum
and send it to the Navy!
I resisted singing along as long as possible. First, because it seemed just mean, running little kids down for being little. Hell, I'd been a kindergarten baby just two years earlier. Then, because I'd just moved into that area myself, and had as little in common with my classmates as with the kindergartners.

Yet before long, the dirty looks from my classmates became overwhelming. My silence marked me as an outsider. And be real, I had to interact with my classmates daily, while the kindergartners remained virtually strangers. What else could a kid with few friends do? To my later shame, I started singing along with the bullies’ taunt.

We're accustomed to thinking of “conformity” as something weak-minded people do, a zombie-like behavior. We often couple conformity with the word “mindless.” Yet evolutionary biologist David Sloan Wilson, in his book Darwin's Cathedral, lists conformity as a necessary precondition to build human society. We can't get along unless we accord with others’ behavior and expectations.

Several benign actions serve to advance productive (rather than mindless) conformity. Small talk is one, though I cringe to admit it. Clichés in speech and writing are another, since they let speakers share a background of reference. As any football fan, science fiction convention-goer, or political party devotee knows, engaging in chants and songs is a powerful group-building act.

We see this in religious songs. When Lutherans sing “A Mighty Fortress Is Our God,” or Methodists sing “O For a Thousand Tongues To Sing,” they confirm their group identity. These songs contain the germinal forms of their group theology, but for religious purposes, the lyrics are secondary. The point is, we sing them together.

Colin Kaepernick
Nobody would mistake “Kindergarten Babies” for secular hymnody, but it serves the same point. By singing it together, we confirmed we'd passed beyond the ignorance of infancy (“those who dwelt in darkness have seen a great light”). We also confirmed our identity as mature, diverse minds prepared for life's strange and dangerous exigencies. Duh, we were seven!

One of today's most inflammatory issues deals with the correct way to handle a national identity song. Must we all, as one side contends, stand to attention in absolute unison? Or may we, as the other side contends, kneel and pray as our conscience dictates?

This isn't a thought experiment. The two sides feud for control of how we express our group identity. One side says we're a martial people defined by our loyalty to the hierarchy (remember, the national anthem is a military song). The other says we're a people of morals and principle, and sometimes we're most American when we defy the American state.

Arlie Russell Hochschild, in her book Strangers In Their Own Land, interviews several people living in strongly conservative areas. She discovers that many have what, to her, sound like progressive values. Some are committed to environmental protection, others to economic fairness, others to their own causes. Yet in the voting booth, time and again, they vote for the party that opposes their pet issues.

Arlie Russell Hochschild
Hochschild, a scholar, avoids attributing intent to this disconnect. I have no such restraint. Like me, swallowing my principles to sing “Kindergarten Babies,” they'd rather get along with the people they have to live with every day, than be morally pure and lonely. This uniformity makes these individuals into a people.

But does it make them a people they'd like to live with?

The difference between productive conformity, and mindless conformity, is often visible only at a distance. I don't mean physical distance, outsiders standing around passing judgement. I mean time distance: I now regret singing “Kindergarten Babies” because I'm an adult who knows the difference between building community, and buying fifteen minutes’ peace. Once-popular actions like Operation Iraqi Freedom, mean something similar to the nation.

In short, we need conformity to survive. But we're lousy judges, in the moment, of the difference between productive and mindless conformity. We need constant guidance and reminders, and even that isn't foolproof. I have no answers yet. But I think I have better questions, and that's maybe more important than facile answers at this point.

Monday, October 8, 2018

Is a Senate Hearing Really a Job Interview?

The Brett Kavanaugh hearings have generated two metaphors: the job interview and the trial. Those supporting Kavanaugh’s SCOTUS nomination repeatedly trumpet the “what happened to innocent until proven guilty” argument, insisting that the relative paucity of evidence shouldn’t disqualify him from a lifetime appointment to America’s highest court. Opponents counter by saying “this isn’t a trial, it’s a job interview, and the standards are much lower.”

I’d like to consider the latter metaphor. If a Senate confirmation hearing really does resemble a job interview, what forces go into similar interviews? Anybody who’s looked for work recently knows, tension exists between what hiring directors claim a job interview consists of, and what actually happens. Hiring professionals want us to believe they impartially consider an applicant’s qualifications, credentials, experience, and temperament, and choose the most qualified person.

That’s the theory anyway.

In practice, job interviews turn on invisible qualities. These qualities are both subjective, and completely anecdotal. After being denied the same promotion three times at my last job, I asked HR what had happened. They said, because I didn’t take my breaks in the company breakroom (which was crowded, noisy, and by Friday often smelled like a locker room), they didn’t believe I was a “team player” and wouldn’t participate in group decision making.

Many researchers have dubbed the most important factor “affinity.” This basically means that hiring professionals select applicants who most resemble themselves: shared values, common experiences, even physical resemblances. If you attended the same kind of college as the decision maker, or have a similar economic background, your chances improve markedly. This is also why men hire men, white people hire white people, and Harvard grads hire Harvard grads.

Yeah, it's safe to say Justice Kavanaugh resembles Senator Graham

Scholars have written extensively about the affinity effect. Two sources should be sufficient: here and here. The continued similarity of ethnic, racial, sexual identity, and gender outcomes in American business reflects that hiring still gets done by white, cishet, middle-class men with college educations. And these HR directors mostly pick people who resemble themselves. Fairness makes an admirable goal, but remains mostly unattainable.

We all do it. It’s hard not to. Chances are, your co-workers, best friends, and spouse all resemble you in age, race, economic background, and (within limits) gender. As researcher Alison Wolf writes, outside the single top economic quintile, most job fields are starkly segregated by gender. Most towns with multiple racial populations know weekends are heavily segregated: there tend to be White, Black, and Hispanic bars, restaurants, and churches.

This applies to Congress, too. Though the current 115th Congress is “is the most racially diverse in history” according to USNews.com, that isn’t saying much. The current Senate is less than one-quarter female, and more than ninety percent white. Neither number reflects America’s actual racial or gender breakdown. Though an exact mirror of America’s demography is likely impossible, the gap nevertheless is remarkable.

Brett Kavanaugh resembles the Senate that narrowly confirmed him: White, male, heterosexual, Christian, college-educated, and relatively well-off. Based on the affinity principle, we shouldn’t be surprised. Of the nine current Justices, six are men; seven are White. Six are Christian (five Catholic), three are Jewish. The court has never had a professing atheist, Mormon, or Muslim; it’s had exactly one self-described agnostic, Benjamin Cardozo, from 1932-1938.

So yeah, the Court resembles the Senate. Kavanaugh is recipient of such continuing affinity.

In fairness, the Senate isn’t dominated by accused sex criminals. Al Franken, the only sitting Senator in the 115th Congress accused of sexual misconduct, resigned under massive bipartisan pressure. But a willingness to overlook sexual allegations has become an American political standard since 2016. President Donald Trump has seventeen pending allegations of unwanted sexual contact or peeping Tom-ism.

If a Senate hearing really resembles a job interview, therefore, that doesn’t instill much hope in me. Real-life job interviews since the collapse of 2007-2008 have seen me rejected as both underqualified and overqualified from jobs with no listed mandatory qualifications. They’ve seen me rejected as insufficiently extroverted from jobs that had nothing to do with being gregarious. They’ve seen me… well, the list continues. Basically, I don’t resemble the interviewers enough.

Future appointments like Kavanaugh’s need a better metaphor. He wasn’t on trial, so that tight standard doesn’t apply, but job interviews have loose, sloppy standards that also don’t apply to something so important as a Supreme Court seat. I don’t readily know what metaphor would make better sense. But we’d better decide that soon, because Justices Thomas and Ginsberg aren’t getting any younger.

Thursday, October 4, 2018

Does “Nature” Really Exist?

Papa Pigeon hunts for scraps amid a refurbishment job

A family of pigeons has made its nest in a disused air duct at work. Mama Pigeon stays up high with her nestlings, while Papa Pigeon, a handsome specimen with beautifully marbled black-and-white plumage, wanders the premises, hunting scraps to bring back for the young. We’ll eventually have to turn the ventilation back on. But for now, there's an unspoken agreement to leave the birds alone until the young are ready to fly.

Nobody would mistake our jobsite for a natural environment. An air duct isn't a verdant branch; the dumpsters and trash cans make a poor analog to the forests pigeons once scavenged for food. Yet our environment provides shelter, warmth, protection from weather, and abundant cheap nutrients. Animals that adapt to live among humans, from rats and pigeons to dogs and cats, flourish and get fat, while their wild cousins struggle.

How much can wild animals adapt to human-made conditions and still remain natural? To get really pointy-headed about it, we haven’t yet created a meaningful definition of the word “natural.” When does this piece of wood stop being a natural tree and become an artificial object? When the tree is felled? When the lumber is milled? When the carpenter turns it into a table? You see the problem. The word “natural” means something, but we can't agree what.

We know that “artificial” describes what happens when humans get involved in our world. Houses, streets, and cities are clearly artificial. But wild influences inevitably make their way into our artificial environment, from crabgrass and ants to feral cats and, yes, pigeons. Some living beings flourish in environments moderated by human artifice, without being necessarily domesticated. Are these influences natural?

Mama Pigeon guards her nest from the intrusive photographer

I’m inclined to say yes, crabgrass and feral kittens are natural forces in an artificial environment, because they’re neither planned for nor controlled. We make salutary efforts to control both, spraying lawns with harsh chemicals to ensure only desirable plants grow, and trapping feral animals for rehabilitation or removal from the environment. But these efforts are minor and don’t stem the flood. Nature persistently clings to the artificial space.

So. If humans and their built environment are artificial, but nature adapts itself to the artificial environment, then humanity is no longer strictly artificial. We’ve become a moderating force on nature. Scientists acknowledge this influence when they speak of the Anthropocene, the proposed geological moment since around 1750, during which human activity has exceeded wind and water as the greatest force shaping Earth’s surface.

Nature, then, is an artificial thing. We cannot separate what exists before human involvement from what exists after. Even in places where humans have little or no involvement, our influence alters the environment, from pollutants in the air and water, to the sounds generated by our machines. Any hunter or outdoorsman knows the frustration of going into nature to escape humanity, only to find litter and noise scattered everywhere.

Our human illusion of separateness gets spoiled whenever we try to escape. Even just studying nature fixes it in a form, creating “laws” which reality must supposedly obey. Yet reality isn’t an algorithm; we cannot list nature’s laws and expect coherence. Whether we tromp out into nature to study it, or watch nature infiltrate our built spaces and adapt itself to us, we witness a supposedly non-human world adapting itself to human forces.

Papa Pigeon takes flight

So nature does not exist. If nature is whatever humans haven’t influenced, then we’ve never seen such a thing. The human influence on non-human space is pervasive, and we carry it with us wherever we go. Pigeons living inside a half-refurbished public building are one easy example of this, since a nesting family is adorable. The extinctions of passenger pigeons and western black rhinoceroses are more grim examples.

If nature adapts to humanity, it is no longer free of human influence; it is artificial. Humans have created the natural world around us. So far, we’ve done so mostly heedlessly, assuming wild species will simply accept our intrusion of cities, long-distance roads, and carbon-burning technology with peace and equanimity. Which, of course, they haven’t. We’ve fallen ass-backward into a changed world without planning anything.

Therefore, if humans create nature, we need to start doing so consciously. We need to keep ourselves aware of the influences we force upon the world, the ripples our actions cause on everything. We need to study the non-built world so we can steward it accordingly. The next creature moving into our space might not be as cute as a pigeon.

Tuesday, October 2, 2018

The Conservative Anger Litmus Test

Brett Kavanaugh
This numbskull at work plays right-wing talk radio way too loud. And by “way too loud,” I mean much louder than necessary for him to hear it at ordinary sound levels, but not loud enough to hear over power tools and equipment. Clearly, unless he’s suffering severe hearing loss, he doesn’t need the radio at this volume. I wondered for weeks why he played his radio so loud. Then I realized: he does it for me.

He hopes I, or someone like me, will complain about him playing Rush Limbaugh, Sean Hannity, and other shouting nabobs of partisan hackery. Personally, I don’t mind this guy has politics that disagree with mine. I don’t even mind that he seeks sources that encourage a more extreme and divisive iteration of what he already believes. Everyone is entitled to their sources. I mind that the sources he chooses are always shouting.

During last week’s Senate testimony, a literal “she said, he said” where Dr. Christine Blasey Ford stated her case, then Judge Brett Kavanaugh called her a liar, we heard lots of shouting. Blue Facebook and Blue Twitter held virtual postmortems where they reminded fellow thinkers that, in a two-sided debate, whoever starts shouting first is usually wrong. Defensiveness, belligerence, and wrath are refuges for liars and cads.

Except conservative Americans didn’t perceive things that way. Point out that Judge Kavanaugh started screaming and crying even during his prepared opening statement, they’ll respond: “But she accused him on national TV.” Note that he responded to ordinary routine questions with petulance and spleen, they’ll answer: “Wouldn’t you get angry if somebody said things about you?” Rage, I’ve observed anecdotally, is their only reasonable response.

Lindsey Graham
Nor was Judge Kavanaugh alone in his fury. Professional hand-wringers in the punditocracy have made bank parsing the outraged displays from Republican Senators like Lindsey Graham and Orrin Hatch. The nominee to be one of America’s top judges gets angry at accusations, rather than trusting that facts will exonerate him, and legislators echo his choler. The people we expect to be rational debaters think shouting proves them right.

I’m reminded of linguist and political commentator George Lakoff. In his book Don't Think of an Elephant, Lakoff describes the mental framework separating conservative and progressive Americans in terms of family dynamics. Progressives favor the “nurturing parent” model, where parents encourage children to do more and better with their lives and choices. This dynamic, not gender specific, believes in rewarding fledglings for leaving the nest.

Conservatives, however, favor what Lakoff calls the “strict father” model. A stern, singular lawgiver, usually but not necessarily male, provides the source of moral authority, and brings the hammer down on anyone who strays from righteousness. This strict father might reward good and honorable behavior, but exists mainly to punish wrongdoers. The orderly, obedient home, is the source of justice. This is the dynamic of “wait till your father gets home” parenting.

To a certain form of highly public conservatism, indignation and rage aren’t deflections or shelters from responsibility. They’re expressions of paternal righteousness. If you aren’t angry, you aren’t honest, and more importantly, you aren’t serious. To this mental framework, fathers default to anger because anger teaches children the ways of righteousness. Being the first to become angry doesn’t make you weak or wrong, it makes you fatherly.

Alex Jones
Consider those talk-radio pundits shouting down the airwaves. Limbaugh and Hannity aren’t shouting at someone who disagrees with them; their core audience shares their opinions. Alex Jones is famous for becoming so outraged, while reaching an audience who already agrees with him, that he’s reduced to incoherent, wordless brays, screaming “Aaahhhhh!” into the microphone. (Some women, like Jeanine Pirro and Tomi Lahren, also share this quick-to-anger dynamic. But they’re outliers.)

The voting base that favors high-profile, demonstrative conservatism didn’t see Judge Kavanaugh’s outrage as deflection or retreat from facts. They saw a display of honesty and moral confidence. Psychology might say this interpretation doesn’t jibe with research and observation, but Kavanaugh’s intended audience doesn’t care. Anger, to their mindset, stands for truth and courage. Only the truly angry have moral courage to lead.

While progressives mock Judge Kavanaugh’s display as unbecoming of a judge, polls indicate that voters already inclined to believe this “strict father” model see Kavanaugh as more trustworthy. Pollsters are reluctant to attribute reason for this outcome, probably because more than one reason applies. I, however, feel confident in saying that, at least partly, conservatives love Kavanaugh because they endorse his willingness to get angry.

Tuesday, September 25, 2018

Growing Up While Going Nowhere

Jennifer Handford
Last week I reviewed Jennifer Handford’s third novel, The Light of Hidden Flowers, and I hated it. It’s a novel about a grown woman’s failure to do anything, to break outside the pre-written script she established as a sophomore in high school. Though I soft-pedaled that opinion in the review itself (the writing world is small and exclusive, and I still hope to publish), I’ll say now: this book wasn’t done baking.

But it got me thinking. The heroic journey, a sort of narrative manifestation of the psychological journey we all undertake to become adults, has been an obsession of mine since I first read Joseph Campbell’s Hero With a Thousand Faces in graduate school. We simply assume, reading novels, that our characters will undertake some journey; even if they remain rooted in place, the changes they undertake internally will fill the role quests played in medieval myth.

Though Campbell didn’t craft a writers’ guide, many writers have used his book thus. George Lucas famously wrote the original Star Wars with two books on his desk: a dictionary and Campbell. Although Campbell only intended to describe patterns he and others identified in comparative religion and world folklore, subsequent readers have found his description sufficiently insightful that they’ve consciously mimicked what past storytellers did unconsciously.

Except…

In his introduction, Campbell quotes several case studies in psychological literature of people who, for whatever reason, failed to become adults. They retained childhood identities, repeated patterns they established in high school, never transcended the family dynamic they began with. Symbolic dreams of oedipal inclinations, wounds that replicate Christ or the Fisher King, and other Jungian forms abound. Campbells point is, adulthood remains astonishingly rare.

Joseph Campbell
See, Campbell really did have prescriptive intent. He didn’t write for writers, but for psychologists, hoping to provide ancient insights into the modern phenomenon of people who, lacking religious adulthood rites, stayed trapped in childhood. People like Jennifer Handford’s protagonist, Missy, who never had a clear break from adolescence, and therefore continues enacting household roles that brought comfort and satisfaction when she was fifteen.

Maybe I missed Jennifer Handford’s point. Maybe her book isn’t about a character who fails to grow up. Maybe her book is about the protracted adolescence that defines modernity. French philosopher Alain Badiou recently wrote that permanent adolescence has become modernity’s default setting, especially for men: “The adult becomes someone who’s a little better able than the young person to afford to buy big toys.”

That’s Handford’s story. Missy affords a big house, lavish dinners, a nice car. She has her own office with elaborate IT setup and her own receptionist. She inherits her father’s status as Richmond’s leading voice in financial planning for the extremely well-heeled. Yet somehow, she never does anything; resplendent gold-plated inaction defines her life. She spends hundreds of pages failing to start… as, arguably, do we.

Because that’s life today, isn’t it? College or trade school provides a chute to transition us from dependence on parents, to dependence on employers. The rise of automation means fewer low-skilled jobs even exist, while technology races ahead so fast that high-skilled workers need constant retraining. An IT specialist I know tells me, if he doesn’t have regular continuing education, his skills become obsolete and unmarketable in just eighteen months.

Several years ago, I critiqued Harry Potter for having his journey largely internally. Harry goes to school, and enemies and monsters assail him there; his journey doesn’t involve actually going anywhere. Even in book seven, when he finally does journey, or more accurately meander, his path returns him to school, where he confronts the bugbears of adolescence, avenges his parents, and apparently, marries his high school sweetheart.

J.K. Rowling
I intended this as commentary, not criticism. Hey, I figured, it’s a new kind of journey. But Rowling, like Handford, understands something I largely missed: today’s society isn’t about the journey. Though we’re more mobile than ever, with cars and air travel and space tourism, we’re rooted, from adolescence, in an identity and role we never wholly shake. If I could describe modernity in one word, it’s certainly “stationary.”

This doesn’t excuse Handford’s writing style. As her protagonist narrates hundreds of pages of waffle, I struggled to care. She tells a story of somebody who sabotages herself, then seeks our sympathy for it. But setting aside Handford’s book as artifact, maybe she understands something us willful myth-makers keep missing: that life today isn’t about the journey. Somebody else needs to finish the thought, but Handford’s gotten it started.

Friday, September 21, 2018

The Verses of War and Fatherhood

Martin Ott, Lessons in Camouflage: Poetry

Themes of “who I am” regularly permeate Martin Ott’s poetry and fiction. As a writer, a father, and a former soldier, he has alternated among identities with the urgency of an actor trying roles. So, like many of us, he sits down quietly with himself, as poets have to, and he doesn't know exactly who he’s sat down with. This struggle becomes the driving force behind his quiet, introspective verse.

The tapestry of identities Ott draws upon to create this collection may seem familiar, especially to anyone who’s read his previous books. The rural Michigander living in the city; the working-class boy in a creative-class job; the quiet introvert with an energetic family. As in previous collections, though, Ott’s history as an Army interrogator looms large: the man assigned to extract truth, like a tumor, in situations of hostility and violence.
A retired interrogator walks
into a bar with himself,and asks for bold spirits,
untraceable in the lineage
of fevered fermentation.
Who is greater than gods,
creator of zealots and fools,
apocalypse of every shade,
architecture of storm and awe,
maker of mountainous tombs?
(“Riddle”)
Saying a poetry collection turns on themes of “identity” has become almost cliché anymore, since poets write for self-selecting audiences rather than mass publics. Everybody writes about identity, because they write about themselves. But Ott takes this a step further. The question-and-answer tone of the poem above permeates this book. Many of his verses stride forth boldly, then interrupt themselves with questions that reverse everything that came before them.

Martin Ott
This probably reflects his own rapid transitions in life. At various times he’s needed to nurture and to kill, to discern truth and to obfuscate, to create and to destroy. Who hasn’t, of course, even Solomon wrote something similar; but having served in the military, at a time when the moral certitudes of the World Wars have fled us, this conflict between Ott’s present and his past forces him to constantly re-evaluate himself. The past isn’t gone, but the present changes it:
The older I get, the less well I do at hide
and seek, my kids able to see the bulges
poking out, fewer places for me to disappear,
the essence of fatherhood to be in plain view.
(“33 Lessons in Camouflage”)
Most of this collection’s early poems deal explicitly with Ott’s military experience, littered with references to basic training, maneuvers and orders, the disciplines necessary in war. After the first twenty or so pages, this theme recedes, becoming not a driving force, but an implicit piece of background radiation. Like a musical theme in a symphony, it becomes a necessary part of a larger composition, no longer demanding attention, but fundamentally part of the structure.

This happens with several concepts throughout this collection. Themes introduced in one poem achieve maturity in another. Hide and seek, mentioned in the stanza quoted above near the end of the collection, refers to another poem near the beginning. In that one, he writes about being so good at the game, in childhood, that even police tracking dogs couldn’t find him. This seems a momentary blip, until Ott unexpectedly completes the arc, over thirty pages and twenty poems later.

Readers weaned on the way poetry is taught in high school, with each poem essentially a separate specimen considered in complete isolation, may require some time to get accustomed to this. (Hell, I have a graduate degree, and it threw me at first.) For Ott, poetry collections like this aren’t anthologies of individual verses, written separately and brought together for publishing purposes. He constructs his poetry collections as consciously as any novelist.
When I was a boy, my family and I took
long forays into the woods for berries,
Dachshund in tow, pinging our haul
into pails, sometimes searching for morels.
Mom’s body is pale, tumors nestled between
windpipe and heart, five days since she collapsed.

(“Morels”)
Motifs of gravel, and fire, and morals/morels crop up throughout the collection. They seem to have the randomness of everyday life. Yet suddenly they’ll come together in an explosion of clarity, sometimes in a poem’s closing lines, sometimes later. Like Beethoven’s Ninth, this collection progresses toward its final movement, in this case the mini-epic that provides the title for the collection.

Like us, Ott’s identity isn’t monolithic. It comes together in a sudden explosion of insight, not always looked for, but forever impending. We wait for clarity, and aren’t disappointed. And we’re grateful Ott invited us along on his personal journey.

Wednesday, September 19, 2018

The Biggest Rock Stars in Nashville

1001 Albums To Hear Before Your iPod Battery Dies, Part 12
The Byrds, Sweetheart of the Rodeo

Critics sometimes floated the Byrds as America’s answer to the British Invasion, especially the Beatles, and this wasn’t completely unfair. They gave songwriters Bob Dylan (“Mr. Tambourine Man”) and Pete Seeger (“Turn! Turn! Turn!”) their first number-one chart hits. They pioneered the folk-rock, psychedelia, and moody singer-songwriter genres. So when their sixth album dropped in 1968, fans probably expected more of the same.

Yet this album swerves so seriously from anything that came before, it gave listeners audio whiplash. Critics loved the album, then and now, and it had a terrific influence on other musicians, but audiences didn’t know how to interpret a serious rock band’s careen into another genre, one often regarded as at war with rock. Music historians occasionally call this the birth of “country rock,” but by any serious standards, it’s a full-on country music album.

This album opens with “You Ain’t Going Nowhere,” a Bob Dylan composition from the Basement Tapes. But this arrangement, layered with steel guitars and twangy electric lead, sounds little like either a Bleeker Street folk club or Roger McGuinn’s distinctive jangly Rickenbacker 12-string. Fans couldn’t comprehend what they’d heard. Despite Dylan’s famous non-linear lyrics, this tune sonically would’ve fit nicely on most AM country radio back then.

By 1968, the Byrds were down to only two original members. David Crosby had found another home singing close harmonies, while Gene Clark had become a solo singer-songwriter, critically lauded but commercially mediocre. Original drummer Michael Clarke had become a session musician, and would never regain his prior fame. Only lead singer McGuinn and bassist Chris Hillman remained. Desperate to record, they hired Hillman’s cousin Kevin Kelly on percussion.

Most significant, however, they hired Gram Parsons as guitarist. McGuinn considered him a sideman for a planned one-disc history of American music. Parsons, however, had a passion for country music dating from when a friend played him a George Jones record years earlier. He found a kindred spirit in Chris Hillman, whose music career began by picking mandolin in various bluegrass outfits. Teaming up, Parsons and Hillman lobbied successfully for a serious country record.

The Byrds in 1968 (l-r: Kevin Kelly, Roger McGuinn, Gram Parsons, Chris Hillman)

Besides two Dylan tracks, this album includes one each by the Louvin Brothers, Merle Haggard, and Woody Guthrie. McGuinn and Hillman composed no original tracks for this album; despite being award-winning musicians, even at the Byrds’ height, they didn’t write much, trusting Clark and Crosby for original material. However, they include two Parsons compositions: “One Hundred Years From Now,” a concert barn-burner, and Parsons’ trademark song, “Hickory Wind.”

Parsons’ love of George Jones drives the sound. It has a rough-hewn fifties honky-tonk texture, but the rhythm section of Chris Hillman and Kevin Kelly gives the band a well-defined low end much like the “Bakersfield Sound,” pioneered by Merle Haggard and Buck Owens, that was contemporaneously transforming country music. This complexity does borrow heavily from rock music, but little resembles later country rockers like Lynyrd Skynyrd or Marshall Tucker.

The Byrds’ longstanding fans regarded this diversion as too extreme. Rock-and-rollers saw country music as reactionary music for white trash, while country fans considered the Byrds ignorable long-haired hippies. The album died on arrival. However, as often happens when something artistic breaks new ground, much of the reaction simply reflected discomfort with something too new and different. Once forgotten, this album now has influential cult following.

Besides core band members, this album includes contributions from several veteran Nashville session musicians, including Earl P. Ball, Jaydee Maness, and Clarence White. (White would later become a band member after Parsons quit.) This complex, layered sound gives the album a legitimate country music vibe. Ten to twenty years after this album’s release, its influence was clearly audible in most Nashville country music.

Despite Parsons’ influence, Columbia Records discovered at the eleventh hour that a contract with a prior band prevented them using Parsons’ vocals. Roger McGuinn rushed into the studio and re-recorded six songs Parsons had already sung, mimicking Parsons’ style. On the Louvins’ “The Christian Life,” this sounds almost like parody, like McGuinn held country music at arm’s length. By “Hickory Wind,” however, he’d found, and embraced, Parsons’ voice.

This album basically consigned the Byrds to permanent “cult” status; they never regained mainstream prominence. But it marked a seismic shift; less than ten years after its release, groups like the Ozark Mountain Daredevils and Pure Prairie League had Hot-100 hits, while Waylon Jennings covered rock gods like Neil Young. Two warring genres began collaborating, and it happened right here. Music would never be the same.