Monday, November 12, 2018

God's Not Dead, but the Church Might Be Sick

Promotional photo from the film God's Not Dead

I’ve never seen the movie God’s Not Dead or its sequels. I have no intention of ever doing so. I realize many people like these movies and draw hope from them, and I also recognize that they represent the leading edge of a Christian indie movie industry that’s massively lucrative: the first God’s Not Dead film grossed $62 million on a $2 million production budget. Yet, as a Christian myself, these films irk me endlessly.

And I’ll tell you why.

When my sister was seventeen and shopping for colleges, she settled on two. She ultimately wound up attending the state university a few hours’ drive from home, partly because that’s what the family could afford. But she really had her heart set on attending Northwestern College of Orange City, Iowa. This private liberal arts college is institutionally affiliated with the Reformed Church in America and has Christianity in its curriculum.

My sister (whose permission I don’t actually have to retell this story) was massively excited by Northwestern’s explicitly Christian content. She loved that Orange City, itself a heavily Dutch Calvinist community, largely closed down on Sundays so everyone could attend church. The idea of surrounding herself with all Christianity, all the time, while pursuing her liberal arts credentials, got her downright giddy. She really wanted this entirely Christian education.

Her bank account, let’s say, didn’t.

When she discovered there was no chance whatsoever of affording private-college tuition, she became paralyzed with tears. I remember her standing in the family kitchen, weeping like her dog had died, half-screaming: “If I go to a state university they’ll tell me to go stand over in the corner and keep quiet and never say anything because I’m a Christian.” The mix of rage and grief in her outburst was palpable. So was the baloney.

Architectural drawing of the new Learning Commons at
Northwestern College, Orange City, Iowa

I was more conservative then than I am today, certainly. But I’d also perused both schools’ catalogues, and knew the state university had religious options available. Certainly, as a public institution, it couldn’t offer theological or seminary training, and was too small to host a religious studies program. But it had courses in biblical literature, the social science of religion, religious psychology, and more. And it hosted, though it obviously couldn’t sponsor, several on-campus ministries.

Yet for years, in our small-town congregation, we’d gotten barraged with messages about the inherent depravity of secular education. Well, all worldly institutions, really, but state-sponsored schooling was the nexus through which everybody passed, and therefore, the first test of godless secularizing mind control the good Christian had to surpass. Getting through high school, in many people’s understanding, meant walking through a purifying fire and emerging holy. Going back for more public education? I never.

Please understand, we weren’t part of some hyper-partisan fundamentalist sect. We attended the United Methodist Church, a centrist denomination that, among other things, tried (unsuccessfully) to censure Jeff Sessions for his behavior in authority. Yet, as often happens in small communities, a lack of diversity meant people became more extreme and intolerant in their opinions. From politics and religion to loyalty to the Denver Broncos and country music, everyone generally became more rigid, not less.

But this moment forced my first meaningful break with my childhood certainties. (Childhood, yeah. I was 23.) Seeing my sister paralyzed with grief because she had to attend public university, like three-quarters of Americans and most future pastors, struck me as beyond odd, especially as she’d had a fairly successful campus tour. She’d internalized the popular narrative of modern Christian persecution. And in her mind, months in advance, it had already begun happening to her.

Asia Bibi, a woman who actually, literally fears for her life because of her Christianity

Please don’t misunderstand me. I know Christians in this world are still persecuted. As I write, Asia Bibi, a Pakistani Christian, has narrowly evaded the death sentence for “insulting the Prophet,” an accusation the high court admits is probably fake; she still lives with constant death threats. There are places in the world where Christians have to fear violence daily. America isn’t one of them. Having to attend public education isn’t a human rights violation.

Yet a constant propaganda assault has convinced some people, unable to comprehend their own blinders, that that’s exactly what’s happening today. Mass media like God’s Not Dead convince believers to see conflicting views not as challenge and learning, but as attack and oppression. Years later, my sister doesn’t regret her state university education. But could she convince another small-town kid, raised on persecution media, that she won’t be silenced for her views? I don’t know.

Monday, November 5, 2018

That One Film! Starring That One Guy!

Guy Pearce in (left to right) Iron Man 3, Jack Irish, and The Adventures of Priscilla,
Queen of the Desert
 (click to enlarge)

I’m so far behind in watching big-ticket superhero films that I didn’t realize five entire years had already passed since Iron Man 3 debuted. Honestly, Wonder Woman was the first superhero film I’d seen in cinemas since Iron Man 2. Or since. So when it appeared in my On-Demand menu, I figured, what they hey, let’s get caught up. Except, once I pressed play, a whole other question occupied my mind:

Why do I recognize Aldrich Killian?

The fella looked so damned familiar that I focused my attention on trying to place him, to the exclusion of events onscreen. (I really should re-watch the film. Eventually.) Only during the closing credits did I realize: it’s Guy Pearce, a British-born Australian actor probably best-known to international audiences as Felicia, the flamboyant young drag performer in The Adventures of Priscilla, Queen of the Desert.

Despite appearing in high-profile productions like The Time Machine and Prometheus, Pearce hasn’t achieved the immediate recognition of his sometime castmate, Hugo Weaving. He has liberty to dive into projects he chooses, remaking himself each time, unlike Weaving, who has become so visible that he cannot enter new productions without carrying the aura of Agent Smith or Lord Elrond behind him. To different degrees, each actor carries past characters behind them.

I’ve witnessed this at various stages in today’s massively media-saturated culture. When, say, George Takei entered the TV series Heroes, back when anybody could still watch it, nobody could separate him from his role in Star Trek, and the series producers didn’t even try; they even gave his limousine the license plate number NCC-1701, the registry number of the starship Enterprise.

But as modern media has become increasingly ubiquitous, it’s also become increasingly homogenous. Production companies use tentpole franchises to purchase (or steal) audience loyalty, and as a result, they persistently overstretch a story, resulting in disasters like Peter Jackson’s ridiculously overlong Hobbit trilogy, or a Han Solo movie nobody really wanted. And actors, by sheer repetition, become associated with one character.

George Takei in Star Trek (left) and Heroes

Be honest. You can’t see Patrick Stewart without seeing Captain Picard and/or Professor Charles Xavier, possibly both superimposed in your mind simultaneously. Stewart has tried playing against that role, in shows like The Eleventh Hour and Blunt Talk, but audiences have refused to accept him, because they cannot separate his face from the extreme rectitude they associate with his characters.

Once actors become sufficiently high-profile, many complain about getting typecast, and only being offered variations on the same role. This works for some, as it’s impossible to imagine Alec Guiness as Obi-Wan Kenobi without his roles as men of great integrity, sometimes played straight, as in The Bridge on the River Kwai, or subverted, as in his classic Ealing Studios comedies.

Other actors, however, suffer for this typecasting. Jack Nicholson played characters known for their unique moral code in films like One Flew Over the Cuckoo’s Nest or The Missouri Breaks, but Stanley Kubrick forced him into an extreme version of that character for The Shining. And sadly, he never returned from that, becoming more and more exaggerated. On learning of Heath Ledger’s death after playing the Joker, Nicholson reportedly said: “Well, I warned him.”

Guy Pearce has in some ways avoided these extremes. Because he’s tough to recognize from role to role, he can accept divergent characters like the flamboyant Felicia or villainous Aldrich Killian, without interrupting roles like the shabby but heroic and oversexed Jack Irish. He’s just famous enough for me to vaguely recognize him, but has liberty enough to try new roles without carrying old ones with him.

Speaking as somebody who’s done a bit of acting himself, I admire Pearce’s flexibility. I suspect most actors want that freedom. Many good actors start off that way: Johnny Depp famously spent fifteen years only taking roles he found challenging, before going broke and descending into the great tomb of ego that is Pirates of the Caribbean. Does Depp want that freedom back? He’s diplomatically circumspect, but I suspect yes.

Once an actor becomes sufficiently famous, I suspect he’s trapped by his roles. Whether it’s Nicholson’s feigned madness, Hugo Weaving’s relentless seriousness, or Adam Sandler’s unending cascade of dick jokes, actors become prisoners of whatever people liked in the last role. It’s tough to imagine Robert Downey, Jr, without Iron Man anymore. Yet the fortunate few, like Guy Pearce, somehow manage to avoid that.

There’s something inspiring about actors retaining that freedom. One wonders whether it’s portable to other career fields too.

Friday, November 2, 2018

The Magic “I” and the World

1001 Books To Read Before Your Kindle Battery Dies, Part 92
Matthew Hutson, The 7 Laws of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane, and
Margaret Heffernan, Willful Blindness: Why We Ignore the Obvious At Our Peril

We’ve all seen it: people who see something that isn’t there, from attributing intentions to their puppy dog or the economy, to believing their domestic rituals and behaviors can change the outcomes of football games and wars. Or people who steadfastly cannot see something patently obvious, whether it’s the effect their behavior has on others, or the way their work has consequences down the line. People who see too much, or else see too little.

Serious psychologists have spent decades researching the ways ordinary people, with no malign intent, distort their perceptions of the world so it conforms with their needs. Two journalists, British-American multimedia specialist Margaret Heffernan and former Psychology Today contributor Matthew Hutson, have compiled that research into plain-English books for general audiences. The two volumes, published almost simultaneously, have complementary themes which reward being read together.

Heffernan unwinds long, sometimes dire tales of ways people permit themselves to selectively filter reality. Several of her most important stories involve a few recurrent examples: how a 2005 explosion at a Texas City oil refinery should’ve warned its parent company, BP, of systematic flaws, for instance, long before the Deepwater Horizon disaster—but didn’t. Or how W.R. Grace’s notorious vermiculite mine poisoned Libby, Montana, for decades.

Hutson’s approach is more episodic. He organizes the research on magical thinking into categories, then finds relevant examples to demonstrate principles drawn from research. This makes Hutson’s book more useful for dipping into when seeking evidence in later arguments, though it means he doesn’t give themes the same opportunity to mature that Heffernan does. Hutson’s approach is more discrete, Heffernan’s is more holistic.

To give only limited examples: Hutson describes how belief in luck, a seemingly irrational precept, nevertheless gives people power. When individuals believe themselves naturally lucky, they feel emboldened to take risks, push limits, and innovate productively. Likewise, Heffernan describes how a culture of silence and self-censorship detonated BP, but protects intelligence officers at Britain’s Chatham House, headquartered just down the same London street.

These books appeared as the world economy recovered from the subprime mortgage disaster of 2007-2008. Not surprisingly, both books make generous use of examples from the financial services sector, though they stress these concepts are portable to the larger culture. Heffernan describes ways money mutes personal ethics, and the “bystander effect” prevents anybody from saying anything. Hutson shows how belief in luck could both empower people to act while preventing them thinking about their actions.

Margaret Heffernan (left) and Matthew Hutson

Perhaps the most important theme both books share, is the absolute importance of thoughts we might automatically consider harmful. As Hutson describes how humans attribute meaning, personality, and even soul to the world, he also describes how people who don’t do this often struggle with routine activities. And Heffernan admits humans cannot possibly see and process everything; selective blindness is necessary to survive. Surprisingly, neither magical thinking nor willful blindness are completely wrong.

They are, however, ways people see the world. Both authors acknowledge their desire isn’t to abolish magical thinking or willful blindness; rather they want us to understand how these forces influence us, and steer ourselves to the most useful forms of thinking. In Heffernan’s words, she wants us to “see better.” These authors don’t want readers to abandon habits which, they admit, actually work under certain conditions. They want us to use these conditions more consciously.

Between the authors, Hutson intrudes more into his narrative. He admits his seven “laws” aren’t hard scientific principles like, say, Newton’s Three Laws; he deliberately enforces arbitrary but satisfying forms on a sprawling field of research. Heffernan lets themes unveil themselves more sequentially. Hutson, who admits a material existentialist philosophy, also lets that visibly color some of his conclusions. Unlike Heffernan, he believes in only one right way to see.

So, um, he engages in some magical thinking, to which he’s willfully blind. Oops. Still, his facts withstand scrutiny.

Both authors’ use of real world examples should hit readers close to home. Though the exact circumstances that destroyed Countrywide Financial and started the Second Palestinian Intifada don’t currently exist, the context of laws, religions, and money that created those circumstances remain. So everything Heffernan and Hutson describe remains relevant to the world we live in. And every disaster they describe could recur.

RenĂ© Descartes supposedly said: “We do not describe the world we see, we see the world we describe.” That quote encapsulates these two powerful books. We create our worlds afresh daily. Let’s make sure we create the best world be possibly can.

Thursday, November 1, 2018

Why Do We Hate The People Who Handle Our Food?

Promo still from the award-winning 2007 indie film Waitress

Why are food service workers paid so poorly? Everybody eats. Nearly everybody would rather eat well than eat some mush slapped mindlessly on a plate. Yet not only is food service among the most poorly paid work in America, it's among the least likely to offer paid sick days and comprehensive health insurance from one's employers. Meaning the likelihood you've been served steak with a side of pneumonia is high.

The Fight for $15 has banded together food service, healthcare, and childcare workers with other groups historically poorly paid, and provided a massive public voice for paying America’s most despised workers a better living wage. We’ve heard the moral arguments for why these groups deserve a better wage. But we, or anyway I, haven’t really heard the argument against paying these people better, except some vague mentions of supply-and-demand theory.

So I spent some time thinking furiously about the reasons we refuse food service workers better wages. It’s easy to dismiss some of the more commonly trotted-out counterclaims. The argument, for instance, that food service is a starter job for teenagers, runs immediately aground on the fact that we can order food at noon on weekdays, and other times teenagers aren’t legally allowed to work. Let’s discard those arguments immediately.

(Let’s also discard recent evidence that our economy holds doors open for certain teenagers while slamming them on others. That’s an entirely different ballgame.)

By keeping focus on the work itself, and the social context in which the work takes place, two arguments made themselves available to me. Then they fell apart. First, supply-and-demand tells us that very common jobs are priced poorly, and food service is one of America’s few growing industries since manufacturing collapsed. There are too many food service jobs. Except those jobs are ubiquitous because demand is high, so oopsie!

Where I live, I’ve seen entire communities which basically exist to sell food, gasoline, and hotel service to travelers on the highway. Since the agricultural economy has dwindled, and growing crops doesn’t bring money in like it once did, servicing the very high-demand traveling business remains the only wealth generator in much of the heartland. Classical economics would say this demand would make the supply very valuable, but economics fails.

Edmonton, Alberta-based chef Serge Belair at work

If supply-and-demand doesn’t work, the other classical economic argument for poor pay is that food service is a low-skilled job. Skills are valuable, and therefore create wages. Except, as I’ve bounced among industries after college, I’ve observed that there’s no such thing as “low-skilled jobs.” Some jobs demand skills workers bring with them, from machinists to physicians, and other jobs require skills learned while working. Food service is the latter.

So simple arguments don’t work, and classical economic arguments don’t work. I’m left with one approach: what moral attitudes do people have toward food service workers? Here I run into something more complex, and more disgusting.

On one hand, we glamourize food. We court potential future spouses over food, especially food somebody else made, to showcase our ability to splurge. We clinch important business deals over food—and, if you’ve ever watched a “business lunch” in process, you know that food mostly doesn’t get eaten. Most of our important religious and secular celebrations, like holidays and weddings, involve conspicuous overeating. We love showing off our food.

On the other hand, food is slightly icky. We enjoy eating certain kinds of food, like roasted meats or rich desserts in sauce, for their intense sensual pleasure. But we eat other foods, like starches and cruciferous vegetables, because we need the nutrients. We’ll even invent cooking methods, like drizzling them with sauce, to make them less nasty. We must eat these foods in order to survive, to withstand death.

Food memorializes human mortality. Eating reminds us how perilously close even the wealthiest and most successful among us, remain to dying. Food forms a key link in a continuum: we eat food to survive, and convert it into poop. That poop becomes nutrients for future food. This reminder of mortality disgusts industrialized Westerners so much, we break the cycle, chemically treating our poop and dumping it outside the food chain.

Food service workers handle our future poop. They remind us, by simply existing, that we’re going to die, that we could die tonight. Our dependence on food service workers challenges the Western technological myth that we’ll eventually beat gravity, physicality, and death. Food is, in essence, gross. So we punish the people who provide food by denying them better wages. Because how dare they remind us we’re going to die.

Monday, October 29, 2018

What Can One Man Do About Workplace Racism?

Unknown individuals wave the flags of Honduras and Mexico above the caravan

“What else could I have done?” I heard myself screaming, desperate to be taken seriously. “What would you have done? I have to make a living!” I realized I was waving my hands in front of me, Bernie Sanders-like, scared and desperate. Because I knew I had been wrong, but I didn’t know what other choice I had.

I was talking with a friend, over Facebook remote video chat, about something that had happened at work the previous day. I had been one of four guys installing tornado-proof outdoor furniture at a bank branch in a central Nebraska town, a process that wasn’t difficult but was tedious and time-consuming. And as dudes will do when bored, we started chatting.

The topic turned to current events. I’ve learned not to broach politics at work unless someone else brings it up, and never to offer my opinions, no matter how founded on facts and evidence, because doing so gets me in trouble. In Nebraska, and especially in blue-collar work, partisan allegiance isn’t a matter of discussion, it’s a matter of group identity. Dissent doesn’t mean you debate, it means you’re an outsider, or worse.

As the other three guys chatted about politics, and I kept my head down pretending to be selectively deaf, one guy asked another guy’s opinion about the caravan. As an estimated 7000 mostly Hondurans walk slowly toward the United States, planning to claim political asylum at a port of entry, probably Del Rio or Eagle Pass, Texas, this caravan has become America’s hottest political dividing line. This isn’t accidental.

“I don’t know much about this caravan,” one guy said, while the other two nodded like Solomon. “But I know, when you have a stampeding herd, you shoot one or two animals at the front, the rest of the herd will scatter.”

That’s what had me screaming down a Facebook video at my friend the next day. She insisted I had a moral obligation to speak against such dehumanizing language. I said I couldn’t, because when I’ve tried before, the blowback has been too vicious, and I’ve found myself ostracized for days, in a job where communication is paramount.

“You should’ve gotten the boss,” she said. “He has an obligation, by law, to provide a workplace free of that kind of hostility and discrimination.”

“I apparently haven’t made myself clear,” I replied. “That was the boss. That was the site supervisor.”

The caravan passes from Guatemala into southern Mexico

This is something I’ve encountered repeatedly since taking a construction job over three years ago. Racism is widespread in this business. I’ve been forced multiple times to swallow my objections while other workers, including my supervisors, have stood around running down Blacks, “Mexicans,” and other groups. Racist bullshitting is basically a form of group bonding.

This leaves me conflicted. I know keeping silent serves to empower the oppressors in our society. Jim Crow laws were only overturned when White people joined with Black people to call injustice unjust. When White people previously swallowed their objections, going along to get along, racists felt empowered to make laws even more unjust.

But, as I told my friend, I have to make a living. Construction isn’t a job anybody can do in isolation. If nobody will talk to me, if I find myself ostracized for speaking against group identity issues, like most workers’ shared conservatism, I can’t do my job. So sometimes I do what I know is wrong, keep quiet, and let people say things I consider morally odious.

This carries extreme risks, though. When people only speak to other people who already share their views, they tend to emerge from such discussions with more extreme versions of the views they already have. Psychologists call this phenomenon “group polarization,” though the military has a superior term for it: “incestuous amplification.”

I really feel I witnessed incestuous amplification happening last Friday at work. As otherwise good, hard-working White people stood around furiously agreeing with one another, their views became more extreme before my eyes. A guy whose expertise I respected, suddenly compared brown people to herd animals, and suggesting shooting them for doing something perfectly legal.

Now I have to return to work Monday. I have to return knowing I’ll hear language that bad again, or worse; knowing co-workers have Alex Jones InfoWars and “America For Americans” bumper stickers on the trucks where they carry their tools, knowing White people regularly write racist graffiti in shared outhouses. And I don’t know what to do if this happens again. Because it will.