Thursday, November 29, 2018

The American Message Machine



Hank Williams, Jr., released a song in the late 1980s entitled “USA Today.” He never released it as a single, and it never charted, but it nevertheless got some limited radio airplay at a time when his Lynyrd Skynyrd-lite sound dominated country music airwaves. I remember hearing it distinctly: the verses laid out a laundry list of supposed American cultural failings, not unlike his oft-repeated hit “A Country Boy Can Survive.” Then he launched into the chorus:
It's true we've got our problems, Lord knows we make mistakes
And every time we solve one, ten others take its place
But you won't see those refugees, headin' the other way
Welcome, to the U.S.A. today
I couldn’t help remembering these words last week, when news trickled in that American forces massed along our southern border fired tear gas canisters across the border, into Mexico, to disperse refugees, mostly from Honduras. This struck me as wrong on multiple levels. Since when does America keep standing troops along our land borders? Since when do we fire CBRN ordnance across the border into non-combatant nations?

And since when do we want to discourage refugees?

I’ve written before that I grew up conservative, surrounded by the right-wing values of family, church, and country music. We believed that government derived from moral authority, and therefore had moral authority itself. We believed that full-time work deserved a living wage. And we believed that the continued movement of refugees into America meant we were doing something right, that other nations were doing wrong.

This isn’t an exaggeration or metaphor. We believed that people coming into America to escape the hardships and terrors of life in their homelands meant America was uniquely blessed to handle complicated world situations like diplomacy and war. Yes, America had, for instance, gotten significantly drubbed in Vietnam. But the continued existence of Vietnamese boat people after the war proved we’d won the peace. Anyway, that’s what we told ourselves.

Equally important, it meant we knew the world was watching us, and we took pride in what they saw. We held ourselves up as paragons of moral virtue, economic justice, and opportunity. People came to America for various reasons, such as religious liberty, freedom from ethnic persecution, or simply to get a job. And we congratulated ourselves for being the one nation where all peoples and creeds believed they could find these things.

It didn’t hurt that the Cold War was happening. Professor Ibram Kendi, in his award-winning history book Stamped From the Beginning, notes that when John Kennedy and Lyndon Johnson threw their support behind Civil Rights in the 1960s, they didn’t do so because it was right. Johnson in particular had previously been aggressively anti-civil rights, until he did an abrupt 180. And he did so because he knew the Soviet Bloc was watching.


America’s victory in the Cold War was a PR maneuver as much as anything else. We showed our ability to juggle economic liberty, military supremacy, cultural diversity, and a willingness to change when we were wrong, which the Soviet Bloc just couldn’t. Peoples throughout the world witnessed America’s dexterity, decided they wanted to be like us, and fled to our shores, calling for asylum. And, for a while, we willingly gave it.

Our current government came to power through promises of restored greatness and national identity. Exactly when this greatness happened, or what it looked like, we couldn’t say, but we knew America loomed over the world somewhere in history, and we yearned to recapture that moment. Yet this government has forgotten that greatness stemmed not just from some in-group identity, but from the message we showed the world.

That message was that our actions followed our words. If we wanted European powers to divest their African and Asian colonies, and grant liberty to refugees fleeing to European shores, we had to do likewise. During the Reagan years we too pride (often grudgingly, but still) in accepting refugees from America’s involvement in countries like Vietnam, Guatemala, and Iran. We even lauded some refugees as heroes of bedrock American values.

t hurts to say that global peoples still witness America, and know exactly what messages we carry. We tout economic liberty, but cut our poorest loose. We drop the Mother of All Bombs in Syria, then tell Syrians to go back home and apply for refuge at the embassy. Our PR machine is broken. We need to start caring what message we’re selling the watching world—and ourselves.

Tuesday, November 27, 2018

The Force That Was Jimi Hendrix

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 29
Joe Boyd, John Head, & Gary Weis (directors), Jimi Hendrix


It’s difficult to believe, given his outsized influence in multiple musical genres, that Jimi Hendrix’s mainstream success lasted only four years. He went from the blustering rebel smashing his guitar at Monterey Pop, to a solemn icon playing with quiet intensity at the Winterland Ballroom, to an early grave, faster than some artists today produce their second album. He left behind a lifetime’s output, and he changed lives along the way.

Filmmakers Boyd, Head, and Weis began creating this rocumentary shortly after Hendrix’s abrupt passing. They bring together both original and vintage interviews with concert footage to eulogize an artist whose passing was still recent. The fresh hurt on many interviewees’ faces is palpable. But the music Hendrix created remains fresh and powerful; the licks from tracks like “Machine Gun” and “Purple Haze” still influence musicians today.

This documentary begins with the assumption you know who Jimi Hendrix is. It launches with concert footage of Hendrix deliberately overwhelming his speaker stacks, getting the well-modulated pop and whine that made his guitar work groundbreaking. (Hendrix didn’t invent feedback, Ike Turner did, but Hendrix exploited feedback’s popularity for everything it’s worth.) Then, when you’re good and rocked out, it launches into serious journalism mode.

Interview subjects include people who knew Hendrix personally, before his early fame, and also musicians, mostly British, who felt his impact on their careers. Little Richard, in whose band Hendrix paid dues; Pete Townshend, who viewed Hendrix as a rival at Monterey; Eric Clapton, who started off seeing Hendrix as somebody who stole his licks, then became an ardent admirer. Hendrix clearly changed every life he touched.

But though it’d be easy and cost-effective to interview well-known musicians, these filmmakers throw the net wider. They interview people Hendrix knew and worked with, people who loved him before he became famous, people who guided his musical sound as he guided theirs. Childhood friends, Army platoon mates, influential DJs, old lovers, bandmates. Diverse voices combine to tell Hendrix’s story in ways he, being absent, cannot.

Jimi Hendrix sets fire to his already-smashed guitar at Monterey Pop (click to enlarge)
Al Hendrix, Jimi’s father, provides important context for Jimi’s struggling childhood. He discusses how Jimi, a desperately shy child, found identity through playing guitar; if ordered to clean his room, young Jimi would strum his broom, Al says. But Al, a notorious alcoholic with a temper, also struggles through his narration, his words audibly slurring. It’s difficult not to wonder how reliable a narrator poor Al actually is.

Our filmmakers match their interviews with well-chosen concert footage. When Pete Townshend talks about his battle with Hendrix over playing order at Monterey Pop, the camera jumps directly to Hendrix’s legendary performance, where he played “Wild Thing,” culminating in smashing his guitar. Early girlfriend Fayne Pridgeon talks about almost getting evicted behind Jimi playing Bob Dylan at full volume, leading into “Like a Rolling Stone” at the Isle of Wight.

And what concert footage! Hendrix remains famous for early, high-energy recordings like “Wild Thing,” but that represents one fraction of his output. His performance of “Machine Gun” at Winterland, in which he mostly stands still and plays with understated gravity, is downright entrancing. And a vintage jam where he plays “Hear My Train a’Coming,” a concert staple he never recorded to his satisfaction, on an enormous acoustic twelve-string, brings chills.

Ironically, Hendrix himself probably wouldn’t have approved this eulogy. Archival interviews show his disdain for Q&A repartee. Doing panel with Dick Cavett, he subverts every question Cavett asks, leaving the interviewer stumped. Another anonymous off-camera interviewer questions whether smashing his guitar is a “gimmick”; Hendrix disparages the idea of gimmicks altogether, and says destroying a guitar is neither better nor worse than destroying a Vietnamese village with napalm.

This film substantially reflects its time. Released in late 1973, just months after Operation Homecoming basically ended the cultural moment we call the 1960s, its ethic was already outdated upon debut. Interviews with Buddy Miles, Noel Redding, and multimedia pioneers Arthur and Albert Allen, bespeak a street-fighting attitude that probably made sense one year earlier. Various attempts at hippie-era pop philosophy reflect how the Woodstock era was already dying.

Yet it also reflects how much Hendrix himself breathed life into that period. Some of his live performances and interviews included here were recorded mere weeks before his sudden passing in 1970. Though Hendrix’s survival wouldn’t have prevented the 1960s ending, it’s tempting to wonder exactly how culture might have changed. But, like a supernova, the light of Hendrix’s burning continues shining long after the source itself has burned out.

Tuesday, November 20, 2018

Building an Economy From the Soil Up

1001 Books To Read Before Your Kindle Battery Dies, Part 94
Wendell Berry, What Matters? Economics for a Renewed Commonwealth

What would a fair and just economy look like? This isn’t a new question. It isn’t even new since the Great Recession, when reckless speculation proved much American economics was founded on air. People of wisdom and learning have asked that question since at least Adam Smith and Karl Marx, and come no closer to an answer that satisfies everyone. Poet and farmer Wendell Berry suggests we’ve been looking in the wrong direction.

Berry, who has worked the same stretch of Kentucky highland his entire life, grounds his economy in judicious management of resources; and for him, the foremost resource is land. His use of “land” broadly encompasses water and air, forests and pastures, which humans must manage, not merely use. Humans arise from land, and humans create money; any economy that places money first inverts, and thus destroys, the natural order.

America, and the world generally, has fallen under sway of “autistic industrialism,” in Berry’s words, a laser-focused belief that man-made technologies will solve everything. This finds its apotheosis in a financial services industry that sees its dollar-sign output as superior to whatever it places a price on. And it works exclusively through creating ever increasing demands: Berry writes, “Finance, as opposed to economy, is always ready and eager to confuse wants and needs.”

Likewise, this economic model has concentrated land-use decisions in the hands of putative experts who don’t work the land. Chemical companies, government bureaucrats, and absentee landlords uniformly overrule the hard-won experience of people who once owned and husbanded the land they worked. This creates a vast gulf between who receives the short-term benefits of land exploitation, and who pays the long-term price.

Wendell Berry
This book comprises two sections. The first, shorter section involves five essays and speeches Berry delivered in the immediate wake of the 2008 financial services disaster. His primary target, however, isn’t bankers; it’s a culture-wide malaise that systemically mistakes money for value. This produces a breakdown that isn’t merely industrial, but moral: “We tolerate fabulous capitalists who think a bet on a debt is an asset.”

Berry’s second section involves older essays, dated from 1985 to 2000, about ways he sees an intimate connection between American economic values and our respect for the land. He witnesses how agricultural mismanagement has resulted in massive topsoil erosion; how mineral mismanagement has strip-mined coal, leaving behind hollow mountains and toxic rivers; how community mismanagement has separated individuals from the neighbors they live among.

These aren’t incidental losses, either. We cannot replace lost topsoil by dumping new dirt and saturating it with petroleum-based fertilizers, because topsoil is a complex relation of earth, organisms, and organic matter, that we don’t really understand. But those who profit from mismanagement care little for understanding, Berry writes: “The advocates and suppliers of agri-industrial technologies have encouraged us to think of agriculture as an enterprise occurring on top of the ground.”

Berry admits revising America’s economy to reflect the values it has already consumed won’t be easy. In some places he says he doesn’t know how economists could implement his vision. In others he has some insight, but admits the needed changes won’t be easy. They mostly involve changes in ownership and allotment: local businesses, local people, and smallholding farmers who own the land they manage.

This necessarily involves placing value on the invaluable. Currently, traits like family and community, or the good of preserving heirloom skills, have no economic value, because they don’t have a GDP price. But we’re currently witnessing the long-term consequences of their loss, in devaluation of land, business, and family; in rising health costs for people worked like machines; in food substantially lacking nutritional value.

This book doesn’t involve much factual data-wrangling; Berry’s interest lies in philosophical foundations, not actuarial spreadsheets. However, I cannot ignore statistical evidence demonstrating his accuracy. Journalist George Pyle has written that postage-stamp-sized farms pay their own bills, avoid chemical runoff, and remain in families better than massive, 10,000 acre industrial farms. Small, distributed agricultural economies have measurable value, but only when we measure in the long term.

Berry’s essentially Distributist model would seem familiar to G.K. Chesterton, whose motto, “three acres and a cow,” makes sense here. (“Forty acres and a mule” wouldn’t be inappropriate.) To Berry, like Chesterton, economic justice requires humans to directly control, and make decisions about, their living, and both philosophers see such control beginning with how we manage land. The ripples, though, extend throughout the economy.

Thursday, November 15, 2018

When the Fires of Anger Burn Out



I had intended to write today about the notorious Baraboo, Wisconsin, high school photo, which shows nearly the entire male class of 2019 flashing a straight-armed Nazi salute. I wanted to discuss how teenagers think asshole behavior is funny, and lack latitude of view to understand actions in historical context. I wanted to say we could demonize these children, possibly for the rest of their lives, or have a classic teachable moment and maybe improve society overall.

Then something happened I didn’t expect: America forgot the story.

Seriously. Two days ago, you couldn’t move on Blue Facebook or Blue Twitter without tripping over a half-dozen reposts of the photo, usually with captions of undiluted rage. How could anybody, especially in a post-Charlottesville environment, think such behavior was funny? Then it stopped. As I write this, another story hasn’t erupted sufficiently to really replace Baraboo as headline fodder, yet somehow, this story just petered out.

What happened? The story didn’t go anywhere. Baraboo police still promise a thorough investigation, which I applaud, since it clearly means all real violent crime in Wisconsin has been resolved. (C’mon, is this really the best use of police resources.) A Google news search reveals new details dribbling out about the story, mostly in regional outlets. But in the collective memory emblazoned on social media feeds, this story has already found a quiet corner to die.

Has our communal attention span become so brief that stories burn this quickly and vanish? Yes. This isn’t news to me. I’ve written before about how our capacity for outrage has become not only short-lived, but weirdly selective. We burn white-hot with indignation, spew largely crinkum-crankum bile in public places, and exhaust ourselves within, apparently, minutes. We lack capacity to keep stories burning low and constant long enough to actually do anything.

This causes further problems because it depletes our common capacity to keep real stories alive. While Baraboo sucked all the oxygen from the room, mainstream media apparently forgot, say, Jamal Khashoggi, whose murder, apparently ordered at the highest levels by one of America’s biggest trading partners, demonstrates just how dangerous the truth-telling industry remains in today’s world. Yet that developing story has become page-eight filler news.

(Yes, that’s a newspaper reference. I’m old.)

Yet perhaps that’s the problem. The American President has openly daydreamed about physically attacking journalists, called them “the enemy of the people,” and lavished praise on politicians who enact his violent fantasies. While the worst his administration has directly done is yank a journalist’s credentials over bullshit accusations, the implication nevertheless remains: we retain the option of capping your ass, Khashoggi-style, if you step out of line. We can make you wish you were dead.

This photo, by contrast, is safe. Everyone from MSNBC to Breitbart has covered this story, because they know moral outrage sells, but also because the story is low-risk. Disparaging American Nazis has been low-hanging fruit at least since The Blues Brothers used them as doofus villains in 1980. Media know they can gin outrage by thrusting stock villains under our noses, which accomplishes what they really want: to sell advertising space to the highest bidder.

And Baraboo itself serves a bicoastal narrative. A small town in a small state that few outsiders ever visit, Baraboo, Wisconsin, has under one-tenth the population of Manhattan’s Washington Heights neighborhood, and is primarily famous as the former headquarters of the defunct Ringling Brothers organization. National corporate media waving around the youth of a small city in an outlying province of “flyover country” reminds coastal city-dwellers of their unique privilege and its accompanying noblesse oblige.

I get frustrated when people tell me certain stories are mere smokescreens. Two years ago, it chapped my ass when pundits told me not to worry my pretty little head about the President-Elect’s theatre tweets. Yet that exact response is appropriate now, because the widespread popularity of the Baraboo story right now, when journalists are dying for telling the truth and an accused rapist sits on the Supreme Court, is the epitome of a smokescreen.

If social media means anything, we politically engaged citizens must start being more discerning in what stories merit our scrutiny. Truth isn’t criterion enough anymore. Humans’ ability to pay attention is finite, and if we expend it on low-stakes stories that burn out overnight, taking real journalism with them, the people who dominate our culture will eventually be able to get away with murder. Which, if current events are any guide, isn’t a mere metaphor.

Monday, November 12, 2018

God's Not Dead, but the Church Might Be Sick

Promotional photo from the film God's Not Dead

I’ve never seen the movie God’s Not Dead or its sequels. I have no intention of ever doing so. I realize many people like these movies and draw hope from them, and I also recognize that they represent the leading edge of a Christian indie movie industry that’s massively lucrative: the first God’s Not Dead film grossed $62 million on a $2 million production budget. Yet, as a Christian myself, these films irk me endlessly.

And I’ll tell you why.

When my sister was seventeen and shopping for colleges, she settled on two. She ultimately wound up attending the state university a few hours’ drive from home, partly because that’s what the family could afford. But she really had her heart set on attending Northwestern College of Orange City, Iowa. This private liberal arts college is institutionally affiliated with the Reformed Church in America and has Christianity in its curriculum.

My sister (whose permission I don’t actually have to retell this story) was massively excited by Northwestern’s explicitly Christian content. She loved that Orange City, itself a heavily Dutch Calvinist community, largely closed down on Sundays so everyone could attend church. The idea of surrounding herself with all Christianity, all the time, while pursuing her liberal arts credentials, got her downright giddy. She really wanted this entirely Christian education.

Her bank account, let’s say, didn’t.

When she discovered there was no chance whatsoever of affording private-college tuition, she became paralyzed with tears. I remember her standing in the family kitchen, weeping like her dog had died, half-screaming: “If I go to a state university they’ll tell me to go stand over in the corner and keep quiet and never say anything because I’m a Christian.” The mix of rage and grief in her outburst was palpable. So was the baloney.

Architectural drawing of the new Learning Commons at
Northwestern College, Orange City, Iowa

I was more conservative then than I am today, certainly. But I’d also perused both schools’ catalogues, and knew the state university had religious options available. Certainly, as a public institution, it couldn’t offer theological or seminary training, and was too small to host a religious studies program. But it had courses in biblical literature, the social science of religion, religious psychology, and more. And it hosted, though it obviously couldn’t sponsor, several on-campus ministries.

Yet for years, in our small-town congregation, we’d gotten barraged with messages about the inherent depravity of secular education. Well, all worldly institutions, really, but state-sponsored schooling was the nexus through which everybody passed, and therefore, the first test of godless secularizing mind control the good Christian had to surpass. Getting through high school, in many people’s understanding, meant walking through a purifying fire and emerging holy. Going back for more public education? I never.

Please understand, we weren’t part of some hyper-partisan fundamentalist sect. We attended the United Methodist Church, a centrist denomination that, among other things, tried (unsuccessfully) to censure Jeff Sessions for his behavior in authority. Yet, as often happens in small communities, a lack of diversity meant people became more extreme and intolerant in their opinions. From politics and religion to loyalty to the Denver Broncos and country music, everyone generally became more rigid, not less.

But this moment forced my first meaningful break with my childhood certainties. (Childhood, yeah. I was 23.) Seeing my sister paralyzed with grief because she had to attend public university, like three-quarters of Americans and most future pastors, struck me as beyond odd, especially as she’d had a fairly successful campus tour. She’d internalized the popular narrative of modern Christian persecution. And in her mind, months in advance, it had already begun happening to her.

Asia Bibi, a woman who actually, literally fears for her life because of her Christianity

Please don’t misunderstand me. I know Christians in this world are still persecuted. As I write, Asia Bibi, a Pakistani Christian, has narrowly evaded the death sentence for “insulting the Prophet,” an accusation the high court admits is probably fake; she still lives with constant death threats. There are places in the world where Christians have to fear violence daily. America isn’t one of them. Having to attend public education isn’t a human rights violation.

Yet a constant propaganda assault has convinced some people, unable to comprehend their own blinders, that that’s exactly what’s happening today. Mass media like God’s Not Dead convince believers to see conflicting views not as challenge and learning, but as attack and oppression. Years later, my sister doesn’t regret her state university education. But could she convince another small-town kid, raised on persecution media, that she won’t be silenced for her views? I don’t know.

Monday, November 5, 2018

That One Film! Starring That One Guy!

Guy Pearce in (left to right) Iron Man 3, Jack Irish, and The Adventures of Priscilla,
Queen of the Desert
 (click to enlarge)

I’m so far behind in watching big-ticket superhero films that I didn’t realize five entire years had already passed since Iron Man 3 debuted. Honestly, Wonder Woman was the first superhero film I’d seen in cinemas since Iron Man 2. Or since. So when it appeared in my On-Demand menu, I figured, what they hey, let’s get caught up. Except, once I pressed play, a whole other question occupied my mind:

Why do I recognize Aldrich Killian?

The fella looked so damned familiar that I focused my attention on trying to place him, to the exclusion of events onscreen. (I really should re-watch the film. Eventually.) Only during the closing credits did I realize: it’s Guy Pearce, a British-born Australian actor probably best-known to international audiences as Felicia, the flamboyant young drag performer in The Adventures of Priscilla, Queen of the Desert.

Despite appearing in high-profile productions like The Time Machine and Prometheus, Pearce hasn’t achieved the immediate recognition of his sometime castmate, Hugo Weaving. He has liberty to dive into projects he chooses, remaking himself each time, unlike Weaving, who has become so visible that he cannot enter new productions without carrying the aura of Agent Smith or Lord Elrond behind him. To different degrees, each actor carries past characters behind them.

I’ve witnessed this at various stages in today’s massively media-saturated culture. When, say, George Takei entered the TV series Heroes, back when anybody could still watch it, nobody could separate him from his role in Star Trek, and the series producers didn’t even try; they even gave his limousine the license plate number NCC-1701, the registry number of the starship Enterprise.

But as modern media has become increasingly ubiquitous, it’s also become increasingly homogenous. Production companies use tentpole franchises to purchase (or steal) audience loyalty, and as a result, they persistently overstretch a story, resulting in disasters like Peter Jackson’s ridiculously overlong Hobbit trilogy, or a Han Solo movie nobody really wanted. And actors, by sheer repetition, become associated with one character.

George Takei in Star Trek (left) and Heroes

Be honest. You can’t see Patrick Stewart without seeing Captain Picard and/or Professor Charles Xavier, possibly both superimposed in your mind simultaneously. Stewart has tried playing against that role, in shows like The Eleventh Hour and Blunt Talk, but audiences have refused to accept him, because they cannot separate his face from the extreme rectitude they associate with his characters.

Once actors become sufficiently high-profile, many complain about getting typecast, and only being offered variations on the same role. This works for some, as it’s impossible to imagine Alec Guiness as Obi-Wan Kenobi without his roles as men of great integrity, sometimes played straight, as in The Bridge on the River Kwai, or subverted, as in his classic Ealing Studios comedies.

Other actors, however, suffer for this typecasting. Jack Nicholson played characters known for their unique moral code in films like One Flew Over the Cuckoo’s Nest or The Missouri Breaks, but Stanley Kubrick forced him into an extreme version of that character for The Shining. And sadly, he never returned from that, becoming more and more exaggerated. On learning of Heath Ledger’s death after playing the Joker, Nicholson reportedly said: “Well, I warned him.”

Guy Pearce has in some ways avoided these extremes. Because he’s tough to recognize from role to role, he can accept divergent characters like the flamboyant Felicia or villainous Aldrich Killian, without interrupting roles like the shabby but heroic and oversexed Jack Irish. He’s just famous enough for me to vaguely recognize him, but has liberty enough to try new roles without carrying old ones with him.

Speaking as somebody who’s done a bit of acting himself, I admire Pearce’s flexibility. I suspect most actors want that freedom. Many good actors start off that way: Johnny Depp famously spent fifteen years only taking roles he found challenging, before going broke and descending into the great tomb of ego that is Pirates of the Caribbean. Does Depp want that freedom back? He’s diplomatically circumspect, but I suspect yes.

Once an actor becomes sufficiently famous, I suspect he’s trapped by his roles. Whether it’s Nicholson’s feigned madness, Hugo Weaving’s relentless seriousness, or Adam Sandler’s unending cascade of dick jokes, actors become prisoners of whatever people liked in the last role. It’s tough to imagine Robert Downey, Jr, without Iron Man anymore. Yet the fortunate few, like Guy Pearce, somehow manage to avoid that.

There’s something inspiring about actors retaining that freedom. One wonders whether it’s portable to other career fields too.

Friday, November 2, 2018

The Magic “I” and the World

1001 Books To Read Before Your Kindle Battery Dies, Part 93
Matthew Hutson, The 7 Laws of Magical Thinking: How Irrational Beliefs Keep Us Happy, Healthy, and Sane, and
Margaret Heffernan, Willful Blindness: Why We Ignore the Obvious At Our Peril

We’ve all seen it: people who see something that isn’t there, from attributing intentions to their puppy dog or the economy, to believing their domestic rituals and behaviors can change the outcomes of football games and wars. Or people who steadfastly cannot see something patently obvious, whether it’s the effect their behavior has on others, or the way their work has consequences down the line. People who see too much, or else see too little.

Serious psychologists have spent decades researching the ways ordinary people, with no malign intent, distort their perceptions of the world so it conforms with their needs. Two journalists, British-American multimedia specialist Margaret Heffernan and former Psychology Today contributor Matthew Hutson, have compiled that research into plain-English books for general audiences. The two volumes, published almost simultaneously, have complementary themes which reward being read together.

Heffernan unwinds long, sometimes dire tales of ways people permit themselves to selectively filter reality. Several of her most important stories involve a few recurrent examples: how a 2005 explosion at a Texas City oil refinery should’ve warned its parent company, BP, of systematic flaws, for instance, long before the Deepwater Horizon disaster—but didn’t. Or how W.R. Grace’s notorious vermiculite mine poisoned Libby, Montana, for decades.

Hutson’s approach is more episodic. He organizes the research on magical thinking into categories, then finds relevant examples to demonstrate principles drawn from research. This makes Hutson’s book more useful for dipping into when seeking evidence in later arguments, though it means he doesn’t give themes the same opportunity to mature that Heffernan does. Hutson’s approach is more discrete, Heffernan’s is more holistic.

To give only limited examples: Hutson describes how belief in luck, a seemingly irrational precept, nevertheless gives people power. When individuals believe themselves naturally lucky, they feel emboldened to take risks, push limits, and innovate productively. Likewise, Heffernan describes how a culture of silence and self-censorship detonated BP, but protects intelligence officers at Britain’s Chatham House, headquartered just down the same London street.

These books appeared as the world economy recovered from the subprime mortgage disaster of 2007-2008. Not surprisingly, both books make generous use of examples from the financial services sector, though they stress these concepts are portable to the larger culture. Heffernan describes ways money mutes personal ethics, and the “bystander effect” prevents anybody from saying anything. Hutson shows how belief in luck could both empower people to act while preventing them thinking about their actions.

Margaret Heffernan (left) and Matthew Hutson

Perhaps the most important theme both books share, is the absolute importance of thoughts we might automatically consider harmful. As Hutson describes how humans attribute meaning, personality, and even soul to the world, he also describes how people who don’t do this often struggle with routine activities. And Heffernan admits humans cannot possibly see and process everything; selective blindness is necessary to survive. Surprisingly, neither magical thinking nor willful blindness are completely wrong.

They are, however, ways people see the world. Both authors acknowledge their desire isn’t to abolish magical thinking or willful blindness; rather they want us to understand how these forces influence us, and steer ourselves to the most useful forms of thinking. In Heffernan’s words, she wants us to “see better.” These authors don’t want readers to abandon habits which, they admit, actually work under certain conditions. They want us to use these conditions more consciously.

Between the authors, Hutson intrudes more into his narrative. He admits his seven “laws” aren’t hard scientific principles like, say, Newton’s Three Laws; he deliberately enforces arbitrary but satisfying forms on a sprawling field of research. Heffernan lets themes unveil themselves more sequentially. Hutson, who admits a material existentialist philosophy, also lets that visibly color some of his conclusions. Unlike Heffernan, he believes in only one right way to see.

So, um, he engages in some magical thinking, to which he’s willfully blind. Oops. Still, his facts withstand scrutiny.

Both authors’ use of real world examples should hit readers close to home. Though the exact circumstances that destroyed Countrywide Financial and started the Second Palestinian Intifada don’t currently exist, the context of laws, religions, and money that created those circumstances remain. So everything Heffernan and Hutson describe remains relevant to the world we live in. And every disaster they describe could recur.

René Descartes supposedly said: “We do not describe the world we see, we see the world we describe.” That quote encapsulates these two powerful books. We create our worlds afresh daily. Let’s make sure we create the best world be possibly can.

Thursday, November 1, 2018

Why Do We Hate The People Who Handle Our Food?

Promo still from the award-winning 2007 indie film Waitress

Why are food service workers paid so poorly? Everybody eats. Nearly everybody would rather eat well than eat some mush slapped mindlessly on a plate. Yet not only is food service among the most poorly paid work in America, it's among the least likely to offer paid sick days and comprehensive health insurance from one's employers. Meaning the likelihood you've been served steak with a side of pneumonia is high.

The Fight for $15 has banded together food service, healthcare, and childcare workers with other groups historically poorly paid, and provided a massive public voice for paying America’s most despised workers a better living wage. We’ve heard the moral arguments for why these groups deserve a better wage. But we, or anyway I, haven’t really heard the argument against paying these people better, except some vague mentions of supply-and-demand theory.

So I spent some time thinking furiously about the reasons we refuse food service workers better wages. It’s easy to dismiss some of the more commonly trotted-out counterclaims. The argument, for instance, that food service is a starter job for teenagers, runs immediately aground on the fact that we can order food at noon on weekdays, and other times teenagers aren’t legally allowed to work. Let’s discard those arguments immediately.

(Let’s also discard recent evidence that our economy holds doors open for certain teenagers while slamming them on others. That’s an entirely different ballgame.)

By keeping focus on the work itself, and the social context in which the work takes place, two arguments made themselves available to me. Then they fell apart. First, supply-and-demand tells us that very common jobs are priced poorly, and food service is one of America’s few growing industries since manufacturing collapsed. There are too many food service jobs. Except those jobs are ubiquitous because demand is high, so oopsie!

Where I live, I’ve seen entire communities which basically exist to sell food, gasoline, and hotel service to travelers on the highway. Since the agricultural economy has dwindled, and growing crops doesn’t bring money in like it once did, servicing the very high-demand traveling business remains the only wealth generator in much of the heartland. Classical economics would say this demand would make the supply very valuable, but economics fails.

Edmonton, Alberta-based chef Serge Belair at work

If supply-and-demand doesn’t work, the other classical economic argument for poor pay is that food service is a low-skilled job. Skills are valuable, and therefore create wages. Except, as I’ve bounced among industries after college, I’ve observed that there’s no such thing as “low-skilled jobs.” Some jobs demand skills workers bring with them, from machinists to physicians, and other jobs require skills learned while working. Food service is the latter.

So simple arguments don’t work, and classical economic arguments don’t work. I’m left with one approach: what moral attitudes do people have toward food service workers? Here I run into something more complex, and more disgusting.

On one hand, we glamourize food. We court potential future spouses over food, especially food somebody else made, to showcase our ability to splurge. We clinch important business deals over food—and, if you’ve ever watched a “business lunch” in process, you know that food mostly doesn’t get eaten. Most of our important religious and secular celebrations, like holidays and weddings, involve conspicuous overeating. We love showing off our food.

On the other hand, food is slightly icky. We enjoy eating certain kinds of food, like roasted meats or rich desserts in sauce, for their intense sensual pleasure. But we eat other foods, like starches and cruciferous vegetables, because we need the nutrients. We’ll even invent cooking methods, like drizzling them with sauce, to make them less nasty. We must eat these foods in order to survive, to withstand death.

Food memorializes human mortality. Eating reminds us how perilously close even the wealthiest and most successful among us, remain to dying. Food forms a key link in a continuum: we eat food to survive, and convert it into poop. That poop becomes nutrients for future food. This reminder of mortality disgusts industrialized Westerners so much, we break the cycle, chemically treating our poop and dumping it outside the food chain.

Food service workers handle our future poop. They remind us, by simply existing, that we’re going to die, that we could die tonight. Our dependence on food service workers challenges the Western technological myth that we’ll eventually beat gravity, physicality, and death. Food is, in essence, gross. So we punish the people who provide food by denying them better wages. Because how dare they remind us we’re going to die.