Saturday, August 19, 2017

Should Free Society Limit Free Speech?

Charlottesville protester Peter Cvjetanovic claims this photo doesn't represent his real
personality. But he's chanting Nazi slogans which inherently threaten violence against
others based on their demographic group. Which is the Fundamental Attribution Error,
though we'll talk about that later.
This week, I’ve gotten drawn into multiple debates about when and whether a free society places limitations on free expression. Which isn’t surprising. For many of us, the boundaries of free speech were primarily thought experiments in philosophy courses. For white Americans especially, we consider free speech our native birthright, and expect nobody will circumscribe our ability to say anything. Events this week sorely test that assumption.

For most, the stereotypically unacceptable extremes of free speech are limited to obscenity and incitement. Language (or images) so vulgar and transgressive that it undermines social order should, courts agree, be squelched. And no matter how crude my language, I can disparage individuals or groups until I threaten their bodily safety. We accept these limits tacitly. But I suggest we go further. We should consider both what language says and does.

Imagine a cranky drunk. Well into his fifth pint, he blurts: “Christ Almighty, I hate Jews.” He might even say: “That Doctor Goldstein across town overcharges for routine shots, what a Kike.” As thoughtful people, I’d hope we agree that’s hurtful, possibly even damaging language. But it serves a purpose, even if only to signal we should avoid this guy, since most people say drunk what they think sober. We know he thinks offensive thoughts.

Things change if that guy says: “Somebody out to put a cap in Doctor Goldstein’s ass.” We could consider that a threat. Depending on our drunk’s history, we might toss him in a cell until he dries out, for his own and Dr. Goldstein’s safety. But if he says “Get your pitchforks, boys, we’re gonna teach Goldstein a lesson,” we’d agree he’d crossed a line. Because his language no longer described something, it actually constituted an action.

When Queen Elizabeth smashes a bottle of gourmet plonk against a battleship’s prow and says,/ “I christen this ship the HMS Insurmountable,” her words don’t merely describe something which already exists. Her words create a new reality. She bestows upon the vessel, and its subsequent crew, a concrete sense of identity. She recognizes the ship into membership in the Royal Navy. She initiates the ship into a maritime valor tradition.

Rhetoricians call this “performative language,” meaning the words change something. The change may have only symbolic value or change the mental landscape rather than making physical changes: for instance, naming your baby doesn’t change the baby’s physical structure. But it grants the baby an identity, which permits it to participate in human civilization. Your unnamed baby is interchangeable with other groups of human cells and organs; naming bestows individuality.

This picture was taken in Charlottesville, Virginia, on 12 August 2017. The Nazi salute
is a prime example of performative language, since it pledges allegiance to a particular
ideology, which itself killed sixteen million unarmed civilians.

Incitement language does something similar. In Charlottesville, crowds surrounded a Black church, brandishing torches. This admits multiple interpretations, since firelight vigils have a lengthy Christian tradition. Without actual violence occurring, how do we interpret this event? By language. These crowds didn’t pray, they chanted racist slogans meant to cause fear, and keep parishioners contained. The words spoken, not the actions taken, define what happened, and what makes it execrable.

Sadly, I hear racist language regularly. Working construction, I’ve heard people drop N-bombs, and other language I consider wholly offensive. Construction is the most thoroughly segregated workplace I’ve ever seen, physically and rhetorically. But these racist spewings aren’t performative language. When my colleague says hurtful, mostly uninformed things about African Americans, he reveals himself, but doesn’t change anything. And I’ve seen him work with Black people just fine.

The dynamic would change if my colleague changed his language, however. When he drops N-bombs, his hurtful words drive wedges between himself and others. Which is jerkish, but tolerable. If he recited Nazi or Klan slogans, language historically associated with racially motivated language, he wouldn’t be merely speaking, he’d be willfully causing others to fear for their bodily safety. Violent language creates a violent interaction, even without physical bodily violence.

Therefore, we can divide language into two categories: descriptive and performative language. Descriptive language, even when vile, is ultimately harmless. That’s why anybody can say “I hate Donald Trump.” Performative language gets further divided into benevolent and harmful performances. “I love you” is benevolent performative. “I hope Trump is assassinated” is harmful, and deserves every backlash it receives.

Language that describes even the most vile depths of human behavior deserves every protection, because slippery slopes do exist. But language that says something differs, measurably, from language that does something. Society should scrutinize all performative language to ensure it doesn’t harm others or undermine society. Which, oh wait, it already does, with obscenity and incitement.

Friday, August 18, 2017

The Art and Science of American Racism

We stumbled into the Lawrence Arts Center at the tail end of an exhausting weekend. The events in Charlottesville had hypnotized the country for days, and Sarah and I desperately needed something quiet, something aesthetic, not carrying the stench emanating from the upper echelons of American power. We thought surely an art gallery would ease the tensions. We had no idea Iola Jenkins’ textile art would greet us inside.

Jenkins, a self-taught African-American outsider artist, uses a mixture of quilting and embroidery to create dynamic, multicolored images based on her beliefs and experiences. Her textile art includes portraits of Whoopi Goldberg’s character from The Color Purple and activist professor (and sometime fugitive) Angela Davis. She also has pictures of African market days and folk scenes. The depth Jenkins extracts from what look like discarded scraps makes mere paint look one-dimensional and wan.

Sarah and I paused before twinned portraits of Barack and Michelle Obama. Rendered in exceptionally bright colors, the images had a certain quality of a 1990s comic book, but that didn’t undermine their depth. They looked like photographs taken under tinted stage lights, or possibly run through an oversaturated Instagram filter. They captured the energy and potency African-Americans like Jenkins must’ve felt seeing a couple who resembled themselves in Washington.

We stood there, dumbstruck, for several minutes. Both quilts were dated 2015, when President Obama was riding his highest crest of popular support, before the primary election campaign heated up. Jenkins must’ve made these textiles when Donald Trump still looked like a longshot protest candidate, a spoiler meant to agitate a wounded but angry Republican base. She might’ve made them before Trump declared his candidacy. I don’t know.

I threw my hat on the floor.


Many protesters in Charlottesville were unambiguous: they felt emboldened to express their vile opinions because Donald Trump became President. Though it’s overgeneralizing to say Trump caused the Charlottesville violence, his discourse—calling Mexicans rapists, spouting decades-old stereotypes about “the inner city”—emboldened whites who already had racist tendencies to express them. Trump’s failure to condemn people toting swastikas as Nazis hasn’t helped.

Unlike many American prairie communities, Lawrence, Kansas, didn’t spring up spontaneously. Activists from the New England Emigrant Aid Company deliberately founded Lawrence as an abolitionist colony during the Bleeding Kansas fighting, to provide anti-slavery forces an added edge in determining the future state’s future. The city’s main downtown corridor, Massachusetts Street, reflects the city’s abolitionist heritage. As often happens, contemporary attitudes mainly reflect historical foundations.

Historian Ibrahim X. Kendi writes that beliefs tend to follow policy. By this he means that public opinion, on multiple issues but especially race, tends to reflect the ideas floating from the top of politics, economics, and culture. Racism, as we experience it in America, didn’t really exist in pre-colonial Europe. People we’d now consider “white” hated one another and fought violently: French versus English, Spanish versus Portuguese, Germans versus Germans.

After plague and warfare decimated the Native American population, rendering North America fit for colonization, Europe started dumping its undesirable denizens on distant shores. According to Nancy Isenberg, America’s first English settlers weren’t called heroic pioneers at home. The English used colonies in Virginia and Massachusetts to unload what they called “the offscourings of the land.” The word “offscourings” refers to shit that clings to your ass and needs wiped off with paper.

England bound this goulash of beggars, debtors, thieves, and other outcasts together, by telling them: well, at least they weren’t Black. Parliament created policies forcibly separating white settlers from slaves in work, residence, and even food. England initially regarded Indians as whites with extreme tans, until Native pushback against English adventurism turned violent; then policies changed to separate Black, white, and Red. White beliefs accommodated these policies.

So racism became a response to public policy. After the Revolution, when Northern states couldn’t reconcile their rhetoric of freedom with slaveholding, they changed policies to emancipate their Black slaves and white indentured servants. But once the policy of race took hold, nobody could undo it. Northerners still saw themselves as Black, white, and Red. Even abolitionists progressive enough to colonize Lawrence, Kansas, carried the idea of race into their new homes.


Iola Jenkins made her art during a time when it appeared the legacies of colonial policy might finally disintegrate. But electing a Black President, a political moderate with big-tent views and even bigger smile, couldn’t reverse the trend. As Charlottesville proved, the vile colonizers simply moved underground, awaiting their chance. The persistence of abolition in Lawrence, and of racism in the Trump administration, proves boldly: problems don’t go away because head operators change.

They simply take new form. And the fight, physical and policy alike, must adjust appropriately.

Thursday, August 17, 2017

American Flags and the History of Violence

The offending banner, in its natural environment

So as we sat down to eat at a perfectly pleasant restaurant with an Early American decor theme, I happened to glance to my left and see a banner. Not really a flag, the proportions were all wrong; fly it in the wind, on a battlefield where the infantry needs to know which colors to rally around, and it would hang limply. The only correct word for this red, white, and blue confection is “banner.”

It had a ring of thirteen stars on a blue union, and stripes in red and blue. It also had a fourteenth star inside the ring. And I thought it looked uncomfortably familiar. Like any good resident of the Third Millennium, I reached for my smartphone, because what’s the point of carrying a massively powerful networked computer in your pocket if you can’t occasionally use it to Google things? So I did, and I immediately found what I feared:

This banner sure looked like the Confederate national flag.

We’re accustomed to associating the Confederate States of America with the “Stars and Bars,” a blue St. Andrew’s Cross with white stars on a solid red field. But this wasn’t the national flag of the Confederacy, it was the battle flag of Robert E. Lee’s Army of Northern Virginia. That flag gained prominence during the Civil Rights Movement, as a militant white pushback against the idea that black Americans deserve equal standing. But that’s a later addition to the Confederate myth, and one that serves modern rather than historical purposes.

The Confederate national flag had thirteen stars in a ring on the blue union, and three stripes: two red and one white. The banner flying at my left had five stripes, three red and two white, besides the puzzling fourteenth star. I told Sarah that I thought we’d spotted a Confederate flag, just three days after the violence in Charlottesville, Virginia. As people sympathetic with civil rights and racial equality, I wondered aloud whether we could eat here.

So naturally Sarah went to speak with the manager. She needed to know whether she was about to give her money to a business that openly advertised sympathies with a treasonous pseudo-nation that fought against the United States to protect slavery. But while she did that, I kept probing, and discovered I’d possibly made a terrible mistake. This wasn’t a Confederate flag, it was a Betsy Ross flag.

The Betsy Ross flag (above) and the Confederate national flag (below)

Both flags have thirteen white stars, a blue union, and red and white stripes. At a brief glance, the only distinguishing characteristic between the two flags is the number of stripes. This probably represents a Confederate attempt to usurp American mythology. As historian Nancy Isenberg writes, Confederates and Northerners sniped incessantly about which represented the real American heritage, calling each other Crackers and Mudsills, respectively.

With five stripes, and the inexplicable fourteenth star, the offending banner clearly was neither a Betsy Ross flag nor a Confederate flag. And when Sarah returned with the manager, who explained the banner was hanging around as part of her store’s post-July 4th decorations, I realized there were other pieces of patriotic kitsch hanging around. I’d wandered into a Hall of Americana for brunch without realizing it.

Sarah and I apologized profusely to the manager. Clearly, in light of recent events, we’d made a significant mistake. Yet had we? Images of America’s slaveholding past linger everywhere. If you count colonial times, and historians do, we were slaveholding longer than we’ve been free. Movies like Gone With the Wind, which openly extols slaveholding society, are considered classics. And many of our Founding Fathers were slaveholders.

President Trump, in the video linked above, makes an equivalency between Stonewall Jackson and Robert E. Lee, and George Washington and Thomas Jefferson. This isn’t unfair, since all defended and profited from slavery (though Lee never owned slaves himself). But the equivalency is false. Jackson and Lee fought for a nation that existed to preserve and extend slavery. Washington and Jefferson set the process of liberation in motion… though failed at enacting it themselves.

We ate our French toast in relative peace, surrounded by a comfortingly multiracial dining room that clearly didn’t share our offense at the possibly Confederate banner. We probably hadn’t stumbled into a den of covert racism. Yet I still couldn’t wash the bad aftertaste from my mouth. American society, for all its virtues, still fails to redress its explicitly racist past. People are still dying for the cause. If I’m not willing to speak up, what am I?

Thursday, August 10, 2017

The Other Boy Who Could Fly

John Leonard Pielmeier, Hook's Tale: Being the Account of an Unjustly Villainized Pirate Written By Himself

First, his name isn’t Hook. James Cook, great-grandson of the explorer James Cook, is press-ganged into the Queen’s Navy, aged 14, ending his London childhood and Eton education forever. But rumors of treasure lead to mutiny, and Cook finds himself sailing under the Black Flag. Soon his ship crosses the line into a mysterious land where nobody, not even little boys dressed in tattered leaves, ever grows up.

American author John Leonard Pielmeier is probably best-known for his play, and later film adaptation, Agnes of God. Since that classic, he’s become an in-demand screenwriter, especially for adaptations of heavy, difficult literature. But he admits, J.M. Barrie’s Peter Pan first awakened his interest in reading, and in his first novel, he returns to Neverland, retelling the story from the forsaken antihero’s perspective.

Cook finds himself orphaned, expelled, and pressed in quick succession. A comforting life of middle-class London innocence surrenders to harsh sailors’ compromises. Under his captain’s Puritanical supervision, Cook toughens his skin, practices his Latin, and conquers his ignorance. Soon he’s a real sailor. Then the mutiny forces him to choose between honesty and survival. And, on a distant Neverland shore, he finds a castaway who remembers Cook’s long-lost father.

If Peter Pan is the Boy Who Wouldn’t Grow Up, James Cook is the Boy Who Has Adulthood Thrust Upon Him Violently. There’s a Luke Skywalker quality to Cook’s transition, but he often learns the wrong lessons. He abandons his post to discover more about his father. He nurses petty grudges and pursues vengeance so far, he inadvertently injures himself. He admits lying to achieve his ends—then demands we trust him, not Barrie, to tell the real story.

John Leonard Pielmeier
Peter Pan, meanwhile, proves himself capricious, controlling, and worse. Marooned by his shipmates, Cook meets Peter, and both are overjoyed to finally make friends their own age. But Cook doesn’t want to stay fourteen forever. He faces a monster so terrible, even Peter can’t stomach it, and in so doing, wins Tiger Lily’s heart. Peter, jealous that his friend doesn’t live in the eternal present, murders her. Or so Cook says.

Pielmeier strips Barrie’s Edwardian sensationalism. Cook repeatedly insists he’s no pirate, but an orphan caught in something beyond his control. He’s certainly not Blackbeard’s bo'sun. The Piccaninnies aren’t a stereotyped Plains Indian tribe, they’re a proud Polynesian nation, the Pa-Ku-U-Na-Ini. And Neverland isn’t a haven of eternal innocent irresponsibility, it’s a land of Lotus-Eaters where all time gets compressed into Yesterday, Today, and Tomorrow.

Repeatedly, Cook insists he’s no villain. Yet he’s exactly that, if accidentally: everywhere he goes, his presence disrupts the balance. Gentleman Starkey initiates the mutiny because he finds Cook’s treasure map. Peter and the Pa-Ku-U-Na-Ini live in peaceful rapport until Cook interrupts their religious ceremony, breaks Tiger Lily’s prior engagement, and leaves Peter friendless. He even accidentally hastens the Wendy Darling’s kidnapping.

Critics have seen, in Barrie’s Peter Pan, an enactment of the Oedipal conflict, as Peter battles the piratical father-figure and must choose between three ideals of womanhood. I see, in Pielmeier’s Cook, a dark mirror of Campbell’s Heroic Journey metaphor. Pielmeier hits every marker: the Call to Adventure, the Threshold, the Road of Trials, the Temptress, even the Return. But unlike Campbell’s hero, at every opportunity, Cook makes the wrong choice.

Cook insists he’s innocent. But everywhere he goes, he leaves a trail of broken souls and dead bodies. He insists upon his own honesty, and gives a detailed accounting of his actions, while he admits lying to achieve selfish ends. Though book-smart and crafty, he lacks wisdom, perhaps because his lifetime’s experiences don’t match his bodily appearance. Thus, instead of achieving enlightenment, he becomes driven by vengeance and rage.

Maria Tatar writes, of Barrie’s original play and novel, that the dominant theme is futility. The Lost Boys, Piccaninnies, and pirates pursue one another in a permanent clockwise pattern around the island, perpetually enacting time, though they never age. Pielmeier disrupts that: Cook enters a magic archipelago where time means nothing, but instead he brings change. He brings mortality into a land without age. But he never understands this.

Pielmeier isn’t the first author to rewrite Hook’s backstory. Besides Barrie himself, recent entries have included J.V. Hart, Christina Henry, and Dave Barry. However, I particularly like Pielmeier’s psychological depth and emotional complexity. Pielmeier’s Cook is a master schemer, but also a master of self-deception. He successfully complicates Barrie’s original story, but only at great cost to himself, which he clearly hasn’t begun to understand.

Tuesday, August 8, 2017

A Goddess's Guide to Folk Rock Stardom

1001 Albums To Hear Before Your iPod Battery Dies, Part Seven
Ani DiFranco, Living In Clip


Ani DiFranco gained attention for her DIY music ethos in the 1990s, as probably the most successful musician to found her own label and release her own albums. That’s how I first encountered her. In the final fifteen years when record sales still mattered, her ability to control her own sound, marketing, and image control made her legendary. Frequently, this forward-thinking creative control overshadowed how profound her music actually was.

This recording showcases DiFranco’s uncompromising musicianship. Recorded over the previous two years, these songs display a performer notorious for her assertion that she lived to play before a live audience. Her ability to respond to audience energy, and the audience’s willingness to answer her cues, show a reciprocal relationship between both sides of the divide. Her intensely autobiographical lyrics clearly touch listeners through their immediate intimacy.

Though famous for her entrepreneurial ethic, DiFranco’s music was equally ambitious, a mix of acoustic austerity with indie rock drive. Though she never got much radio play, lacking connections to distribute payola, occasional songs like “32 Flavors” or “Untouchable Face” got airplay from radio programmers rebelling against the then-nascent ClearChannel monopolism. Her independence apparently rubbed off on gung-ho individualists, college students, and other freethinkers.

She certainly conveys this independence in her live recordings. Though self-identified as a folksinger (and in frequent rotation of venues like FolkAlley.com), her style combines folk introspection with punk clarity. She drives her own sound with just her voice and guitar, backed mostly by a rhythm section. She doesn’t invest in ornamentation or ensemble complexity—with exceptions, as she does front the Buffalo Philharmonic Orchestra on two tracks.

Ani DiFranco
But mostly, she carries her own weight onstage. She plays with a modified clawhammer strum, the same basic style used by Bob Dylan and John Lennon. (In interviews around this time, she described teaching herself guitar with a Beatles songbook.) Her evident love of playing comes across when she doesn’t stop strumming during stage banter. And banter she does: she uses a Lenny Bruce-style conversational rapport to establish, and respond to, her audience’s desires.

Despite her acoustic folk roots, DiFranco shows herself comfortable with innovation. Tracks like “Not So Soft” or “The Slant” utilize a hip-hop recitative style which punctuates her lyrical urgency. On other tracks, like “Sorry I Am” or “Fire Door,” she allows her sound operator to loop her vocals, permitting her to harmonize with herself, in a style other acoustic artists wouldn’t embrace for a decade after this album’s release.

DiFranco has often been the most vocal and strenuous critic of her own studio recordings, describing them as “sterile” or worse, despite serving as her own producer and arranger. This is often unfair, as anyone who’s heard albums like 1996’s Dilate can attest; she’s a masterful stylist who uses studio effects without overusing them. However, even her best studio recordings do have a certain lack of immediacy about them.

Not so this recording. Her mostly acoustic performances, with session drummer Andy Stochansky and bassist Sara Lee, showcase her power as a live performer. In an essay reprinted in the Utne Reader in 2002, DiFranco admitted she mostly made albums to publicize her live tours, largely the opposite of the then-accepted music business standard. She invested studio time to justify her passion for playing before a live audience.

Despite her personal lyrics, her writing is often intensely political too. DiFranco, an admitted pansexual agnostic, adopted opinions too liberal even for most mainline progressives back then, embracing her sexual inclusivity on songs like “Adam and Eve,” and confessing gender-based personal traumas with “Letter to a John” and “Tiptoe.” She was too aggressive even for most feminists: at her 1990s peak, she declined Lilith Fair, though she could’ve headlined, calling it too timid.


This landmark album pushed DiFranco into mainstream consciousness, drawing listeners’ attention to her muscular, unapologetic live performances. She dared audiences to join her introspective journey, and that largely self-selecting audience followed. Her mainstream acceptance followed, including larger venues and ten Grammy nominations in ten years. Though never a superstar, this album ushered in DiFranco’s moment of greatest artistic and commercial triumph.

DiFranco’s particular stretch of the 1990s produced several iconic women singer-songwriters, from fresh-faced ingenues like Fiona Apple to seasoned geniuses like Tori Amos. Like them, DiFranco saw her commercial star marginalized by the artistically anodyne stylings of the middle 2000s, and she’s returned to headlining the specialized circuit she once loved. She’s probably better for it. These pre-fame recordings display an artist most comfortable with intimacy and vision.

Friday, August 4, 2017

Dollar Store Jesus

Mitch Kruse with D.J. Williams, Street Smarts From Proverbs: How to Navigate Through Conflict to Community

The other day I reviewed a business book that had an underlying moral message. This time, it goes the other way: I have a Christian book that’s essentially an encomium to capitalism. And I have the same basic responses to both: they’re okay, provided you don’t push either to their logical extremes. Tempered with moderation and accountability, either could be uplifting and game-changing; unmoored from community, either could lead to self-sanctification and arrogance.

Reverend Mitch Kruse paid for his seminary education through proceeds from selling his Internet start-up to eBay. This duality, the transition from capitalist self-marketing to Christian humility, inflects this, his second book. He has a biblically solid exegesis, based on a fairly consistent conservative evangelical read of Proverbs and, to a lesser degree, other Scripture. But the emphasis is very first person singular. It’s about I, me, my relationship with God… which isn’t what Proverbs is about.

Kruse believes monthly rereadings of Proverbs will instruct open students in judgement, discretion, and restraint. This, for Kruse, makes a working definition of “wisdom,” a form of thinking in which faithful believers sublimate their human reason to God’s will. Because Proverbs has thirty-one chapters, reading one per day will increase opportunities for learning and self-correction. Through repetition, devoted readers will acquire the habits of right thinking and self-control which Proverbs makes available to open minds.

Pursuing this goal, Kruse is a master maker of lists and other mnemonic devices. The largest part of this book delves into what Kruse calls the Twelve Words, important themes which recur in Proverbs and define its overarching goal, Wisdom. These include Righteousness, Justice, Discipline, and Learning— words which mean something different in Biblical contexts than in conversational English. Kruse defines these terms using a mix of scholarship and anecdote, in the classic sermonist style.

Mitch Kruse
My problem isn’t Kruse’s writing. He hews to a familiar homiletic style which I assume Christian writers must learn in seminary, because it recurs regularly. I might wish Kruse broke from a mold I find boring in Christian pop nonfiction, because I’ve seen it so often that my eyes skim the page, but that’s my personal problem. Rather, his choice of Proverbs, probably the most first-person-singular book of the Bible, leaves me scratching my head.

Proverbs is part of what Hebrew scholars call Wisdom Literature, four books (seven in the Septuagint, called the Catholic Apocrypha) that differ from other Biblical books. Neither history, like the Torah, nor exhortations of the people, like the Prophets, Wisdom Literature mostly involves gnomic poems and sayings restraint, humility, and godliness. Unlike most Hebrew Scripture, Proverbs isn’t intended for the whole people; it’s specifically for kings or (like the case of Ecclesiastes) sons of kings.

That’s why mainline liturgical churches often avoid Proverbs in pulpit ministry. We can’t agree on how it applies to most believers today. We consider it inspired, and like all scripture, useful for instruction. But to call it controversial is underselling the situation. Kruse often has to perform theological gymnastics to make Proverbs yield the communitarian thesis he promises in his title, an approach that, I’ve noticed, doesn’t much include direct citations from Scripture, including Proverbs.

Thus we’re faced with a Christian book that seldom cites the principal Christian source, a communitarian book about hierarchies, a book about Hebrew Scripture that largely eschews the Hebrew prophetic tradition. Kruse trades primarily in contradictions, which I think even he doesn’t always see. His most recurring theme holds that Christians need to subjugate their will and intellect to God’s, yet h almost entirely emphasizes individual salvation, not what God would have us actually do.

My biggest problem turns on Kruse’s distribution of rewards. His anecdotes generally follow a reliable pattern: a person Kruse knows, or knows about, ventures into willfulness and self-seeking, which ends badly. (He particularly dreads substance abuse.) That person rediscovers God’s purpose, surrenders to divine will, and gets restored. This often ends with some earthly reward: a university degree and a family, lucrative speaking gigs, media stardom. Heavenly salvation, in Kruse’s theology, generally brings earthly rewards.

Kruse never says anything I find altogether wrong. He often shares meaningful, uplifting, theologically sound precepts. But with his emphasis on individual salvation and the journey from poverty to riches, literal or metaphorical, he’s essentially sharing a Christianized Horatio Alger story. Though I often like Kruse’s message, I cannot escape the reality that, throughout, he never stops thinking like a businessman. Christianity is, for Kruse, a transaction, where the faithful hope to make a profit.

Wednesday, August 2, 2017

The Trouble With “Isms”

John Oliver, the grumpy uncle
of pay-cable comedy
John Oliver, a man who’s described his accent as being like “a chimney sweep being put through a wood chipper,” has made Net Neutrality a personal pet issue. After pushing the issue once during the Obama administration, successfully, he came back to the issue again after President Trump appointed Ajit Pai chair of the FCC. For a polemicist famous for his wide-ranging interests, coming back to any issue is significant for Oliver.

For those unfamiliar with Net Neutrality, the concept is simple. Under current regulations, internet service providers have to make all web content equally available. Whether dialing up, say, an obscure blog by a struggling provincial writer (let’s just say), or the web’s most successful commerce sites, speed and accessibility shouldn’t vary. Service providers shouldn’t charge site owners for faster or more reliable consumer access.

Seems simple enough. I thought I supported the principle undividedly. And when a friend, an outspoken Libertarian who believes everything would improve if all top-level rules vanished tomorrow, claimed that Net Neutrality rules were forms of government micromanagement interfering with a system that worked just fine without bureaucratic interference. Let whomever charge whatever to whoever they want! It’s not government’s place to get involved.

To be clear, I don’t buy the anti-neutrality argument. If service providers could charge content providers for superior access, massive corporate systems like Facebook and Google, which between them control half the online advertising revenue for the entire earth, or mega-commerce sites like Amazon, could afford extortionate rates, while struggling artists and shoestring entrepreneurs would get shunted onto second-string connections and forgotten. That includes my friend, a strictly regional business owner.

Jeff Bezos, whose Amazon controls
half the Internet commerce in the world
(Full disclosure: this blog appears on a platform owned by Google. And my book reviews link to Amazon. I couldn’t afford this service if I had to pay out-of-pocket.)

But thinking about it, I realized: I’m protecting the very big against the very big. And so is my friend. As stated, half of all online ad revenue travels through two corporations and their subsidiaries. Google owns YouTube, Zagat, Picasa, and the Android operating system; Facebook owns Instagram, Oculus, and WhatsApp. Just for starters. I’m protecting the already massive from paying to get their product onto my computer screen.

Some Google subsidiaries, like Boston Dynamics, actually produce marketable product. But Google mostly sells ads on their search engine—ads that frustratingly often lead customers to something they already wanted to find. They’re already charging small operators, like those artists and entrepreneurs I mentioned, access to get seen by people like me. Same for Facebook: it’s primarily an ad vendor. And if you don’t buy these two companies’ ads, you probably won’t get seen.

So the Net isn’t Neutral right now.

Nearly a century ago, G.K. Chesterton wrote that, for most citizens, the difference between communism and capitalism is vanishingly small. The only choice is whether we prefer to be ruled by government bureaucrats, or corporate bureaucrats. That’s clearly happening here. On careful consideration, Net Neutrality is essentially protecting the very large corporations, who are imposing their own rules on smaller companies, rules that are proprietary and therefore both invisible and arcane.

So we’re faced with the choice between protectionism, which will ensure Google, Facebook, and to a lesser degree Amazon (which controls half of all Internet commerce) can charge small operators like me to get seen by anybody whatsoever; or Libertarianism, which… um… will make these companies pay Comcast, Time Warner, and Charter Spectrum to continue doing the same thing. On balance, neither choice really protects small operators like me.

G.K. Chesterton thinks your neutrality
rules are sweet and naive
I’ve chosen, at present, not to pay either Facebook or Google for increased visibility on this blog. This means I mainly get seen by people clicking through on my personal Facebook and Twitter accounts, and my daily hits seldom exceed 100 to 150. (Oh, who owns Twitter? It’s also a corporate umbrella; it just hasn’t grown nearly as fast.) I’ve chosen not to kiss corporate ass, and the price I’ve paid is vanishingly small readership. Sad trombone.

Both “isms” depend on the premise that, if we create the appropriately utopian regulatory system (too big? Too small? Any at all?), information will flow freely. Except we need only open our eyes to realize information isn’t flowing freely. The commercially accessible Internet, as it currently exists, is essentially a joint-stock partnership between Sergey Brin and Mark Zuckerberg. There’s no sweet spot of appropriate regulation on the Internet. Because there’s no freedom for information to move on a platform that isn’t already free.

Tuesday, August 1, 2017

Business, Ethics, and the Risk of (Self-)Righteousness

Scott Sonenshein, Stretch: Unlock the Power of Less—and Achieve More Than You Even Imagined

Professor Scott Sonenshein divides the the business world into two categories: chasers, who pursue more and better resources to do achieve their objectives, and stretchers, who make what they already have perform double duty and prove maximum return. Sonenshein, who teaches management at Rice University, uses language from business and behavioral economics to convey his message. I was shocked, however, to notice how he made a fundamentally moral point.

A popular mindset persists, Sonenshein writes, particularly among business professionals born into the wealthy class, or among people with very narrow, specialized educations. If we had more money, this mindset asserts, or better tools, or more people, or something, we’d crack the success code and become unbelievably successful. If only we acquire something new, we’ll overcome whatever impediment stops us achieving the success waiting below our superficial setbacks.

By contrast, successful businesses like Yuengling beer, Fastenal hardware, and others, practice thrift in resource management, utilizing existing resources in innovative ways, maximizing worker control over business decisions, eschewing frippery, and making the most of everything they own. Sonenshein calls this “frugality,” a word he admits has mixed connotations. But he’s clearly demonstrating familiarity with once-common ethical standards, what economists still call the Protestant work ethic.

Sonenshein doesn’t once cite religion or morality, either implied or explicit. However, when he breaks successful businesses down into bullet point lists of best practices, like “psychological ownership,” “embracing constraints,” “penny pinching,” and “treasure hunting” (to cite the takeaways from just chapter three), the ethical correspondences become rather transparent. Take responsibility for your choices, little Timmy! Work with what you have! Save now for bigger rewards later! Et cetera.

Scott Sonenshein
From the beginning, Sonenshein structures this book much like religious sermons. His points are self-contained, backed with pithy illustrations showing real-world applications. He asserts his point, cites his text, backs it with anecdotes, then reasserts his point. The structure appears altogether familiar to anybody versed in homiletics. It persists in religion, and translates into business books like this one, because it holds distractible audiences’ attention long enough to clinch brief points.

But again, Sonenshein never cites religion. He frequently quotes research from psychology and behavioral economics to demonstrate how scrutiny supports his principles. But if he’s proffering a business gospel, it’s a purely secularized one. Though Sonenshein comes from the same mold as religious capitalists like Norman Vincent Peale, Zig Ziglar, and Og Mandino, he never relies upon revealed religion. Earthly evidence, not God’s law, demonstrates this gospel’s moral truth.

Oops, did I mention Norman Vincent Peale? See, there’s where doubts creep in. I had mostly positive reactions to Sonenshein’s points until I remembered Peale. There’s a direct line between Peale’s forcibly optimistic theology, and Joel Osteen’s self-serving moralism. We could achieve earthly success by aligning our vision with God’s… but often, already successful capitalists have recourse to God to justify their own goodness. I’m rich because I deserve it!

This often leads to retrospective reasoning—what Duncan J. Watts calls “creeping determinism.” In finding already successful operations, then applying his learning heuristic to them, Sonenshein risks missing invisible factors steering his anecdotes. I cannot help recalling Jim Collins, who praised Fannie Mae and Circuit City scant years before they collapsed. In reading Sonenshein’s anecdotes, like hearing Christian sermons, it’s necessary to listen for the sermonizer’s unstated presumptions.

Please don’t mistake me. I generally support Sonenshein’s principles. I’ve reviewed previous business books and found them cheerfully self-abnegating, urging middle managers to sublimate themselves to bureaucratic hierarchies and treat themselves basely. Sonenshein encourages workers to stand upright, own their jobs, and always seek improvement… and he encourages employers to treat workers like free, autonomous partners. Though Sonenshein never embraces an “ism,” he corresponds with my personal Distributism.

I wanted to like Sonenshein’s principles because I generally support his ethics. His belief in thrift, in embracing a mindset of ethical management, and of getting outside oneself, is one I generally applaud, and strive to apply to myself (not always successfully). Though I don’t desire to control a multinational corporation, I wouldn’t mind leveraging my skills into local business success and financial independence. And I’d rather do it ethically.

But like any ethicist, Sonenshein requires readers to police themselves carefully. They must apply these principles moving forward, not looking backward. Financial success often inspires implicit self-righteousness, which business ethics can inadvertently foster. I’ll keep and reread Sonenshein, because I believe he’s well-founded. But I’ll read him with caution, because his framework conceals minefields I’m not sure even he realizes are there.

Wednesday, July 26, 2017

There Is No Opioid Epidemic

Rachel Maddow has done solemn, suit-clad journalism
on the awfulness of opioid addiction
I'm sick of hearing about the "opioid epidemic." Repeating this claim has become a sure-fire way for media professionals to prove that they're serious about fixing the problems in American society today. Mainstream journalists like Dan Rather and Rachel Maddow have done somber think pieces about opioids, while humorists like John Oliver and Adam Conover approach the problem from a satirical, but still po-faced, angle.

There's just one problem. It's a bunch of crap.

Pause a moment and think about this: what do we call somebody who can't stop taking pain medication? Somebody who continues popping pills long after it's become measurably clear that their appetite for the medication drives a wedge between themselves and their family? Somebody who would rather pop pain medications than hold down a job, have a place in the community, or pursue constructive hobbies?

Modern medicine calls such people addicts. But I suggest another name altogether: patients. A person consuming pain medications with such dogged desperation, probably needs the pain to go away. Calling this person a moralistic name like "addict" implies the person has suffered a failure of ethical character, and we need to punish that person—an attitude which uses medicine and law to enforce moral values. Calling this person a "patient" lets us seek the root underlying pain.

We do need to distinguish here between addicts and recreational users. Some people take pain medication, mind-altering drugs, or prescription stimulants and steroids to enjoy the altered state of mind. Such people creep me out, but they aren't addicts; they just enjoy seeing the world from another perspective for a few hours, usually in the company of friends. As long as they harm nobody but themselves, stopping them feels pretty high-handed.

John Oliver plays the story for laughs, but
clearly has the same self-serious intent
But addicts have a completely different situation. They cannot stop taking whatever substance they're addicted to, because without it, they feel incomplete, pained... inhuman. Addicts don't want to feel good; they want to feel normal.

Consider what people become addicted to. Besides opioids, the most common addictive substances include alcohol, heroin, cocaine, and codeine. All of these are, or were at one time, prescription painkillers. Other addictive substances include nicotine, marijuana, and Valium, which all have anti-anxiety properties. Even lowly methamphetamine, the one drug I'd like to see permanently forbidden, began life as a prescription anti-anxiety medication, Benzedrene.

Besides substances, people can become addicted to behaviors. Psychologists have observed and treated people demonstrating addictions to gambling, sex, eating, work, and more. Each of these behaviors creates an internal reward system: the giddy rush of waiting for your pony to win, the sense of accomplishment from a job well done. They also give addicts something to think about other than the mundane, and possibly painful, circumstances of their day-to-day lives.

Opioid addiction, when considered in news or late-night comedy, seems inextricably entwined with two circumstances: medical trauma, or chronic unemployment. I've noticed a persistent trend of connecting opioids with West Virginia coal country, where declining revenues have forced many former miners, many suffering black lung and other work-related injuries, out of the workforce. The physical pain of illness, and the psychological pain of uselessness.

Journalist Nick Reding, author of Methland, correlates the most widespread consumption of methamphetamine with America's rural communities, where dwindling economic opportunities and declining wages make grasping at straws a necessity to get by. Addiction specialist Gabor Maté, based in Vancouver, British Columbia, notes how many of his destitute patients come from physically or sexually abusive backgrounds. Pain and addiction go hand-in-hand.

So God forbid we actually address the real problem.

The phrases "opioid addiction" and "coal country" keep coming up in the same news stories,
usually without any comment on the connection between addiction and despair


America's policy for dealing with addiction, largely unchanged since the days of Harry J. Anslinger, remains to jail offenders, regardless of their suffering. The isolation of prison, where prisoners lack social networks, meaningful work, or even sunlight, will only compound whatever pain got them addicted to begin with. Rather than helping addicts deal with their problems, our criminal justice outlook only doubles down on the underlying causes of addiction and other diysfunction.

Solemn, tedious think pieces about opioid addiction allow journalists to look engaged with America's diffuse suffering. But they exonerate a socioeconomic system that devalues humans and profits off their pain. Prescription pain meds cannot advertise on broadcast media, so blaming them doesn't hurt revenue. Media professionals thus excuse their own complicity in an economy that encourages resource hoarding, devalues labor, and treats humans as interchangeable parts.

And buying that crap lets us, the audience, off the hook for profiting from that system. Stripped of glitz, the problem is us.

Monday, July 24, 2017

The Doctor Is Still In

Peter Capaldi as the Twelfth (current) Doctor
I have seen more people shaming the Doctor Who haters than I’ve actually seen Doctor Who haters. With the recent announcement of Jodie Whittaker’s nod to play the thirteenth (or fourteenth, or fifteenth) version of the title character in the BBC’s Doctor Who, social media has been alight with people mocking and belittling those tender souls who cannot stomach a woman in the role. Actual tender souls, however, have been rare and hard to find.

Perhaps that’s because I haven’t been looking for them. The Internet allows such fungal undergrowth to flourish beneath various rocks on Reddit and 4Chan, but unless one of their number gets elected President, few actually venture into the sunlight, knowing the general cultural trend has moved away from them. Their sexist, homophobic, bigoted attitudes belong to a long-gone era, and they know it. They have the common decency to stay away from us normal people.

Yet rather than let such attitudes fester and die quietly, members of the progressive call-out culture feel obliged to share, retweet, and otherwise publicly announce the existence of such attitudes. They qualify the shares with snarky comments or condemnations of regressive attitudes, which maybe gives such shares the vestments of sanctimony. But as working journalists discovered last year in sharing Donald Trump’s message, vocally refuting regressive attitudes doesn’t change the value of free media attention.

Every time some white, male yob claims that a female Doctor, female Ghostbusters, or female Jedi “ruined my childhood,” they get free publicity from those who purportedly oppose that position. I’d think having a childhood so brittle that you’re still hanging “No Girls Allowed” signs on your door, would be ruined without outside intervention. But by giving these self-proclaimed martyrs sunlight, my fellow progressives allow these attitudes to flourish and propagate, past their sell-by date.

Paul McGann as the Eighth Doctor
Weirdly enough, while rewarding such conservative crybabies, progressives nourish their own crybaby culture simultaneously. I’ve witnessed liberals whining that the female Doctor doesn’t count because she’s too late, or because she’s white. In other words, progress doesn’t really matter unless it happens on my timetable. Besides being just plain arrogant, this attitude overlooks the commercial demands of media. Even a publicly owned resource, like the British Broadcasting Corporation, is still a corporation, beneath the surface.

Richard Walter, professor and chair of screenwriting at UCLA, writes that “American television seems to love last year’s— or last decade’s— controversy.” In other words, media won’t embrace a position until it’s safe enough to handle without getting burned. Hollywood refused to touch the anti-racist message of Broadway plays like Finian’s Rainbow until history had made the message essentially harmless. The BBC is British, yes, but it suffers under the same constraints as American television.

So yes, a character created in 1963 will carry the whiff of the white, patriarchal influences under which he (she?) was created. Though original showrunner Verity Lambert was a twenty-something woman, the character she oversaw was old, male, and white, reflecting the British power structure in her era. The Doctor didn’t get a black traveling companion until Mickey Smith in 2006, or an openly gay one until Bill Potts in 2017. Hispanic? Sorry, not yet.

But the show planted seeds for the Doctor’s more inclusive regenerations years ago. We saw our first on-screen Black Time Lord in 2008, two years after Black actor Patterson Joseph was briefly considered to play the Doctor. We saw a male Time Lord regenerate into a woman in 2016, demonstrating that such transitions were possible. (Don’t bring me Steven Moffett’s “The Curse of Fatal Death,” where Joanna Lumley plays the Doctor. Gag episodes don’t count.)

Tom Baker as the Fourth Doctor
So. Progressives who’ve already seen that the Doctor will become more diverse in the future, complain that such diversity isn’t happening right now. Simultaneously, crusty back-numbers who want Doctor Who to remain a museum piece of their supposedly bucolic, girl-free childhoods, complain that diversity happens at all. The BBC must satisfy the largest number of viewers, so change has to happen, but it cannot alienate large audiences, so change happens slowly. That’s what corporations do.

Which is why I cannot take either side seriously. Calls to make change happen instantaneously, even when such change is legitimate and overdue, are as unreasonable as demands to forbid change and keep things static forever. The Doctor will become a non-white woman someday soon. Possibly even a gay woman. (I favored Sue Perkins for the role, after all.) Conservatives cannot stop it, but progressives cannot rush it. Corporations are slow, but change is inevitable.

Friday, July 21, 2017

Stephen King, Imperialist Ignoramus

Stephen King in 1977, the year he publishe
"Children of the Corn" and The Shining

Stephen King’s short story “Children of the Corn” pisses me off. There, I said it. King admits, in his collection Night Shift, that the story found him while driving through rural Nebraska: he found the miles and miles of corn, planted in ruler-straight rows, highly intimidating, largely because this clearly human landscape was seldom interrupted by anything as humane as houses. Note, he drove through Nebraska; he never mentions bothering to speak to actual Nebraskans.

I always dismissed this shortcoming as a fluke, though, an oversight by a busy, highly popular novelist who didn’t notice his own blinders. That is, until someone recently pointed out to me that, in King’s novel The Shining, the all-important Overlook Hotel closes for the winter. In Colorado. Let me repeat that: a hotel closes for the winter, during the peak of Colorado’s lucrative ski season. What the hell? That’s two states King gets wrong!

Thinking about this, I realized: in both cases, King projects Greater Northeast values onto the American West. In New England, where King lives, white colonial communities were deliberately planted within walking distance, because they were planted in the Seventeenth Century, when twenty miles was a time-consuming overland slog. Farmers lived in towns, and walked brief distances to their fields; human habitation is the norm, not the exception. (It gets more complicated inland, but not much.)

And, in the Northeast, resort hotels do close during winter. Partly, this reflects brutal Northeastern winters, where soil sometimes freezes solid, while blizzards isolate neighboring communities. But it also reflects the Northeastern tourist business, driven primarily by New Yorkers and Bostonians fleeing sweltering big-city summers. The Borscht Belt mainly offered tourists mild countryside temperatures. Even if people could visit tourist resorts during winter, they mostly don’t want to. Winter, in Northeastern cities, is almost pleasant.

Jack Nicholson in Stanley Kubrick's
adaptation of The Shining
King failed to understand that Colorado tourism derives from completely different drives. People band together through shared love of braving the cold to test their athletic endurance. In 1977, when King published both The Shining and “Children of the Corn,” inexpensive transportation was moving Colorado skiing from a regional pastime to a national destination. Formerly an agricultural state, Colorado was slowly transitioning to a tourist economy. Stephen King showed up right during that intermediary stage.

According to legend, which he propagated himself, King chose Boulder, Colorado, as the setting for his next novel by selecting a random location from an atlas. King and his wife spent one October night as the sole guests of the otherwise vacant Stanley Hotel, which inspired The Shining’s massive, sprawling, vacant halls. October is prime foliage-gazing tourism season in New England, where King lives; one wonders whether he even realized he’d chosen Colorado’s off season.

Thing is, if you understand Northeastern history, King’s Overlook Hotel makes perfect sense. He describes a run-down hotel of the 1970s that peaked during the Jazz Age. The building, basically a preserved museum of its heyday, is haunted by the ghosts of big-band musicians and bartenders passing liquor off the books. Jack Torrance stumbles into a tuxedo party and gets handed rotgut booze. The hotel King describes belongs not in Colorado, but the Catskill Mountains.

My memories of the Catskills are dominated by Cub Scout adventures, which is about all that happens there anymore. Ruins of grand hotels built between the world wars remain, but few still do business. Neither particularly tall nor unusually beautiful, the Catskills benefitted from being within driving distance of Manhattan and Boston; when plunging fuel prices before the Arab Oil Embargo made Colorado and Utah viable tourist destinations, the Catskills’ storied hotels fell into disrepair.

Steven Weber in Mick Garris's adaptation of
The Shining, with a script by Stephen King
Follow my reasoning: King can’t handle Nebraska because, in his mind, human habitation means towns; Nebraska’s spread-out human geography looks demonic to him. He can’t imagine tourist business that isn’t circumscribed by warm weather, or haunted by Jazz Age glory. In both cases, he projects regional values onto regions he doesn’t understand. Regions he can’t understand, because he doesn’t bother speaking with locals. If he did this to non-white people, we’d call him an imperialist.

I moved from California to Nebraska at age 18, so I understand the shock of transitioning to spaces sculpted by human activity, but vacant of people. It’s scary… for a while. But I didn’t have to live in Nebraska, near the Colorado border, for very long, to realize the circumstances King describes aren’t horrific. If he’d bothered meeting any Nebraskans or Coloradans, the horror would’ve worn off quickly. Instead, King looks like a world-class dick.

Wednesday, July 19, 2017

First of the Texas Troubadours

1001 Albums To Hear Before Your iPod Battery Dies, Part Six
Townes Van Zandt, Live at the Old Quarter, Houston, Texas

Townes Van Zandt was a notoriously unreliable performer. Before his death in 1997, audiences reported he had only two modes: an unbelievable clarity and rapport with the audience, coupled with sublime lyrical sensitivity, or an absolute drunken mess. Throughout his career, his attempts to numb his bipolar disorder with substances always undermined his performance persona. But for four nights at a Houston dive-bar in 1973, he produced something unmatched in the history of Texas music.

Country music fans may remember Townes Van Zandt as author of such classics as “Pancho and Lefty,” “If I Needed You,” and “White Freight Liner Blues.” He produced six albums in five years between 1968 and 1973, often working at a feverish pace made possible only by mental illness and cocaine. His songs bespoke a bleak pessimism that, paradoxically, gave his cultish fans great hope that they weren’t suffering alone. He became an underground legend.

As this prolific period wound down, Van Zandt played four shows at the Old Quarter, a venue co-owned by his friend and bandmate Rex Bell. Producer Earl Willis recorded these shows on a portable four-track system, apparently for posterity, never intending to release them. However, Van Zandt descended into a creative dry spell, corresponding, probably not coincidentally, with a healthy new relationship. Without his malaise, and the drugs,he lost much of his creative motivation.

The recordings feature Van Zandt, alone on stage with just his guitar, uncharacteristically sober and receptive to his audience. The recording, chosen from Willis’ crude recordings, involves such concert business as announcements about where to find the cigarette machines, and apologies for the broken air conditioning. But it also includes some of the most tender and insightful recordings a damaged genius ever put forward. The austere arrangements and clinking background emphasize Van Zandt’s lyrical complexity.

Townes Van Zandt
He opens with “Pancho and Lefty,” his then-current single, about a Mexican bandit’s betrayal. Both Emmylou Harris and Willie Nelson had hits with this track, but both embroidered it with radio-friendly flourishes highlighting their own celebrity. Van Zandt, playing solo, keeps focus on his lyrics, a mournful exploration of the forces with make a criminal into a folk hero. The idea that we need heroes only after their dead is poignant, coming from Van Zandt.

This track list reads like Van Zandt’s Greatest Hits. He plays several of his classics, like “Rex’s Blues” or “Tecumseh Valley,” with the unornamented authenticity of somebody producing work he loves. Between his own classics, Van Zandt also includes tracks by artists who influenced him, like Merle Travis, Bo Diddley, and Lightnin’ Hopkins. This remarkably inclusive set of influences goes heavily toward explaining Van Zandt’s unusual style, quirky songwriting, and widespread fandom outside genre circles.

Part-time fans and corporate executives long classed Townes Van Zandt as country music. This isn’t entirely unfair, considering how many of his songs have been covered by artists like Don Williams and Steve Earle. But like many artists most immediately impacted by his music, including Lyle Lovett and Gillian Welch, he fit poorly into country’s mold. The distinctive mix of country, folk, blues, and other roots genres has some fans speaking of just-plain “Texas Music.”

Because of its essential austerity, this album provides insights into not only Van Zandt’s influences, but also his artistic vision. His classic studio albums were larded with Nashville sidemen and sophisticated productions, presumably to create chart hits. But this overhandling not only produced no radio-friendly singles, it frequently pointed up Van Zandt’s weaknesses as a vocalist. Like Bob Dylan, Van Zandt had a vision, but wasn’t a pretty singer. What he had, was insight.

Van Zandt’s production team released this album in 1977, four years after it was recorded, five years after his last studio album. Basically, he needed the money. Though he’d pulled some songwriter’s royalties from cover versions, his period of happy inactivity left him functionally penniless. Van Zandt released only one studio album between 1972 and 1987. Unfortunately, his eventual return to the studio would result in more overproduced pablum, presumably to capitalize on increasing name recognition.

Standing between his career bookends, this album, the most stripped and honest he’d ever record, highlights not only Van Zandt’s artistry, but also the way he created a pinch-point of Texas music. Though others blended the state’s reach of country, blues, and folk before him, Van Zandt coupled that with raw poetry few peers ever approached. Without the studio crutch, this album makes plain his stylistic vision, and keeps his music alive for coming generations.

Monday, July 17, 2017

A Classic Comic Resurrection Beneath the Earth

Jon Rivera and Gerard Way, writers; Michael Avon Oeming, artist, Cave Carson Has a Cybernetic Eye, Vol. 1: Going Underground

Former spelunker and part-time action hero Calvin “Cave” Carson hung up his spurs and became a family man several years ago. But the excavation company that now employs him has ulterior motives for keeping Carson on a short leash. When a ghost from his past appears on his doorstep, Carson realizes his adventuring days aren’t through. But his employers won’t let Carson go so easily… nor his daughter, either.

DC Comics introduced Cave Carson in 1957, alongside other adventure-oriented titles, featuring heroes without superpowers, like Challengers of the Unknown and the Sea Devils. But Carson never got sufficient traction to become his own franchise; he fought alongside Superman, but always as a sidekick. Lead writer Gerard Way admits he needed to consult a concordance of obscure classic characters to find someone worthy of reboot for his Young Animal imprint.

Newly widowed at the start of this story, Cave Carson struggles to maintain connections with his college-age daughter. He goes through the motions of workplace diligence, but they mostly keep him around for nostalgia: he taught his followers everything they know about underground adventuring, before they eventually outgrew him. Now Carson has the kind of slow, melancholy conversations we recognize from action movies, right before everything hits the fan.

And fan-hitting does occur. One night, tired, frustrated, and alone in his formerly full house, Carson hears a knock. A loincloth-wearing emissary appears at his door. Seems the Muldroog, a lost civilization of mole people, are under attack, and only Carson’s late wife, with her panoply of ancient secrets, can save the underground. But with her gone, apparently a blood quantum is sufficient, because they’ll accept Carson’s daughter instead.

Original promo art,
click to enlarge
It’s difficult to read this graphic novel without recognizing the debts it owes older stories. Besides reviving an almost forgotten character from the Eisenhower era, and connecting him to characters borrowed from Edgar Rice Burroughs, the art suggests a combination of Peter Max and Astro-Boy. The story has hints of old EC horror comics, a tendency emphasized by sudden jarring images of amorphous fungus people savaging the peaceful natives.

Yet this obsessive borrowing doesn’t undercut the story. Like many serial science fiction franchises that don’t bother concealing their roots, like Star Wars and Doctor Who, this story’s connection to older pulp traditions gives it a sense of continuity. We aren’t just reading something generated last weekend like the transient comics of the 1990s that are largely unreadable today. This story connects science fiction’s past to his evolving present.

The emissary at Carson’s doorstep warns him that his employers, EBX, committed the attack on his subterranean nation. So Carson doesn’t even bother bringing his bosses into the discussion. He calls his oldest ally, Wild Dog, an Uzi-wielding maniac who plainly copied his image from the first Quiet Riot album, and goes rogue. Getting off the grid proves easy for a scientist accustomed to caves. Bringing his daughter along proves harder.

Deep underground, the Muldroog have buried a secret for generations. Why else would a nation, apparently blessed by technology but attuned to natural rhythms, continue living in caves? Seems the Muldroog civilization is based upon a lie its people tell outsiders, a curse that keeps giving, provided nobody ever finds out. But what the Muldroog have spent centuries keeping locked up, EBX wants to make into a profit engine.

For all the sci-fi-adventure trappings, this story essentially isn’t about that. Cave Carson’s cybernetic eye, which sometimes goes unmentioned for several chapters, isn’t a driving force behind the story, it’s a metaphor for a man who’s seen things he cannot forget. Carson and his wife told their daughter lies to protect her from hostile reality. Now Eileen’s gone, Cave must bear punishment for those lies alone when truth rushes forth.

This book carries a “Suggested For Mature Readers” label. Please take this seriously. Besides violence, language, and very brief nudity, the themes of long-simmering family tensions shouldn’t be taken lightly. This story introduces themes that most grown-ups will recognize from their own families. Though we perhaps won’t discover our connection to forgotten mole-people civilizations, we all struggle to accept and understand our roots.

Cave Carson is only one among several classic DC characters getting reboot treatments from Gerard Way’s Young Animal imprint. Formerly lead singer of My Chemical Romance, Way’s recent reinvention as a genre writer has made visible several themes always implicit in his music. He admits his comics deal preponderantly with strained parent-child relationships. Well, this story ends in motion; it’ll be interesting to see where he takes these themes next.

Monday, July 10, 2017

Is Wonder Woman a Pro-War Superhero?

The group photo of Wonder Woman and friends that begins the movie

This essay contains spoilers.
Patty Jenkins’ Wonder Woman continues to enjoy mainly positive reviews, the first DC Extended Universe movie to enjoy warm reception. But not everyone agrees. Self-described Leninist author Jonathan Cook calls Wonder Womana hero only the military-industrial complex could create.” He backs this with something I thought common knowledge, that America’s security apparatus has leaned on Hollywood films before. But even by Cook’s own criteria, Wonder Woman would make very poor pro-military propaganda.

The term “military-industrial complex” comes from President Eisenhower, who warned Americans that a small cadre who got rich off violence would foment new wars as tickets for personal enrichment. Eisenhower primarily meant manufacturers of war materiel: the Lockheed Martins and Northrop Grummons of the world. But Hollywood has occasionally participated in the drum-beating industry. Illustrious directors like Frank Capra and David Lean have participated in creating frequently vile wartime propaganda. So have comic book publishers.

Yet this movie doesn’t support such interpretations. Consider a key sequence: leaving England, Diana, our heroine, watches soldiers returning to London with missing limbs and disfiguring scars. Nearing the front, she wants to rescue a wailing orphan, until she sees the refugee caravan. She feels for the refugees, until she discovers an entire occupied village. She liberates the village, but sees it destroyed by chemical weapons. She destroys the general who ordered the attack, and… noting improves.

This ascending pattern describes this entire movie. Apparently Cook believes that, because Jenkins depicts war in her movie, she perforce endorses it. But Diana, trained in single combat, thinks war morally vacuous and diabolical, hoping to uproot it altogether. Ultimately, Diana’s journey isn’t to defeat war; it’s a journey to discover war’s systematic nature. No one nation, army or general holds ultimate culpability for war. Rather, human power structures keep everyone fighting over diminishing scraps.

The real Erich Ludendorff, left, and Danny Huston as Ludendorff in Wonder Woman
Cook claims Wonder Woman positions the Allied leadership as virtuous and peace-seeking, compared to the murderously war-mongering Germans. Yet General Erich Ludendorff murders most of German high command when he wants the war to continue after they’ve lost hope. Meanwhile, when Steve Trevor warns that ignoring Ludendorff could cause thousands of soldiers to die needlessly, a British general, accustomed to leading from the rear, sneers: “That's what soldiers do.” Hardly the Manichaean dualism Cook purports.

Most important, the British and German leaders aren't negotiating for peace; they’re negotiating an armistice. A cessation of active hostilities, which would prove toxic. In reality, Ludendorff survived the war and proselytized the “Stab-in-the-Back Theory,” a leading intellectual justification for rising Nazism; see Christian Ingrao. The armistice didn't solve the underlying problem, it just punted everything onto the next generation. In the final reveal, the armistice serves supernatural war efforts; peace was never on offer.

Throughout the movie, Diana pursues Ares, the war god, whom she accuses of fomenting this violence. Cook takes this accusation so literally, I wonder if he actually saw the entire film. The climactic confrontation reveals that, while Ares put ideas in human heads, humans ultimately didn’t need supernatural incitement to war. Humans, Diana discovers, are a mixture of violence and kindness, of destructive and constructive capabilities. She could defeat war, but only by obliterating humanity.

In fairness, one of Cook’s criticisms holds water. Israeli model-turned-actress Gal Gadot has, as Cook writes, served in the Israeli Defense Force, and the IDF has a dismal human rights record. Working from known dates, Gadot probably participated in Israel’s illegal occupation of southern Lebanon. But Israel has compulsory military service laws; all Israelis, except ultra-Orthodox groups, must render two years’ service. Gal Gadot served in the IDF; so did Dr. Ruth Westheimer. So what?

This picture doesn't serve my theme; I just really like that it exists (source)

In my youth, I attempted (unsuccessfully) to join the Marine Corps. Later, after my views shifted, I waved placards in anti-war demonstrations. I’ve observed the military-industrial complex from the pro- and anti- camps, and friends, if Wonder Woman glorified war, violence, or nationalism, I’d say something. The events onscreen simply don’t justify any such interpretation. Throughout, we witness Diana’s dawning realization that violence, while sometimes necessary, never fixes humanity’s underlying problems. War is not praiseworthy.

Many movies have glamorized war, often at the expense of reality. From classics like The Longest Day to recents like American Sniper, Hollywood has often bought into myths of wartime glory, and peddled Security State drivel to unsuspecting young audiences. But Jonathan Cook’s accusation that Wonder Woman joins these ranks is so unsupported by onscreen evidence, one almost suspects Cook didn’t watch the movie, and reviewed other people’s rumors. Wonder Woman isn’t a pro-war movie.

Friday, July 7, 2017

The Samurai, the Chef, and the Sunset Strip

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 20
Juzo Itami (writer/director), Tampopo

A pair of rough-hewn truckers in Stetson hats pull up to a local business and defend the proprietress’ honor. Pretty standard Spaghetti Western fare. Except these truckers are Japanese, the proprietress operates a traditional roadside ramen shop, and the biggest trucker gets his ass kicked. We know we’ve ventured off the high-minded track from art house cinema. But don’t get comfortable— you’ve entered a roller coaster of upended movie conventions.

Writer-director Juzo Itami combines boilerplates from cowboy action movies and samurai films with Monty Python-esque slapstick, to create a rapid-fire comedy that has all the heft we associate with the words “foreign film,” and none of the tedium. His combination of Westerns and samurai traditions isn’t necessarily unusual; directors have frequently remade films from one tradition into the other. Itami breaks the mold, though, with his distinctly western-friendly humor.

That local business is a roadside ramen shop, which sophisticated Japanese cities have like American cities have coffee shops. They provide loci of community, places where people get together, share stories, make friendships, and scheme revolutions. Japanese communities lacking ramen shops try to lure new ones in. But this shop is failing, and its proprietress doesn’t know why. She needs a savior, and the truckers need a mission. Hmmm...

The lead trucker, Goro (“bull” in Japanese, played by Tsutomo Yamazaki, an Akira Kurosawa veteran), has drifted through Japan inside his truck, alone and bored. Like the best cowboys or samurai, Goro thinks he needs only himself, until circumstances trap him in one location. Rough-hewn and westernized, Goro doesn’t fit into the ramen shop’s clean white surroundings. Yet he feels a call to rescue the pretty young owner.

That owner, Tampopo (Japanese for “dandelion,” Nobuko Miyamoto, wife of director Itami), has fallen on rough times. Recently widowed, she struggles to keep her family business active while raising a precocious son unaided. But the pressures overwhelm her; her kitchen is a haven of comically misplaced food, which she occupies in a manner all flying elbows and panicked screams. She seems lost behind the counter, until the badly beaten Goro reawakens her passions.

Described in prose, the plot seems formulaic. This isn’t coincidence. Itami deliberately satirizes cowboy and samurai movies, the ways their conventions still dictate modern Japanese gender identities, and how trapped modern people are in traditional roles. Goro and Tampopo glom onto each other because they must, because they perceive their reciprocal relationship entirely in terms inherited from glamorous movies. Toho Pictures and Hollywood Boulevard circumscribe our life choices.

Nobuko Miyamoto (left) and Tsutomo Yamazaki in Tampopo

But Itami also introduces sly elements ridiculing those who’d break from convention, too. Between long blocks of his main plot, Itami offers quick sketches, timed with Monty Python briskness, of people trying, and failing, to change. A schoolmistress teaches girls to eat noodles quietly, like Westerners, but gets undermined by a noisy, slurping American. A junior businessman incurs his employers’ wrath by simply trying to order his own dinner.

The longest recurring subplot involves a flashy Yakuza (organized crime boss) and his mistress. In a string of scenes, many wordless, these two find ways to make sharing food an intensely erotic experience. Japanese culture has deeply conflicted traditions surrounding expression of sexuality, so these scenes feel transgressive. We feel weird watching their food-based intimacy, but they’re so genuinely inventive, and so altogether earnest, we’re also comedically hypnotized.

Goro and his boyish sidekick Gun (played by Ken Watanabe before his career got Americanized) gradually rehabilitate Tampopo’s restaurant, turning it into a gleaming citadel of home-cooked goodness. But the effort of rebuilding the business domesticates Goro; he becomes accustomed to sleeping in the same bed and having hot meals nightly. Like cowboys and samurai throughout film history, he both loves and fears the changes domesticity forces.

We spend the entire movie wondering whether Goro and Tampopo will ever escape the prefab roles they’ve fallen into. One moment, we think they’ll show some identity, then the next, they grab onto the familiar like a life preserver. Much of this movie’s comedy comes from these characters’ complete inability to know themselves. Life dangles happiness before their eyes, but they can’t imagine their lives different from anything prior.

With its mix of Western and Japanese elements, and its casual, conversational dialog, this movie has become a staple of college-level Japanese language courses. Though dubbed versions exist, please watch it in Japanese. The actors’ remarkably understated performances and deadpan delivery make this a comedy classic transcending language barriers. Its metaphors of food and relationship resonate, whatever language you speak.