Monday, February 27, 2017

Folk Circuit Foot Soldiers

1001 Albums To Listen To Before Your iPod Battery Dies, Part Five
Wolfstone, Year of the Dog


Now that the Celtic folk music revival of the 1990s is barely remembered, it’s hard to comprehend that people ever argued whether the Pogues or the Chieftains, the Waterboys or Loreena McKennitt, was the superior artist. Contemporary or traditional? Electric or acoustic? Into this verbal jousting came Wolfstone, a band from Inverness, whose name reflected an undiluted mixture of traditional and rock elements similar to, but quite unlike, anything that came before. Or, unfortunately, since.

This was the band’s fifth album—though they disavowed their first two releases, making this their actual third. With this recording, they finally made the complete jump from a dancehall band to an actual mass-media presence. By formally consolidating their drummer and abandoning sequenced loops, they shed their former recordings’ late-Eighties sound, fully incorporating the crashing, organic sound of wood striking goatskin. With their embrace of full band, but rejection of Riverdance niceness, they peaked.

Wolfstone had a continually rotating membership; they exhausted more drummers than Spinal Tap. However, their best work centered on a quartet of guitarist Ivan Drever, fiddler Duncan Chisholm, guitarist Stuart Eaglesham, and his brother, keyboardist Struan Eaglesham. Drever also wrote most of their best songs, some with Chisolm. Their mix of acoustic and electric instruments, melding folk and rock styles, creates an integrated sound that, rather than blending two styles, creates one their wholly own.

Saying they have “rock stylings” could be misleading. Some of their tracks show influences of late-Eighties commercial hard rock, driven by percussive chords and a synthesizer foundation. Others have a quieter, more atmospheric edge reminiscent of acts like the Cocteau Twins or Dream Academy. Either way, their sound was already displaced, a heritage structure that belonged to a prior time. Like the folk music they liberally sampled, their rock was a holdover from another era.

The band’s fondness for circular rhythms and percussive backbeats reflect its heritage, paying its dues on the “Highlands and Islands” dance-hall circuit. Though their lyrical content ranges from Scottish history to America’s present to Norse mythology, they remain rooted to a sound designed for dancing. Their clear, synth-driven chord progressions and sing-along choruses cut audibly through surrounding noise; even this decades-old recording sounds piercing enough for house parties and driving around with the windows down.

Guitarist Duncan Chisholm and fiddler Ivan Drever
playing live at Wolfstone's mid-1990s peak

Beyond a doubt, this is a political album. It opens with a track, “Holy Ground,” about the suffering caused by violence in Northern Ireland. They also include “Brave Foot Soldiers,” about the struggle for Scottish independence, and “Braes of Sutherland,” a lament by a Scotsman evicted from his homeland during the Foreclosures. Only one song, “The Sea King,” doesn’t have political themes. Loosely adapted from an Orkney folk poem, it eulogizes a fallen islands conquerer.

American audiences will particularly appreciate the song “White Gown,” about one’s refusal to bow before racism and bigotry. Purportedly, Wolfstone, who enjoyed a semi-permanent status on America’s folk festival circuit in the early 1990s, discovered they had played a concert within driving distance of a Klan rally. Horrified, they wrote a song about standing straight and unbroken, even unto death. With the newly visible resurgence of organized bigotry in America, this couldn’t be more timely.

Besides the five songs, this album also includes four instrumental medleys. All four mix traditional folk melodies with original compositions, though the original tunes, composed by the band’s instrumentalists, have distinctly traditional roots. These are essentially dance-hall folk tunes bolstered with an electronic backbeat, permitting either Riverdance-ish traditional jigs or more contemporary, unplanned dancing. The closing track, “Dinners Set,” particularly resembles the theme for some revisionist fantasy epic, ending the album on a high note.

Don’t mistake this album for something new. Recorded when the Internet barely existed, in a studio that maxed out at sixteen tracks, and sold primarily by mail-order and merch table, this album belongs securely in its time. In the hangover from the Reagan-Thatcher generation, Wolfstone, like other bands of their times, had a squeaky-clean sound and earnest lyrical thrust that squarely reflect their era. Despite politically charged hard edges, this album remains aggressively, soberly nice.

Sadly, this album and its follow-up, The Half Tail, about northern Scotland’s continuing economic decline, saw the band at its peak. After this, Wolfstone struggled to clear its contractual obligations, while two members of its central quartet left to pursue side projects. They cranked out two sub-par albums and a best-of collection before going into hiatus. Currently, a ghost of Wolfstone tours Scotland, occasionally self-releasing albums. Like the Beatles, this sound belongs to its time.

Wednesday, February 22, 2017

The Married Woman's Guide to a Happier Afterlife

Abby Fabiaschi, I Liked My Life: a Novel

Maddy Starling doesn’t let mere physical death stop her mothering her family. She keeps monitoring her husband and daughter, trying to comfort young Eve through her grief, and helping Brady audition for his second wife. But Maddy’s family can’t move on. They haven’t yet reconciled why anyone, much less their high-spirited and involved wife and mother, would commit suicide just hours after buying ingredients for next Sunday’s dinner.

Abby Fabiaschi’s debut novel is probably directed at a primarily female audience. I say this not disparagingly, but in recognition that women have always been America’s chief producers and consumers of fiction. I also recognize that Fabiaschi’s story doesn’t follow the guy-friendly pattern of rising action, climactic conflict, and resolution. Instead it’s made of moments, a cascading pattern of highs and lows, much like a relationship. Or like life.

Though disembodied, Maddy remains a presence in her family’s life. Not just in their memory, either; her new semi-omniscience couples with an ability to subconsciously nudge loved ones, letting her continue providing the guidance they’ve grown accustomed to. Brady and Eve are both driven Type-A personalities, who needed Maddy’s influence to soften their interactions. Now father and daughter must learn to get along without Mommy to grease the gears.

Brady climbed the corporate ladder to become CFO to a ranking Massachusetts corporation (I missed what they do). He excused his marathon hours behind necessity: if I stop working, my family will lose its creature comforts. But with Maddy gone, Brady realizes he doesn’t really know the family he’s supported. Worse, he doesn’t know himself, because he’s buried his personality behind responsibilities. And he can’t heal if he can’t learn.

Sixteen-year-old Eve’s youth of privilege ends abruptly when she discovers that not money, but Maddy, protected her from life’s sharp edges. Suddenly the endless round of high-school athletics, gossip, and boyfriends seems hollow. She begins a quest most girls don’t begin until after college, the quest for meaning intrinsic to herself. As she discovers a family history of mental illness and self-destruction, that meaning seems increasingly elusive.

Abby Fabiaschi
Lesser novelists might have written this novel on maudlin themes of loss and heartbreak, of learning to build a “new normal” without the foundations we previously took for granted. Fabiaschi front-loads those themes into early chapters, then dispenses with them quickly. For her, the gap between reality and expectation, between the reality of purpose and the illusion of control, matters more. These characters learn having purpose means sacrificing treasured illusions.

Admittedly, early chapters could try readers’ patience. Eve and Brady begin the novel newly bereaved, casting about impotently for blame and explanations, that is, for illusions of control. It’s necessary to face these early chapters to have the payoff that comes later, but audiences weaned on rising action may find such introspection maudlin. Don’t be fooled. These characters invite us on their journey, and it starts in their bleakest moments.

Rising action is, arguably, readers’ illusion of control. The well-framed, writerly structure comforts us in believing the work was constructed, as Roland Barthes put it, “theologically,” with a purpose inherent to the author. When reality has us casting about helplessly, reducing us all to agnostics, at least everything we read has conscious control. At times throughout this book, we wonder if that’s true. Fabiaschi, to her credit, lets us wonder.

Interspersed among this story, told by rotating first-person narrators, the theme of Maddy’s death remains persistent. Not only does her suicide seem unmotivated, her means—jumping off Wellesley College’s library roof—seems more symbolic than practical. Though Maddy tells one-third of the novel herself, she remains remarkably cagey about the exact circumstances. Experienced readers know that, the longer a secret is withheld, the bigger it better be.

In a rare surprise for literary fiction, the twist reveal proves sufficient to reward readers’ interest. No, I won’t reveal what the characters have to earn. But suffice to say, it brings what they’ve discovered about Maddy’s life to a culmination. It emphasizes the characters’ faith in justice and purpose has merit. Though committedly unreligious, the Starlings trust the universe has an arc of justice, which will reward their unknowing.

The epilogue makes me suspect this novel has autobiographical elements. But not many. Fabiaschi creates a literary, magic-realist fantasy of incisively realistic proportions. She invites us to confront grief and loss, but also life’s annoying tendency to persist. And she reminds us that the times which most incline us toward despair are the moments that most infuse life with meaning.

Tuesday, February 21, 2017

The First Church of Jesus H. Trump

If you follow Blue Facebook, the consensus is clear: President Trump’s first solo press conference last week was catastrophic. His attacks on the assembled press, attempts to silence dissent, rambling responses, and lack of coherent message basically invited mockery from the coterie of hack comedy writers and amateur psychologists cluttering one side of politics. So you may have felt surprised when conservative bomb-thrower Ann Coulter posted this gem on Twitter:
Yes, Ann Coulter, a woman whose views are so extreme that the National Review fired her, had a transporting moment of religious ecstasy while watching the President’s performance. Though Trump’s first solo outing was so bad that even Fox News’ Chris Wallace turned on the President, Ann Coulter is practically shouting “Habemus Papam!” on today’s most-quoted social media platform. So I realized: of course she did. He’s clearly a religious figure.

More important, the performance Trump’s opponents derided wasn’t word salad. Repetitions of words like “tremendous” and “great,” present progressive verbs like “are becoming,” and simple explosions of phrases like “not good” when pressed on Russian spy ships inside American territorial waters, suggest a desire, not to inform Americans, but to get us shouting in unison. If Donald Trump is a religious leader, he’s using familiarly liturgical language.

Liturgy is the shared repetition, usually in unison, of religious language amid large groups and congregations. Frequently, even powerful religions like Christianity and Islam are riven by dissension about theological concerns; but they’re unified by the experience of repeating community language like the Lord’s Prayer or the daily prayers of Salah. In many ways, liturgy, far more than faith, is what keeps religions and their believers unified across time.

This isn’t incidental to the religious experience. Sociologist Émile Durkheim notes that religions which maintain strict liturgy have much lower incidence of apostasy than religions soft on liturgy. Many evangelical Christian groups in America and Europe have eschewed liturgy as mere mindless repetition, and are now seeing membership flake off at unprecedented rates. Strictly liturgical traditions like Catholicism or Lutheranism, by contrast, are stable or even slightly growing.

Donald Trump’s very simple repetition of emotive adjectives like “tremendous,” “great,” and “really” have the practical purpose of keeping his own famously brief attention span anchored on admittedly tedious activities like press conferences. But when he repeats key phrases like “Believe me,” “We’re going to win,” and “Political Correctness,” these terms aren’t for him. He, or whoever writes his Teleprompter speeches, wants audiences at home to repeat these terms.

In case anyone forgets these two have a history: Ann Coulter is a longtime Trump evangelist

Just as Christian congregants recite “We Believe” while speaking the Apostles’ Creed in worship, Trump’s assertions of himself as believable join True Believers together in a shared experience. The verbal reminder creates mental pathways when asked to explain their beliefs to outsiders. Christians have a clear creed of the Father, Son, and Spirit they can recite in under two minutes. Trump supporters have their leader’s pithy, often funny catch phrases.

This extends to creating a group identity. When Christians repeat “We Believe,” they implicitly state they don’t believe the opposite. What the Apostles’ Creed says implicitly, Donald Trump says explicitly. His use of paired descriptors advances this claim, as when he contrasts “winning” with “losers,” how “smart” he is with whomever he dubs “moron,” declaring himself “strong” while calling his opponents “lightweight,” and most blatantly, his pairing “we” with “they.”

Creating this shared experience gives believers a group identity, which they can deploy against challenges. Assailed by pluralism and secularism, Christians retain their religious identity because they have confidence, not only in God, but in other Christians. Likewise, Trump supporters, feeling embattled by America’s changing economic and demographic constituency, are nevertheless emboldened in their beliefs because they aren’t going forward alone.

Beyond doubt, this can create problems. Anti-science Christians, abortion clinic bombers, and the Westboro Baptist Church are obvious examples of people so emboldened by identity that they become destructive. Perhaps this arises from a purely oppositional religious position. But purely oppositional politics is no better, if preserving “white working class” identity means keeping minorities, homosexuals, and immigrants down. Any identity can become toxic under stress.

Bruno Latour writes that religious language (almost) never attempts to convey information. “What [divine injunctions] transfer is not information content, but a new container,” he writes, meaning religious language doesn’t inform but transform the hearer. While Trump’s opponents “fact-check” his claims, believing they primarily contain factual propositions, True Believers internalize the form, becoming new together. Facts don’t matter when confronting liturgy. In total, only what we’re becoming, now, together, matters.

Friday, February 17, 2017

The Voices of Catastrophe Capitalism

1001 Books To Read Before Your Kindle Battery Dies, Part 79
Sheldon Rampton & John Stauber, Toxic Sludge Is Good For You: Lies, Damned Lies, and the Public Relations Industry, and Trust Us, We're Experts: How Industry Manipulates Science and Gambles With Your Future

Reality exists. This has become a controversial proposition in today’s “culture wars” milieu, where my vehement assertion is equal to your demonstrated fact, but it shouldn’t be. Reality isn’t always obvious, as when talking about pollution that doesn't’ leave air crunchy or water flammable, but even when we can’t clearly detect reality, it still exists. And if anybody tells you it doesn’t, you should immediately investigate who’s bankrolling their message.

Journalists Sheldon Rampton and John Stauber have won the prestigious Orwell Award for unpacking the money trails behind many widely held opinions in contemporary culture. Turns out, our opinions have a detectable family tree, one often openly for sale. The wealthy and powerful plow remarkable funds into the public relations and spin industry, to ensure our opinions accord with their financial interests. Often at the expense of reality.

Rampton and Stauber’s first book, Toxic Sludge Is Good For You, took its name from a cute Tom Tomorrow cartoon. But, they reveal in a later chapter, a PR representative contacted them late in the production process, asking them to change their title, because they literally had a campaign underway to convince Americans that sewage sludge, which includes industrial runoff and is frequently radioactive, would make good agricultural fertilizer. Satire writes itself.

They describe an industry which actively pitches a malleable reality. A network of advertisers, paid experts, and Solomonic eminences spread viewpoints that (coincidentally, I’m sure) accord with whoever signs their paycheck. These agencies interrupt the news you depend upon to intrude opinion, often at odds with demonstrable fact. But it creates the appearance of debate and controversy in public spheres. And fights that should end, instead continue indefinitely.

The most interesting chapter deals with President Clinton’s attempt to create national health care. Running in 1992, health care wasn’t Clinton’s side issue; he won the plurality in no small part behind pledges to standardize medicine access. Even Senate Minority Leader Bob Dole warned Republicans not to run against health care in 1994. But by 1995, an industry-funded PR campaign had rebranded the effort as Hillarycare, made it poison, and killed it in Congress.

After a brief detour into farming practices, Rampton and Stauber’s third book, Trust Us, We’re Experts, describes how PR professionals apply their techniques directly to science. It goes into further depth into something they previously touched on, the use of paid third-party representatives and front groups. Organizations with seemingly scientific names and superficially neutral mission goals, are actually in the pockets of industries profiting from the status quo.

You’d have to dig deep to find independent or university-affiliated scientists who dispute certain claims: that humans are driving global warming, or secondhand smoke causes cancer, or organochlorides endanger wildlife. Yet somehow, these stories remain hotly debated in news media. Keeping the mass population confused keeps debate alive at the legislative level, and culture-wide efforts to redress these problems are consistently DOA.

Tom Tomorrow's original toxic sludge cartoon. Click to enlarge.

Mass media debates, subsidized by “think tanks” with opaque money trails, create the illusion of 50/50 splits. But on issues of immense moment, our authors demonstrate, debates are kept alive by as few as three experts who make their living doing the Sunday talk-show rounds, mostly supported by non-scientific backers. The illusion of controversy prevents meaningful change from beginning, even on issues wholly uncontroversial inside scientific communities.

Both books emphasize the role a highly credulous, and often cash-strapped, news media plays in perpetuating these debates. TV news especially, a source of network prestige during the Cold War, is a neglected stepchild today, desperate for low-cost content, and often reruns PR-produced content strictly “as-is.” Reader’s Digest, which cultivates an appearance of generalist dispassion, often works hand-in-glove with PR agencies and their backers.

It may seem counterintuitive to suggest anyone profits when debates continue interminably. But our authors demonstrate, people who profit off the status quo needn’t win debates to really win. If debates on, say, global warming drag, we’ll continue burning hydrocarbons and strip-mining mountains because, hey, why not. Bankrolling paid experts who don’t win debates, but merely cast doubt on material evidence, bolsters corporate profits exponentially.

Consider the debates that seem perpetual. Health care and global warming are merely the most visible examples. Why would anybody believe, against the evidence of short winters and prolonged droughts, that Earth’s climate is becoming less hospitable? Because the illusion of controversy keeps stories alive past their sell-by date. When stories that seem ironclad remain debated, it’s often best to consider who profits. Some people get rich watching our world burn.

Wednesday, February 15, 2017

Hugh Jackman and the Myth of Wolverine

Promo photo via 20th Century Fox
Australian actor Hugh Jackman was thirty-one years old when he played Wolverine, the role that made him a bankable global star in Bryan Singer’s 2000 film X-Men. Now forty-eight, he has completed filming his ninth (and, according to his Twitter, final) appearance as Wolverine for this year’s Logan. His age range across the movies, playing a character substantially (physically) unchanged since 1974, makes me wonder: how do mortal actors play immortal characters?

In 1972, before Wolverine existed, Italian semiotician Umberto Eco wrote a much-read essay, “The Myth of Superman.” The title is more than slightly misleading, since it’s about Superman’s audience, not Superman himself. But touching on how readers enjoyed serial stories like Superman comics, in the days before narrative continuity and “realism” became storytelling priorities, Eco notes how each story, though nominally independent, transforms the reader.

Eco’s central theme is “consumption.” Superman, who had by that time existed for about thirty-four years, isn’t constrained by time. He remains youthful, square-jawed, idealistically American. But his audience doesn’t, and as they continue reading, their ability to consume comics evolves. Every time Superman punches Lex Luthor, he nominally ages, nominally “consumes” himself—but not really. He actually consumes his relationship with his audience, and must constantly transform to remain relevant for changing readers.

By contrast, Wolverine doesn’t even nominally age. His “mutant healing factor” doesn’t merely stave off physical injury; the ravages of time don’t even influence him. His skin doesn’t leatherize, his joints don’t creak, his hairline doesn’t recede. Since I debuted the same year as Wolverine, I appreciate this mutation. But it also serves comics’ underlying conceit, that events published years, decades earlier, are somehow still recent, and events are always happening “now.”

Promo photo via 20th Century Fox
Though Wolverine’s superpowers keep him eternally thirtyish, Jackman lacks that blessing: besides deepening lines in his face, concealed behind makeup and copious facial hair, Jackman recently had his sixth—sixth!—skin cancer removed. Jackman cannot play the role forever, simply because he cannot resist time. Eventually the franchise must either end, or the role must be recast. Either option has dire implications for Wolverine going forward.

Critic Djoymi Baker writes that actors from influential franchises inevitably carry their characters into other appearances. George Takei always trails Hikaru Sulu’s clouds of glory behind him, something he used to his advantage in the series Heroes. We could continue: no matter how many roles David Tennant plays, the whiff of The Doctor always follows him. American audiences cannot see Patrick Stewart as Professor X without Captain Picard coloring our perceptions.

But the reverse also remains true. A role always has the imprint of the actor most associated with it: attempts to recast Michael Keaton’s restrained, whispering Batman became progressively worse as different actors attempted to play the same Bruce Wayne. Chris Pine as Captain Kirk remains disappointing as he cannot shake Bill Shatner’s shadow. And Brandon Routh as Superman… well. The only lead character ever successfully recast is James Bond.

That’s why, when existing superhero franchises like Batman or Spider-Man recast their leads, they also reboot altogether. As Hugh Jackman ages out of the Wolverine role, the entire franchise is jeopardized. (Notice that, unlike in comics, Cyclops and Jean Grey remain dead. [Edit: oops. I was behind in my watching.]) In comics, characters like Wolverine or the original X-Men retain their youthful vigor because they’re a collaborative invention of the artists’ and readers’ imaginations. In movies, actors get old.

Actors sometimes don’t take this well. Jackman has followed the Harrison Ford model of building a diversified career so he’s never yoked to one role to his credit. But consider how, aging out of Superman, George Reeves committed suicide, while Christopher Reeve essentially retired to yeomanry. The popularity of Birdman notwithstanding, Michael Keaton’s leading-man career essentially ended after Batman. God help Henry Cavill if he ever works for scale again.

Actor Hugh Jackman out of costume
Superheroes’ intimate relationship with their respective actors really consumes both. Since 2000, Hugh Jackman has inextricably been Wolverine, a portrayal that spills into the comics. Likewise, the reciprocal relationship between Nick Fury and Samuel L. Jackson resembles nothing seen since DC first retooled their Batman comics to portray Adam West. As Hugh Jackman uses up his ability to play Wolverine, he uses up Wolverine himself.

Grant Morrison has observed that superheroes reflect their audiences’ needs, which change with culture’s constant evolution. But depicting superheroes with live actors fixes them in time. To survive, Wolverine must completely abandon Hugh Jackman, an unlikely proposition for now. As actors and audiences age, characters become subject to time. And time is always unforgiving.

Monday, February 13, 2017

Because of the Wonderful Things He Does

Dorothy (Adria Arjona, center) and Toto arrive in Munchkinland
Everyone from the Atlantic to Rotten Tomatoes feels compelled to notice how NBC’s event series Emerald City feels like producers ran The Wizard of Oz through Game of Thrones like a coffee filter. The implication, not universal but widespread, holds that this is a bad development. But arguably, America probably needs such a twist today. Oz remains a primordial American myth, and in today’s pessimistic environment, a darker read may be what American audiences need.

It’s hard to imagine a more literal manifestation of Joseph Campbell’s “Hero’s Journey” than L. Frank Baum’s The Wonderful Wizard of Oz. With its literal road through the signposts of Dorothy’s transitions, it clearly signifies America—in its own figuration, an eternally youthful country—developing from the horse-drawn, primarily oral Nineteenth Century to the more solidly factual, petroleum-burning Twentieth. It remains a compelling story, because it’s a story about us. Emerald City reverses that trend.

It begins with an adult Dorothy (Adria Arjona) manifestly unhappy in small-town Kansas. Expository dialog and brief scenes reveal Dorothy, a nurse, has commitment-free sex with doctors inside her chain of command and steals medication from patients for her ailing adoptive parents. She’s also visibly brown, which, if you know anything about Kansas outside Douglas and Johnson Counties, makes her an unwanted but economically necessary outsider, somebody targeted for harassment in today’s divided red-state America.

Swept into Oz by a brutally realistic whirlwind that lacks Judy Garland’s semi-Freudian whimsy, she crash-lands in a fairyland that leaves her bloody, and, of course, kills the witch. She immediately encounters “Munchkins” who are neither diminutive nor cute. Literary critic Evan Schwartz writes that Baum invented the Munchkins to manifest his guilt over having advocated, as a young South Dakota newspaperman, the eradication of American Indians. These Munchkins represent America’s inherited sense of guilt.

The Wizard (Vincent D'Onofrio, center), surrounded by the witches of Oz,
in an advance promotional poster from NBC

One could make a list representing ways Baum’s world represents various transitions. The sentient apple trees as a fear of malevolent nature, our own or that of the outside world, which we must control to achieve maturity. The poppy field inducing people to fall asleep where they are, and avoid contacting the adult world. The city itself, where the lack of magic, and the ultimate disappointment of technology, ultimately thrown Dorothy onto her own devices.

This retelling reverses many such symbolic moments. The narcotic poppies don’t want to stop Dorothy’s adulthood, they want to drop her in a prison of existential malaise. The Wizard (Vincent D’Onofrio) remains impotent and powerless, almost a tragicomic figure when standing alone, but maintains the illusion of authority by being capricious, vengeful, and destructive. The Scarecrow (Oliver Jackson-Cohen) represents the one moment of Dorothy’s maturation, when she accepts responsibility for something larger than the moment.

Comparisons to Game of Thrones aren’t untoward. This version is both violent and sexually forthright, unafraid both to establish well-liked characters only to kill them, and to engage in titillation. By episode seven, Dorothy and the Scarecrow fall into bed, while episode five has young Tip, a rendition of Baum’s lesser-known character Ozma, in a Sapphic bathtub scene with another very young woman. For a network production, Emerald City isn’t afraid to skirt FCC boundaries.

Speaking of Ozma, this series’ reach is remarkably catholic. Though The Wonderful Wizard of Oz was a primal American myth, it also became an unquestioned cash cow, and Baum ultimately published thirteen sequels, many written in haste, which contradicted one another. This series incorporates Ozma, Nick Chopper, Queen Ev, and other characters Baum distributed wildly. This results in a decentralized story, but it also reflects the multiple, divergent storylines necessary for any modern American myth.

The Scarecrow (Oliver Jackson-Cohen, left) tries to persuade Dorothy
that they should continue as a team

If Baum’s original Oz symbolized America’s transition from agrarianism to high technology, this Oz symbolizes America’s inability to agree upon whether reality even exists. Episode six reveals that Dorothy was born in this mythic land, not in “reality” as she knows it. D’Onofrio’s Wizard goes from confused, sympathetic uncle to vindictive fascist with, ahem, whirlwind speed. As in America dominated by MSNBC and Fox News, characters’ individual preconceptions shape their apprehended reality, not vice versa.

One prior attempt to update Oz also reflected its time. Syfy’s three-part Tin Man reset Baum’s characters in a similar morally ambiguous steampunk universe. But, released in the waning days of George W’s fervent nationalism, Tin Man showed Oz possessed by an evil spirit it must shake off. Emerald City shows Oz occupied… but by whom? Both the Wizard, and the Witches he controls, lie with casual impunity. Oz, like America, is dominated by confusion.

Friday, February 10, 2017

Brass Canary in the Coal Mine

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 17
Mark Herman (Writer/Director), Brassed Off

Aging Danny Ormondroyd (Pete Postlethwaite) works digging coal in a distant Yorkshire village, but considers his real career directing the local brass band. When pretty, young, posh-talking Gloria Mullins asks to sit in with Danny’s previously all-male band, noting she was born local—and switching her accent quickly—Danny swallows his objections because she can play. But then his boys, who also dig coal, realize Gloria’s dark secret: she’s management.

Yankee Doodle film distributors pitched this movie as a romantic comedy between Fitzgerald’s character and angry, disillusioned trumpeter Andy Barrow (Ewan McGregor, back when he still did mainly indie films). In fairness, that’s a prominent subplot. But in reality, this movie centers on the three-way struggle between Yorkshire coal miners, the state-owned National Coal Board monopoly, and the labor union workers no longer trust. It’s like The Full Monty, with horns.

Danny directs his band with quiet restraint, standing at his podium with a dignified bearing beyond his working-class roots. But his energy has been depleted recently, which he attributes to age. Danny’s son Phil, shackled by massive debt, has just watched helplessly as his wife and children left. Phil takes a second job, performing as a clown at children’s parties. Danny considers this beneath his talented son, but Phil lacks his father’s ideals. You can’t eat music.

Andy Barrow greets Gloria’s arrival with cold disdain. She wonders why, until he reminds her: they had their first fumbling attempts at teenage romance together fifteen years earlier. She left Grimley, adopted a London accent, and became a public servant; he stayed and dug coal. Now she’s back, and he can’t help wondering why. Still, old feelings die hard, and before long, they fall into rapid love. Until he learns she’s there to evaluate whether the local mines have any economic future.

These two narratives, and a few others besides, drive the story. And the symbolism isn’t subtle. Danny’s inability to communicate with his struggling son represents the relationship between the nationwide union and the hurting workers, a representation made more poignant when Danny is diagnosed with black lung disease and given months to live. Andy and Gloria, meanwhile, represent the relationship between management and labor: they need, but cannot trust, one another.

Members of the Grimley Colliery Band, based on the band, village, and mine in Grimthorpe, Yorkshire, are the epitome of British working poor. Still suffering with debt a decade after the anti-Thatcher miners’ strike of 1984, which lasted a full year, the miners consider their union as bad as management. They don’t realize the National Coal Board is in the process of privatizing, and any ethical underpinnings or public good will soon no longer matter in business decisions.

Ewan McGregor, Tara Fitzgerald, and Pete Postlethwaite (front row L-R) in Brassed Off

Workplace bands matter in British tradition, a way of banding together and enforcing shared identities that aren’t circumscribed by work. Americans do something similar with softball teams. Mines, factories, retailers, government agencies: many organize marching bands in their off hours, to entertain the community and compete regionally. Following Britain’s massive economic contraction of the 1980s, the bands often outlived the workplaces they represented.

This movie continues a respected British cinematic tradition, of losers fighting a losing battle simply because it’s right. From David Lean’s war movies to the classic Ealing Studios comedies, British filmmakers have long celebrated glorious losers. They can’t fight the tide, these movies imply, but simply refusing to concede makes them noble. But as Danny’s hacking cough starts bleeding, this movie also asks whether being noble matters enough.

The Thatcherite government shuttered Grimthorpe’s mines in 1993, forcing nearly half the village into unemployment lines. By 1994, the EU declared Grimthorpe Britain’s poorest community, and entire neighborhoods were demolished because nobody can afford the taxes and maintenance. Though the community saw some revival in recent years, Grimthorpe remains a byword for economic stagnation. And the band remains synonymous with stiff-upper-lip perseverance.

In some ways, this movie eulogizes a way of blue-collar life already dying in Britain. In 1983, Britain had 176 active coal pits; by 2009, it had six. And the movie isn’t shy about its political motivations; for the American release, distributors bookended the film with title cards about Britain’s working class decline so less familiar audiences could understand. But this movie isn’t naĂ¯ve. The Thatcherites sealed miners’ coffins, but economically, they were already dead.

These miners were, potentially, rife for militant nationalism and economic protectionism. But the final scenes imply they refuse to let fear dominate them, because they have each other. As the band circles Westminster, loudly playing “Land of Hope and Glory” from the top of the double-decker bus, the camera dollies in on Danny, who knows he’s walking dead. He watches his band, beaten but not broken. Tears fill his eyes. And, if we’re honest, ours too.

Wednesday, February 8, 2017

Is Economic Bridge-Building Still Possible?

Chuck Collins, Born on Third Base: a One Percenter Makes the Case For Tackling Inequality, Bringing Wealth Home, and Committing to the Common Good

Activist Chuck Collins was born rich. Like, really rich. Probably richer than you’re thinking right now. A fourth-generation heir of the Oscar Mayer meatpacking fortune, Collins enjoyed the accoutrements of inherited wealth, like education, social networks, and job opportunities. But a brief sojourn volunteering among America’s poor opened his eyes. He committed his wealth to charity, and has spent decades working to build bridges across class divides.

This, the latest book in Collins’ remarkably prolific career, comes across as half autobiography, half call to action. (Note, not “call to arms”—Collins adamantly opposes war, even as metaphor.) He describes meetings with Americans from throughout the economic spectrum, organizing the working class, well-off retirees, corporate boards, and others. His message rings true to those familiar with history’s great successful reform movements: only a joint effort of rich and poor can truly reverse America’s widening inequality.

To Collins, inequality isn’t an issue exclusively of the have-nots. The upward concentration of wealth has resulted in a remarkable economic narrowing that stifles entrepreneurship and weakens demand. Collins describes this in both statistics and storytelling, a masterful application of appropriate anecdote to further a less obvious general claim. He makes the stories of inequality, and their eventual remedy, seem human. (I should note, this book launched in September 2016, so his tone already looks somewhat dated.)

Union across class divides, not an attempt to demonize the rich, makes the most sense to Collins. Well, it would, but he makes a strong case. Early on, Collins presciently identifies why, knowing what we do of American history, a politics of stoking hatred makes bad policy: “In the United States, we’re more likely to get Donald Trump regressive populism than Bernie Sanders’s progressive populism.” Remember, he wrote this before the election, when Trump seemed a longshot protest candidate.

I admit having serious stipulations regarding this position. “Instead of a class war of shame,” Collins writes, “I advocate an appeal to common humanity and empathy.” Which is fine, with wealthy people capable of either shame or empathy—of which I'm unpersuaded with President Trump. But billionaires like Mark Cuban and Nick Hanauer strengthen my confidence. And to their credit, the Koch Brothers sat out the 2016 general election, unwilling to sully their money.

Zuccotti Park, 2011

Even if some billionaires do wrong, Collins makes the case that “if people feel attacked, they respond from fear. If they are shamed, they respond from anger. If ridiculed, they withdraw. But if they are respectfully engaged, people show up.” Or enough do to change the game. Once upon a time, George Soros, “The Man Who Broke the Bank of England,” was the Left’s only wealthy representative. Now a small but growing number recognize that widespread poverty hurts their bottom lines, too.

America’s post-WWII wealth didn’t just happen, Collins reminds us. Public subsidies like the GI Bill, the FHA, and the Small Business Administration put education, homeownership, and entrepreneurship in ordinary Americans’ reach. Subsidies to science, arts, education, and public works weren’t just generous donations; leaders saw such spending as investments toward winning the Cold War. Yet we’ve grown averse to the very public goods which made American exceptionalism, and a durable middle class, possible.

Collins doesn’t just limit himself to the plight of a generalized, and therefore whitewashed, poor. He dedicates entire chapters to the unique economic plights of women, Black and Hispanic Americans, and other historically marginal groups.“As in real life, there are well-publicized stories of exceptional runners starting far back in the pack and breaking to the front of the field…. But the overall picture is one of steadily growing class-based inequality.”

Much as I like this book, I don’t recall Collins addressing one major element: Americans’ tendency, as Gar Alperovitz writes, to kick down the ladder we’ve just ascended, before anyone can climb after us. We hate the very public-private partnership mechanisms that made our own success possible, and actively attempt to prevent anyone following suit. The billionaires Collins describes inordinately benefited from such arrangements, which some now inveigh against. This needs addressed to build any lasting remedy for today’s social breakdown.

So briefly, Collins provides both tools and justifications to combat the collapse of America’s belief on common good. Using his tools, and his storytelling acumen, could reverse our two-generation trend toward increasing inequality. But he overlooks some aspects of the cultural milieu—a milieu that has soured radically in under six months since this book launched. The protest movement suggests hope exists. If we can channel that movement toward Collins’ approach, and away from anger, hope may still exist.

Monday, February 6, 2017

The Mind Unlocked, In Two Evenings Or Less

Lloyd I. Sederer, M.D., Improving Mental Health: Four Secrets in Plain Sight

Medically grounded mental health treatment has a history of being very fashion-driven. The lengthy inpatient committals at spa hospitals, made famous during the 1980s, were curtailed in favor of heavy medications when HMOs demanded quick, low-cost fixes in the 1990s. Dr. Lloyd Sederer, who has contributed to mental health treatment as both a researcher and a clinician, attempts to eschew trends and focus on what actually works. The product is readable and frequently eye-opening.

According to his introduction, Dr. Sederer writes for two distinct audiences: psychiatric clinicians dealing with patients suffering significant disorders, as well as students; and the families and friends of such patients, who must monitor their loved ones and provide constant palliative care. As such, Sederer’s prose is frequently dense with scientific concepts, but he never introduces terminology without providing definitions. His mix of official, medical language, with case histories, makes this a very humane exposition.

As the title unambiguously declares, Dr. Sederer distills mental health treatment into four broad “secrets,” or functional approaches. The first is, Behavior Serves a Purpose. All human behavior, even counterproductive, harmful, and seemingly “insane” behavior, means something. Substantial treatment begins, not when we get patients to stop hitting themselves, but when we identify what actual meaning their actions serve. This isn’t always easy, much less straightforward. But it’s more productive than just condemning actions we don’t understand.

Second, Sederer emphasizes The Power of Attachment. Humans are linked creatures, and loneliness can transform our mental functions, especially at early ages. People will remain in dangerous relationships rather than confront loneliness (which Sederer clearly distinguishes from solitude). And our need for relationship influences our ability to heal from illness. Sederer describes the “therapeutic alliance,” the relationship by which therapy actually makes any progress. It isn’t just that therapists help, but how they help, too.

Throughout this book, but especially here, Sederer overlaps significantly with the reading on addiction theory I pursued a few years ago. He talks about Bruce Alexander’s Rat Park experiments: laboratory rats in environments designed to resemble their natural habitats wax prosperous, avoid harmful behaviors, and live long, happy lives. Rats raised in cages will gorge themselves on drugs until they overdose and die. Here and elsewhere, Sederer demonstrates that all psychology is linked.

Lloyd I. Sederer, M.D.
Third, Sederer writes, As a Rule, Less Is More. Remember the spa hospitals and heavy medications I mentioned earlier? Though tilted toward opposite extremes, both options represent a do-too-much attitude of massive interventions designed to overwhelm whatever preëxisting conditions produce undesirable behaviors. Rather, Sederer writes, the therapeutic goal should be to reëstablish optimum natural balances, and often, the least intrusive approach works best. Care providers, including families, should avoid the temptation to overtreat routine conditions.

Finally, Sederer hits the one I find most familiar: Chronic Stress Is the Enemy. This takes different forms in different patients, at different stages of life. Children exposed to chronic abuse or neglect develop defense systems that, as adults, turn maladaptive. Adults subjected to these same conditions develop inflammatory diseases that shatter our defenses and literally shorten our lives. These can manifest in myriad ways. What matters isn’t the particulars, but that stress undermines our bodies and brains.

In describing these operant conditions, Sederer also gives constant indications how to counter them. Some responses are within the patients’ control, while others require physicians, families, and other caregivers to take first initiative. In a few cases, Sederer makes recommendations of medications known to have beneficial effects, but in keeping with his Less Is More philosophy, he dispenses these suggestions only sparingly. It isn’t what goes into our bodies, but how we treat them, that transforms us.

Readers familiar with developments in recent psychology, even as filtered for a generalist audience, will recognize much here they’ve read before. From the effects of isolation and company on our short-term mental health, to how epigenetic influences reshape our brains over the long haul, I recognize from other writers. Johann Hari, Stephen Ilardi, and Gabor MatĂ© cast long shadows over Dr. Sederer’s writing. For well-read audiences, Sederer brings these disparate influences together under one tent.

Counting out Sederer’s works cited lists and liberal illustrations, this book runs barely eighty pages, basically a long article. Ambitious readers undeterred by technical prose will savvy this book in one or two evenings. Yet it never feels underwritten or like it’s forgotten anything. It just stays concise, clearly focused on its topic. If anybody you love is undergoing mental health treatment, consider reading this book. It may open your eyes.

Friday, February 3, 2017

The Good Place: Secular Salvation on the Boob Toob

Kristen Bell and Ted Danson in The Good Place
The first episode of Michael Schur’s sitcom The Good Place commences with Eleanor Shellstrop (Kristen Bell) waking up in a waiting room. There is soft Muzak playing, the walls are very beige, and the surroundings are altogether nondescript. Before the first commercial, Michael (Ted Danson), who calls himself “the architect,” informs Eleanor that she has died and is in… well, not Heaven. As the title says, it’s just The Good Place. Eleanor blandly responds: “Cool.”

This sets the entire season’s tone. Michael introduces Eleanor to what he calls “the neighborhood,” a very Southern California development of stucco walls, landscaped parklands, and interminable frozen yogurt shops. Though the neighborhood population is aggressively international and racially integrated, is afterlife reflects an exceedingly Caucasian vision of safety and inoffensiveness. It’s the kind of place where people who can afford the buy-in feel virtuous because they don’t have to make choices or do anything.

So okay, the story unfolds from there, but pause, because the background matters. Michael informs Eleanor that she earned salvation by saving innocent Death Row convicts and undertaking humanitarian missions to war-ravaged lands, and her reward, apparently, is a brightly painted dingbat cottage in Orange County. No presence of God, no reunion with the Brahma, no answers to eternal questions; in this afterlife, those who bettered the world live like those who fled it post-WWII.

There’s a small step between suggesting we’re saved into a life of eternal blandness, and saying we’re saved for living a life of eternal blandness. The widespread adoption of religious language and religious trappings in suburban America correlates strongly with increasing economic segregation. Having grown up in a succession of blandly Caucasian suburbs, including some in Southern California, I’ve noticed that these communities, heavily planned and often erected hastily, always reserve land for some churches.

Jameela Jamil and Manny Jacinto in The Good Place
The Good Place aired its first nine episodes during the 2016 general election campaign, when the political parties forced Americans to choose between Hillary Clinton and Donald Trump, two different manifestations of the white establishment: conventional politics versus conventional money. They represented the sort of white communities that dwell within the suburbs constructed for this show. Despite the much-reported urban/rural divide that fired the electorate, both were essentially suburban candidates offering a secular salvation.

Importantly, both represented an appeal to the past: Clinton to the squeaky-clean 1990s, when her husband was President, and Trump to whenever America was great before. They spoke words about the present, Clinton about inequality and Trump about… um, burning cities or whatever (his present is mired in 1977). But both offered Americans an all-expenses-paid trip to the beatified past that probably happened, in someone’s imagination, just before their respective voting bases needed full-time jobs.

Back in 1948, French philosopher and former Nazi resister Jacques Ellul wrote that many people seek to impose order through drafting moral laws, not because they want everybody else to be good, but because they want to make themselves good. If laws compel moral goodness, we don’t have to make choices. We see this, arguably, with religious conservatives who demand sweeping morality laws requiring people to be good. These laws save anyone from making choices.

The Good Place, however, deals entirely with characters who need to make choices: the sinner who needs to atone posthumously, the celebutante whose motivations don’t match her actions, the moral philosopher whose entire mortal life was characterized by indecision. They’ve always deferred choices, in their own ways. In the season’s final scenes, they’re cornered into making some choice, any choice, and still cannot. Which is the problem with any secular salvation: we’re stuck with ourselves.

Kristen Bell and William Jackson Harper in The Good Place
Spoiler alert. Watching the first episode of The Good Place, I watched Eleanor, the doomed sinner who achieves Heaven by bureaucratic error, and thought: she’s in Hell. Much like when I’m dragged to a party and want to leave, but somebody else drove, Eleanor is in a place she doesn’t belong, and can’t escape. In the final episode, we discover that’s literally true. This vision of secular salvations is, secretly, Eleanor’s version of eternal damnation.

And ours too. As we discover the central characters’ histories through flashback, we realize these characters, in various ways, tried to save themselves: through reason, through action, through force, or through numbing life away. From the first episode, this series avoids specific religious statements, but in the resolution, we have one clear moral: we cannot save ourselves. These characters tried, failed, and kept trying. Our votes, moral outrages, and inoffensive lifestyles will ultimately fail likewise.

Wednesday, February 1, 2017

The Church of the Eternal Conspiracy Theory

Minutes before the most famous 20th Century conspiracy theory erupted

A sudden idea gained massive popularity on social media this week. It began when a previously unknown physicist, engineer, and professional blogger named Yonatan Zunger published a lengthy op-ed entitled Trial Balloon for a Coup? Zunger speculates that President Trump, whose massive (but not unprecedented) reshuffling of authority within the Executive Branch represents a theft of authority preparatory to declaring himself the final authority on everything in America.

That’s a pretty nonstandard definition of a “coup,” but whatever. It’s plausible, given Trump’s demonstrated unfamiliarity with the separation of powers and his authoritarian stamp. On first reading, I shared the link, on a “forewarned is forearmed” basis. On second reading, I realized it was pure speculative gibberish, free-wheeling enough to make Alex Jones look sober and introverted. Of course, that didn’t stop Michael Moore from spouting the same moon juice, or Jake Fuentes from paraphrasing it in less extreme language.

For the last eight years, progressives have mocked the political Right’s love of conspiracy theories. From old classics like “Obama is coming to take our guns,” to surprise hits like Jade Helm 15, conservatives have mustered all manner of bizarre theories to explain why society’s immense complexity is actually simple and comprehensible. From mainly harmless Kennedy Assassination weirdos to 9/11 Truthers, some actually harmful, theories have abounded.

But suddenly, similar conspiracies are emerging on the Left. The complaints about suspended civil liberties, unconstitutional overreach, and incipient dictatorship are remarkably similar, sometimes almost identical, only the proper nouns changed. Apparently people, regardless of political alignment, want to believe our entire society, with its population of hundreds of millions, behaves in manners essentially similar to small crowds—and in ways we mere mortals can comprehend.

Primitive man lived surrounded by forces incomprehensible, and then as now, sought explanations. Lightning, animal attacks, and illness happened with stunning regularity, and people could not accept that it was random. But they realized that they were finite and small against the vast forces of nature, so they concluded, not unreasonably, that a force similar to their own minds dominated all reality. Something much like themselves, but vast enough to encompass all creation.

From this explanation, we get the earliest forms of faith. The consistent cross-cultural similarity of pre-literate spiritualities, which imbue human-like reason to rivers and windstorms, suggests the innate human desire to project ourselves outward onto natural phenomena. Following that, we have the attempt to placate natural phenomena through language and offerings, the first simple religions. Supernaturalism represents human efforts to comprehend, and thus control, nature.

Comet Ping Pong: site of the most famous conspiracy theory of the last year

We see something similar happening with conspiracy theories. This may include everything from Shakespeare authorship to Pizzagate: if I can comprehend something so vast as how somebody created Shakespeare’s works, or who’s doing what at Comet Ping Pong, I can control it. Playwright David Mamet suggests that discovering who wrote Shakespeare makes one (putatively) smarter than Shakespeare, but that maybe goes too far. Most conspiracy theorists want only to control.

The repetition of important keywords furthers this interest. Whether it’s the joyous adjectives attributed to the gods in the Homeric Hymns, or words and phrases like “coup” and “consolidation of power” found in the anti-Trump theories, they serve the same liturgical purpose. The repeated language adjusts all worshipers’ mindsets so everybody thinks according to the same patterns, the patterns of the worshipful mass and, hopefully, those of the gods.

Conspiracy theories, then, are a form of totemic religion. They exist to coordinate believers onto shared and enforceable values, and better, they create a measurable definition of apostates and heretics. They delineate “us” and “them,” “true” and “false.” Thus anybody essentially idealogically aligned, but unwilling to get far enough ahead of facts to say “coup,” unambiguously positions themselves outside the fold, and can be shunned.

Strategic thinking certainly requires moving ahead of facts. Claims of purely naturalistic science notwithstanding, facts seldom explain anything. Only when positioned inside a framework do facts make sense. But the temptation, when explaining political movements or lightning bolts, is to attribute “facts” beyond measurement, to further an increasingly complex framework that, eventually, risks becoming counterfactual. This applies equally to river gods and “Ted Cruz’s dad shot Kennedy” bullshirt.

In politics or religion, a cooler head must prevail. Eventually Confucius, the Buddha, and Jesus arrived and warned people to control themselves before controlling the weather. Hopefully something similar will happen in politics. Don’t look to me for solutions, sadly; I lack insight. I can only warn True Believers that we mustn’t let beliefs get ahead of facts.