Monday, April 25, 2016

God, Man, and Battlestar Galactica

The original series unashamedly pinched imagery, and technical design, from Star Wars
Audiences old and geeky enough to remember the original Battlestar Galactica from 1978, or its frequent syndicated re-airings throughout the 1980s, must’ve shared my astonishment by one element of the 2003 remake. Though the core premise survived the 25-year hibernation, I admit being shocked when the title warship leapt beyond the confines of known space to discover a universe filled with… nothing. This marked reversal signifies profoundly altered expectations about humanity’s place in the universe.

When the Galactica and its dirt-streaked flotilla ventured into space in 1978, it encountered worlds brimming with life. Nearly every storyline (there were seventeen stories across twenty-four episodes) featured an encounter with some new alien species. These included a Vegas planet, a medieval planet, and a Cold War planet with a population driving pickup trucks. The revived series, by contrast, features encounters with no life more advanced than trees. This re-envisioned universe is functionally unorganized.

Dr. Robert L. Strain, writing in Perlich and Whitt, attributes this reversal to an abandonment of “the American frontier myth.” There’s something to this: American science fiction has always had an element of Manifest Destiny embedded within it, an enduring desire to strike up trade where civilization exists and, where it doesn’t, to establish colonies. Though writers like Asimov and le Guin have vocally rejected American exceptionalism, they’ve tacitly redistributed that triumphalism onto humans in general.

But that doesn’t really explain BSG’s transformation. Both in its original incarnation and the reboot, Galactica’s fleet never establishes new colonies; it merely hopes to rediscover that one distant outpost forgotten my mapmakers. Even in the original, the Galactica never particularly wanted to conquer indigenous peoples or resettle their homelands. Indeed, they minimized contact with alien races wherever possible. Dr Strain’s frontier metaphor only stretches so far. Both shows actually represent a different operant myth.

The reversal, rather, represents changing expectations about reality itself. Original showrunner Glen A. Larson conceived a universe enjoying sufficient widespread organization to create, not only life, but intelligence. Self-observant wisdom, in Larson’s vision, is this universe’s norm. The revived universe is chaotic, random, and human intelligence is the exception. Structure is something humans create, not something we take for granted. Larson’s universe is planned; the revived universe is flukish. Only the humans themselves remain constant.

The revived series had an ultimately semi-hopeful conclusion,
but only after showing us a sterile, industrial-shaded universe first
It’s tempting to attribute the difference to God—as we would from early revived episodes. Larson, a devout Mormon, considered a benign God and a life-filled universe, both precepts of LDS doctrine, so self-obvious that he could assume his audience shared that supposition. At various points, space gurus lecture characters on transcendent topics blending Mormon dogma with post-Beatles “Eastern” mysticism. Larson produced to entertain a spiritually hungry America, which took God’s pervasive existence for granted.

The 2003 relaunch couldn’t assume such. Though the pilot miniseries mentions priests and religious order, and early episodes feature religious visions, they’re made ambiguous at best. Is President Roslin actually having visions, or merely drug hallucinations? Is Baltar wrestling an angel like Jacob, or does he have a microchip implanted inside his skull? But considering how the relaunched series resolved these questions theistically, the debate is functionally only contrived. The reboot ultimately accepts God too.

Our reversal, then, isn’t whether God exists; it’s whether God can be taken for granted. A theistic universe, in Larson’s telling, makes a decent launching point for science fiction adventure, and a spiritual thread ties everything together. Twenty-five years later, God’s existence, and a purpose-driven universe, makes the show’s ultimate conclusion. God, gods, or godlessness feud throughout the series, culminating ultimately in more questions than answers. The conclusion seems finally theistic, but actually resolves little.

Note that, at the resolution, Adama keeps narrating events to Roslin’s grave. Like Richard Feynman, Adama rejects theism, but cannot accept that his beloved no longer exists. He represents contemporary America, where “No Religion” has become the second-largest religious identification. Like the “Spiritual but Not Religious” crowd, Adama cannot believe beyond his senses, but cannot abandon structure either. Stranded between belief and nihilism, he finally embraces neither, an ever-shifting chameleon, much like the audience ourselves.

We might observe, cynically, that neither BSG universe is truly atheistic. Writers impose “theological” decisions upon events, a reality that bothers postmodern critics like Roland Barthes. But even that addresses our issues: the universe we observe is never random, but filtered through our perception. Theistic or non-theistic explanations never describe what really happens, but rather our ability to see it; therefore the debate cannot resolve. Our place in the debate matters more than its resolution.

Friday, April 22, 2016

How Prince's Music Changed America

Like Michael Jackson before him, the public outpouring of emotion this week at musician Prince’s passing has provoked discontinuous reactions. People too young to have experienced him the first time are posting Facebook and Twitter images of his peak in the late 1980s, when his innovative fusion sound dominated multiple charts. But they aren’t celebrating him as he was when he died. They’re celebrating the Platonic Ideal of Prince.

Hardly a wonder, either: though he briefly creased the Top Twenty with 1997’s “The Holy River,” he hasn’t produced an out-and-out chart hit in a quarter century. Unlike David Bowie, who persevered through a decade-long illness to produce two of his best albums, Prince largely avoided the limelight, even forcibly removing videos from YouTube. Recent footage reveals he remained a consummate musician, but his best work was long behind him.

Yet people respect his accomplishments, because they are undeniably accomplishments. His CV reads like inevitable hero-worship, even from lukewarm audiences: an unusually wide vocal range, an accomplished multi-instrumentalist, comfortable with both the technocratic studio environment and the human-dominated stage. Like Bowie, he had no difficulty mixing his flamboyant, sexually omnivorous public persona with his lifelong spiritual quest. As a musician and celebrity, he had something to offer everyone.

But for me, his songwriting looms largest, primarily because of its disdain for limits. In a music business heavily chopped by categories, Prince’s a sound combined multimodal guitar playing, Dylan-esque lyrical complexity, a prominent dance backbeat, and the best mix of rock and funk ensemble playing. Like Sly Stone, his band was integrated. And like Hendrix or Jackie Wilson, he drew large white and black audiences despite music’s historically segregated demographic.

Like those prior artists, Prince showed no restrictions in his embrace of musical influence. He played American, European, and International chord progressions with equal ease. If some technique from country music, show tunes, or advertising jingles piqued his fancy, he incorporated it. But he never felt like someone merely imitating somebody else; his skillful integration of influences made every artistic appropriation feel wholly, inevitably, like Prince.

This inclusiveness seems obvious now. We applaud politely whenever, say, Selena Gomez uses a tabla and sitar to supplant a traditional Western rhythm section on some overproduced single, calling her “eclectic” and “international.” A duet between country ensemble Florida Georgia Line and hip-hop eminence Nelly produces a crossover hit that unites audiences who often cannot stand one another. Cross-genre fertilization happens so often, we forget how revolutionary it once was.

Worse, we accept this efflorescence only in approved ways. Nelly and FGL managed to produce a chart hit by combining America’s two top-selling genres; neither ventured into truly uncharted territory. Top-40 radio has become so risk-averse that tracks cross over from the indie, dance, and “Heatseekers” charts over a year after release, because corporate program directors won’t gamble on anything that isn’t thoroughly test-marketed in advance.

And the business is deeply technocratic in ways invisible to consumers. As Charles Duhigg reports in his book The Power of Habit, the music industry is dominated by focus groups, analytic software, and substitutes for individual initiative. Even Pandora, the most common way people who dislike radio discover new hits, doesn’t really introduce new artistry: Pandora’s algorithm is so strict, fans discover new artists but not new sounds.

Thus it’s easy to overlook the influence Prince had on the music-listening audience. We could forget the impact he had on a suburban white boy whose parents kept him proactively sheltered from the modern world. Imagine that boy, riding a schoolbus with the radio tuned every day to the Top-40 station, the most studiously inoffensive music available to listeners. Why, to that boy, everyday radio must’ve sounded positively abject.

Now imagine that boy, in 1987, a year dominated by slick, bland, over-loud artists like Starship or Whitesnake. A year when black and white students sat on opposite sides of the schoolbus, not because laws required it, but because they just did. A year when raucous beats, unsupported by lyrics or melody, encouraged boorish behavior among largely unsupervised schoolkids. Suddenly, through all that crap, he hears “Sign ’o’ the Times.”

Prince gave me hope that popular music didn’t have to be a tool of exclusion. He let me believe that one song could reach across racial, economic, and social lines, to make everybody sing along. Because Prince wasn’t ruled by categories, I realized, I didn’t have to be, either. That sounds easy to say now. But Prince proved we could live it, too.

Monday, April 18, 2016

Death of the British Invasion

1001 Albums To Hear Before Your iPod Battery Dies, Part 3
The Zombies, Odessey and Oracle

Official history records that the Zombies released two studio albums in the 1960s, but that’s deceptive. Their first LP, Begin Here (released in America as The Zombies), was a collection of singles, B-sides, demo recordings, and other sundries, compiled largely without the band’s aid. Slovenly management and no label support hobbled their art. By 1968, having released nothing but 45s, the Zombies were on the edge of dissolution.

The Zombies’ principal songwriters, keyboardist Rod Argent and bassist Chris White, composed their band’s first dedicated LP, Odessey and Oracle, secure in the knowledge that the band would shatter before the album’s release. The title’s spelling supposedly reflects a cover artist painter who didn’t check the paperwork before painting. The album debuted to indifferent reviews. Then the third and final single, “Time Of the Season,” became a world-dominating phenomenon.

Recorded at legendary Abbey Road Studios, just after The Beatles recorded Sgt. Pepper, this album definitely shows the Fab Four’s influences. But from the opening strains of “Care Of Cell 44,”, a letter from a man awaiting his lover’s anticipated release from prison, it’s clear the Zombies aren’t merely beholden to the Beatles’ influence. They’re creating something unique to themselves, possibly years ahead of the late British Invasion around them.

Without doubt, the Zombies were better technical musicians than the Beatles. A graduate of a cathedral music school, Argent in particular was capable of syncopated rhythms, difficult chord progressions, and offbeat time signatures that eluded Lennon and McCartney. Their baroque arrangements, now classic rock radio staples, went substantially unappreciated in their time. But they use them to maximum effect on this album.

But they weren’t just musically advanced. This album’s lyrical content took pre-Woodstock audiences completely by surprise. “Cell 44,” the album’s first single, mixes remarkably dark themes with hippie-era musical bounce, playing the irony for all it’s worth. The second single, “Butcher’s Tale (Western Front 1914),” combines images from war poets like Siegfried Sassoon and Wilfred Owen into a bleak portrait of war’s innate horrors, predating most American Vietnam-era protest songs.

The original lineup of the Zombies
Both singles went nowhere, unfortunately. Album cuts like “I Want Her She Wants Me” and “Maybe After He’s Gone,” written in darkly funereal tones, didn’t attract hippie-era attention. And while occasional critics appreciated “A Rose For Emily,” based on the Faulkner story of the same title, and “Beechwood Park,” a masterwork of nostalgia, audiences in 1968 were unprepared for anything so dark. The stark, visionary musicianship went largely unheralded.

Only when “Time Of the Season” upended everything did the Zombies begin receiving the recognition they deserved. By then, however, it was too late. Broke and riven with managerial problems, the band splintered; most members left music, at least temporarily. Though unscrupulous promoters floated several fake Zombies lineups, the original band didn’t perform these songs live until 1990—and the survivors didn’t tour America with this music until 2015.

The dark themes, allusive lyrics, and musical sophistication probably didn’t suit their time. Despite Argent’s inarguably Sixties organ work, this album frequently has a sound more akin to 1973. The 1960s were substantially divided between the Beatles’ optimism, leading to an acrimonious breakup, and the Stones’ sullen teenage posturing, which has remained lucrative for over fifty years. The Sixties weren’t prepared for the Zombies’ subtle musicianship, or their lyrical ambiguity.

No songs on this album runs very long. The longest, “Cell 44,” approaches four minutes, and “Time Of the Season” hits three-and-a-half, but by contrast, these feel almost marathon-length. Few tracks exceed three minutes. Even with surprising time signature changes on tracks like “Brief Candles,” or the subtextual wrath of “Friends of Mine,” the sound sometimes feels circumscribed by the limits of a vinyl 45, probably imposed by the label.

Perhaps that says something there. Despite hip contemporary complaints about industrial interference in artistic integrity, recording has always been a struggle between music as art, and music as business. The Zombies shattered, not as some have claimed, because of artistic differences, but because management was indifferent to their vision. The Zombies has profound musical potential, but didn’t follow the pulse of their time very well.

Since its release, Odessey and Oracle has achieved cult status, mainly by word of mouth. Unlike most LPs of its generation, it remains available in its entirety, without being chopped into “Greatest Hits” confetti. Anyone listening, expecting happy hippie-era escapism, will find it as jarring as it must have seemed in 1968. For an oldie, this album maintains its harsh edge. Nearly fifty years later, it still bites.

Friday, April 15, 2016

Is Led Zeppelin Threatening Creativity?

Randy California (top left) and Spirit
The concept of ownership in America has officially become untenable. The case of Michael Skidmore v. Led Zeppelin, alleging the famous British blues-rock band plagiarized the iconic opening measures of “Stairway to Heaven,” demonstrates how narrowly restrictive intellectual property law has become. It transforms one of rock music’s most famous compositions into a ripoff of a frankly boring and forgettable deep-album cut. But it does much worse.

Skidmore, representing the estate of prog-rock semi-luminary Randy California, purports that Jimmy Page, who almost certainly knew California’s work—Zeppelin opened for California’s band Spirit on a 1968 US tour—recycled Spirit’s instrumental “Taurus” into the opening bars of “Stairway.” Skidmore calls “Taurus” an “ethereal yet classical guitar composition.” Most music lovers who’ve heard the actual track would probably call it “boring.”

The case, filed nearly two years ago but approaching trial now, has merit. Skidmore makes a persuasive case that Page knew Spirit’s music. Anyone hearing “Taurus” without being warned might believe, for eight or ten seconds, that they were hearing “Stairway to Heaven.” Skidmore’s thirty-five-page complaint certainly has a prima facie case. The question becomes, is that very brief similarity legally actionable? And if so, does it have wider implications?

The similarities, though brief, are undeniable. Music critic Alex Ross, who understands these things better than me, argues that the similarities are superficial. But to the untrained ear, the similarities are audible. The difference is that Spirit circles that one motif for two-and-a-half minutes, and simply peters out. Zeppelin, by contrast, uses that brief motif as foundation, launches from there, and never stops building. Zeppelin clearly composed the better song.

Jimmy Page (with guitar) and Robert Plant of Led Zeppelin
Intellectual property in music has become hot property lately. Classic twang-rocker Tom Petty now partly owns Sam Smith’s snoozy Top-40 hit “Stay With Me,” because it supposedly resembles Petty’s “Won’t Back Down.” Millions of dollars ride on the case of whether Robin Thicke’s track “Blurred Lines” ripped off Marvin Gaye, or merely composed an homage. In both cases, similarities to the original tracks are less distinct than the Zeppelin case.

Both cases seem really weird to interested amateurs. “Stay With Me” has a similar backbeat to “Won’t Back Down,” but the two songs resemble one another less than Red Hot Chili Peppers’ “Dani California” resembles Petty’s “Mary Jane’s Last Dance”—a comparison Petty has reputedly laughed off. The Peppers even unabashedly pinched Hendrix’s “Purple Haze” chord progression for “Dani Califonia’s” final instrumental breakdown. But the Peppers aren’t in court.

And similar backbeat isn’t enough to base a plagiarism case. That isn’t me saying it; John Fogerty, in 1988, took his guitar into the witness box to prove similar backbeats don’t mean he plagiarized his composition “Run Through the Jungle,” which he doesn’t own, in writing “The Old Man Down the Road,” which he does. Courts sided with Fogerty. The Fogerty precedent should apply to Sam Smith, and musicians generally.

Basically, under current US copyright law and the Berne Convention, the similarity between “Taurus” and “Stairway” is too brief for sustainable legal action. But even worse, this case, even more than the “Stay With Me” case, has chilling implications for creativity. If even small, transitory similarities constitute theft, if pinching three bars from another composer equals plagiarism, most artists, not just musicians, are culpable for at least occasional theft.

John Fogerty, who was accused
of plagiarizing himself
Bob Dylan’s classic album Highway 61 Revisited includes long passages deliberately reminiscent of Allen Ginsberg. Dylan couldn’t claim ignorance; he’d appeared onscreen with the poet in his “Subterranean Homesick Blues” video. Andy Warhol regularly reproduced, not just mimicked, classic artists and graphic designers, some still living then. My first published poem, “I Do Not Love You,” was written in imitation of June Jordan—not advertised as such, but present nonetheless.

Clearly plagiarism does exist. Michael Bolton lightly tinkered with the Isley Brothers track “Love Is a Wonderful Thing,” slapped his own name on it, and made millions before getting caught. But even that brings issues: the Bolton case began within one year of the single’s release. Zeppelin is defending itself forty-five years after “Stairway’s” release, and nearly twenty years after Randy California’s death. The long delay seems specious at best.

If this case succeeds, it will have chilling effects on artistic creativity. If one composer can claim absolute ownership over three measures of music, ideas cannot flow. Artists depend on exchange to create new works; Randy California’s composition borrows liberally from Palestrina and Bach. Copyright law was written to protect intact or significantly-intact works, not fragments. Cases like this could literally kill art.


See Also: Copy This Copyright Rant

Wednesday, April 13, 2016

Fear of a Bionic Planet

Boston Dynamics' first high-profile walking robot shuffled like a newborn colt

Whenever Boston Dynamics (a former Google subsidiary) releases videos of its newest robotic accomplishments, most internet fans feel compelled to comment on how humans treat the robots. Every video demonstrates the robots’ ability to compensate for adverse conditions by having employees trip, kick, or bludgeon the robots with instruments. The robots always obligingly stay upright… and cue the predictable “robot uprising” jokes.

Not my friend Jay. Among America’s last die-hard pinkos, Jay sees these robots as the apotheosis of his longed-for workers’ utopia. “Everyone will be unemployed in 20 years,” he boasts. “Even professions such as surgeon and medical doctors are going to be automated.” Where this ex academic considers this a dangerous precedent, Jay applauds the possibilities: “With this technology we can build a luxurious socialist economy where no one has to work.”

Though I don’t disagree with Jay’s reasoning, I’d question whether this is reasonable, or even possible. I base my doubts on two basic questions: How have such prophecies unfolded in the past? And, would most people consider this development desirable? I have simple answers for both: “poorly,” and “no.” Given recent advances in technology, and general resentment of work, these opinions may seem counterintuitive. So let me defend my positions.

First, the prophecies. Clear back in 1997, Utne Reader featured a series of articles, dominating an entire issue, on how society’s ever-accelerating pace was detrimentally affecting Americans broadly. In one sidebar (apparently not archived online), the editors included several now-obsolete predictions about how speeding technology would transform the future. With both computers in the 1960s and automation in the 1930s, futurists proclaimed the imminent demise of work.

By their second high-profile model, Boston Dynamics made their robots trot like harness ponies

They didn’t mean just a little bit, either. These futurists predicted that humans broadly would struggle to find anything with which to occupy the now-abundant time. Around the same time, without apparent irony, David Brin wrote something similar about near-future prospects in Popular Science. These predictions make having time to burn feel, as Jay says, “luxurious.” This overlooks that we have names for unemployed people trying to burn off excess time: addicts.

So, our predicted coming leisure time keeps never happening, big deal. We need only succeed once and we’re all golden, right? Maybe, but I’d dispute that. While having abundant free time sounds wonderful, anybody who’s taken a long weekend to run naked through sun-dappled fields of daffodils knows how quickly the bliss peters out. Before long, you find yourself melancholy, studying anthills, wondering whether it’s time to return to work yet.

Catholic economist John Médaille notes that most people, confronted with self-directed time, spend it doing stuff that, if they received pay for it, we’d call “work.” Woodworking, knitting, community theatre, writing a blog… some people consider these paying careers. Others do them for fun. The only leisure-time activities that don’t resemble work are those which involve dissipating your time away, like watching television or getting drunk.

Socialist writer Barbara Garson admits she began her discursion on hierarchical employment, All the Livelong Day, assuming, as socialists do, that capitalism forces work upon unwilling workers. (Writing, in her mind, wasn’t really “work.”) However, she discovered that people want work, they seek it, and when profit-first employers strip work of meaning, workers try to infuse meaning back in. Work, she realized, isn’t a burden, but a shared human desire.

Boston Dynamics' latest piece of self-promotion walks bipedally, handles rough terrain,
and picks itself up when this bearded turkey shoves it with a log

Many sociologists define humans distinct from other animals by three traits: forming relationships, cooking food, and working. As Jesus said, the birds of the air and the lilies of the field neither labor nor toil. Christian scholars often take work as an emblem of Original Sin, following the Genesis account; but other faiths share the belief. Consider the Buddhist maxim: “Before enlightenment, chop wood, carry water. After enlightenment, chop wood, carry water.”

Even scientists and non-religious persons recognize the centrality of work. Psychologists speak of the existential malaise humans suffer when lacking meaningful work, and the documented psychological consequences of long-term unemployment strikingly resemble PTSD suffered by ex-soldiers and rape survivors. And Jay would redistribute this meaning-making activity onto robots. But why? Work isn’t something to offload onto others. Work gives our lives meaning.

Turning our responsibilities over to robots sounds empowering. Isaac Asimov, wounded by the irrationality of World War II, proposed the same over sixty years ago. But as advances in technology prove overwhelmingly banal, and advances in psychology demonstrate humans crave work, that attitude seems naïve now. Robots may redefine what work means for coming generations. But work should not, arguably must not, ever completely go away.

Monday, April 11, 2016

Trapped Inside an Altered Body

1001 Books To Read Before Your Kindle Battery Dies, Part 69
Elisabeth Kuhn, Average C-Cup: Poems

It's a shame nobody reads poetry anymore. Because when readers dismiss poetry as something inscrutable out of the past, they are losing the chance to introduce themselves to forward-thinking word crafters like Elisabeth Kuhn.

German-born and Berkely-educated, Kuhn takes the same world-wise travelling mentality in her verse that she takes in her life. She is also wise enough to recognize something that many academic poets these days have forgotten: that formal verse exists because people like to read it. Kuhn crafts verse in accessible forms like villanelles and sonnets, forms that a poetry audience will read for pleasure, and uses these forms to address difficult issues.

The issues Kuhn wants to address emerge from her own life. Foremost is her battle with breast cancer, culminating in a partial mastectomy which leaves her with two very different breasts—thus the title. In a world that values women according to their appearance, she struggles to decide where that puts her. Different poems show her in different places, but she remains generally optimistic, strong enough not to be broken by anybody looking at her body strangely.

Some poets are primarily storytellers, like T.S. Eliot, and some bare their sins in the Sylvia Plath style. Kuhn approaches poetry with the aplomb of a creative memoirist. The most important element in her poetry is herself, but she is not just flatly telling her story, she is telling us why her story should matter to us. And for the most part, she is telling us her story well.

I admit flinching when I began reading. The very first poem, "Palpitations," is a sestina, a form where key words repeat according to a geometric schedule across thirty-nine lines. It's a very difficult form, and Kuhn judders markedly on this one as well. Occasional pieces like this, which suggest they were written for an MFA poetry class, don't jibe well. When one such piece opened the collection I got a little queasy and thought I was in for a bad ride.

But I'm glad I stuck with the book. There are real treasures in this book, insights into being human as well as insights into being Elisabeth Kuhn. Consider these lines, from "The Pleasure Is Mine," discussing breast reconstruction:
...I'd feel as if he fondled
molded jello, glued
to my chest.
Elisabeth Kuhn
This deceptively simple analogy contains within it such nuance about human relationships, sexuality, sensuality, individual identity, and the human body. The author feels partly adrift inside her altered body, but she refuses to be prettified just to satisfy an abstract individual she might never meet. These lines are very direct, yet freighted with all the depth and complexity that human language can bring to bear upon them. And these are just three lines in a much longer book.

Kuhn’s struggles dwell in the present, but also unfold into the past. Her conservative German Catholic upbringing, one she addresses directly rather than hiply walking away, presages her adult difficulties with relationship and identity. Church and family should be the first place young girls learn trust, yet that frequently isn’t what she learns, as in “Sin”:
My sister had been good.
She was allowed
to pick a woolen thread
for Jesus’ crib so he would be
warm and comfortable.
I had done something to anger Mother.
I had to pick a needle.
Jesus would have to suffer
for my sins.
The poem gets only darker from there, not only from its actions, but from Kuhn’s innocent incomprehension. Her base poetic voice is the eternal, bewildered Even being born German carries weight. As a bilingual, bi-cultural woman, Kuhn brings outsider perspectives to nearly every situation, including this from “Original Guilt”:
When we were teens, we’d sometimes
flash the Heil Hitler sign
as a joke. It always upset the adults,
like questions about sex, and that forbidden
stanza of our National Anthem
we’d sometimes roar at parties, drunk:
Deutschland, Deutschland, über alles!
Some poems ring hollow in this volume. Kuhn is a journeyman poet, and even great masters don't succeed every time. But on balance, this readable book offers rewards for experienced poetry readers and casual bookworms alike. I would go out of my way to recommend a book like this to other readers.

These poems let us glimpse a woman’s heart, and together they give us a banner insight into the complexity of one person's life experience. She promises much in the future, and on the evidence of this first book, we have much to look forward to.



To hear Garrison Keillor read Elisabeth Kuhn's poem "Bathrooms," click here.

Friday, April 8, 2016

How To Have a Career In Outrage

Seriously, this is a thing. Click to enlarge
Early Thursday morning, while checking my regular websites, I had one of those weird moments made possible only by algorithm-driven electronic communications. Several Facebook friends, mostly graduate students or recent degree holders, shared various forms of outrage about Calvin Trillin’s newest poem, “Have They Run Out Of Provinces Yet?” Immediately below one such diatribe, Facebook ramrodded video clip from their tragicomic new Moto-Dojo ads. Apparently without ironic intent.

ICYMI (as the kids say), Trillin’s poem lists the various Chinese regional cuisines that have been aggressively marketed to Americans throughout his lifetime, each putatively hipper and more authentically Chinese than the last. When I first read this poem, before the shitstorm erupted, I interpreted it as a satire on how marketing gurus have commodified China, usually denuding it of history and accuracy in the process, for dumb-tongued Western audiences.

Moto-Dojo has no such, even hypothetical, justifications. It shows a white guy with a Fu Manchu mustache displaying supremely bad martial arts moves, in what appears to be his mother’s basement. This segues in various ways into some display of Motorola’s new mobile phone features. I’m unclear whether this chintzy, broadly imperialistic campaign comes from American-owned Motorola Solutions, or from Motorola Mobility, a subsidiary of China’s Lenovo Corporation.

For whatever reason, the sort of university-based umbrage-mongers who get upset at “cultural appropriation” have selected Trillin as this week’s receptacle of moralistic outrage, while politely overlooking Motorola. Critics have forced Trillin onto the defensive, making him explain what I considered obvious, that it’s “written about food snobs” and not about Chinese people. Motorola, by contrast, has been characterized with terms like “extremely comical” and “all in good fun.”

Calvin Trillin (AP photo)
This makes me wonder: who decides which villainies merit our anger? Is Trillin’s poem worse than Motorola’s ads because Trillin is an artist, supposed to have pure motives, while we accept such vulgarity from venal for-profit corporations? Our selective blindness about truly offensive situations, while seeking persecution where little exists, says something about our national psyche. Critics disparage some populations for seeking mass-market victimhood. But that’s becoming our entire society.

The Moto-Dojo advertising campaign looks like plain old yellowface, the same form of cross-cultural usurpation that made Hollywood movie studios think they could trowel “Oriental” makeup on Peter Ustinov and let him play Charlie Chan. I've complained about "cultural appropriation" accusations before, claiming the term is overused to demonize ordinary situations where art movements achieve mainstream acceptance. But this looks like a situation where it clearly applies.

Perhaps the abruptness of the reaction tells us something important. Though Trillin’s poem was included in a The New Yorker issue which shipped last week, and had been online for nearly a week before the outrage explosion, the reaction blew up late Wednesday evening and into Thursday morning. Nothing about the poem or its presentation changed on Wednesday. Just suddenly, for no visible reason, countless people became rashly angry.

Just as the explosion happened so fast, it ended almost as quickly. After I started writing this essay, I paused to, y’know, go earn a living. By Thursday evening, the furor had petered out; by bedtime, it had vanished entirely. Though some stray tubthumpers might still be posting somewhere in the twitterverse, the momentum of outrage had passed back to usual election-year targets, like Donald Trump or the Clinton legacy.

This hasty reaction, and its equally swift evaporation, says something. The social media echo chamber, fueled by what British journalist Mick Hume calls “full-time professional offense-takers,” prompts many contributors, eager to appear right-thinking, to jump on every bandwagon of moral umbrage, however specious. Because a certain fraction of outspoken Chinese-American dignitaries saw trespass in Trillin’s poem, media figures and graduate students followed suit, possibly without anticipating the ramifications.

The New Yorker issue which
contained Trillin's offending poem
Please don’t mistake my intent. Serious outrages do happen; sometimes, public figures deserve our scorn. The blowback last week over the Kenyon Review posting two cack-handed “Indian” poems online reached heights enough that the magazine eventually pulled the poems. (They claim it happened because the poet violated editorial guidelines, but c’mon.) Sometimes people get angry because bigwigs do something stupid, and earn the consequences they endure.

But this isn’t that case. High-minded crowds turned on Trillin because they behaved like a schoolyard mob over a vanishingly insignificant offense. And like such people do, they became so enwrapt in fake umbrage that they missed legitimately offensive behavior right in front of them. The effects of such viral outrage should frighten observant people. Because like all mobs, this one could turn on you.

Wednesday, April 6, 2016

The Philosophy of the Vague

Chris Luke, Power Habits: 101 Life Lessons & Success Habits of Great Leaders, Business Icons and Inspirational Achievers

I've noticed most bookstores have two sections side-by-side: "Self-Help" and "Psychology." The former is self-explanatory and highlights authors like Jack Canfield, Tony Robbins, and Sylvia Browne. The latter includes some scholarly psychologists, like Paul Ekman or Mihaly Czikszentmihaly, but is mainly dominated by what one New Republic reviewer called “self-help for people who would be embarrassed to be seen reading it.” This book is, far and away, that category.

Author Chris Luke, who doesn't burden readers with anything so pedestrian as a bio or credentials, proffers a book that says nothing particularly wrong, but also says nothing particularly well. Luke offers 101 mini-essays, none over two pages, most just one plus some dangling lines. Each shares some putatively relevant life lesson, like "Challenge Yourself" or "Follow Your Curiosity," illustrated by examples from some famous, successful, or influential personality. Some examples are fictional.

My problem isn't Luke's principles. Indeed, some of them I like so much, I wish Luke spent more time unpacking them. Though Luke never mentions the word, these principles accord with classic Stoicism, a philosophy I recently rediscovered and have striven to apply in my own life. A handful of precepts Luke treats so briefly that they're arguably vulnerable to abuse if audiences read carelessly, but evidence suggests Luke's heart is in the right place.

No, my problems are sub-surface, facing more what Luke omits. For instance, despite the title, Luke never describes what habits are, nor how to engineer them. Unlike, say, Gretchen Rubin, Luke presents habits as something we do, not ways of reprogramming core mental processes. Despite advances in neuroscience, habit formation remains deeply controversial , profoundly unsettled and unsettling. Anybody who's tried muscling past bad habits knows how intractable our brains really are.

Just as our author omits structure, this book also omits process. Telling audiences to, say, "Cultivate Creativity," doesn't guide the nominally well-adjusted cubicle drone how to change thinking patterns learned through years of school and employment. Precepts like "Don't Wait For Permission" could sound offensive to poor or minority readers, structurally slapped down by economic and social forces that discourage, even punish, risk-taking or innovative behavior.

Marcus Aurelius, the great
Stoic philosopher
Let me reiterate: I don't mean Luke is wrong. Quite the contrary, his principles, if handled appropriately, are universal, portable, and empowering. But the applications are not. Telling people to "Focus" isn't good enough; entire religious traditions, like Buddhist meditation or Christian monastic prayer, are dedicated to improving individuals' ability to focus. Workers plagued by bills, responsibilities, and work-life balance, often want to focus, but need guidance actually doing so.

One short illustration should suffice. Luke says that, at the peak of his touring career, comedian Jerry Seinfeld "used a unique calendar system to motivate and pressure himself to write, even when he didn't feel like it." Holy schlamoly, now that's meaty! This old ex-teacher wants to say: "Speak more to that, please." Because I know the feeling of being too tired and discouraged to practice the art I love.

Too late, though: Luke has already caromed onto another topic. To Luke, these precepts aren't something smart people of earnest intention struggle to achieve; they're something we should just do, and quit dithering. Apparently he did. Luke repeatedly mentions his success building an exercise regimen from zero. I believe that's possible; I've seen it. But most beginners need an experienced coach, workout partner, or Phys-Ed teacher to start well. People who just start running usually get charley horses.

I believe Luke's 101 principles come from good solid foundations, and not just based on my experience. They're confirmed in authors from the ancient (Marcus Aurelius) to the modern (Charles Duhigg). Writers like Kelly McGonigal confirm the science, while journalists like Malcolm Gladwell, whom Luke quotes, confirm the practice. Robust evidence testifies that Luke writes from a position of strength, backed by history's best minds. At no point in this book does Luke say anything wrong.

But neither does he say anything likely to translate into action. Because his vague, gnomic essays contain no how-to steps, I fear his words will result in many solemn nods, many  "mm-hmms," and copious agreement. Then readers who concur with his opinions will do nothing. Because being right doesn't mean much, if you aren't also useful.

Sometimes, in writing reviews like this, I catch grief for not having any counter-proposal. But I'm no scientist or journalist; that's not my job. This review links several authors whose writing achieves what Luke promises. Take a look. The processes exist, for the diligent student.

Monday, April 4, 2016

Fuck Picasso

1001 Movies To Watch Before Your Netflix Subscription Dies, Part Seven
Ed Harris (actor/director), Pollock


In 1949, LIFE Magazine published an inside spread about a little-known New York painter named Jackson Pollock. Though he’d snagged occasional high-dollar commissions, nobody outside Manhattan’s intelligentsia knew Pollock until LIFE made him globally famous. Suddenly, canvases comprised of acrylic flowed over the surface, some larger than dining tables, fetched six and seven figures at auction. But this also commenced the artist’s irretrievable slide into self-destruction.

Actor Ed Harris became interested in Jackson Pollock when his father purchased a biography, for no other reason than he believed Harris physically resembled Pollock. As a self-taught expert, Harris never considered anybody but himself to direct this, his directorial debut; he floated several actors, but ultimately decided nobody but himself could enact his vision. This could have descended into an insufferable vanity project. But it does so much better.

Harris commences Pollock’s story in 1941. A 4-F Army reject with useless art credentials and no income, he’s crashing in his brother Sande’s plush apartment. He considers himself an unrecognized genius; his peers consider him a crashing drunk. Enter Lee Krasner (Marcia Gay Harden), a fellow budding painter seeking professional connections. She recognizes in Pollock the same genius he envisions, and pauses her own career to foster his.

This poses significant problems. By sacrificing herself (her career wouldn’t resume until after his death), Krasner becomes both Pollock’s inspiration, and his enabler. Because Pollock really is a raging alcoholic, numbing himself to a world so chaotic that sensitive, wounded souls like him can’t withstand the pressure. Krasner quickly discovers that Pollock’s genius arises from his damage; the more tumultuous their life together, the more profound of art he creates.

And it is, undoubtedly, profound art. If you’re unfamiliar with Jackson Pollock, he created his most influential paintings by flowing paint onto the canvas from above, using the brush like a conductor’s baton. The resulting images are chaotic, frenzied, and completely missing any recognizable object; when people say “I don’t understand modern art,” they’re probably thinking Jackson Pollock. But his work perfectly encapsulated the reeling, frenetic post-WWII years.

As director, Harris lavishes attention onto Pollock’s creative process. His style is quick and gestural. In his earliest artworks, he slathers paint onto the canvas directly from the tube—an innovation made possible only by technological advances in quick-drying acrylic paint. Later, working in a converted upstate New York horse barn, he unlocks his later drizzling style by accident, making new expressions possible that no prior artist ever considered.

Ed Harris as Jackson Pollock, possessed by the muse
Therein lies an important theme of this movie: what is art? To Pollock and Krasner, art communicates something internal to the artist; the audience is a latecomer to the process. As his paintings convey Pollock’s inner turmoil, his personal life becomes more ordered. Pollock and Krasner wed, sober up, and become contributing members of their community. He successfully moves his inner pandemonium outward, where art makes the artist more human.

In this environment, art is intensely personal. Where art conservatories teach aspiring painters to mimic the masters, Pollock strives to create something unique to himself. He succeeds, but not without cost. He becomes disdainful of other painters; once, compared to Pablo Picasso, Pollock snipes: “Fuck Picasso.” It’s a beautifully understated moment expressing how, in becoming himself, he has become tragically disconnected from others. Famous, but alone.

But the clamoring public expects Pollock to repeat past successes, even as his inner struggle has moved onward. Trapped into mimicking himself, his psychological struggles reassert themselves, and he lashes out spitefully at anybody who dares approach him like a friend. As he descends back into drunkenness, alienates Lee Krasner (who supports his work but increasingly despises him), and takes a mistress, he becomes a caricature of the malignant genius.

There’s a moment, in the penultimate scene, that any artist will find difficult to watch. Driving drunk, unable and unwilling to divorce his wife, bereft of creative outlet only seven years after achieving fame, he looks at his beautiful but vacuous mistress (Jennifer Connelly), and the life drains from his eyes. Though physically alive when he drives over an embankment, we witness his dying moment. It’s painful to behold.

Renaissance artists created art to praise God—a God not all believed in, but nevertheless. Without God, art becomes something different, something personal. This movie suggests that’s why artists are frequently self-destructive, because they’ve parceled themselves out to the often unappreciative public. Harris has created an engaging story of how art consumes, in every way, the best artists. Like art, it’s beautiful and tragic.