Thursday, December 31, 2015

Was Bill Cosby's Perp Walk Necessary?

This event was probably inevitable, but it was also probably very, very unhelpful
Yesterday’s much-heralded arrest and arraignment of Bill Cosby hopefully represents the beginning of closure for the women Cosby allegedly assaulted. But it probably represents a brief intermission in America’s media circus surrounding his trials. And just as good playwrights lead into act breaks with emotional cliffhangers, this interlude begins with images guaranteed to drag audiences back. Ushered past celebrity-chasing photojournalists by a studiously mixed-race police pair, Cosby looked every one of his 78 long years.

He also looked exactly like a stereotypical Black man getting his perp walk. With cameras packed as tightly as any Oscar Night red carpet, police shuffled Cosby within grabbing distance of ambiguously adoring crowds. Importantly, police pulled Cosby through the front door of the courthouse, an indignity not forced upon George Zimmerman or Officer Darren Wilson. About the only way yesterday’s arrest circus could’ve been more stereotyped, was if police made Cosby wear a hoodie.

Journalists have penned numerous think pieces regarding Cosby’s highly public disintegration. Will Cosby be remembered, they wonder, as a sexual predator or a comedian? Will these allegations, most outside the statute of limitations, undo Cosby’s history of fighting for African-American rights and justice? These articles have struggled to remain studiously fair to both sides; but the accompanying photographs have inevitably shown a Black man, framed head-and-shoulders by the camera, looking exactly like a mug shot.

Progressives have treated Cosby pretty badly. Wake Forest professor Melissa Harris-Perry, speaking from Rachel Maddow’s MSNBC pulpit, used Cosby’s arrest to conflate virtually everything the former America’s Favorite Dad has ever done to irritate African-American activists. And that list isn’t small. The circumstances surrounding Cosby’s arrest, which dominated half the Maddow show yesterday, opened a spigot of hardly-repressed wrath, letting Harris-Perry reprimand a massive portmanteau of social issues. Most were unrelated to the pending allegations.

Despite his contributions to resisting racial injustice, Cosby has long irked certain quarters with his “Politics of Respectability.” This although the underlying principle is one professors frequently preach to students: To Get Taken Seriously, Act Like Someone Worth Taking Seriously. Cosby has preached that African-Americans, especially young men, should avoid comporting themselves according to racial stereotypes, like saggy britches and slang. Harris-Perry, and other Black activists, have taken issue with what they consider Cosby’s assimilationism.

Nevertheless, this principle informed Cosby’s quite-long career. He took pride in ensuring his I Spy character, Alexander Scott, got presented as smart, well-spoken, and fatherly. This contrasted with then-current stereotypes in Shaft, Superfly, and other dubious blaxploitation classics. (Cosby reputedly took offence at the 2002 I Spy movie, where Eddie Murphy played a semi-literate, philandering goofball.) Cosby also guaranteed his characters in The Bill Cosby Show and The Cosby Show be clean-cut professionals with families.

This issue found itself on egregious display yesterday. Despite his button-down cardigan and varnished bamboo cane, Cosby, marched before the assembled multitudes, looked distinctly rumpled and disreputable. Were anybody from Central Casting seeking a stereotypical Creepy Neighborhood Pervert, yesterday’s footage would certainly find itself in the B-roll folder. Cosby looked scruffy, the shambling image of sexual indecency. Even if, like Michael Jackson, Cosby finds himself legally exonerated, yesterday’s images will cast a long shadow retrospectively.

Cosby’s long, lingering public disintegration has become, for America’s political Left, what Donald Trump has been for the political Right: an opportunity to vent its id publicly. Despite his demi-liberal politics, Cosby has always personally been essentially conservative, a reality that challenges the alliance between African-American leaders and the Democratic party. The lingering current of discontent between organized Black leadership in America and Bill Cosby, is finally getting the public viewing it so desperately deserves.

Bill Cosby now joins the list of public personalities whose sexual violence has become public property after their careers are essentially over. Some, like Jimmy Savile and John Howard Yoder, didn’t get exposed until after their deaths. Few have been treated badly: Savile’s name got quietly removed from his philanthropic contributions, while Yoder still retains a sizeable army of public apologists. For comparison to Cosby’s treatment, one would have to go back to Michael Jackson.

When members of disempowered minorities get busted, their failings cast shadows on their groups. Barack Obama doesn’t get judged as President or a Democrat; he’s America’s First Black President. Now Cosby has been turned into a stereotypical Black predator for the cameras. It’s hard to separate the bad treatment he’s receiving, which even banksters and convicted politicians don’t receive, from his race. And it’s hard to separate his race from our attitudes as a nation.

Monday, December 28, 2015

The Dark Knight of the Soul

1001 Books To Read Before Your Kindle Battery Dies, Part 64
Frank Miller, Batman: The Dark Knight Returns, and
Grant Morrison, Batman: Arkham Asylum—A Serious House On Serious Earth


It seems difficult to believe such an iconic character was ever endangered these days, but during the 1970s, sales of Batman comics slumped so low, the character faced cancellation. But following the DC Comics’ still-controversial 1985 reboot of their narrative continuum, the publisher began experimenting with the character’s moral and psychological implications. Two long storylines paved popular attention to Batman before Tim Burton’s popular 1989 movie: 1986’s The Dark Knight Returns and 1989’s Arkham Asylum.

Immediately following DC’s reboot, writer-artist Frank Miller, already an over-ten-year comics veteran, decided to relaunch Batman retrospectively. He depicts Bruce Wayne, aged and retired, sharing liquor and reminiscing with Commissioner Gordon. But in Batman’s absence, Gotham’s crime has returned. Flippant muggers leave Wayne bleeding on a sidewalk, reawakening an old man’s deeply buried hunger. Batman returns to a world no longer prepared for him, a world torn between violent street gangs and an autocratic government.

Batman’s return precipitates greater consequences. Deep inside Arkham, Two-Face and the Joker have convalesced for a decade. Modern medicine restores Harvey Dent’s face, but he believes himself now completely scarred. No longer burdened with black-and-white morality, his newfound nihilism plunges Gotham into blood. But Dent has nothing on the Joker. Catatonic for ten years, he regains consciousness upon glimpsing Batman. (Miller precedes by two years Alan Moore’s hypothesis that Batman and Joker need one another.)

These events unfold against a background of extreme transnational violence. Miller presents a stark world, a reductio ad absurdum of 1980s moral absolutes. America has recruited some superheroes as government agents, and driven others underground. Superman takes orders from a President Reagan grown gnarled from power, but his thought bubbles reveal he despises collaborating with fascist warmongering. A maimed fugitive Green Arrow, meanwhile, proves Batman’s greatest ally and connection to the growing, organized anti-statist resistance.

Where Miller’s sweeping, epic-scale novel reflects his long publishing history, Grant Morrison hit audiences largely unawares. His story is largely intimate, circumscribed in both geography and time. To quell a prisoners’ riot in Gotham’s notorious Arkham Asylum, Batman acquiesces to an elaborate hide-and-seek game inside the building’s historic walls. His encounters with decrepit inmates provide insights into Batman’s own fractured, delusional psyche. He cannot know, however, that his circumstances mirror the Asylum’s own eerie history.

Miller’s massive political saga seems worlds removed from Morrison’s claustrophobic psychological thriller. Morrison traps Batman inside the world he’s created for his greatest enemies, letting them re-inflict the horrors he previously exacted upon them. As Batman travels deeper into the asylum (Bruce Wayne makes only salutary appearances in this story), the inmates force him to confront his own unresolved traumas. Morrison implies that Gotham only controls its native lunatics by employing an even greater lunatic.

Dave McKean's nightmarish, profoundly influential depictions of Batman and the Joker
In parallel to Batman’s ordeal, we receive glimpses into asylum founder Amadeus Arkham’s own horrific descent. The name Amadeus means “love of God,” while Arkham comes from H.P. Lovecraft. Dr. Arkham begins as a devoted son and becomes a dedicated psychiatrist, mentored by Carl Jung himself. But his own dark history, entombed deep inside himself, returns to visit, costing him everything he loves. Without ever making the choice, Arkham becomes a vigilante killer… and inmate.

Besides their divergent story structures, these novels have wildly different looks. Frank Miller does his own rough art, and his thick, deeply-carved lines make an austere background for colorist Lynn Varley’s surprisingly lush but controlled watercolors. The smeared color over inflexible outlines reflects Miller’s late Cold War milieu. Morrison wrote his script in collaboration with artist Dave McKean, whose watercolors, photocollages, and groundbreaking digital art presage 1990s graphic storytelling, while emphasizing Morrison’s dense psychological insights.

Despite their far-reaching influence on Batman mythology, and comics generally, one mustn’t mistake these stories for twinsies. Miller expands Batman’s impact by stressing his channeled criminal impulses, the reality of Batman as champion of order, but not necessarily law. Morrison, by contrast, says Batman saves others by failing to address himself. Miller’s political novel, and Morrison’s psychological horror, explore Batman from without and within. They provide a channel through which all subsequent comic storytelling flows.

These novels are inarguably products of their time. Written as America’s crime statistics hit their all-time peak, and society threatened its own nuclear armageddon, they reflect the nihilistic philosophy underlying Reagan’s Morning in America. Like The Terminator or Red Storm Rising, they wouldn’t be written today. But the world we inhabit today wouldn’t exist without their having been written. We’ve ascended from the bleak world these novels describe, but their cynical influence lingers, waiting. Patiently.

Friday, December 25, 2015

The Politics of Christmas

In the days preceding Christmas this year, my Facebook feed has been cluttered with remarkably repetitive stories: promises to explain the political implications of “I Saw Three Ships,” “Do You Hear What I Hear,” “Good King Wenceslas,” and other Yuletide standards. Every Christmas seems characterized by some overwhelming theme. This year, apparently, it’s that Christmas traditions have political significance. Somehow, despite the weight of history, this always surprises people.

At the risk of sounding like my dad, we’ve arguably forgotten the meaning of Christmas. I don’t mean that Santa risks displacing Baby Jesus from our attention; we’ve fought that battle relentlessly already. Rather, the strident complaints about the War on Christmas have forced liturgically conservative Christians to cede all political implications of Christmas to people whose interests couldn’t coincide less with the message contained within the Gospels.

Northrop Frye writes that Jesus’ life, as we know it, remains inseparable from prophecy. True that, but it clearly also remains inseparable from politics. From its opening passages, Jesus’ biography ballyhoos its political implications. The two birth narratives, often melded into one in twee Protestant Christmas Eve services, actually have different takes on then-contemporary politics. They agree, however, on one important point: Jesus’ birth explicitly repudiates worldly power and influence.

Luke, the better-known birth narrative, begins with Joseph and Mary displaced from their home for tax purposes. It explicitly establishes Christ’s birth amid the upheaval of a conquered people, whose lives are periodically suspended to subsidize a government that rules the land, without ever working it. Re-imagine it thus: if America required every Native American back onto The Rez for BIA bureaucratic purposes, Modern Jesus might appear in such circumstances.

Matthew, by contrast, pits Jesus’ birth against Herod the Great. A puppet king established by Rome, history records Herod’s reign as a triumph of secular splendor paid for by the impoverishment of Judea’s people. Josephus records that Herod’s Temple in Jerusalem was more spectacular than any Roman religious house. And Matthew records Herod demanding an entire generation put to sword to forestall any challenge to his worldly authority.

One could continue. Luke records pre-teen Jesus disputing the religious authorities in their temple—religious authorities whom Obery Hendricks recounts weren’t spiritual shopkeepers, but an explicitly political order established to govern a conquered people. (“Priests” are, historically, lawkeepers, not shepherds.) Matthew describes Jesus’ family exiled to Egypt, an explicit allusion to Israel’s two periods of national exile in foreign lands. Both Evangelists record differing, but explicitly political, early Messianic childhoods.

This thread continues into Jesus’ ministry. Traditional liturgy loves emphasizing how Jesus called His apostles from among fishers, farmers, and other uneducated poor. The apostles’ lack of theological training is certainly significant. But equally important, Jesus called His apostles from Galilee, from people nominally Jewish, but who, through geographical inconvenience, couldn’t participate in standard Temple ritual. Jesus’ apostles weren’t merely unschooled and penniless; they were impious, despised, and possibly apostates.

Jesus seldom spoke against Roman occupation. He demanded believers “render unto Caesar” (itself a loaded story), and even healed a Centurion’s beloved servant. However, Jesus reserved His greatest wrath for Pharisees and Sadducees. Not because He opposed religious authority, as some suppose—He engaged modest, curious religious like Nicodemus in Socratic give-and-take eagerly. Rather, He opposed religious leaders because they derived their power from maintaining the status quo like Quislings.

Even His death has political implications. Douglas Adams famously claimed Jesus “had been nailed to a tree for saying how great it would be to be nice to people for a change.” But be serious: empires don’t nail hairy provincial preachers to timbers and stake them up to die in the municipal landfill for telling people to pray more. Temple authorities demanded Jesus’ death, and Pilate capitulated, because He threatened their dominion.

Though theologians across time have debated exactly why Jesus needed to die, one thread remains constant: human authority found Him dangerous. He preached against wealth, dominance, and power. He forgave sinful women while condemning lustful men. He healed and redeemed Samaritans, while chasing Jews from their Temple. He literally hugged lepers. He established power relationships exactly opposite those this world favors. Jesus’ life was wholly, explicitly political.

Catholic author (and sometime conservative mouthpiece) Garry Wills writes that “Jesus did not come to bring any form of politics.” True enough, if by “politics” we mean partisan alignments. Jesus doesn’t endorse any political party, notwithstanding both American parties’ recent eagerness to enlist His membership. But Jesus clearly had strong positions on human relationship to power. Christians need to examine this closely. Human governments can’t get us into Heaven.

Wednesday, December 23, 2015

Schoolhouse Block—the Movie

1001 Movies To Watch Before Your Netflix Subscription Dies, Part Three
Mike Million (director), Tenure


Professor Charlie Thurber (Luke Wilson) loves teaching, and his students love him back, some a little more than they probably should. But he hates academia's competitive paper chase. After being passed over for tenure once too often, he decides to knuckle down and join the game at bucolic Grey College. But a screw-loose colleague, a sexy competitor, and family pressures may be more than a loyal English professor can bear.

This straight-to-DVD gem will probably never get the recognition it deserves. PR people can't compress its concept into a plug line. Its gentle, optimistic tone defies hip cinematic cynicism. And its low-key humor, based on characters and language rather than broad physical comedy, will never rake in the big bucks. Yet I can't help but love this film, possibly because I see myself and my colleagues here on screen.

As Grey College’s only non-tenured English instructor, Charlie assumes a new full professorship is his for the asking. Until the department hires Elaine Grasso (Gretchen Mol), formerly of Harvard, a well-published but awkward wunderkind, to sweeten the competition. Charlie, a gifted teacher, sports a brief CV, because “publish or perish” passed him by. But with job security and pay on the line, he becomes painfully aware of academic politics.

At the other extreme, comedian David Koechner (Anchorman, The Office) plays Charlie’s best friend Jay. A chronic loser and academic outcast himself, he apparently exists to offer appalling advice. Urging Charlie into numerous adolescent stunts and theatrical displays, he definitely increases Charlie’s visibility before the tenure committee. But Charlie quickly questions whether succeeding at the cost of his integrity really accomplishes anything.

Luke Wilson (left) and Gretchen Mol

Anybody who’s studied, or taught, university-level liberal arts recently will recognize Charlie’s fundamental struggles. For three generations, America saw investment in higher education as not merely a moral good, but a tool for fighting the Cold War. Since the middle 1990s, however, willingness to subsidize education has plummeted. Administrative patronage plums have mushroomed, while classroom budgets have cratered. Fiercer competition for fewer jobs has become SOP in academia.

But this story isn’t exclusively for scholastic types. Anyone who’s felt the frustration of today’s widening gap between work and reward will recognize this story. Charlie desperately wants, not fame nor recognition, but security enough to do his job the best way possible. Glimpses of his classroom technique and his students’ undisguised respect prove he’s proficient. Yet somehow, in today’s go-go economy, doing a good job isn’t good enough anymore.

This movie’s shoestring production permits a design edge missing from many recent Hollywood spectacles. Shot for $5 million, it uses existing locations, like historic Bryn Mawr College, to give the production an authentically bygone texture. Without expensive music or digital effects, the producers rely upon genuine performances and careful mood to hook audiences. It’s odd, and appealing, to watch a film without having our senses shocked or our emotions manipulated.

I especially respect writer-director Mike Million’s rejection of hip conventional screenwriting techniques. In a movie marketplace dominated by Blake Snyder’s Save the Cat!, we’ve become jaded on over-high stakes, cascading tragedies, and three-act structures. Million’s picaresque storytelling, about a schlubby everyman who wants to do a good job well, makes an engaging change. This movie offers few focus-tested surprises, preferring to offer engaging characters in a smart situation.

And thankfully Million avoids the most obvious trap: he doesn’t force Charlie and Elaine into bed. Throughout the movie, they develop mutual respect, even friendship, that complicates Charlie’s desire to subvert her career. Toward the end, they imply the possibility of possible future courtship. But essentially, their relationship is a realistic depiction of professional competition between two smart people who happen to be opposite sex.

Wilson plays Charlie so he has our sympathy, but doesn't need our pity. He excels at what he does, and students seek his help because he's a good teacher. But being good isn't good enough anymore. Anybody who's ever postponed grading or given students just enough to get by while hammering on our own scholarship to show the department we deserve to exist, will recognize Charlie as one of our own.

And this movie doesn't jump out waving jazz hands to convince us we ought to laugh. Despite a few exaggerated moments, it mainly displays an understated quality that shows its audience a level of respect we've grown unaccustomed to recently. If more movies like this emerged from mainstream Hollywood dream factories, Sunset Strip might have fewer zillionaires, but the movies would still be something to look forward to.

Monday, December 21, 2015

Is It Okay to Hate the Rich Yet?

Serial CEO Martin Shkreli would like to be known as a master
strategist. In fairness, lying and theft really are strategies.
Last spring, self-described “media manipulator” Ryan Holiday released his second book, The Obstacle Is the Way. This attempted updating of classical Stoic philosophy annoyed me from chapter one, when Holiday appropriated robber baron John D. Rockefeller to exemplify applied Stoicism. Rockefeller used many unethical or illegal practices, including predatory pricing to submarine competitors, union-busting to keep desperate workers underpaid, and dumping sludge into rivers.

Nevertheless, Holiday praises Rockefeller’s purported Stoic qualities. In my favorite example, Holiday extols Rockefeller’s immunity to outrage: “He could not be rattled… not even by federal prosecutors (for whom he was a notoriously difficult witness to cross-examine…).” Wait, Rockefeller got prosecuted? For crimes history now considers established fact? Holiday never blinks at this. Rockefeller’s inability to get upset sounds less like Stoicism than diagnosable psychopathy.

Bill Gates
I noted this inconsistency in my review. History believes John D. Rockefeller’s foremost philosophical standpoint was John D. Rockefeller. Nevertheless, Holiday’s encomium to wealthy privilege received multiple defenders, including this gem, dated September 2015:

For all your moaning, Rockefeller was not perfect, but in the end he did more good than bad. He sure wasn't a criminal, he gave to charity by the time he was 16, and he helped medical science very concretely, you half-baked sod
I recalled this naked bullshittery last week, when two news items crossed my desk simultaneously. First, notorious “Pharma Bro” Martin Shkreli, who achieved notoriety when he jacked life-saving drug prices over five thousand percent, got arrested for playing three-card monte to conceal massive corporate losses. Then, uncovered papers reveal how Google, among Earth’s largest corporations, spread money like margarine to make European antitrust cases go away.

Separately, these stories make chilling but ultimately disconnected anecdotes. But patterns accumulate: Microsoft’s late-Nineties antitrust case mysteriously vanished when George W. Bush, Bill Gates’ largest campaign contribution recipient, became President. Many Bush Administration appointees resented having to divest lucrative Enron shares, until Enron’s numbers proved founded on lies (see Greg Palast). JPMorgan CEO Jamie Dimon lauded his personal virtue in avoiding the 2007 banking collapse, before regulators zapped him for multiple market violations.

The chilling circumstantial evidence becomes difficult to avoid: one cannot become exceedingly rich in America without blithe disregard for ethics, rule of law, or the value of human life.

Please don’t misunderstand. I’m not discussing what I consider “the ordinary rich.” I’ve known people who become rich by local standards because they provide valuable products or services: the software developer, the craft brewmaster, the restaurateur. I respect these people because their wealth derives from, and ultimately returns to, the community that built them. No, I’m discussing financial superstars who become so massive that, like literal stars, their mere presence bends light, space, and the value of a buck.

Martin Shkreli, son of immigrants, successfully flipped money to become richer at twenty, than most people who offer services or build things become throughout their lifetimes. Fine, he has valuable skills. He arguably generated capital for young businesses and striving entrepreneurs. But evidence shows he started multiple companies partly to shuffle money and create the illusion of more value than actually existed.

Marcus Aurelius
Google has flourished partly by purchasing every possible service available. They own YouTube, Boston Dynamics, Zagat, and other services you use daily. (They also own my blog platform; I’m arguably biting the hand that feeds.) But they’ve gained their highest revenues not from services they provide ordinary customers; their largest revenue engine entails packaging our usage data and reselling it to corporations. Anybody’s guess who owns your last porn search.

And yes, sometimes the exceedingly rich give to charity. John Rockefeller gave liberally, as does Bill Gates, a century later. So do I. But nobody praises me for giving from my limited store (Mark 12:44). Rockefeller and Gates became very, very rich, at least partly through theft, then returned some fraction of their proceeds to the common good. If bank robbers put change in the Salvation Army kettle, is anybody particularly impressed?

As my pseudonymous critic notes, nobody’s perfect. Even Marcus Aurelius, probably the most famous voice of classic Stoicism, needed to compartmentalize his life. His Meditations, among my favorite books, notably never mentions his military campaigns to suppress insurgency among conquered peoples, nor his defense of borders with Donald Trump-like zeal. We’re all only human.

It’s human nature to admire the wealthy, the glamorous, and the pretty. But criminals are not to be admired. And, capitalist myths notwithstanding, let’s acknowledge: it’s hard to get very, very rich in America without callously disregarding the law.

Wednesday, December 16, 2015

One Simple Step to Avoiding Regret

Stanley Milgram
The holiday season is upon us, and with it, the annual unceasing pressure to do things we really don’t want to do. From attending parties with people we secretly dislike, to continuing drinking past our limits, to waking up beside somebody we’ll instantly regret, the holidays, for adults, seemingly combine joy with heaping helpings of shame. Anymore, the Christmas season seems less about Jesus, more about facepalms and instant remorse.

Thankfully, one of America’s leading public psychologists has a solution—which has laid hidden inside a little-read book for nearly two generations. Stanley Milgram, whose notorious Obedience to Authority experiments demonstrated how power causes good people to do awful things, dedicated his career to researching how private individuals handle public pressures. Milgram’s lesser-known, but powerful, book The Individual in a Social World explains his most interesting, and disturbing, findings.

In one experiment, Milgram assigned graduate assistants to do something most New Yorkers would consider wholly unimaginable: walk onto a city subway and ask strangers to yield up their seats. Dedicated Gothamites know that, unless you’re disabled, pregnant, or elderly, this violates unspoken social standards. True subway users spread evenly and don’t talk, and seats are strictly first-come, first-serve. This experiment violated two norms, studious silence and seat respect.

Milgram’s results are surprising. For instance, male riders agreeably yield their seats to women twice as often as to men… but women are fifty percent more likely to yield their seats to women. Askers who give no reason for asking are twice as likely to actually get the seat as askers who give a trivial reason. And, weirdly, violating the subway’s unspoken rules apparently makes many askers physically ill.

One discovery seems surprising until you consider it coldly. If askers give some advance cue that they’ll ask somebody for their seat, several seconds before asking, this cuts compliance rates by more than half. If askers give no advance warning that they’ll ask somebody to surrender their seats, 56% of riders will yield their seats. But mere verbal warning followed by a brief pause reduces yielding to just over 26%.

Anyone who understands science knows that this data is insufficient to draw ironclad conclusions. Knowing people do something is a far cry from understanding why they did so. However, it does justify some reasonable speculation. The parallel between brief verbal warning, and trivial reasoning, lets Milgram suggest a short but persuasive explanation, oddly buried in an endnote. Citing Erving Goffman:
[R]equests demand either compliance or an “accounted denial.” That is, one does not merely say “No” to a polite request, one gives a justification for saying “No.” It takes time to realize that a justification is not required in this case or to construct one. Many subjects may have given up their seats simply because they didn’t know how not to.
Erving Goffman
Readers of my generation, who grew up with Nancy Reagan’s “Just Say No” ideology, understand that such steadfastness makes better ideology than practice. Crossing your arms, planting your feet, and refusing anything outright, can actually make you feel queasy. But an asker providing trivial justification makes dismissal easier: I’m not refusing you, I’m refusing your reasons. And verbal warning lets me plan my answer before you even ask.

People standing outside given situations may believe refusing an unreasonable request seems straightforward. Tell the asshole no, and forget them. But many people who believe that about others have nevertheless been unable to refuse one more drink, another unwanted party, or frankly unwanted sex, because saying no is prohibitively difficult. This especially affects women, who learn from early girlhood that not being agreeable equals unladylike behavior and “bitchiness.”

Fortunately, not everything is lost. This problem contains within itself its own solution. If we cannot directly refuse yes-or-no requests, we can nevertheless indirectly refuse them. If we have justifications, we can say no. And if we know what requests we’ll receive in advance—if every year involves another drunken Christmas party, say—then we can plan our justifications before we need them. Having refusals prepared makes giving them easier.

We’ve all known the helplessness of getting cornered with some request we’d rather avoid. And we’ve known the guilt that traps us into commitments we find burdensome, unpleasant, or intrusive. Simply planning ahead, having our refusals pre-scripted, takes that burden off our shoulders. Certainly we cannot anticipate every disruptive or irksome request others will make. But by having our rebuffs prepared, we can escape the pressures, and resulting guilt, we all experience this time of year.

Monday, December 14, 2015

Trapped Inside the Iron Flesh

1001 Movies To Watch Before Your Netflix Subscription Dies, Part Two
Julian Schnabel (director), The Diving Bell and The Butterfly (2007)

In late 1995, French magazine editor, bon vivant, and inveterate womanizer Jean-Dominique Bauby suffered a stroke. When he awoke, the essential functions of his brain had become completely disconnected from his body. He would never recover the ability to use any part of his body except his eyes, one of which was so damaged, it had to be sewn shut to prevent infection. He would live the entire rest of his life entirely by blinking his left eye.

This movie is based on the memoir Bauby managed to dictate to a special assistant using blink responses. The title metaphor declares that he has become divided. His body has become a massive steel diving bell, submerged deep beneath an ocean he has no hope of escaping, silent and alone. But his mind remains free as a butterfly, traversing the world, reliving his greatest adventures, and constantly making new discoveries. His prison has become his freedom.

Bauby struggles, slowly, to overcome the limitations which his newly limp, leaden body places upon him. He must reconcile with the family whom he alienated with his wild living and his unconcern for their feelings. Through flashbacks, we discover the life he once lived, all glamour and flashbulbs and selfish consumption, a train wreck of hedonism playing out across years. He thought he needed nobody else, and lived like the center of his own universe.

Then the realities of biology collided with him. The movie plays it out like karmic retribution, as though he suffered a stroke in recompense for his heedless ways. This is magical thinking, of course, a retrospective explanation Bauby (or his film adapters) invented for narrative purposes, but if it’s fiction, it’s useful fiction, and the contrast between his past arrogance, and the humility he must now learn, offers a striking contrast of styles and impacts.

Mathieu Amalric (in the bed) as Jean-Dominique Bauby, composing his book,
with the assistance of a transcriptionist (Anne Cosigny)

The difficulties of “Locked-In Syndrome,” in which severe damage to the brain stem creates a permanent gulf between the rational mind and the body, have been explored before. Philosophers have pondered what this syndrome says about the supposed dualism between soul and flesh. Filmmakers have exploited the helplessness and implicit mortality for horror value. However, telling this story through Bauby’s lived experience, we get a completely different, unexpected viewpoint.

(To its credit, the movie avoids medical jargon and scientific-ese. Though highly specialized in its insights, it doesn’t require technical expertise to understand unfolding events.)

The movie’s title, even more powerful in the original French, Le Scaphandre et le Papillon, reflects Bauby’s new two-pronged life. Bauby experiences his body as an old-fashioned steel diving suit, a massive piece of metal trapping him in a strange world with no external contact and limited oxygen. Life becomes a struggle of complete isolation as lovers abandon him, friends address him like an object, and strangers ignore him.

But Bauby quickly separates himself from his unreliable senses. He retains the trait that makes him unique, his mind; and inside his isolated brain, he travels extensively, revisits the triumphs of his youth, and becomes a creature of pure experience. As he achieves this near-Buddhist state of mindful simplicity, he overcomes what other writers (like me) might consider limitations, and composes his memoirs, an eventual bestseller, entirely in his head.

These conflicting impulses, the iron cage of his flesh and the flittering butterfly of his mind, play out across Bauby’s story. Reality is told in fleeting glimpses, all washed-out colors and smeared images, reflecting Bauby’s inability to perceive reality clearly. But when venturing inside his mind, Bauby’s perceptions become hyperreal, saturating the senses and almost overwhelming the audience. Bauby’s two-track life, desperate helplessness and complete freedom, play out before us.

Not everyone will like this film. Like most French cinema, indeed like the French language itself, it unfolds with soporific grace. This film doesn’t adhere to the three-act structure English-speaking audiences have come to expect from their cinema. Viewers unaccustomed to international films may have difficulty adjusting their viewing habits to this film’s hypnotic, wave-like cadences. This isn’t a film for world cinema newbies.

However, art-house movie fans and students of cerebrovascular medicine will both find plenty to like in this film. Audiences willing to adjust themselves to this film’s flow will enjoy, even learn from, Bauby’s journey of self-discovery. Learning to master his newly limp body, and to live entirely within his mind, is hard for him. Watching that discovery is hard for us. Yet in the end, we feel we’ve undertaken that journey with him. And like him, we feel renewed.

Friday, December 11, 2015

Don't Go To College... Yet



If you have any kind of a social media account, you probably witnessed this week’s viral video, “Go To College,” exhorting youth to pursue higher education rather than just hang out. Produced by website College Humor, which has collaborated with the Obama administration previously, and co-starring SNL actor Jay Pharoah, it makes a catchy tune. And it showcases the success the Obama administration, and the First Lady especially, enjoy using social media for social good.

I find myself torn. Anyone familiar with my background in college education will understand why I think getting your higher degree matters. A good liberal education makes people free, a truth understood since Greco-Roman times. But as one among millions of Americans finding his life options severely circumscribed by inability to pay college debt, I have severe qualms about pressing students into schooling for which they’re often unprepared. There must be a middle ground somewhere.

Ever since the GI Bill created an entire new generation of college-educated middle-class workers following World War II, higher education has undoubtedly been key to entering America’s comfy home-owning central echelons. Because of this, students, especially academically astute students who take standardized tests well, face monolithic pressure to attend college. Being inexperienced, youth remain often unaware of other options available. Thus, except among the poorest Americans, college becomes the supposed funnel to adult economic stability.

This is further compounded by the frequent lack of career guidance colleges provide outside vocational programs. All through high school, the top advice I received was: go to college. In college, my professors urged me into graduate school. In graduate school, my professors urged me into a Ph.D. program and eventual professoriate—during years when spending on college professors was bottoming out. I graduated with a degree, an outdated résumé-writing guide, and $30,000 of debt.

I cornered one professor and demanded guidance on how to pursue a career in my field. He admitted he didn’t know; he, and his colleagues, had been outside the non-academic job market so long, any job-seeking skills they’d once had were irretrievably outdated. Though my major programs (I doubled) both offered career planning classes, they taught only broad, sweeping maxims, from professors who’d spent years, sometimes decades, off the market. I quickly came to despair.


However. During my teaching years, I noticed something distinct and consistent. My best students, almost without exception, hadn’t started college directly from high school. They’d taken time off to pursue something fulfilling, meaningful, or remunerative. This may have involved travelling, getting a job, or starting a family. One particularly successful student had served two tours in Iraq and wrote movingly about PTSD. One spent a year studying evangelism in Scotland. One had been to prison.

Students who worked, traveled, or lived before college entered with important skills. They had better ideas what they wanted from higher education, giving them laser-keen focus on their ultimate final goals. And they were more self-directed, which made them better able to handle college learning. Freshly minted high school graduates were more accustomed to a teacher-centric, classroom oriented learning environment, and unprepared for the hours of autonomous, private study college demands. Many hit a wall.

When America introduced compulsory state-based schooling back in the Nineteenth Century, early backers like Catherine Beecher and Horace Mann needed ways to compel reluctant students into the classroom. One way they accomplished this was to create an undercaste of social rejects and malcontents, whom they nicknamed “dropouts.” We see these instruments of social control perpetuated today whenever anybody says the people who cook food, build roads, and stock shelves deserve “menial” pay for menial work.

Now our highly respected FLOTUS, backed by America’s well-funded media machine, insists online that every job besides literally watching paint dry and grass grow deserves, even requires, post-secondary schooling. But my classrooms were already flooded with students who didn’t want, and were unprepared for, higher ed. They simply didn’t see any other options, a fact with came across in their measurable outcomes. College literally isn’t for everyone. Creating even more pressure forecloses students’ available options.

Nearly a quarter-century ago, John Taylor Gatto wrote something that’s really stuck with me: that life without education is life only half-lived, but we mustn’t mistake education for schooling. Ramrodding students into academic environments for which they’re unsuited does them severe injustice. Why is college universally better than apprenticeship, on-the-job training, or national service? Why can’t youth postpone college until they’re ready? College shouldn’t be a jobs factory. Kids deserve better than more unwanted pressure.

Monday, December 7, 2015

Our Thoughts And Prayers Are With—

1001 Books To Read Before Your Kindle Battery Dies, Part 63
Shane Claiborne and Jonathan Wilson-Hartgrove, Becoming the Answer to Our Prayers: Prayer for Ordinary Radicals


In one of my favorite quotes, Danish existentialist Søren Kierkegaard writes: “A man prayed, and at first he thought that prayer was talking. But he became more and more quiet until in the end he realized that prayer is listening.” In our era of highly public violence, politicians have increasingly fallen lazily onto the cliché that “our thoughts and prayers are with the victims of this senseless tragedy.” This makes me wonder: what, then, are they listening to? And what is that telling them?

Shane Claiborne and Jonathan Wilson-Hartgrove had highly personal encounters with Christian faith during their college years. These experiences led both men to abandon comfortable white middle-class upbringing, forming communities in the poorest neighborhoods of Philadelphia and Durham, North Carolina, respectively. These communities became central points of a spiritual movement, modeled on Benedictine quietism, entitled the New Monasticism.

American public theology once centered on engagement with, and often battles against, worldly authority. Theologians like Dr. King and Reinhold Niebuhr challenged the powerful and the rich to reconcile the lives they lived with the beliefs they claimed. Since around 1979, America’s public Christianity has veered into overt partisanship and defending the old order (see Michael Sean Winters). But New Monasticism, though non-partisan, staunchly challenges the powerful on their thrones.

Like most human activism, however, Christian outreach risks individualistic self-righteousness, or its close cousin, burnout. We’ve all known people, fired by divine fervor, who either consume themselves or everyone around them. Early on, our authors write: “faithfulness requires something we just don’t have on our own.” Prayer provides sustenance when we realize we, ourselves, aren’t sufficient to address reality’s vast demands. But this raises more questions than it solves.

Shane Claiborne (left) and Jonathan Wilson-Hartgrove

Claiborne and Wilson-Hartgrove analyze three prayers scriptural and find not a call to wait on God, but a call to act on Christ's mission. In the Lord's Prayer from Matthew, they find hope for community of believers on Earth, a model of Kingdom economics, and a request for strength when the world beleaguers us. In here, they say, Christ calls us to look at ourselves, listen to God, and act.

Christ's prayer in John 17 asks God for two requests. First He asks God to keep us unified in the world, a bastion of Godly mission against worldly distraction. Then He asks that God keep us from the world's greedy demands. We are not called to form colonies in isolation or to be saved but to live amongst the world. Our church exists to save God's world that He so loves.

Paul's prayer in Ephesians 1 calls us to greater openness. God created humankind for a mission, and He created the church to save humankind. To fulfill our Christian mission and receive our Godly inheritance, we must open ourselves to God's quiet requests, and having heard those requests, venture out to live the Gospel. Christianity does not end in our salvation, but begins, so we may fulfill His commandments.

In all cases, early Christian prayer bespeaks very different values from the “our thoughts and prayers” ethic dominating today’s political discourse. Current radical individualism, personal salvation, and getting to Heaven when we die don’t enter these prayers. All three, the longest and most prominent prayers in Christian scripture, announce membership in something larger, a Kingdom with no army, a nation without borders. That’s difficult—but also painfully, radically necessary.

These distinctions aren’t small. Kingdoms act; nations build. When we use prayer to defer our actions, when we substitute pious platitudes for human community, we aren’t engaging scriptural prayer. Because sometimes, when we use prayer to listen, we hear God saying we cannot cast problems exclusively upon Him. God created the Church to live.

I remember once telling my students, when they backed some far-right opinion with an out-of-context Scripture citation: “Empires don’t crucify hairy provincial preachers for telling people to pray more.” This book challenges both my unspoken assumptions, and my students’. When Christians enlist God in partisan debates, we undermine the uniqueness of Christian community. But prayer provides that community’s lifeline. We cannot be the People without talking with our King.

Wilson-Hartgrove and Claiborne do not deny traditions of prayer in conventional Protestant churches, but they call us, having voiced our prayers, to strive higher. They call us to truly live out not just our own prayers, but Christ's prayers for us. Prayer, they say, is not passively laying our requests at God's feet. We pray, and we answer prayer when we strive to achieve the fullness of the Gospel.

Friday, December 4, 2015

Selling Armageddon By the Yard

Rebecca Belliston, Life (Citizens of Logan Pond, Volume 1)

After the banks collapsed and President Rigsby seized absolute authority, the families around Logan Pond banded together. Five years on, Carrie Ashworth, only twenty-three, is raising her orphaned siblings, growing her people’s garden, and running interference with patrolman (read: stormtrooper) Oliver Simmons. But two new members, in a clan that’s been steady for five years, causes friction, made only worse when a government raid jeopardizes the community.

Rebecca Belliston’s third thriller starts well. (The second volume, already published, is entitled Liberty. Guess where we’re headed.) Belliston avoids stock Tea Party clichés about government overreach, preferring to present engaging characters enacting a complex story. But problems arise as the story progresses. Important story elements begin feeling uncomfortably familiar, and the narrative becomes distinctly preachy. Then, in a sudden Blaze, I realized where my concerns originated.

Carrie and the Logan Pond clan have established their outlaw clan around a suburban cul-de-sac outside Auburn, Illinois. This premise uncomfortably resembles NBC’s ill-fated series Revolution, but Belliston avoids showrunner Eric Kripke’s swinging machismo. Carrie’s extended clan has established an equilibrium of domestic tranquility, growing crops and stealthily evading government involvement. They enjoy a pleasant semi-libertarian balance despite America’s suspended Constitution.

One rainy evening, Mariah Pierce and her son Greg suddenly appear, disrupting the balance. Mariah wants to live peaceably with this extended clan, but handsome, bitter Greg has powerful anti-establishment desires. Notwithstanding his relentlessly bad attitude, Logan Pond evidently wants to decorate the chapel for Greg and Carrie, since they haven’t seen a marriage-aged bachelor in years. Sadly, the young adults get along like gasoline and matches.

Rebecca Belliston
I enjoyed the first hundred pages or so. Despite the enforced Edward-and-Bella dynamic between the leads, I found the premise and characters engaging, and wanted to like them. Okay, characters occasionally start unnecessary debates about whether to receive society’s collapse with stoicism or radical rebellion, and the government Belliston proposes seems profoundly impractical. But her engaging characters and sophisticated story make forgiving such lapses comparatively easy.

Then, somewhere after page 100, things change. Local patrolmen raid Logan Pond, seeking undocumented residents for the work camps. Yes, work camps; Belliston’s World War II metaphors aren’t subtle. Though everyone escapes, they lose property, crops, and sense of security. Suddenly, the once-peaceful community starts talking Greg’s radical language, while shy, moonstruck Patrolman Oliver desperately scrambles to repair his patched relationship with Carrie. The novel’s tone abruptly shifts.

Besides becoming more confrontational, Belliston’s characters begin a sudden tendency toward monologues. Both spoken and internal, these monologues turn on how Logan Pond should confront an intrusive government that disregards rule of law. These diatribes and debates seemed remarkably familiar. Desperate to understand where I’d heard these arguments before, I Googled Rebecca Belliston. Turns out she’s a prominent Mormon with a parallel career composing LDS worship music.

And I realized: the White Horse Prophecy.

Current LDS leadership considers the White Horse Prophecy apocryphal and doesn’t accord it Scriptural legitimacy. However, it remains influential in Mormon political circles. Supposedly, Joseph Smith received a prophecy correlating his nascent church with the White Horse of the Revelation, and foretelling that, one day, the US Constitution would hang “as by a thread.” (Mormons consider the Constitution divinely inspired.) Then the White Horse will ride in, heroically restoring order.

Several public Mormons, Glenn Beck chief among them, embrace the White Horse Prophecy as instrumental to their political and social mores. They see change as corruption, activism as Godlessness, and the Democratic Party as forerunner to biblical Armageddon. This dynamic lionizes outsidership as proof of integrity, and positions religious conservatives as bulwarks against overwhelming secularized rot. It’s a self-fulfilling prophecy: the more marginalized you are, the more right you are.

Like Stephanie Meyer, Rebecca Belliston uses little or no religious language in her writing. Yet her spiritual heritage shines through her prose, from the chaste romance and studiously mild language, to the sanctification of outsider status, to the belief in the despised community restoring Nineteenth Century values. Belliston starts well, but quickly becomes predictable. What starts as an engaging character-driven dystopian story quickly unwinds into a semi-religious Libertarian political tract.

I suppose, in writing socially engaged fiction, the author’s beliefs will unconsciously inform the story. Should I write a post-apocalyptic dystopia (hey agents, I have a manuscript), it would reflect my Distributist values. I just wish Belliston had permitted her characters to drive her story, rather than yoking them to her beliefs. Because I really enjoyed this story, right up until the moment the author got in her own way.

Wednesday, December 2, 2015

Living In Post-Reality America

I’ve recently been enjoying noodling on Whisper, the smartphone-based social media app where every status update is anonymous, and everyone's in everyone else’s network. It’s mostly been playful and uplifting, and I’ve had a few really heart-warming conversations. But occasionally I get some class-A weirdos, as demonstrated in this exchange from earlier this week:
MEPHISTOPHELES: I believe the younger generations widespread introverts, social anxiety, and crippling shy people are a direct result of technology. When you don’t have to socialize, you don’t learn the skills.

ME: Or, alternatively, technology provides a means of communication for the introverts, shy people, and thinkers who were already there, but you previously got bullied by self-righteous jerks like you.

ROWAN: Anyone would become an introvert by picking technology over social situations. It’s common sense

ME: Duncan J. Watts proves “common sense” is often wrong. Susan Cain proves introversion is caused by having a large amygdala, not social choices. You prove being loud doesn’t make you right.

ROWAN: And because you’re quoting one man doesn’t make you right lol. This generation is pitiful
I stopped the conversation here because I saw no purpose in going any further. “Rowan” dismissed evidence altogether. Since Rowan had already decided reality works according to “common sense,” a remarkably slippery criterion, nothing I could say could possibly matter. Citing sources was a cause for laughter, because sources aren’t real evidence; truth comes from inside, not from accord with facts.

Welcome to post-reality America.

On Tuesday this week, New Jersey governor and Republican presidential aspirant declared, on MSNBC’s Morning Joe, that global warming, euphemistically called “climate change,” isn’t an important issue today. This despite his state’s economic dependence on sea-lane shipping, which may be imperilled by warming, acidified seas. This despite near-unanimous scientific consensus. When questioned on his evidence, Christie replied: “That’s my feeling.”

We’ve reached a point in American public discourse where common sense and feelings are considered more valid than evidence, including evidence that common sense is unreliable. People considered legitimate candidates for nationwide office can nakedly abandon facts, dismiss expert testimony out of hand, and mock anyone who disagrees. And ordinary citizens, seeing this behavior greeted without consequence in public, mimic it in private.

Common sense is, in essence, a projection of individual experience outward. Global warming hasn’t inconvenienced me personally, Governor Christie thinks; therefore, even if it’s real, it’s small beer. I see introverts poring over their smartphones, and many Whisper users bonding over social anxiety problems, Rowan thinks; therefore, “technology” causes introversion. No further argument matters. Your evidence, ipso facto, is ridiculous.

Donald Trump, who should probably legally change his name to “Republican Frontrunner Donald Trump,” has become low-hanging fruit in this regard. His tales have become legendary: I made myself rich (he was born rich, and Bloomberg places his wealth at a third of his claims). Mexican border-crossers are rapists (first-generation immigrants have lower crime rates than the general population). President Obama wants to welcome a quarter-million Syrian refugees (twenty-five times the real number).

Politifact has rated Trump’s stump speech claims “Pants On Fire” a whopping sixteen times. By contrast, after nearly seven years of presidency, it’s given Barack Obama that rating only nine times. Trump lies so often, and so outrageously, that it’s hard to determine whether he even knows what truth and reality are. Most importantly, the more outrageously Trump lies, the higher his poll ratings climb. He has no incentive to make friends with reality.

Sadly, whether from social network users or presidential candidates, common sense is deceptive. The fact that introverts use digital information technology doesn’t mean tech causes introversion; it’s equally or more likely that introverts find tech’s relative quiet and asynchronous communication appealing. And high crime in immigrant neighborhoods doesn’t make immigrants criminals; if we force immigrants into impoverished areas where criminals prey on the populace, some get caught in the crossfire.

Then, when caught with proof that their claims don’t reflect reality, people belittle reality. My ability to cite evidence proves “this generation” is pitiful. If journalists cannot find evidence of Muslims celebrating 9/11 in Jersey City, then journalism is a massive leftist conspiracy. Private behavior reflects public examples. Reality has become an optional appendage to truth, which we comprehend only internally.

Maybe my status as former academic colors my opinions. Maybe my belief that claims require evidence reflects my experience grading papers. But it shouldn’t. When reality places second to feelings, it permits powerful people to ignore issues, peoples, and problems. We all suffer from optional reality.

Addendum: only as I was preparing this essay for publication did I notice something about that final Whisper. In mustering evidence, I quoted two sources; Rowan mocked me for “quoting one man.” The misogyny in that statement, though glaring, is huge enough to deserve its own response. Later.

Monday, November 30, 2015

The Brothers Grimm Visit Deepest America

1001 Books To Read Before Your Kindle Battery Dies, Part 62
Diane Wolkstein, The Magic Orange Tree and Other Haitian Folktales


Modern Euro-American scholars like Walter Ong and Marshall McLuhan unabashedly regard written and visual communications as normal in modern society, and oral communications as “vestigial,” in Ong’s terminology. Storytelling, once the chief means of conveying history and public morals (see Maria Tatar), is regarded as a lingering remnant, like your tailbone or your appendix. But what about societies where literacy remains rare? Do such societies not count?

Acclaimed folklorist, storyteller, and one-woman Broadway performer Diane Wolkstein took her tape recorder to rural Haiti in the late 1970s, hoping to catch the sounds of the countryside’s legendary storytellers. Even back then these modern bards were endangered, squeezed by American television and radio. But while electricity remained (and remains) a scarce resource in upland Haiti, these oral storytellers remain an integral part of Haitian community life.

Wolkstein recounts, not just the stories themselves, but how she came to collect them. Being usually the only white face among the ebon-hued Haitian crowd, she witnessed not only the energetic, theatrical raconteurs themselves, but the ecstatic audience that surrounded them. Brought together by markets, pot-luck dinners, and street dances, the crowds shared a true communal experience. Here the old pre-Gutenberg community ethic doesn’t just survive, it thrives.

Some stories she collected, Wolkstein writes, have clear precedents in printed literature. She notes how some stories are clearly retold from the Brothers Grimm, African folktales, and elsewhere. However, other stories are clearly original to Haiti’s impoverished, war-torn, pre-literate social structure. Our society has grown accustomed to fairy tales as either ancient artifacts, or products of single authors; Wolkstein presents new-to-us stories written by an entire culture.

Diane Wolkstein
The twenty-seven stories Wolkstein collects here reflect a uniquely Haitian perspective. “Papa God and General Death” describes a peasant meeting the two most important forces in Haitian life, religion and mortality. (Wolkstein visited during Baby-Doc Duvalier’s reign.) “I’m Tipingee” features a young heroine proving her resilience in a culture where children, until they’re old enough to work, are mere baggage. An American wouldn’t write such harsh but insightful stories.

And Americans probably couldn’t write stories as transcendent as “Bye-Bye.” An allegory of emigration, it reflects a society whose highest aspiration is to leave everything behind and start over. Yet in many ways, this story feels remarkably familiar to modern Americans. Like apocalyptic End Times superstitions, it contrasts the virtuous few able to leave Earth and fly to Heaven (depicted here as New York), with the struggling many Left Behind.

These stories have definite religious components. “Papa God” is a recurrent, and humanly fallible, character. Spirits, ancestors, and magic permeate these tales. But beyond personal faith, the religion arises from the stories themselves. By bringing people together in mass gatherings, speaking aloud their moral values, and bonding them together against oppressive regimes, these stories embody the bond-building goals Émile Durkheim identified as rudimentary to the human religious experience.

By the author's own admission, these stories weren't necessarily the best-told she encountered while researching folk tales in Haiti. Haitian storytelling relies on voice, gesture, stage presence. The flat page lacks the beauty of the oral tale, and some of these stories may have been a little weak in the telling; but on the page they reveal a great deal about Haiti, and are a fascinating read besides.

Folk tales reveal a great deal about a culture-what it values, how members of the society relate, what their beliefs are. These tales do exactly that. While they aren't as clear-cut, with a defined beginning, middle, and end, as American readers have become accustomed to, they do give away a great detail about Haiti. Life is unfinished; hardship is to be embraced and studied; the spirit world is right here at hand, not a million miles away above the clouds.

I had the privilege of corresponding briefly with Diane Wolkstein briefly, before her sudden passing in 2013. Though an inveterate world traveler and seasoned folklorist, Wolkstein admitted Haiti and its stories had remarkable staying power with her. Stories like “The Magic Orange Tree” and “Mother of the Waters” remained staples of her live performance for thirty years. This book remains her best-selling work, for reasons eminently clear in her text.

Even on their own, these stories stand as a monument to the creative act and the power of the human intellect. These stories will infect your head like a virus, spreading and replicating, until you have to pass them on. Read them casually, and you will be enlightened. Study them seriously, and you may be transformed.

Monday, November 23, 2015

Who Is This Wendy Corsi Staub Person Anyway?

Wendy Corsi Staub, The Perfect Stranger and Nine Lives: A Lily Dale Mystery

Months ago, I reviewed Wendy Corsi Staub’s thriller Blood Red, an interesting premise that descended into wordy, over-written excess. I considered Staub an interesting writer who desperately needed an editor, and forgot her. Recently, however, I’ve discovered she’s actually a bestseller who writes with James Patterson-like bounty. So I agreed to reconsider Staub, and accepted two review novels. Now I’m even more confused. Let’s start with The Perfect Stranger.

I knew I was probably facing a difficult slog with this novel in Chapter One, one character took three pages to descend a staircase. Not because she was slow-moving, or descending from the highest tower, but because author Staub kept intruding long expository recollections between steps. That set the tone for this entire novel: it’s difficult to make even incremental progress, because even minor actions trigger rambling recollections.

In today’s networked age, local actions have potentially global circumstances. Meredith Heywood is the unofficial mother hen of a blogger’s circle, comprised entirely of breast cancer patients and survivors. These women have built a transnational friendship, without bothering to meet, or in some cases to learn one another’s real names. So Meredith’s death prompts their first-ever meeting, at her funeral. Too bad Meredith was actually murdered.

Wendy Corsi Staub
Staub crafts a massive ensemble for a modernized, tech-friendly rendition of a classic Agatha Christie locked-room mystery. At its heart, the women bloggers have deep connections which they’ve shared only online. They believe they know one another intensely, but with passing scenes, it transpires that each keeps deep secrets they don’t divulge digitally. One of these friends has mysterious motives. And now they’re all in danger.

But upon this intriguing premise, Staub has layered countless, disjointed internal monologues. Every character has a backstory, expounded interminably, whether they advance the story or not. The cast of thousands each get their own moments, to the detriment of pacing. Single conversations challenge readers’ patience, because between successive exchanges, Staub inserts Proustian recollections, sometimes pages long. The promised mystery never quite begins, because these recollections never quite cease.

I wanted to enjoy this book, but Staub wouldn’t let me. In today’s media-saturated age, authors realistically get about thirty pages to engage readers’ attention in books this long; but well past page 100, Staub still indulges in chugging expository scene-setting. The narrow thematic focus prevents this being a Jane Austen-ish character novel, but Staub’s interminable narration doesn’t let it be a mystery. I tried, but I just got bored.

Though different in premise and character, Perfect Stranger suffered the basic limitations that burdened Blood Red: too much writer, not enough story. Somebody once said, exceptionally prolific writers basically tell the same story time and again. Consider Stephen King. I basically wrote Staub off as a niche author, and prepared to forget her. But she surprised me, and made me reëxamine my prejudices, with her most recent character mystery, Nine Lives.

Newly widowed, unemployed, and evicted, Bella Jordan packs her son Max and whatever she can carry. She intends to crash with her mother-in-law; but car trouble and a needy stray cat divert her to Lily Dale, New York, home of America’s (very real-world) largest table-tapping community and séance resort. A town whose local innkeeper was recently murdered, giving Bella and Max a job, a house, and a mystery to solve.

Staub, who has already written a series of young-adult mysteries set in Lily Dale, now revisits the milieu for adults. Though pitched as a mystery novel, Staub actually offers a charming, low-key character drama with components, which become driving only relatively late. She provides readers with her familiar viewpoint character, the youngish wife with burdens, and basically permits Bella to interact with her interesting, tormented setting.

Bella adjusts, first grudgingly, then warmly, to her new surroundings. Max bonds with his cat, makes friends, and demonstrates budding psychic tendencies. Bella becomes an ardent innkeeper, befriending Lily Dale’s eccentric supernatural community and its resulting tourists. But she also glimpses increasing evidence that the prior innkeeper, whose death everyone calls accidental, actually met foul play. (Can psychics get ambushed?) She dons her Miss Marple had and investigates.

This hardcover original from a usually straight-to-paperback author is undoubtedly the best Staub I’ve read. It suffers her usual weakness, very long expository scenes, but never feels sluggish or overstuffed. She reveals backstory whenever it’s needed, keeps herself (relatively) concise, and simply tells an interesting story. Though this book works better as character drama than noir thriller, it’s nevertheless engaging reading. Now I understand why readers love Staub.

Friday, November 20, 2015

The Moral Failure of Basic Political Rectitude

When the U.S. House of Representatives passed a bill this week, demanding that incoming Syrian war refugees be admitted to America only upon the personal signature of the heads of three—three—security agencies, American politics crossed a line for me. A country that has historically prided itself on its “melting pot” ideology and Ellis Island heritage, has declared itself closed for the global relief business. Unless that business involves dropping bombs.

But the entire country isn’t refusing refugees. The House, which enjoys (if that’s the word) a near-supermajority of Republican control, doesn’t want refugees; President Obama, a Democrat, does. Of the thirty state governors who have refused or restricted refugee access, only one, Maggie Hassan of New Hampshire, is a Democrat. On a county-by-county basis, popular willingness to accept refugees tracks positively with party affiliation.

One party in American politics, and only one, wants to isolate America from the biggest humanitarian crisis since the Holocaust. History will not look kindly upon us for this.

Not that Democrats have proven themselves real forward thinkers here, either. President Obama wants to accept ten thousand refugees from a crisis that has displaced four million people. That’s a drop in the bucket. By contrast, Germany, a country GK Chesterton once called “the advance guard of the Servile State,” has already accepted over 200,000 refugees. While Republicans dither, and Democrats make nodding efforts, Syrians are dying to escape fates worse than death.



A former Republican myself, I quit the party, among other reasons, because I couldn’t reconcile its stated “pro-life” platform with its actual policies. Though aggressively opposed to abortion, the party has staunchly avoided even the most trivial public contributions to prenatal medical care, and has actively tried to kill AFDC, WIC, and Food Stamps for struggling parents. Current Republican presidential candidates want to kill public education. Within the last decade, we fought an unnecessary war estimated to have killed over 600,000 non-combatant civilians.

What the hell kind of “life” are they advocating?

I ask that, already knowing the answer. By torpedoing worker protections, demonizing women’s health access, opposing an increasingly popular healthcare law, and demanding military action against civilian centers without opening doors to the displaced, they don’t advocate life at all. Their anti-abortion stance is essentially a strategic bribe to purchase loyalty from religious conservatives, who would otherwise oppose their agnostic libertarian agenda.

The Republican Party has gotten good at talking the religious game. They’ve shepherded former pastors, like John Danforth and Mike Huckabee, into elected office. They’ve embraced the “War on Christmas” metaphor. They hold hands with tubthumpers who use their authority to selectively deny civil and legal rights. Their rhetoric is perforated with religious terminology. But it’s completely lacking in ethical foundation.

Many Christians today feel uncomfortable with the idea of a benevolent God who will nevertheless judge us for our actions. But it’s right there in the Bible. In the Parable of the Sheep and the Goats, Jesus explicitly separates sinners from the redeemed based on their treatment of “the least of these”: the poor, hungry, naked, widowed, and sick. We’ll be judged by what we do for humanity’s most maltreated, not for avoiding saying “fuck” on television. When religious poseurs use state power to exclude dying refugees, they fail the most basic religious test.

Admittedly, Democrats perform hardly better. Bill Clinton pledged “the end of welfare as we know it” not, as Hillary now claims, as a sop to Republicans during his presidency, but as a campaign pledge during his underdog primary run. Consequently, journalist Matt Taibbi writes, America’s most desperate citizens face more rigorous scrutiny than the banks who detonated America’s economy in 2007 and 2008. President Obama, with his airstrikes on an MSF hospital, recently became the first Nobel Peace Prize winner to bomb another Nobel Peace Prize winner.

Country singer Hank Williams, Jr., is frequently noted for his unreconstructed reactionary values. This has included naked race-baiting, advocating secession, and comparing President Obama to Adolf Hitler. But back in the 1980s, he recorded a song entitled “USA Today,” the refrain of which should remind conservatives, like himself, why we cannot shutter our borders during this massive humanitarian crisis:
It's true we've got our problems, Lord knows we make mistakes
And every time we solve one, ten others take its place
But you won't see those refugees headin' the other way
Welcome to the U.S.A. today
Here’s hoping America’s leaders remember this sentiment when it comes time to open or close our borders to Earth’s neediest peoples.

Wednesday, November 18, 2015

How I Stopped Worrying and Learned To Love Math

1001 Books To Read Before Your Kindle Battery Dies, Part 61
Paul Lockhart, A Mathematician's Lament: How School Cheats Us Out of Our Most Fascinating and Imaginative Art Form


I find myself in the awkward position of arguing against a book I admire, because Paul Lockhart sells himself short in this remarkable book. First, Lockhart thinks he's identified a problem unique to current mathematics education, when he's actually identified schooling's primary flaw. Second, he thinks we shouldn't teach math as "merely useful," when it's profoundly useful, just not in the way math textbook authors commonly think.

Lockhart believes (and I agree) that the current focus on rote memorization, "skillz drillz," and repetitive exercises causes students to falsely believe math is a heap of formulae in a vacuum. He expertly dismantles how math is taught while demonstrating the discipline's true, dynamic nature. To Lockhart, mathematics should foster inquiry and a curious, question-driven mindset that then pursues answers. Math teaches how to face challenges without road maps.

But on page 40, Lockhart asks: "What other subject shuns its primary sources—beautiful works of art by some of the most creative minds in history—in favor of third-rate textbook bastardizations?" Would Lockhart like me to start a list? I could do it alphabetically. Historian James Loewen and literature professor Gerald Graff have voiced the same complaints in their fields. Discouragement of inquiry is endemic throughout education today.

Reading Lockhart's demonstrations of popular mathematical concepts, I was struck by how seemingly complex concepts suddenly appeared both clear and welcoming. I remembered the difference between my undergraduate education, which favored memorization and regurgitation, and graduate school, which encouraged individual discovery and growth. But why must I or anyone wait for grad school before unlocking the truth for ourselves?

A Renaissance woodcut depicting Euclid,
the father of plane geometry
Some of Lockhart's critics say that math should focus on memorized formulae, because knowledge is cumulative, and few students can savvy higher math without a comprehensive foundation. But how many want or need higher math? As a student of mine said, she'll never need to factor polynomials in real life. No, we don't study math for its perceived utility. But that's not to say that math isn't useful.

Lockhart's eloquent, graceful proofs demonstrate a mind which faces questions that lack explicit answers, and then pursues those answers. This process of investigation is necessary in a world where clear-cut options are few on the ground. Too many graduates, facing adult life, stumble when they find that work and family aren't closed cases. These habits of investigation and inquiry are useful inasmuch as life offers more questions than answers.

In the six years since I first read this book, American education reformers have addressed some of this book’s concern by widening the selection of learning heuristics children learn. Though old fogies like me have mocked Common Core’s alternate learning patterns for being different than what we learned, I’ve come to appreciate the mindset behind the differing patterns. Teaching children diverse ways to approach common problems with systematic rigor.

Sadly, Common Core functionally repeats math education’s underlying flaw, because no matter how many heuristics children learn, they still reduce learning to rote memorization. Lockhart, in this book, concedes that sometimes we must give struggling children correct answers on a silver platter. And arithmetic, often, is best learned by mimicry. But higher math, as Lockhart asserts, is more than a repetitive skill. It’s a true, and underappreciated, art form.

More than that, math is a simple joy. Edward de Bono writes that many people, confronted with new ideas that upset their preconceived notions, respond with laughter. Laughing is our human response to having our eyes opened, our minds widened. I never understood what that meant until I read this book. Lockhart incorporates several exercises from Euclidean geometry to open our thinking to math’s higher influences, and while reading, I repeatedly laughed like a madman.

Math is useful and desirable because it opens doors of thought. Too much of school appears closed to debate, but real life forces us to ask questions that don't have answers. People who aren't equipped to face such questions take on adult roles without their most valuable tools. Whether it's math or science or art or business, we must learn to face questions, weigh possibilities, and seek that "Eureka" moment.

Often, people who would dominate us seek to create the false impression that sophisticated questions can be solved as simply as textbook exercises. Appropriate education, including an introduction to math, should teach us not to plug memorized answers into dynamic, changing questions. Math teaches us that each question opens new answers, and, like all disciplines should, invites us to learn how to ask and investigate for ourselves.