Wednesday, January 29, 2014

California Dreaming

1001 Books To Read Before Your Kindle Battery Dies, Part 29
T. Coraghessan Boyle, The Tortilla Curtain


This novel begins with a very literal collision between America and Mexico. Delaney Mossbacher’s luxury import car hits Cándido Rincón, a desperate illegal immigrant crossing a tree-lined California road. A single accident between a well-heeled white liberal and a scuttling brown laborer locks the pair in an orbit of mutual destruction. Like opposing black holes, they slowly pull each other apart without ever touching.

T. Coraghessan Boyle’s strange 1995 novel, his first following his fame-making The Road to Wellville, challenges the very structure of novel writing and storytelling. Its episodic, picaresque form centers on two characters who completely misunderstand each other, never have a conversation, and act spitefully. Everything could stop if the characters simply spoke to one another, but they can’t, because they don’t share a language.

In the weeks and months following their collision, Cándido’s injuries plunge him into violent illness, compounded by the fact that he needs to work. Sharing a Topanga Canyon squatter camp with his very pregnant wife, recuperation isn’t an option; he works, or they starve. Los Estados Unidos, which Cándido first relied upon to reestablish himself as a husband and a man, quickly becomes a land of shame, emasculation, and hunger.

Meanwhile Delaney, a writer and stay-at-home father, starts seeing echoes of his guilt everywhere. He becomes newly aware of Mexican work exchanges, but also of gang activity and racial animosity. He sees his white neighbors spitting bigotry, but because he rationalized the accident away with money, he cannot sustain counterarguments any longer. He struggles to maintain his idealism, but Delaney quickly joins the bourgeois system he once hated.

Boyle’s authorial symbolism may seem high-handed at times: dry, distant Ernest Hemingway he ain’t. Giving Mexican characters names like Cándido and América Rincón, and ensconcing his white characters in the gated community of Arroyo Blanco (“White Ravine”), smacks of allegory approaching medieval morality plays. But this lampshaded symbolism lets Boyle skip past what other authors would spent thirty chapters implying, straight to his story’s beating heart.

This isn’t a novel about how Norteamericanos treat Mexicans. This isn’t a novel about race or economics or how heartless white people are to brown people. Boyle writes, instead, about two characters so wholly wrapped in themselves, they cannot comprehend anything different. Delaney envies Cándido’s unpolished authenticity, while Cándido aspires to Delaney’s settled comfort. But every time they meet, they manage to widen the gulf between themselves.

Delaney and his wife Kyra moved into Arroyo Blanco to live near nature while remaining close to Los Angeles. Their Spanish Colonial neighborhood encourages leafy trees and faux Mexican chic. Their urban romanticism, however, leaves little room for actual nature; a coyote kills their dog on their manicured lawn. The Arroyo Blanco tenants’ committee erects first a chain-link fence, then a wall, to exclude anything Spanish-speaking or authentically natural.

Cándido will take any job, however demeaning, to achieve his goal of a simple studio apartment. He wants to give his wife and unborn child the gift of four walls. But his injuries age him, and América must take dirty, dangerous jobs that jeopardize her looks and her pregnancy. As squatters, they have few rights when first white teenagers, then Mexican gangs, attack and brutalize their camp. Whenever they think they’ve hit bottom, they discover how far they still have to fall.

Throughout their shifting struggles, Cándido and Delaney remain unable to speak with anybody who doesn’t essentially resemble themselves. Thus the grow entrenched in their attitudes and small in their horizons. Delaney establishes a fortress mentality, striving to exclude anything exotic, and lapses into knee-jerk racism. Cándido, likewise, becomes paralyzed by fear, unable to defend himself or his family. Both worlds become intolerably small.

In interviews, Boyle has stated that he doesn’t know what he thinks about a subject until he writes about it. This entire novel has the atmosphere of an experiment: what tortures can we inflict on characters until they break? As Cándido and Delaney repeatedly draw lines in the sand, and Boyle repeatedly forces them to retreat, both characters quickly become something they swore they’d never permit. Until they do.

Boyle doesn’t purpose, in this novel, to solve America’s border tensions. Indeed, situations have become vastly more tense and complicated in the two decades since this book debuted. But by simply letting two characters be themselves in an atmosphere that permits no compromise, and no communication, he forces us to hold a mirror to ourselves. This isn’t a novel about the border; it’s finally a novel about us.

Monday, January 27, 2014

In the Land of Invisible Morals

Lee Van Ham, Blinded by Progress: Breaking Out of the Illusion That Holds Us

Informed Americans know we’re using Earth’s resources faster than Nature can replenish itself. We burn carbon, squander water, generate waste, and denude land at rates unprecedented in natural history. We know we’re doing it, but feel powerless to stop, and don’t know why. Philosopher Lee Van Ham suggests we’re beholden to an ethical edifice we can’t even see.

Van Ham proposes two competing philosophies: MultiEarth thinking, which consumes resources and people like we’ll never run out, and OneEarth thinking, which endeavors to live in harmony with Earth, human nature, and ecology. We cannot live right, he asserts, until we live aware of our moral shackles. We must shatter the illusions concealing our morality from ourselves.

Readers familiar with Bill McKibben, Wendell Berry, or Julia Butterfly Hill will recognize Van Ham’s themes. But they won’t recognize his reasoning, not superficially, anyway. Though much environmentalist and anti-plutocratic writing has an innate spiritual component, Van Ham applies intensive exegetical considerations to the topic, reflective of his prior ministerial career. He particularly finds, in the story of Cain and Abel, a parable of modern society.

Abel, Earth’s roaming steward, and Cain, Earth’s settled owner, could never have lived peaceably. The relationship between those who follow Earth’s ever-changing movements, and those who try to shackle Earth to their whims, will inevitably turn violent, as they forever cross purposes. Van Ham sees a late anti-urban allegory here, much like Jacques Ellul, and his exposition of two ultimately incompatible systems permeates his book.

This moral vision differentiates Van Ham from the numerous voices already propounding similar messages. While Wendell Berry, for instance, shares Van Ham’s faith, Berry frequently avoids current events, focusing on transcendent, almost mystical themes. McKibben, though a professed Christian, prefers scientific arguments, using spirituality sparingly. Van Ham’s moral catholicity claims the broad middle ground between these visions, the domain where most Americans live, but where environmentalists fear to tread.

Van Ham spends the largest part of his book discussing what he terms the “Five Big MultiEarth Practices Holding Us In Illusion.” These practices correspond with important issues I’ve noticed, but haven’t yet voiced as clearly, especially “Giving Primary Religious Devotion to Economics” and “Disguising Corporatocracy as Democracy.” Van Ham’s breakdown alternates between the shock of familiarity and deep, suppressed detail.

Economics’ religious structure, which most capitalists would probably deny, become obvious when considering the rituals attendant to, say, Monday NYSE openings. But Van Ham explores subterranean corners of modern economic practice, demonstrating how business insider liturgies and CNBC hymnody conceal a deeper moral landscape, one most Americans never see, but inevitably share. His discoveries, as current as the morning news, are frequently chilling.

Now, many writers publish many books explicating how society rationalizes damaging humans and the Earth, while mortgaging our own future, for wasteful short-term gains. Van Ham distinguishes his book by mixing objective fact with personal writing. Not just a book of science or morality, Van Ham offers a memoir of his own struggles with eco-unfriendly living, and how he transitioned from short-term profligacy to mindful living.

Much as I appreciate Van Ham’s premises, and mostly support his arguments, his exegesis remains frustratingly one-sided. In discussing MultiEarth philosophy, he characterizes it entirely in his own terms, not terms his opponents would comprehend. Consider this early characterization of his MultiEarth frenemies: “Human species strives for lifestyles that use more resources than available on one planet.”

Does anybody really strive for that? Or do people enamored of earthly wealth simply believe Earth’s resources so vast that we cannot possibly deplete them? When approaching his opposite numbers, Van Ham might consider attempting what rhetorician Gerald Graff calls “the believing game”—attempting to state counterarguments in terms true believers would accept. Because right now, MultiEarth adherents could accuse him of Straw Man arguments, and dismiss him.

Therefore, I recommend this book primarily for people who essentially already support Van Ham’s central thesis. His reasoning will give us tools for debating technocratic zealots, and allows us to bolster our own beliefs with reason and facts. Once we’ve persuaded others to take our positions seriously, and only then, let’s push copies of this book into Old Order followers’ hands.

Van Ham describes this as the first of a trilogy. In this volume, he primarily establishes the moral foundation of MultiEarth and OneEarth philosophies; he promises in future volumes to address actual plans to fix the socioeconomic Frankenstein we’ve created. If he maintains his personal, moral, and dryly humorous tone, I’ll anticipate those coming volumes with giddy fanboy hope.

Friday, January 24, 2014

The Omnivert's Dilemma

One of my most confounding teenage moments happened when I made a church youth leader cry. I didn’t mean to make her cry. She’d simply chosen, as her leadership mission, the goal of making me extroverted. When I didn’t choose to participate in every moment of spontaneous group silliness, like singing madrigal versions of Top-40 songs or re-enacting scenes from Pauly Shore movies, she took it as a judgment on her leadership.

It didn’t help that this “adult” leader only had about seven years on her oldest youth charge. But her entire leadership model hinged on keeping everybody active, vocal, and engaged in everything, no matter how flippant or how far outside particular group members’ wheelhouse. The fact that I might dislike small talk, focus on actually exchanging or testing ideas, or prefer to converse in my “indoor voice,” never penetrated this woman’s leadership paradigm.

I recalled that woman when reading PR professional Brigitte Lyons’ essay 5 Myths About Extroverts That Need To Die. I remembered that woman because the “myths” Lyons cites simply didn’t exist fifteen years ago, before the read-write Web 2.0 created an environment where introverts could communicate. And I assert they don’t really exist now, outside Lyons’ head, and perhaps a few other people who feel aggrieved because extroverts no longer monopolize social etiquette.

When I finished high school, the terms “extrovert” and “introvert” chiefly belonged to mental health professionals and scholarly researchers. In ordinary discourse, we said “outgoing” and “shy,” words that contain moral judgments beyond their dictionary definitions. The schools I attended clearly considered it their mission to break shyness habits, forcing students to spend time milling in noisy, echoing common areas which, considering the large urban schools I attended, often devolved into zoo-like conditions.

(Don’t get me started on what that means for education. My opinions are already well documented.)

This attitude wasn’t exclusive to schools. As Anneli Rufus writes in her book Party of One: The Loners' Manifesto—one of the first on its topic—mass media openly depicted people who enjoyed their own company as masturbating perverts. The medical profession treated introversion as mental illness. Corporate recruiters believed introverts were stupid. And military culture treated (and, to a lesser degree, still treats) introversion as a form of cowardice.

As advancing technology made the Internet a two-way conversation in the late 1990s and into the New Millenium, however, natural introverts found a venue where they flourished. Because introverts prefer to ruminate before they speak, the Web’s asynchronous conversations let us look quick-witted and debonair. We prefer to avoid even accidentally looking ignorant, and Google lets us fact-check our assertions before voicing them, so we don’t shoot our mouths off. And the physical distance lets us speak deeply without the overstimulation of public conversation venues, like bars or city streets.

In this new milieu, people who prefer to pause before speaking suddenly had the upper hand. In offline communications, rules of etiquette have always favored extroverts. Emily Post openly opposed “air in the conversation,” requiring people to keep speaking without regard for what they had to say. And though we teach children to take turns, introverts quickly learn that, in open, unmoderated conversation, primacy goes to whoever speaks first, regardless of whether they actually have anything to contribute.

While introverts quickly found themselves running online conversations, new popular awareness of mental health was changing how we considered ourselves. The Myers-Briggs Type Indicator (MBTI), first published in 1947, remained a medical oddity until around 2003, when newly unified introverts online discovered that this approach to psychology proved they weren’t abnormal. (Yes, I actually had an authority figure, a school counselor, tell me, pre-Internet, that preference for solitude was “abnormal.”)

When Brigitte Lyons complains about feeling “hurt,” “burned,” and “at war” with introverts, she’s discussing pop concepts newer than a chihuahua's lifespan. Just in the two decades since I attended high school, I remember classmates who got good grades because they dominated discussions, even when they were flat damn wrong; co-workers who got promotions for which they were unprepared because they’d befriended the managers; and politicians whose skillful gladhanding got them elected to offices for which they proved supremely unqualified. Non-digital life still caters to extroverts.

Online, however, introverts flip the script. And not surprisingly, too. The Web rewards people who pause before they speak, behavior that realtime conversation punishes. And I concede, a few introverts have claimed Introversion as an identity issue, engaging in harmful in-group behavior. But even most introverts don’t want to keep company with those people. In the main, we’ve simply found an atmosphere that honors the way introverts prefer to communicate.

The myths that Lyons purports to demolish sound weird to me. “Extroverts don’t have feelings,” she refutes. “Extroverts aren’t introspective.” “Extroverts are heartless bastards.” I never said any of those things. Importantly, the three articles Lyons links and claims to rebut don’t say those things, either. They simply assert that another way of thinking exists, one which contradicts established modes of etiquette. Considering that the established etiquette was mainly written by extroverts, the other way will naturally incline toward introversion.

We live in a society where one word, “dumb,” connotes two meanings: stupid and unable (or unwilling) to speak. Within living memory, psychologists, educators, and clergy considered introversion an illness that needed curing. That youth leader put me off church so badly that, religious as I am, I still struggle with volunteerism and activities. An acknowledged leader, backed by the administration, told me I had serious personality flaws because I’d rather listen than speak. I can’t shake that like water off a duck.

I don’t kid about these things. A therapist once put me on psychotropic medications that caused severe side effects, impeding my work life and (I contend) pausing my social and educational development for years, because my “social anxiety” was a medical illness. The DSM-IV, the standard diagnostic manual of mental health professionals, sets the bar for “Social Anxiety Disorder” so low that patients, especially children, can still be clinically diagnosed for getting the jitters in crowded rooms.

Again, I’d be ignorant if I pretended a minority of introverts haven’t engaged in petty feuds and other nasty behavior. But no opinion, philosophy, or disposition should ever be defined by its most unpleasant outliers. Most introverts, especially those of us who grew up before the Internet, simply want to ratify that our preference for quiet, occasional solitude, and fewer, more intimate friendships is normal. Because for years extroverts told us we were maladjusted, cold, or deranged.

Introverts, like extroverts, simply want our earned place at society’s table. And if Lyons, and others like her, feel she’s being attacked, that says more about her than it does about introvert subculture.

Wednesday, January 22, 2014

Neuroscience and the New Human Nature

Stefan Klein, Survival of the Nicest: How Altruism Made Us Human and Why It Pays to Get Along

Conventional economics and biology sneer at generosity. People who give of themselves get fleeced regularly, and diminish their chances to survive and reproduce. Yet the world ordinary people occupy practically shimmers with mundane kindness; without it, we couldn’t conduct urban economies or do industrial jobs. But what is altruism, really? How did it originate, and what encourages it to flourish? After centuries of debate, science may have some answers.

Though trained as a physicist, German author Stefan Klein has made a career writing about the points where science impacts general society—a Teutonic Carl Sagan, perhaps. In this case, his interests throw sharp light on neuroscience and psychology, though his investigations overlap with the growing domain of behavioral economics. And he draws a surprising conclusion: more than language or technology, generosity and altruism make us genuinely human.

Because humans cannot survive individually, much less thrive, evolution rewarded primordial humans who engaged in quid pro quo generosity. This persists today in acts of trust, kindness, and public ethics that make modern society possible. Yet the very conditions altruism fosters discourage people from behaving generously. Neuroscience tells us that human society may soon shift, requiring altruism to ensure continued human survival.

Science and natural philosophy have long speculated on the motivations behind altruism, speculations Klein partly recapitulates here. From Aristotelian ethics to Richard Dawkins’ “Selfish Gene” theory, speculations have drawn on complex mixes of influences. But only recently has science reached the point where we can objectively analyze not only what people really do, but what mental and biological processes fuel these actions. Though highly complex, Klein explains the science eloquently.

Experimental techniques with Clint Eastwood titles, like The Prisoner’s Dilemma, The Free Rider Game, and Ultimatum, allow researchers to observe human generosity (and stinginess) in action. Though many of these games have existed for decades, new neuroimaging technologies permit glimpses inside human brains, in real time, as people make key decisions and formulate ethical precepts. The results, as Klein describes them, are anything but obvious.

During such interactions, it turns out, we have opportunities to establish norms that make future dealings possible. The way we play the Free Rider Game, for instance, allows groups to agree on ethics, not just of generosity, but of how to penalize goldbrickers. This holds true across not only groups as small as two or three, but across entire societies. Democracy and capitalism absolutely rely upon neurological habits we acquire during such simple trust exercises.

Strangely, the approaches groups traditionally use to encourage altruism and punish greed don’t really work. Klein shows how verbal praise and public recognition make much better motivators for good behavior than money or possessions. Paying people to do right actually counteracts meaningful gains. And punishments that originate “on high” have far less impact on bad behavior than mass peer pressure or censure from one’s equals. Praise and scolding: who knew?

Klein’s model refutes both common liberal and conservative lines about social organization. Top-down authoritarian government produces not honesty, but resistance, while libertarian ideals encourage mass defection from the social order. Effective codes of honesty instead percolate upward from the masses, and the people do a much better job punishing infringements than either the state or the private sector. Serious readings of Klein’s conclusions will force sweeping reconsiderations of contemporary authority.

Indeed, Klein’s conclusions could jeopardize how we constitute nations, enforce borders, punish criminals, and conduct international relations. Why, he asks, do the citizens of Basel, Switzerland, pay their taxes eagerly, while citizens of Geneva cheat regularly? The answer should force public servants to reconsider how we persuade citizens of all nations to follow the law. Turns out, most people probably learned the wrong lessons from the Cold War.

Though Klein avoids judgmental terms like “totalitarianism,” the implication remains under his regular statements: absolute power undermines basic social cohesion. Only when people feel free to leave social arrangements, only when they feel they have some voice in how society governs them, will citizens wholly commit themselves to improving the general commonweal. Klein sidesteps such sweeping conclusions, but it remains implicit in his statements. (See also this.)

Stefan Klein’s book runs short, but is packed with the kind of dense, surprising information we expect from Malcolm Gladwell or Charles Duhigg. He’s informative, helpful, lucid, and frequently funny. I defy any reader, regardless of preformed positions on human nature, to read this book without having to reconsider what you think you think. Because if Klein’s sources are right, human nature will soon have a new model.

Monday, January 20, 2014

The Early Technologies of Violence

Charles Todd, Hunting Shadows: An Inspector Ian Rutledge Mystery

One humid September day in 1920, a gunman climbs the cathedral steeple in secluded Ely, England, armed with an Enfield rifle and a newfangled German telescopic sight. This rudimentary sniper technology lets him assassinate a corrupt Army captain amid a huge crowd and escape unseen. When he shoots a local barrister two weeks later, local police contact Inspector Ian Rutledge, Scotland Yard, to solve a case beyond their rudimentary skills.

American author Charles Todd, actually a mother-and-son team, has made a career writing intense, atmospheric British mysteries with a muscular American style. This, Todd’s sixteenth Inspector Ian Rutledge novel, continues their blend of meticulous anglophilia and bold procedural mythmaking. They create a breathtaking landscape of British grandeur and stiff-upper-lip secrecy. Sadly, they don’t create character nearly as well here as they have previously.

Rutledge must investigate two murders in Britain’s Fen Country, several miles apart in a region where neighboring towns literally don’t talk to each other. The victims share no visible connection, except their mode of death. In a milieu lacking CSI technology, much forensic precedent, or even reliable telephone communication, Rutledge can only trust his wits. This becomes even more true when neighbors hate the first victim enough to protect his killer.

Charles Todd’s novels attract comparisons to Agatha Christie, specious comparisons that only make sense because, um, Britain. But this isn’t a cerebral puzzle, where the hero must unpack clues, Encyclopedia Brown-style. Todd’s storytelling more resembles Raymond Chandler and Dashiell Hammett than Agatha Christie or Arthur Conan Doyle. Inspector Rutledge doesn’t play intellectual games; he hunts murderers, who have the means to kill and the will to live.

But Todd’s novels also turn on Inspector Rutledge’s lingering struggles with postwar trauma. A Western Front veteran, Rutledge returned to a Britain where “shell shock” was regarded as a form of cowardice. He must resume his responsibilities while squelching his lingering wartime flashbacks. That would be easier if Hamish MacLeod, Rutledge’s sergeant whom he shot for mutiny, didn’t follow him everywhere as a disembodied Jiminy Cricket.

This is my third Inspector Rutledge novel, and I’ve long enjoyed this struggle. Rutledge must uncover truths about others while concealing truths about himself that could scuttle his career. That makes it more disappointing when, of the Rutledge novels I’ve read, this one invests the least in Rutledge’s ongoing backstory. Todd reduces his personal history to the occasional intrusive paragraph, while Hamish simply pipes up to periodically scold Inspector Rutledge.

Not that Todd doesn’t pitch a remarkable novel. His/their attention to detail recreates a Britain that only existed very briefly. The Victorian English resolve murdered during World War I leaves a country obsessed with appearances, maintaining postures of prewar dignity while untreated wounds fester under the surface. This is the England of Ford Madox Ford and Downton Abbey, and it would die screaming mere years later during the Blitz.

And Todd’s procedural mystery packs an aggressive punch that only velvet-gloved gentility could produce. Rutledge must ask questions that good, mannerly British people shouldn’t ask, much less answer. The Fen Country’s near-feudal culture doesn’t appreciate some London dandy prodding community secrets. While the outside world allowed the War to change it, the Fens endured. Rutledge represents the objective, scientific Twentieth Century intruding on venerable Jacobite stateliness.

No, Todd certainly creates a good, freestanding mystery novel. But many authors do that. I anticipate new Charles Todd novels because he/they usually incorporate multiple layers. The War depleted Scotland Yard’s aristocratic leadership, permitting territory feuds as acrimonious as Highland clan wars. Rutledge’s prewar friends intrude on his postwar recovery. Rutledge must balance his fraught personal life against his lawkeeping career, often at great personal cost.

This novel just lacks that. It exists on one level, allowing occasional, teasing interruptions from Rutledge’s backstory, but never insistent or durable enough to influence the larger novel. Instead, Rutledge and his trusty motorcar crisscross the Fens, interviewing sources and reconstructing the events precipitating the murders. Todd creates a cast of thousands; keep notes on the endpaper. This results in a long, talky novel, varying little in tone or pace.

Established Charles Todd fans may enjoy this novel within the Rutledge canon. But I can’t recommend this for casual readers or Rutledge newbs. Todd creates a pleasant but ordinary period mystery, just one among many the publishing houses mass-produce anymore. Audiences with catholic tastes will find this book sadly one-note and common. Considering his/their long, prestigious track record, this unremarkable novel just isn’t up to Charles Todd’s usual standards.

Wednesday, January 15, 2014

Chris Christie and the Manhattan Oracles

Can we please start rationing the time cable news can allot to New Jersey governor Chris Christie’s “bridge scandal”? While MSNBC pundits sputter out accusations about Christie’s evident inconsistencies, Fox News cognoscenti draw tenuous parallels to Benghazi, Operation Fast and Furious, and the Healthcare.gov rollout. We who strive to follow world events, meanwhile, wonder: what happened to, well, everything else?

For those fortunate enough to remain unaware, back in August 2013, somebody in New Jersey government closed three of the four access lanes connecting Fort Lee, NJ, to the George Washington Bridge. Many of Fort Lee’s approximately 50,000 residents work in Manhattan, and closing these lanes mired the entire city in day-long traffic jams. School buses couldn’t transport kids on the first day of school. Ambulances couldn’t reach the dying.

Broadcast nightly news attempts to keep this story in its appropriate context. America’s few remaining newpapers have treated this story in detail while also addressing other developing news. And internet newshounds can simply choose to exclude Christie stories if they wish. But the preponderance of Americans today get their news from cable today, which they often drop into for bite-sized nuggets, seldom staying for full coverage.

Right now, that means extra-strength doses of Chris Christie, whose unusually long press conferences and public appearances get combed like holy writ. MSNBC’s Rachel Maddow has expressed such umbrage that, on Monday and Tuesday, she dedicated nearly her entire show to “Bridgegate,” sparing three minutes per night for the chemical spill which rendered 300,000 West Virginians’ tap water unsafe for human consumption, and now jeopardizes metro Cincinnati.

(Late edit: on Tuesday, Maddow, a longtime LGBT spokespundit, also dedicated two minutes to the court ruling vacating Oklahoma’s gay marriage ban. That’s forty-three minutes of Chris Christie, twelve minutes for commercials, and five minutes for the remainder of world events.)

Meanwhile, Fox News stalwart Steve Doocy has spent morning after morning correlating Governor Christie’s reported mismanagement with various conspiracy theories against President Obama. Doocy, and many other Fox contributors, concede the heart of Christie’s opponents’ accusations, but assert that his transgressions are less awful than President Obama’s. Essentially, they declare some malfeasance acceptable in realpolitik.

CNN, with its history of trying to avoid taking sides, remains the only American news network to keep this story in perspective. They remain the only network dedicating significant airtime to the Afghanistan war, conflicting complaints about John Kerry out of Jerusalem, and the ex-cop who shot another man in a Florida theatre. After comparing CNN’s coverage to that emerging from everywhere else, I think I’ve found the link.

CNN is based in Atlanta, Georgia. Every other network’s home news bureau lives in Manhattan.

All three broadcast networks that have nightly news originate their broadcast from Manhattan. The same holds for MSNBC and Fox, plus partisan commentators like Rush Limbaugh and Jon Stewart. They probably had staffers caught in the Fort Lee traffic purgatory. While these networks have bureaus in other cities and countries, the anchors, editors, and senior production staff work in Manhattan. Many probably cross the George Washington Bridge every single workday.

Even seeming opponents work so close together, they share important geographical presumptions. MSNBC’s Lawrence O’Donnell and Fox’s Bill O’Reilly excoriate each other on air regularly, but their offices are so close together, they could lean out their windows and converse via bullhorn. Thus, seemingly opposite opinion vendors actually share more prior assumptions than their viewers in Minneapolis or Phoenix.

This has caused an unacceptable narrowing of discussion regarding world affairs. That which impacts high-rise news editors personally, merits lengthy analysis on valuable airtime. The New Jersey bridge folderol, which appears important but regional to viewers in the provinces, assumes world-crushing proportions when viewed through Manhattan blinders. Because Manhattan news studios live with it directly, in ways we rustics don’t.

Not that we should jettison the issue. Governor Christie’s well-publicized Presidential ambitions made this story a legitimate national concern. Either Governor Christie personally ordered multiple acts of political pettiness, including “Bridgegate,” or he exercises so little oversight of his staff that individual appointees run their offices like feudal domains. We deserve information, because this smacks uncomfortably of Irangate-style political mismanagement.

But we deserve information distributed in ways that reflect the networks’ national audiences. Gothamite provincialism is downright legendary, and the concentration of “national” news networks’ editorial offices in one city propagates that attitude. TV analysts wonder why citizens tune out news, and remain ignorant of current affairs. Perhaps it’s because the Manhattanite gatekeepers remain ignorant of their audience.

Monday, January 13, 2014

The "Everything's Okay" Gospel

Roy Page (with Sarah Horton), A Letter to Evan: An Average Dad's Journey of Discovery and Discernment Through Divorce

Oklahoma advertising executive Roy Page’s life shattered one year: Alzheimer’s took his father, his promising athlete son required invasive surgery that sidelined him from key games, and his marriage collapsed. Page’s sixteen-year-old son Evan got lost amid the catastrophes. So Page wrote Evan a letter, trying to bolster their relationship while resolving his own foundering life. That letter grew into this sadly self-serving Christian memoir.

I wanted to like this book. Authors have penned many thousands of pages about divorce and its family impacts. But most focus on small children; nearly-grown youths get short shrift. Page could’ve closed a glaring gap in this field, if he’d opened himself to his own shortcomings. However, he squanders the opportunity, spinning a mix of personal anecdotes, capped by gnomic morals that exonerate himself and trivialize what’s really happening.

Page essentially fails the Dave Test. Reverend Frederick Schmidt invented the Dave Test when his accepted seminarian bromides didn’t comfort his brother Dave through terminal brain cancer. Schmidt submits all Christian axioms to ten questions; Page fumbles, by my count, eight. These include, but aren’t limited to: “Can I avoid using stained-glass language?” “Can I give up my broken gods?” And most fundamentally: “Can I say, ‘Life sucks’?”

No, Page cannot say that. He cannot just be there with Evan, hurting. Instead, he performs appalling verbal gymnastics to justify why his divorce was inevitable, but not his fault; why prolonged physical absence doesn’t make him less present; why his time-consuming business demands were acceptable from a Christian man with a family. In Page’s telling, only vague abstractions are ever his fault. Cruddy circumstances just happen to him.

After dribbling out details for chapters and chapters, Page finally divulges the narrative of his divorce around the two-thirds mark. It’s the familiar story from his economic bracket: he ran the business, she ran the house, and their worlds scarcely intersected. Eventually, separate lives, lived at cross-purposes, ended their relationship long before courts vacated their marriage. They became two strangers, linked by their house and kids.

Except Page’s telling stays really, really abstract. In ten pages, he never says anything more specific than “The more passive I became, the more resentful she was of the burden of responsibility she carried.” Many marriages survive passivity and resentment. Why not his? People who know Page have posted counternarratives online, which aren’t mine to repeat. Briefly, people use vague, noun-free sentences to deflect banalities like blame or remorse.

This tone permeates Page’s entire memoir. Life’s blessings, like a successful business, resourceful kids, and community standing, Page treats in detail, praising God and his own ingenuity. Setbacks, like his twenty-year marriage’s collapse, remain fuzzy and accidental. Page tries to model his fathering skills on God’s Fatherhood, but in ways that don’t require him to actually be physically present for his wife and children.

Notably, while Page extols churchgoing and Christianity as family ethics, I counted only two Scriptural citations in a 200-page book. He quotes Jack Canfield, Babe Ruth, and Bill “Falafel” O’Reilly far more than God’s Word. For this white, upper-middle-class professional, Christianity is about what you get, not what you give. Page compares himself to Job, but only for the earthly goods Job lost and regained, not the pained debate.

Jack Canfield and BillO become what Schmidt calls “broken gods,” those man-made placebos that fail to sustain in dark hours. Instead of purging accumulated artificial reassurances, Page doubles down on the status quo, reassuring himself, and Evan, that everything’s okay, God’s in control, and weekends spent hunting and fishing make us family. Given the chance to make a new life with his son, Page opts to retrench his existing habits.

Instead of calling believers to new life, Page’s gospel ratifies this life, and this world’s standards. It requires no taking stock, no repentance, no change. It lets Page continue his jet-fueled former life unhindered, provided he darkens the church door often and uses Christian lingo. For Page, Christianity is a shield, an appurtenance he uses when need arises, when he, like everybody, clearly needs complete resurrection from this world’s ways.

Roy Page strikes me as a man who believes sincerely without examining deeply. He hasn’t purged worldly influences from his moral framework, and therefore justifies himself without needing more than cosmetic changes. Page and Reverend Schmidt could profitably have long, deep conversations, because Page is still a work in progress. He just doesn’t realize his own need, because his worldly privilege has shielded him from life’s harsher buffets. For now.

Friday, January 10, 2014

Remember, the Enemy's Gate is Down

Kevin S. Decker (editor), Ender's Game and Philosophy: The Logic Gate is Down

Nearly thirty years on, Orson Scott Card’s Ender’s Game remains not only the author’s most read, most influential book, but a powerful outsider cultural critique. It has achieved crossover success, and often gets read by academics, public policy makers, and general audiences that wouldn’t normally touch science fiction. It has a dedicated intergenerational readership, and organized opposition, probably exceeded only by the Bible and Harry Potter.

And no wonder: despite space opera flourishes, “the Enderverse” touches common human experiences that transcend time, genre, or audience background. Diverse readers see themselves in Card’s tale of a genius, disgusted with his own prowess, molded by a state which exploits his wrathful tendencies. But until Kevin S. Decker collected eighteen new essays from varying disciplines, all centered on Ender’s tale, I’d never realized its place in Western philosophical tradition.

The assembled authors muster a tremendous array of philosophical insight on Ender’s struggle, mixing ancient and modern philosophy, plus domains including governance, mathematics, computer science, and military ethics. Some authors draw on familiar sources, from Aristotle, Aquinas, and Sun Tzu, to Michel Foucault and Hannah Arendt. Other sources don’t ring immediate bells: Friedrich Schelling and G.E.M. Anscombe aren’t household names, but offer remarkably valuable insights.

Card’s story maintains its popularity, and avoids ordinary science fiction obsolescence, by embracing ambiguity. This is remarkable in a frequently sententious genre. Despite science fiction’s general agnosticism, authors like Asimov, Heinlein, and Octavia Butler eagerly spotlight their culminating morals. Card, though he clearly cues our sympathy for Ender, doesn’t flinch to show how Ender’s propensity for violence and self-delusion make him a prime Battle School candidate.

Decker’s peanut gallery utilizes this ambiguity to explore topics like war, education, competetive behaviors, and the composition and governance of modern and postmodern societies. Topics Card addresses momentarily, or which vanish into the background of his highly complex narrative, get detailed treatment by serious scholars. Though fans have long loved Card’s Enderverse for its complexity, they’ll surely share my joy at discovering how specific that complexity really is.

Some authors, like Danielle Wylie and Kenneth Wayne Sayles III, use Ender’s story to explicate important concepts in ancient or current philosophy. Others, like Jeremy Proulx and Matthew Brophy, use philosophy to shed new, deeper light on Ender and his struggles. This open, rolling dialog allows credentialed scholars in difficult disciplines to communicate plainly with general audiences. It also lets academic philosophers espouse the uncertainty frequently reserved only for artists.

Importantly, these authors don’t necessarily agree. Kody Cooper, for instance, cites Aquinas and Augustine to justify Ender’s violence. But James Cook, of the US Air Force Academy, draws the opposite conclusions from the same sources, while condemning Battle School for not teaching even rudimentary military ethics. (Some critics, primarily online, find disconcerting Hitler parallels in Ender’s story. Though some of these scholars cite these claims, none embrace them. Godwin’s Law applies in print, too.)

These disagreements make for some of an already surprisingly great book’s best reading. Legitimate scholars, mustering robust support, debate topics like how responsible we can hold Ender for his actions, or whether governments can legitimately manipulate their citizens. Some of Decker’s scholars would exonerate Ender altogether; others suggest he’s self-deluding and culpable. Though all of these authors “like” Card’s novel, they disagree vigorously on what liking a dystopia means.

Be warned: these authors, with their intellectual debates and philosophical hermeneutics, are academics. These eighteen essays, averaging around twelve pages apiece, are serious scholarship, not Ender’s True Hollywood Story. Don’t undertake this book unthinkingly, or mistake it for light beach reading. Expect authors to challenge, threaten, and overwhelm you. Expect to learn by struggling with hard concepts. Expect, frankly, a sit-down version of Battle School.

Though publisher Wiley Blackwell released this book to coincide with the new Ender’s Game movie, it shipped before the movie debuted. These scholars address only the literature, which they address in great depth. They assume you’ve already read the book and recognize flip references to Bonzo, Bean, and Eros. I admit, it’s been a while, and I needed to cross-check my paperback occasionally. Decker’s authors write for Ender fans, not newbs.

Ender’s dedicated audience won’t be surprised to discover how much intellectual intensity Card packed into his book. We’ve loved it for that reason for over a generation. But this book lets fans attach names to concepts, explore ramifications in greater depth, and situate it in our larger cultural tradition. Decker’s authors won’t make you enjoy Ender’s Game; they’ll show you what loving this timeless classic entails.

Wednesday, January 8, 2014

Edward Snowden, Pavlov's Dogs, and the New Panopticon

Willey Reveley's schematic of Bentham's
proposed Panopticon. Click to enlarge.
Edward Snowden threatened on Monday to bring forward new documents embarrassing to American officials, managing to revitalize his standing in domestic headlines. If you’re anything like me, you probably hadn’t realized, until paid pundits began parsing Snowden’s threat from various angles, that his spine-chilling NSA accusations, had fallen off the radar. Today’s 24-hour news cycle makes it appallingly easy to forget even urgent, powerful stories.

Amid pseudo-news, celebrity gossip, and shoddy opinion merchantry, it’s easy to avoid acknowledging our own role in forgetting important news. Yet Americans’ willingness to accept terrifying surveillance conditions reflects a Pavlovian internalization of outside stimuli. Governments, private corporations, and even nosy individuals spy on citizens daily, and we accept this. But I’ve come to suspect we face more than blithe disregard. Americans have unwillingly come to accept a prison mentality.

Public surveillance techniques surrounding Americans daily have become legion, and remarkably invisible. In 2010, newshounds felt shocked when a Pennsylvania school district admitted using webcams to spy on students. Anyone using most current laptops faces an integrated webcam which we can only block by putting electrical tape over the lens. Smartphones contain GPS technology which we can only disable by removing the battery—essentially opting out of modern digital networks.

English philosopher Jeremy Bentham, in 1791, proposed a new prison, the Panopticon, from the Latin for “place where everything is seen.” His design involved radial cells arranged around a central guard tower, which is glassed in so guards can see into cells, but prisoners can’t see the guards. Panopticon prisoners have no privacy; barred cells make prisoners perform every act, however humiliating, in view of each other and of unseen guards.

Prior prisons suffered virtual anarchy because guards couldn’t watch all prisoners at all times. Bentham’s design disclaimed constant surveillance: because prisoners never know when the guards are watching, they must always behave as though the guards were right there, because sometimes they are. Though Bentham’s designs have seldom been used, and mainly in Communist countries or military dictatorships, his principles govern modern maximum security prisons.

Bentham had sweeping influence on British philosophers, especially John Stuart Mill; but Bentham’s own decidedly stilted prose is generally read today only by scholars. General audiences today more often encounter his Panopticon theory in Michel Foucault’s 1975 Discipline & Punish. Foucault situates prisons among other “mechanisms of power,” including schools and military, noting that social control serves not to preserve justice, but to create identity.

That is, prisoners who survive the Panopticon without incurring new punishment, necessarily internalize the overriding ethos of surveillance. They comport themselves according to assumptions of constant oversight and imminent punishment. They learn to police their actions constantly; the best even assume supervisory roles over other prisoners, quashing undesirable behavior before guards can notice. Successful prisoners essentially let prison ethics into their heads, assuming subordinated attitudes, and guarding themselves.

When we discuss prisons this way, we may feel revolted by such blatant social engineering, or we may concede its necessity among criminals and traitors. But the Panopticon attitude percolates outside prisons. Military dictatorships rely on “secret police” to enforce fear-based social control. Even in democracies, drug wars and anti-pornography stings rely on unmarked plainclothes informants, making every transaction a potential police bust.

Worse, NSA spying, conducted as an undifferentiated dragnet, turns the technologies we need to participate in modern society into potential instruments of control. Informed citizens know the police can use our phone records, Google searches, and other data trails against us, often without a warrant. I’ve seen people carrying laptops who nevertheless use public computers for non-routine business, because they fear leaving trails the state can use against them.

Even if we’re not criminals, we excuse high-handed government intrusions as necessary to maintain order, a goal in itself. We tell ourselves we’re okay with the administration pulling our GPS coordinates off our phones because, hell, we’re not doing anything wrong. We keep our heads down and don’t make waves, even against intrusions consistent with Cold War-era military juntas, because… well, because we’ve let the Panopticon condition us into submission.

American ethics were founded upon the idea that citizens constitute and control their government. But our blasé attitudes regarding Edward Snowden’s revelations prove that we’ve relinquished that role. We let the state constitute us. Thus we cast votes and conduct business on the state’s terms, creating a circular arrangement that preserves power, even when we know the process is manifestly unjust.

Breaking the cycle won’t be easy. But hopefully, once we acknowledge it exists, we’ll begin the long overdue process.

Monday, January 6, 2014

Prophet and Loss

Joseph James Slawek, Ingredients for Success: 10 Best Practices for Business and Life
“[T]his collection of ten best practices is not a concoction of my own thinking, but rather these are concepts that come from the Bible, God’s Word; therefore, it is beyond a man’s opinion, and more than a personal passion. It is the God of the universe speaking personally and profoundly to each and every one of us. That should send a shiver up our spines!”
—Joseph James Slawek, 10
Early in this book, Slawek disavows any role as prophet, but when he says “prophet,” he means “soothsayer” or “forecaster.” The Hebrew prophets only sporadically foretold the future, but they consistently rebuked flaws in the present. Slawek’s focus on contemporary fiscal ineptitude, self-aggrandizing business, and amoral capitalism certainly sounds prophetic to me. But his desire for absolute answers forces his prophecy into precarious balance.

Slawek, a successful entrepreneur whose company has achieved regional and national recognition in its quarter-century existence, claims Christian faith steered his success. He extols churchgoing, Reaganite ethics, and public religiosity as business values. He glorifies decisions like entombing used Bibles in his corporate headquarters’ foundations, and basing business practices on Gospel parables, but this sounds uncomfortably like televangelism and Prosperity Theology.

Unfortunately, using Biblical language doesn’t guarantee ironclad Christian business foundations. Consider Chik-Fil-A, whose directors famously consider themselves Christian to close on Sundays, but give money to anti-rights activist groups, or Hobby Lobby, which until two months ago refused to carry menorahs. Anyone can talk Christian; the lingo isn’t hard. But we know the vine by its fruit. Slawek’s actions, not his words, should verify his spiritual bona fides.

For instance, in discussing wise investments, Slawek demands his company “double our revenue and results every four years”—Ponzi economics. Did Slawek not face the 2007 economic contraction everyone else endured? Worse, he demands “that everyone in the company has a responsibility to double their personal results every four years.” Am I the only one who recognizes finite humans on a finite earth cannot do this? Seriously?

Bad as this is, early in the book, Slawek calls it simple Biblical honesty that any workers who he believes cannot pull their weight get fired promptly. This surely supports his bottom line. But how Christian is it to sack someone who simply flunks Slawek’s cost-benefit analysis? I’d direct him to Luke 13:6-9 for my answer, which demonstrates how Biblical ethics can support differing, even contradictory, secular outcomes.

After writing that paragraph, a pastor of my acquaintance reminded me of the Parable of the Workers (Matthew 20:1-16). Not only does it repudiate a utilitarian Scripture, it blatantly places justice above monetary outcomes, partly countering Reaganite capitalism. I could continue: Matthew 5:48 urges Christians to “Be perfect, therefore, as your heavenly Father is perfect,” which directly violates Slawek’s seventh precept, “Aim for excellence, not perfection.”

Slawek and I could continue this “point-counterpoint” approach indefinitely, because Scripture permits multiple variations. But that only proves my point: Scripture doesn’t permit the closed reading Slawek’s prophetic vision demands. Christians have wrestled with Slawek’s chosen passages, primarily Matthew 25, for millennia without finding any ten-step processes. Moreover, considering many early Christians died violently for refusing to worship Caesar, they certainly didn’t find checklists for earthly success and wealth.

Worse, by combining selective biblical prooftexting with moralistic autobiographical vignettes, Slawek dances dangerously close to idolatry. In ballyhooing his Christian business sense, he claims unique insights to God’s fiscal vision, reducing The Great I Am to his personal banker. “I’m successful because I’m good,” Slawek implicitly declares. We who recall Jim Bakker’s disastrous flame-out know how this transitions, tragically, to “I’m good because I’m successful.”

Evangelical capitalists might claim I’m conflating spiritual and economic consequences. Like Job, Slawek proposes earthly rewards separate from getting to heaven when we die. This divide has colored Culture War thinking at least since Jerry Falwell. But how can we die well, except that we first live well? To anyone who thinks I misrepresent either God or Joseph James Slawek, I commend to you Matthew 6:21—“For where your treasure is, there your heart will be also.”

Slawek surely means well. But he makes the same mistake I’ve seen in other recent self-help books, where the author assumes his particular miracle is transferable, based on laws as immutable as gravity. Slawek has received generous opportunities in life, and assumes they’re both God-given and universal. This assumption blinds him to any reading, of facts or Scripture alike, which contradicts him. Hopefully, readers will recognize this as merely Slawek’s unique business and spiritual journey.

Friday, January 3, 2014

The Ghosts of 1973

1001 Books To Read Before Your Kindle Battery Dies, Part 28
Andreas Killen, 1973 Nervous Breakdown: Watergate, Warhol, and the Birth of Post-Sixties America


Philosopher Michel Foucault claimed that history trembles along “fault lines,” identifiable by sudden shifts like the Industrial Revolution or the Renaissance. Manhattan historian Andreas Killen finds one such line in 1973, the year America finally abandoned its romance with 1960s idealism and began a march toward grim practicality. Though the Vietnam War, which dominated the 1960s, finally ended that year, the promised age of national bodhidharma never materialized.

1973 saw many beginnings and ends. Operation Homecoming saw the mass repatriation of POWs, including two-time Presidential candidate John McCain, concluding the Vietnam conflict. Public broadcasting also launched An American Family, the first reality TV show, starring TV’s first openly gay protagonist. The Symbionese Liberation Army, which achieved infamy the next year, began in 1973. But none of these social forces happened alone; Killen’s 1973 reflects terrifying top-down entropy.

Though Killen addresses several topics—arts, sex, economics, Vietnam—Richard Nixon casts a long shadow. An intensely popular President, recently re-elected by an overwhelming majority, Nixon nevertheless spent 1973 undergoing a high-profile crack-up. He was reputedly addicted to amphetamines and sleeping pills, and once vanished from public view for eleven days, unmatched in the modern Presidency. Though Watergate began in 1972 and ended in 1974, Nixon’s biggest dramas happened in 1973.

For all his prominence, though, Nixon-hating has a certain dead horse ineluctability. Killen sees in Nixon a manifestation of postwar America’s deep death-wish. Drunk on Eisenhower-era prosperity, America gave rise to incompatible idealisms: hippie utopianism still captures popular imagination, but ultra-conservative counter-protesters always outnumbered change agents. The first Arab oil embargo punched holes in America’s fantasy of bottomless prosperity, proving both simple answers unsatisfactory.

The conflict between progress and continuity assumed new bitterness in 1973 with the Roe v. Wade decision. Where once issues appeared resolvable through sign-waving and other political theatre, a Supreme Court decision, with no hope for higher appeal, gave former social issues new moral moral inevitability. Though the conservative pushback would lack leadership until Jerry Falwell (q.v.), 1973 would mark the sides in today’s lingering, seemingly insoluble Culture War.

Killen spends an entire chapter detailing “Warholism,” a neologism that strangely elevates Andy Warhol and his assembly line artistry to a quasi-religion. And Killen’s description justifies that apotheosis. Although Warhol’s Factory debuted well before 1973, and lasted long after, even Warhol got battered by that year. Edie Sedgwick’s death publicized the seamier implications of fame merchantry. Many of Warhol’s prominent creations collapsed that year, often in tragic or catastrophic ways.

Among the stranger overlaps Killen emphasizes, 1973 saw Warhol’s New York Dolls’ debut, complete with muscular, androgynous press photos and sexually forward-thinking attitudes. It also saw George Lucas’ American Graffiti and Terrence Mallick’s Badlands, two very different paeans to rockabilly America, hit cinemas. This presents an American pop culture obsessed with its future and its past, but without a present—a sweeping emblem of post-sixties directionlessness.

This retreat from “now” would characterize much of the post-1973 generation: glamorizing some lost past, the Classic Rock Radio ethos, or aggressive rejection of anything past, characterized by burgeoning urban alternative and underground subcultures. The former option had better mass traction, because corporations could monetize it, and politicians play off its prejudices. Nixon’s grumpy anti-modernism submarined his administration, but achieved its apotheosis in the rabidly anti-1960s Reagan years.

Historians have their unique approaches, and Killen certainly brings his particular preoccupations to history. His pet phrase, “oedipal conflict,” recurs throughout the book, often in situations that have little apparent Freudian implications, as when discussing Reverend Moon, or financial struggles within Hollywood dream factories. Killen’s psychoanalytic approach yields surprising, frequently jarring interpretations of historic moments we’ve probably glossed over without really thinking about them.

Killen doesn’t pretend to offer history as inarguable fact. He spotlights important themes and maps tangled connections showing how seemingly unrelated events represent sweeping patterns. The connections between Andy Warhol’s Chairman Mao paintings and President Nixon putting Joe Namath on his “enemies list,” aren’t necessarily obvious. But the facts are less important than themes, and only an inquisitive historian like Killen can take what is implicit and make it obvious.

This book debuted in 2006, when President Bush was becoming wildly unpopular and a majority of Americans newly agreed Operation Iraqi Freedom had lasted too long. The parallels between Nixon and Bush are obvious. But as President Obama’s NSA woes weaken his administration and even classical liberals have grown discouraged with political solutions, the conclusion looms large: 1973 is still with us. And if we don’t meet that year’s challenges, we’ll face that year’s woes.

Thursday, January 2, 2014

The Best of 2013, Part Three

Incite a riot of new ideas.
—Jonathan Lockwood Huie
I always envisioned WordBasket as a book review resource, and still consider it that, primarily. Today’s flutter of new, daring publishing has coincided with what Andrew Keen calls The Cult of the Amateur, meaning we’re constantly awash in user-generated content, which mostly isn’t very good. To paraphrase PJ O’Rourke, the Internet gives every pissant an anthill to piss off. I wanted to create space where thoughtful reviews could thrive.

But in today’s fraught, media-driven world, it’s hard to avoid occasionally soapboxing about world issues. Because I receive diverse books representing manifold opinions, I’ve internalized many ways of reading events, and many correlations that aren’t obvious just browsing cable news. Appallingly often, news traffickers suffer because they see events in yes-or-no terms. In 2013, I discovered, we could best analyze world events by finding the unexamined third option.



The Ignorance Merchants

The saying goes: “You’re entitled to your own opinion, but not to your own facts.” When Republicans in Congress shuttered the government in October to force concessions on Obamacare, respective sides pitched their own narratives about how we’d reached these dire straits. The two versions were wholly incompatible because, apart from any value judgments or political loyalties, one version parted company altogether with objective reality.

But this divide from veracity didn’t just happen. American mass media needs advertiser money to remain afloat, and a greedy minority profits by keeping citizens factually ignorant. Political debates that should resolve quickly remain open, sometimes for years, because privateers sell ignorance, and too many people keep buying. When a friend said something counterfactual in my presence, I couldn’t keep silent; my response became my most-shared, most-read essay of 2013.



Guns, Women, and My Friend—God Rest Her

For years, I maintained squishy moderate views on American gun rights, because the issue seemed vague and distant. Most gun debates in legislatures and media turn on high-minded principles, as removed from everyday life as discussions about astronomy or quantum physics. I could play both sides of the debate because neither side really touched me; my life remained unimpeded by either option.

Until, that is, a college friend fell victim to gun violence. Suddenly, an airy-fairy debate assumed brobdingnagian proportions in my life. Digging around, I discovered a truth neither side really discusses: guns and gun violence disproportionately disadvantage women. The most common pro-gun arguments proved founded on stray anecdotes and wishful thinking. Taking sides became not just an intellectual option; my friend’s memory made it an ethical imperative.



Sacred Vows to a Secular State

Love or hate Edward Snowden, you can’t pretend he doesn’t matter. His revelations turned Barack Obama from the first Democratic President re-elected with a straight majority since FDR, into the guy nobody acknowledges in church, almost overnight. Suddenly even Democratic loyalists couldn’t deny that, in the struggle between personal power and campaign promises, President Obama is as human as anybody else.

Far more interesting than the facts, though, the official bipartisan response demonstrated how drunk both major parties have grown with power. Both sides claimed Snowden had violated “sacred vows,” a term usually associated with marriage or the priesthood. If Republicans and Democrats share the view that the state is our spouse, or worse, our God, both sides need schooled on what “sacred” means, and where it stops.



The Vengeance Machine

In 2013, some of my titles came to resemble Doctor Who episodes, which is ironic, since Doctor Who came to resemble Internet fan fiction. But too much popular opinion also resembled Doctor Who villains gloating over others’ misfortunes. Such was especially the case when WalMart stores in Louisiana got swamped after a routine computer error in the government’s EBT system suddenly gave poor people unlimited taxpayer-sponsored grocery money.

The real story wasn’t what happened; I found the response to events much more interesting. People whipped themselves into high dudgeon, apparently competing to express the most dramatic outrage. Friends who wouldn’t harm a fly offline revealed dark, vengeful doppelgangers who would dispense moralistic payback to perfect strangers half a continent away. This newly uncovered Internet ruthlessness was so ugly, and so terrifying, that I couldn’t let it go unanswered.



Honorable Mention: A Brief Guide to America's Clandestine Economy and America's Clandestine Economy, Part Two

Regardless of political views or moral justifications, anytime anybody wants to forbid something, they apparently become willing to jeopardize their foundations to preserve their ban. American history has been, partly, a competition between official limits, and attempts to circumvent those limits. Though these are book reviews, not op-eds, both the essays themselves and the books they review reveal truths Americans may not like about our own leadership.



The Best of 2013, Part One
The Best of 2013, Part Two

Wednesday, January 1, 2014

The Best of 2013, Part Two

See also Part One.
July: Twilight of the Godlings

Novelist Jason Hough certainly didn’t invent the idea that humanity is in an evolutionary bottleneck. But his debut novel, The Darwin Elevator, dramatizes this concept literally: due to incomprehensible alien influences, humanity can only survive in a narrow radius around Darwin, Australia. As resources become scarce, and humanity’s survival teeters, survival hangs on bands of scavengers, while zombies besiege humanity’s last redoubt.

Hough distills several post-9/11 motifs into one trilogy. Science fantasy fans will recognize shades of Firefly, Battlestar Galactica, and The Walking Dead, among others. In so doing, he epitomizes much of America’s current plight. Clinton-era malaise, of the X-Files variety, no longer suffices; we’ve transitioned to active, blood-spitting desperation. Thankfully, in these critical times, emergent heroes refuse to embrace the easy numbness of despair.



August: Slow Death as the Son Becomes the Father

Jonathan Gillman grew under his powerful father’s disapproving gaze. An accomplished pianist and mathematician, Gillman’s father had worldwide acclaim. He also had incipient Alzheimer’s disease. My Father, Humming, Gillman’s first collection of verse describes the turmoil his family underwent as this former genius descended into ignominious senility. Refusing contemporary poetry’s often opaque flourishes, Gillman uses verse to strip his experience to its rawest, most heartfelt honesty, free from self-seeking airs.



September: If Life's a Stage, Then All School is Acting Class

Trained as an English teacher, Lou Volpe never anticipated a life in theatre. But when Levittown, Pennsylvania, crumpled during the 1970s, Volpe took a chance on a struggling school’s drama program. While everything else collapsed, lacking outside help or funds, Volpe turned Truman High’s theatre into his town’s hope for the future. Children of the Rust Belt believed, sometimes for the first time, that their lives could truly matter.

Volpe’s former student Michael Sokolove returns to his alma mater in Drama High: The Incredible True Story of a Brilliant Teacher, a Struggling Town, and the Magic of Theater. Sokolove pitches a powerful counter-narrative to current jeremiads of school decline, demonstrating that dedicated teachers, bolstered by supportive communities, can give youth direction even in the poorest hometowns. And he proves that, amid today’s math-and-science mania, fine arts mean something greater.



October: Proving Personal Writing Still Matters

Michael Sokolove, above, openly mocks the impersonal, test-driven education system that obtains in America today. He’d appreciate The Best American Essays 2013, a collection of personal writing in which twenty authors struggle with today’s widespread problems and joys. This collection challenges audiences to step outside themselves, understanding that only the intensely personal is ever truly universal. I’ve used prior Best American collections in writing classes, and will surely do so again.



November: Snowpacked Mystic

Former New Hampshire Poet Laureate Patricia Fargnoli didn’t publish her first book until she was 62 years old. Since then, she’s won numerous awards, been feted by poetry’s elite, and received generous recognition at home and away as one of today’s great unrecognized poets. No wonder: her verse reflects decades of experience, but lacks the hip cynicism burdening too many seasoned poets. This makes her prime reading for non-poetry audiences.

Fargnoli’s sixth collection, Winter: Poems, examines active life from isolation’s enforced leisure. (New England regularly gets snowed in.) Her panoply of love lyrics, agnostic prayers, and solemn laments melds across pages, and years, to create a landscape of joy and loss. I couldn’t put it better than to quote myself: “Fargnoli’s Winter is bleak, ghostly, and alone. Yet it brims with life, because humans inject themselves into the vacuum.”



December: Eternity Runs in the Blood

If American society has a new frontier, it must surely be the human genome. Our growing knowledge of biological inheritance threatens to transform medicine, childbearing, marriage, and other keys to human community. While scientists have written copious data-driven studies, the human story remains appallingly unheard. Editor Amy Boesky aims to change that with The Story Within: Personal Essays on Genetics and Identity.

Cable news and populist politicians have presented genetic research as inhumane, eugenicist, and possibly Nazi. These essays, by ordinary but eloquent genetic illness sufferers and their families, reframe the debate in ways that will hopefully move important discussions into the Twenty-First Century. Because genetic illnesses, fundamentally, aren’t about genes; they’re about how humans live in a scientific era.



2013 was a literary mixed bag. January sucked so bad that I actually apologized for the entire month. Yet highlights like these remind me that books and reading remain vitally important. Here’s wishing you a smart, literate, satisfying 2014.