Friday, January 29, 2016

To Boldly Go Where Many People Have Gone Many Times

Wow. How long must those suits have been in mothballs before this year?
I confess, I watched this week’s X-Files relaunch prepared to hate every minute of it. The original run shuddered badly after about 1997, overburdened by mythology but lacking creative direction. Especially after the December 2012 invasion “deadline” passed unheralded (which the relaunch hasn’t yet addressed), I thought that series belonged alongside other Clinton-era paranoia. I had myself primed for every possible reaction… except realizing it’s actually pretty good.

From the first episode, writer-director Chris Carter essentially upended the mythology established during the show’s original run. Pinching a trick largely pioneered by Larry Niven and Harlan Ellison, he made himself into a liar in the most glorious way. The characters remain continuous with everything they were during the 1990s, and the storytelling is of a piece. Even the visual design is consistent. It’s like a TARDIS trip to 1996.

Except, isn’t that the problem? The X-Files was always analagous to the kind of mass-media paranoia which profligates whenever a Democrat is elected President. It’s no coincidence the original series flourished during the Clinton administration, and foundered under Dubya. And we already have a track record of big-ticket paranoia failing to garner an audience. The 2010s aren’t identical to the 1990s. American culture has, I’d like to think, moved forward.

Well, the American moral and philosophical landscape certainly has. The Obergefell decision propelled society’s most basic bond-building ceremony into new and largely untested realms. After a Republican near-lock on electoral politics for two generations, Democrats have won five of the last six presidential elections, and stand poised to capture the next one too. Notwithstanding Donald Trump’s backward-looking showboating, American life looks different now than just ten years ago.

Considering how few original characters returned, Heroes Reborn is
a pretty inaccurate title. How about Premise Recycled?
But while our lives continue moving forward, commercial pop culture has moved backward. The X-Files relaunch followed hard on the heels of the (mostly failed) Heroes Reborn event series, an attempt to capture the magic of a series that debuted well, but eventually cratered. We’re eagerly awaiting Zach Snyder’s launch of the third Batman movie franchise, and remakes of Labyrinthe and The Rocky Horror Picture Show have been promised.

Some of these long backward glances have done well. Star Wars Episode VII has become the highest-grossing film ever (in unadjusted box office). But Star Trek Into Darkness crumpled so badly, current franchise overlord J.J. Abrams has promised the next entry, due this summer, will wholly ignore that movie. The newest Alien movie will write out everything added since 1986. Pop culture has gone into full retreat mode.

Admittedly, snobbish cultural hipsters like myself have always complained about the proliferation of sequels, remakes, and reboots. UCLA screenwriting professor Richard Walter writes that Hollywood, backed by big money demanding huge returns, has always been squeamish about the present, and roughly ten years behind the times. From its fondness for Depression-era superheroes, to its staunch decisions to keep funding Adam Sandler, Hollywood has always been averse to the present.

But this feels bigger. The X-Files was off the air longer than it was on. Alien is openly promising to retreat clear back into the Reagan years. Even Star Wars, launched in 1977, was a deliberate callback to pre-war Flash Gordon serials George Lucas watched on his family’s first black-and-white TV. It really feels like the people who package and sell us our culture cannot endure the present.

This must have seemed like a good idea on paper.
The problems with this become clear quickly. The failure of Star Trek Into Darkness probably has much to do with how retrogressive its future ethos really is: though a starship plowing into San Francisco definitely channels our post-9/11 fears, Khan reflects the fears of the Johnson Administration. Our science has progressed beyond him, but apparently our mass-market science fiction has not. And Paramount pays for that at the box office.

The present always seems scary, especially in times like these, when everything seems chaotic and unpredictable. Science fiction author William Gibson, in his novel Pattern Recognition, writes that the far future has become dangerous territory for serious authors: “We have no future because our present is too volatile.” He means science and technology here, but it applies to cultural trends equally well.

Yes, the present is frightening. But addressing these fears by retreating into the future or the past—or worse, past versions of the future—doesn’t make those fears go away. Sure, that’s what Hollywood dream factories sell, but that doesn’t mean we have to buy. We need to pay our culture producers to generate something to hope for, something to strive after, not just something to flee into.

Wednesday, January 27, 2016

Haunted By the Ghost of Sandy Denny

1001 Books To Read Before Your Kindle Battery Dies, Part 66
Elizabeth Hand, Wylding Hall

Stalled up, strung out, and desperate to release a second album before Christmas, British “acid-folk” band Windhollow Faire decamp to Wylding Hall, an isolated medieval manor. Sequestered from London’s late-Sixties scene, they smoke dope, jam on acoustic instruments, have sex, and produce a potentially epoch-making album. But the music exacts a cost: one gifted young guitarist meanders into Wylding Hall’s labyrinthine corridors and is never seen again.

Elizabeth Hand, a veteran of both experimental fantasy and punk rock, creates an interesting semi-epic of a time when music really did offer to transform the world. Melding rock sensibilities with folk tradition, British Electric Folk bands like Steeleye Span and Fairport Convention embodied what the Sixties meant in the UK. But when the Sixties ended without Aquarius dawning, Electric Folk retreated into a niche. It only regained force with the neo-hippie movement of the late 1990s.

Windhollow Faire’s story unfolds episodically: Hand’s frame story features a nameless music journalist interviewing witnesses to events at Wylding Hall. Surviving bandmates, devoted groupies, music business professionals, and interested locals recount events surrounding that fateful summer. Not surprisingly, their accounts don’t wholly jibe. This forces us readers to speculate who among the eight named witnesses really understands, and just what really happened.

Business manager Tom Haring sequesters his band for one reason: he needs the payday following Windhollow Faire’s acclaimed, but middling-selling, first album. His band sees it differently, though each brings their own viewpoint. Singer Lesley envisions a small proto-Woodstock. Bassist Ashton sees a vacation from London’s constant performance pressures. Drummer Jonno, closeted when homosexuality is still illegal, just needs some rest. Everyone experiences what they expect. Mostly.

Elizabeth Hand
But singer-songwriter Julian’s experiences color the story. An ardent medievalist and gifted guitarist, his skills come together at Wylding. He discovers a library nobody else can find, littered with pre-Enlightenment poetry unknown to scholars, which he immediately begins setting to music. This includes a strange Thomas Campion madrigal that everyone finds strangely hypnotic. It may also be a summoning spell, calling a fair-skinned girl who tempts Julian to his doom.

Hand’s storytelling is a masterpiece of atmosphere and characterization. The idea of a shapeshifting house that eats hippies implies a horror novel; but Hand actually creates a “weird” narrative reminiscent of Poe and Borges. What each character notices about Wylding Hall reveals something about their individual baggage, which Wylding reflects. This makes us wonder, does Wylding Hall really change when nobody’s looking? Or can the characters just not agree?

Symbolism in this novella isn’t subtle. Wylding Hall supposedly dates to medieval times, but includes a Tudor wing, a residential area with modern plumbing, and—in later chapters—a neolithic burial chamber. There’s even implications of new construction, making Wylding an outgrowth of British history. Thrust into that environment a band playing traditional music on modern instruments, and pressure will almost inevitably develop, hastening the eventual explosion.

While manager Tom pushes Windhollow to create a marketable album, groupie girlfriend and part-time mystic Nancy tries to keep the band grounded. Members meanwhile have wildly divergent ideas of what artistic integrity means: long drug-fueled jam sessions, or busking, or meticulous research. The various characters represent the forces artists confront daily, the demand for money and family and honesty, which few, even the best, ever successfully reconcile.

Hand makes no secret, from page one, that Julian enters Wylding Hall and never returns; the only mystery remains why he stays. Witnessing the forces feuding to control his band, knowing he’s accomplished his musical best, his decision to vanish and remain forever young makes perfect sense. Especially for us, with perfect hindsight, knowing his musical milieu has peaked, going forward and aging offers little promise.

Thus, like Jimi Hendrix or Gram Parsons, Julian remains forever young, linked with his time, free never to see himself become redundant or antique. He transcends mortality, becomes uniquely himself, and remains pure to his art. Okay, this “accomplishment” means never creating new art, never acquiring new audiences, so it’s a Pyrrhic victory. But that’s arguably Hand’s (admittedly unsubtle) point: art, like all human endeavors, is an ultimately doomed pursuit.

Hand embroiders her narrative with little nods us old folkies will appreciate. Besides direct acknowledgement of acts from Lindisfarne to Devendra Banhart, Hand gives winks to Sweeney’s Men, Vashti Bunyan, and others which whisk by so fast, only fans will recognize them. This contributes to the experience: it’s full immersion storytelling, dropping us into the Electric Folk world feet first. Like Julian, many readers may prefer to stay.

Monday, January 25, 2016

The Tragedy of Argentina's Small Things

Íngrid Betancourt, The Blue Line: A Novel

Julia reads the future like scattered pages from a torn book. That is, she reads future images clearly, but without sequence or context, making her messages often opaque. Sure, clairvoyance saved her sister’s life when storms attacked their boat; but she couldn’t save friends from the Peronist massacres in 1973 Buenos Aires. And thirty years later, trapped in American suburban doldrums, her strange future glimpses probably won’t save her marriage.

Franco-Colombian politician and former political prisoner Íngrid Betancourt, who herself draws opinions as divided as Perón, has written several nonfiction volumes about her life, Colombia’s political struggles, and her captivity. This is her first novel. Reading throughout, I couldn’t help recalling Arundhati Roy’s The God of Small Things, another lone novel from a political activist. They address similar themes of nationalism, identity, and the collision between past and present.

Like countless mid-20th-Century country dwellers, Julia’s family emigrated to Buenos Aires seeking work. Living amid the city’s bustling ethnic Italians, teenaged Julia meets Theo, a social chameleon with electric political views. Theo leads Julia into the Montoneros, a radical militia that somehow simultaneously lionizes Marx and Perón. Think “Weather Underground.” But when circumstances return Perón from political exile, the militia that restored him becomes his target.

Betancourt’s narrative, like Roy’s, shifts time, sometimes abruptly. One chapter ends in 1973, the next commences in 2006. Here Julia and Theo, now nearly thirty years married and living in Connecticut, have fallen into comfortable Yanqui boredom. Julia’s visions warn her someone, somewhere, will soon commit irreparable infidelity, or worse, but she cannot know who. Suddenly motivated, she races to rescue her marriage from the scourge of Wonderbread American life.

Ingrid Betancourt
Like Arundhati Roy, Íngrid Betancourt’s storytelling trends very slow-moving and cerebral. Characters are generally manifestations of themes, and she doesn’t craft plot turns so much as metaphorical reversals. Readers accustomed to active momentum and character-driven narrative may find Betancourt’s highly constructed approach daunting. It certainly requires willpower to pursue its slow, meandering development. Betancourt doesn’t write for casual airport or bedtime reading.

But unlike Roy, who apprenticed in Bollywood before publishing her novel, Betancourt has visible authorial fingerprints across her writing. Her symbolism is often unsubtle and high-handed, while her story runs more on social conscience than character or action. Long passages get driven by political debate, which, since principal characters are university students and a priest, gets understandably intellectual. Imagine entire chapters where the characters remain seated. This book demands patience.

Understanding this novel does require some willingness to step outside ourselves. Not only must we accommodate ourselves to the South American setting (by Norteamericano standards, Julia and Theo’s early courtship appears age-inappropriate; the politics seem often strange and contradictory), but the Magic Realist style takes some getting used to. When something so extraordinary as clairvoyance gets treated as banal, rationalist Western reading habits simply no longer apply.

However, readers willing to make the investment Betancourt demands will find a smart, humane portrait of two forces shaping a generation of South Americans. (Despite the Peronist setting, Betancourt is clearly writing about herself and Colombia.) As Cold War-era juntas surrender to middle-class stability, as a generation raised seeking the next rebellious cause descends into making a living, we realize that oppression doesn’t require guns to rob life of meaning.

Where praise will accrue to this novel, and it probably will, it’ll probably come from audiences who read for theme. Future generations may encounter this novel in graduate school. In a reading environment clogged by action thrillers and character dramas, Betancourt briefly acknowledges both, but doesn’t linger, courting audiences who’d rather feel engaged than hypnotized by books. It’s easy to think about, but hard to get lost in, this novel.

I’ve drawn several parallels between Betancourt and Arundhati Roy here, so let’s clarify that these parallels are very imprecise. First, The God of Small Things is undoubtedly the superior novel; Betancourt seems more comfortable with expository speech than storytelling. But also, Betancourt’s vision is far less bleak than Roy’s, suggesting people’s ability to triumph over seemingly overwhelming circumstances. (Also, despite what I’ve said, Roy did eventually publish a second novel.)

One reads this story not to journey with the characters, since Wikipedia shows where they’re ultimately headed. Rather, we journey with the peoples symbolized by these individual characters, the working masses of Buenos Aires, “los desaparecidos,” and immigrants trapped in American malaise. I certainly wouldn’t mistake this book for fun. But behind its authorial hiccups, it’s an interesting snapshot of two violently conflicted times and places.

Friday, January 22, 2016

Politics Is Our Nation's Large Intestine

1001 Movies To Watch Before Your Netflix Subscription Dies, Part Four
Armando Iannucci (director), In The Loop

A low-ranking British Cabinet Minister (Tom Hollander) does the unthinkable one day on live national radio: he tells the truth. He says what he thinks without bothering to consult the official party script. He suddenly finds himself under fire from the Prime Minister’s personal attack dog, Malcolm Tucker (Peter Capaldi), whose ability for piquant vulgarity and psychological warfare approaches legendary. In politics, careers have crumbled over less.

This pseudo-documentary, with a large ensemble cast and trans-Atlantic scope, serves as an indirect prequel to the BBC satire series The Thick Of It, but is functionally freestanding. It depicts British and American political maneuvering as blatantly Machiavellian, focused on winning without regard for such trivia as facts or collateral damage. It positions power politics as an essentially glamorless enterprise, occupied by small people operating for insignificant gains.

Long before Doctor Who, Peter Capaldi salvaged a foundering acting career by playing The Most Vulgar Man In Britain. By his own admission, he’d gotten trapped in a black hole of repetitive parts, playing politicians with repressed sexual secrets. He attended auditions for The Thick Of It feeling bored, combative, and despondent. Turns out, these were the three qualities show creator Armando Iannucci wanted, and Capaldi became an overnight celebrity.

Buoyed by Capaldi’s performance, this movie thrives on language. This story focuses on how people use language to inform or deceive, to bond together or tear apart, to open or close the nation’s systems to its people. Hollander’s Minister is simple, honest, and winds up road kill. Capaldi’s Tucker lies, bullies, plays double-talk, and wins. Capaldi’s verbal duel with General George Miller (James Gandolfini) is one of the best pieces of legerdemain ever captured on film.

Simon Foster (Hollander), Minister for Overseas Development, has no business getting involved in war planning. The British PM and the American President both want to invade “the Middle East,” but Foster calls war “unforeseeable.” Because his political domain trades in global altruism, he literally cannot foresee something so uncharitable as war. Both sides in the heated international debate begin trying to recruit Foster, and his epic naivete, to their purposes.

Foster’s office includes Judy Molloy (Gina McKee), his Director of Communications, and Toby Wright (Chris Addison), his “Special Advisor.” They represent different pushes on British politics: Judy is scrupulous, knowledgeable, and ethical. She wants to help Foster do his job to his utmost. Toby is theatrical, savvy, and smug. He pushes Foster into high-profile media histrionics, apparently because he thinks Foster’s profile boosts Toby’s own by extension.

On the American side, a deputy Secretary of State makes an alliance with a ranking Pentagon general, both agreeing they lack manpower for a successful war. They have no humanitarian illusions; they don’t want war because they can’t win. But Linton Barwick (David Rasche), the President’s point man, performs remarkable end runs to keep naysayers outside the decision-making circles, and if that means freezing out the military, so be it.

This movie’s Oliver Stone-ish handheld camera work and semi-improvisational dialog give it a look like unfolding news. The fly-on-the-wall tone sometimes makes us feel creepy, as when Malcolm Tucker or Linton Barwick use their bizarre manipulative techniques to eviscerate anyone who doesn’t toe the party line. (Search Malcolm Tucker on YouTube. His ability to turn ordinary vulgarity into squirm-inducing sagas is both hilarious and terrifying.)

Peter Capaldi as Malcolm Tucker, the most vulgar man in Britain

This movie manages to strip politics of all romance. It does this, in part, by pushing elected officials out of the story: in the entire ensemble, only Simon Foster was elected to anything. Instead, it focuses on appointed functionaries, unanswerable to the people, performing elaborate maneuvers to turn unpopular propositions into inevitable actions. It presents the United Nations as the large intestine of world politics.

Watching lies accumulate, expert liars speak from both sides of their mouths, and promises hang on dog-whistle language, this movie achieves a level of complexity I can only call “poetry.” The dialog bypasses the brain, stabs straight into the gut, and leaves a scar. The comedy arises because we know, instinctively, that the unelected apparatchiks who govern our lives really are this low to the ground.

Co-writer/director Iannucci avoids a heroic or viewer-friendly movie. He intends his audience to feel distressed, even terrified, that our leaders might maneuver thus. His take on the Operation Iraqi Freedom preparations look familiar to anyone who follows news, particularly anyone who follows recent revelations kept secret at the time. This is our politics, folks. Iannucci makes it blackly funny, but that doesn’t mean he’s wrong.

Wednesday, January 20, 2016

Once Upon a Time in Olde Boston Towne

Bishop O'Connell, The Stolen: An American Faerie Tale

Brendan Kavanagh is an exile from  the Fian, an ancient order of Irish avengers. Ageless and undying, he's wandered New England since gaslight times, plagued by guilt. Caitlyn Brady is a nurse and single mother just getting by. When granite-eyed dark fae, the oíche-sidhe, steal Caitlyn's red-haired Celtic daughter from her bed, Brendan and Caitlyn become necessary allies in Fiona's rescue. They cannot know that Fiona is actually a pawn in faeryland's brewing civil war.

There's a certain faux-Celtic mishmash I call "Magically Delicious," honoring the prancing twit from the Lucky Charms commercials. It's something I usually mock and deride. That isn't helped by this book's tendency to brew a Mulligan stew of urban fantasy tropes: the fae court concealed in a techno club, evil beings who have guns but politely attack one-by-one with blades, faery battles disguised as street warfare. Jim Butcher readers may find this book excessively familiar.

Yet out of loyalty to the author (with whom, full disclosure, I attended high school) I persevered, and I'm glad I did. Turns out, the over-comfy genre name-dropping is only set dressing. Bishop O'Connell has actually created a smart, layer-cake narrative which uses potboiler stereotypes to tell a more sophisticated story. Like Mary Poppins' proverbial spoonful of sugar, O'Connell's tale makes palatable what we secretly need by hiding it behind what we think we want.

Caitlyn stumbles into a common city-dwellers' nightmare, the midnight mugging. Okay, the muggers bleed black sludge and heal supernaturally, but this isn't escapism. Anyone who's had anything stolen knows that feeling of violation, knowing your life's been compromised. Except Caitlyn's literally has, since her attackers use her purse to magically invade her home, kidnap her daughter, and ensorcell her household. Caitlyn faces every city mommy's worst nightmare, except to awaken, she must first go deeper.

Bishop O'Connell's male protagonist, Brendan, is
a big kilt-wearing Celt. This is Bishop O'Connell.
I'm sure it's just a coincidence.
Brendan has a beast inside. Literally. If he ever gives into his rage, which has festered for centuries, he shapeshifts into a killing machine. This beast once killed his one true love, and he fears letting anyone close. Neither human nor fae, he belongs nowhere. But his ironclad sense of justice perks up when oíche steal Fiona, pulling him back into the supernatural world he once fled. He may now need the monster inside.

Besides these leading Jungian archetypes, their supporting ensemble includes some familiar faces from fantasy. The apprentice wizard who must uncover deeper resolve to defend those he loves. The elven nobleman who pretends to be jaded with power and age, until threats become too vast to ignore. And the ultimate villain, whose secret bond to our protagonists could upend everything they think they know about themselves. Though familiar, O'Connell invests these ensemble characters with full psychological depth.

Okay, the story isn't entirely perfect. Where O'Connell gives his characters depth, he does so with great metaphorical weight. But to create his fae civil war, O'Connell also creates many characters, some named, some not, whose only purpose is to die. Some climactic scenes have a "Redshirts versus Stormtroopers" feel, because characters die without first earning our sympathy. I kept waiting for these scenes to end, and get back to the stuff I really like.

That said, even those scenes have potential. When the beautiful, dutiful elves battle harpies, púca, and other creatures of pure id, we can see the profound depths beneath the common fantasy trappings. If O'Connell doesn't push that potential as far as an egghead like me would, if he sometimes foregrounds the paperback adventure we've grown comfy with, that doesn't mean the deeper stuff isn't there. Sometimes you have to tell the story your audience wants.

Like Star Wars or High Noon, this novel follows the common mythic patterns of human experience. This doesn't mean the story is predictable or simplistic. It frequently surprised me, though what happens always seems natural afterward. Rather, like Grimm's fairy tales, O'Connell's story is ultimately about its readers, about our own journey through life's unbeautiful turmoil. Fairyland, Tir na nÓg, whatever you call it, finally, it isn't a place, it's a metaphor for our own psychological landscape.

Did O'Connell mean everything I've stated? Maybe. Though I know him, we're not close. Maybe everything I've said says something about me. These subtexts certainly aren't blatant; I only finished the book on my second try. But I'm glad I did. This book rewards casual readers seeking an adventure to fall asleep under, but it also challenges we who seek something meatier. Meeting divergent readers where they live: what better can any good book do?

Monday, January 18, 2016

Notes Toward a New Liberated Woman

Susan L. Edelman, MD, Be Your Own Brand of Sexy: A New Sexual Revolution for Women

Second-wave feminism has been a mixed bag, we can probably all agree. Sure, it freed women from stultifying Eisenhower-era housewife roles, mandatory motherhood, and terminal niceness. But many women I know feel trapped by relationship and sexual roles as restrictive as what they escaped. Dr. Susan Edelman pitches her ideas for a third-wave remedy to these conditions. I like her premise, but I have distinctly conflicted feelings about her execution.

Edelman, a psychiatrist in private practice, as counseled successful career women as they’ve struggled to reconcile their professional accomplishments with relationship disasters. She’s seen identical problems crop up often enough to recognize patterns. Too many women think that to be truly “liberated,” they must acquiesce to casual sex; or they commit too early; or they accept ill treatment from men. Otherwise successful women just don’t assert their own needs.

Though Edelman notes that patterns women create in work environments often repeat themselves elsewhere in life, her proposed “new sexual revolution” focuses preponderantly on romantic and sexual relationships. “We can redefine ‘sexy’,” she writes, “so that it's not just about the outside. It's about inner strength, self-knowledge, and self-confidence.” Okay, this bromide has the comforting familiarity of an After School Special, but that doesn’t make it less true.

So Edelman’s principles intrigue me. She mixes lightly fictionalized stories from her practice with solid research, psychological insight, and hard-won experience. Some women can safely have casual flings; others can’t. Some women can await the correct man; others need to take a more assertive posture, just to be themselves. In short, Edelman writes, “Real power is knowing what works for you and having the courage to stand up for yourself.”

Dr. Susan L. Edelman
Okay, in theory, most people would agree with Edelman’s maxims. But “knowing what works for you” also means knowing which of Edelman’s applied recommendations actually speak to you. Some of Edelman’s suggestions sound reasonable, like “Turn down what you don’t want with grace and dignity.” But others worry me. Precepts like “Play the field: after you’ve weeded it!” are so broad that, without professional guidance, you’re priming yourself for blowback.

And some suggestions frankly make my skin crawl. At one point, if a guy’s too cheap to pay, Edelman counsels women to “enjoy your conversation, letting the check sit there until he pays it, never offering. You want to give him a chance to impress you.” That sounds less like empowerment, more like duelling passive aggressions. If you don’t communicate clearly from the first date, you probably never will.

So. We’re faced with the common problem with professional psychological treatment, that broad guidelines make perfect sense, but the more specific those guidelines become, the more likely they’ll conflict with our lives. What sounds good in the abstract requires profound self-awareness before we can apply it to ourselves. All self-help is predicated on self-knowledge, which, Edelman repeatedly concedes throughout this book, is pretty rare. How to reconcile this gap?

In one chapter, Edelman resorts to that self-help writer’s favorite Hail Mary, the self-scored quiz. Here we hit something frustrating. As Benedict Carey demonstrates, tests have definite learning value. But when we anticipate our own likely reactions to future situations, Dan Ariely proves, mental and emotional states matter. Our answers while coolly reading at home almost definitely differ from how we’ll respond in the heat of passion, bliss, or anger.

How about taking Edelman’s quiz multiple times, then? Take it calmly, then after something that makes you angry, like watching the opposite party’s primary debates? And again after reading a steamy novel? Changing your psychological state will change your answers. And comparing your different answers once you return to a calm state will tell you what areas of self-knowledge and self-control most require your continuing attention.

Edelman’s final chapter, on planning for what comes next, comes the closest to pushing a message of global self-knowledge. It’s here that Edelman most explicitly demands women better know themselves, not just as women, or sexual beings, or half a relationship, but as human individuals. Here Edelman most directly calls women to complete self-awareness. I only wish this were better distributed throughout the book.

I actually do recommend this book, if you possess self-knowledge enough to evaluate Edelman’s applications. Both the arch-conservative housewife model and the hedonistic “liberated woman” are conformist behavior patterns which women must escape. And men can benefit from reading to better understand the conflicting pressures women face in today’s relationship minefield. I only insist that you take the opportunity to better know your whole self.

Friday, January 15, 2016

Why Our Economy Requires—And Undermines—Trust

I made a rookie mistake at work this week. I hung wood blocking around thirteen windows, a job that took one-and-a-half shifts, and was made even longer because partway through the process, my supervisor took my pneumatic lift away and gave it to a subcontractor. Therefore I did the job with minimal power tools, and no electricity, standing on a ladder wider than the foundation footing it balanced on, constantly fearing I’d fall and die.

But dammit, I did it! With fifteen minutes to spare, I finished the job. I dragged my tools to the Jobox, signed my time card, and barely pulled myself home for a tiny dinner before collapsing, sleeping for ten hours because I’d exhausted myself doing too much work, often defying gravity, with inferior tools. But even at great personal cost, I finished my job promptly. I felt frankly proud of my accomplishments under adverse circumstances.


Except after all that, I forgot to gather and return the leftover cut scraps. Considering that I performed the impossible, ahead of schedule, I thought this a minor error. But four superiors took the opportunity to stand around the office table, mocking me round-robin style. During my break. They took obvious pride in humiliating me for my inexperience, and the mockery just kept coming. I haven’t hated anybody so much in a long time.

I read Maria Konnikova’s The Confidence Game and Dan Ariely’s Predictably Irrational over two months apart, with plenty of other reading between. But something really stood out between the two. Konnikova, a psychologist turned journalist, writes that trust is essential to individual success. People capable of trusting others are more able to take risks, delegate authority, embrace new opportunities, and other actions necessary for successful entrepreneurship and career advancement. Trust is a marketable career trait.

Ariely, a pioneering behavioral economist, similarly notes measurable economic consequences of trust. Large-scale growth and long-term planning ultimately rely on the expectation that people will, in the main, behave honorably and look to everyone’s mutual benefit. Economies where people behave with baseline standards of trust and honor, like America’s, flourish. Economies where these traits are lacking, like Iran’s, languish. Ayn Rand notwithstanding, we absolutely need other people. Trust isn’t a luxury, it’s an economic necessity.

American society has witnessed the undermining of public trust in recent years. The refusal of Presidents from both major parties to prosecute the bankers who imploded the economy in 2007-2008 has undercut many citizens’ willingness to trust both politics and economics. No wonder outsider insurgents like Donald Trump and Bernie Sanders recently took leads in primary polls. Except many Americans distrust polls so much, polling organizations have reported non-response rates as high as 90% lately.

People alive right now have witnessed the full-scale rollback of Progressive Era gains in workers’ rights, social safeguards, and the shared belief that we should have a base minimum below which nobody should fall. And it’s happened with public connivance. The Republican party, whose voting base is overwhelmingly rural and working-class, floated Mitt Romney for President, and promises to float Donald Trump, on the basic platform that “I’m so rich, obviously I’m competent at everything.”

We face a vicious cycle. We absolutely require trust, because without it, all plans remain small, all opportunities circumscribed by fear. But Ariely demonstrates that, when trust is undermined, it’s gone. Not that lost trust cannot be regained, but especially in large gatherings, where individual contributions remain substantially anonymous—from workplaces to economies—one sufficiently bad actor can literally change the entire landscape. Ten thousand honorable people can spend years putting one person’s damages right.

My experience at work this week probably seems small compared to Lehman Brothers submarining millions of pension funds. But my anecdote captures, in small, the problems facing many working-class Americans. We cannot advance ourselves from abjection without banding together, but we cannot band together without trust. And it only takes one vulgar power play to squander all trust. It took my superiors ten minutes to permanently piss away the solidarity we’d spent ten months building.

One major American political party today openly panders to widespread distrust. And working-class people buy that shit because we’ve had our trust systemically undermined for generations. America’s “win-at-any-cost” economic culture has won adherents from our economy’s biggest losers. This seems paradoxical, until you realize how dependent success is on trust. Radical individualism is the opposite of trust. And we won’t improve, as a society, until we discover how to rebuild the trust we’ve long squandered.

Wednesday, January 13, 2016

Why Being Irrational Is Your Sanest Choice

1001 Books To Read Before Your Kindle Battery Dies, Part 65
Dan Ariely, Predictably Irrational: The Hidden Forces That Shape Our Decisions

Let’s start with this simple premise: human beings don’t behave according to conventional economic models. Whether the capitalist supply-demand arc, or the Marxist labor dialectic, human behavior steadfastly resists slotting into categories experts consider “rational.” We often ignore our best interests, make choices which render us poorer and less happy, and cannot value our options reliably. Yet we do this in completely consistent, reliable ways. Why?

Israeli-American professor Dan Ariely holds doctorates in psychology and business. Through appointments at MIT, Stanford, and Duke, he has helped mold the still-young discipline of Behavioral Economics, a research field which examines human choices without preconceived models. His discoveries shed light on human reasoning, and why what seems unreasonable actually serves very consistent purposes. This book provides an interesting thumbnail introduction to a discipline still new and somewhat dangerous.

Consider how people will treasure something that costs more, assuming it’s more effective, better quality, or whatever. Anyone who’s ever slipped box wine into a posh bottle already knows this. Higher prices, subconsciously, translate into better products. Yet the arc shifts when the price hits zero. We’ll chose a measurably inferior but free product rather than pay one thin dime. This seems counter-rational, but actually has well-rooted human behavioral causes.

Late in the book, Ariely writes: “Behavioral economists… believe that people are susceptible to irrelevant influences from their immediate environment…, irrelevant emotions, shortsightedness, and other forms of irrationality.” Though this makes a decent definition of his field, it sounds harsh and judgemental, which this book mostly isn’t. Unlike standard economics, which literally assumes humans are money-driven reason machines, Ariely sees human behavior as driven by forces we cannot see.

Dan Ariely
However, these invisible forces, unlike the fabled “invisible hand,” actually have measurable traits. For instance, consider volunteer activities. If Habitat For Humanity needed to pay its workers the going rate for construction labor, it couldn’t afford to build homes for the destitute. Why will people people donate freely time and labor they’d charge dearly for if money became involved? Because society is bound together by forces economics cannot value monetarily.

Likewise, given complete impunity, most people will cheat slightly to pursue their own advantage. But most people will cheat far less than they possibly could. What forces keep chiseling to a reasonable minimum? Apparently we have complex neurological structures that let us monitor ourselves, asking whether we’re upholding our own basic judgments. Our brains literally stop healthy people from swindling others. Honesty isn’t merely a morality, it’s a neural imperative.

Ariely demonstrates we could prevent even limited cheating with simple steps. Honesty oaths and honor pledges work effectively, as do simple reminders of moral codes like the Ten Commandments. We can also minimize cheating by keeping consequences close-by and visible, as by transacting business in cash, not credit. The bankers who imploded the economy shortly after this book appeared probably wreaked havoc because, psychologically, they were handling Monopoly money.

Through-lines develop across Ariely’s chapters, without his need to notice. We don’t cheat profligately, and moral nudges halt even minimal cheating, because social standards are based upon trust. Economies where trust runs high, like America, thrive because we’ll keep money flowing, trusting few people will deceive. Economies with limited trust, like Iran, struggle with minimal growth. But it takes remarkably little to undermine trust, and when it’s gone, it’s gone.

These examples aren’t thought experiments or mathematical models. Ariely draws conclusions based on experiments in the laboratory and the field. Some are only possible now because of advancing technology—in one example, a colleague replicates the famous Pepsi Challenge with subjects strapped into an fMRI machine. Others involve simple field tests anyone could replicate by offering free chocolate or beer. But they’re all based on real-life trials and empirical data.

In consequence, these discoveries challenge both the traditional Left and Right in American politics. Because people aren’t theory machines, prefab theories cannot encompass our choices: we’re driven neither by profit, as neoclassical economics insists, nor by the class-based identity Marxism demands. Our motives defy standard categorization, threaten anyone who’d predict or control our choices, and expand regular citizens’ views of themselves and their choices.

Ariely’s experiments make humans more, not less, complex. They demonstrate we’re ore powerful and sophisticated than even we ourselves realize. But our complexity also makes us vulnerable to forces we cannot see externally. Behavioral Economics makes humans both more and less reasonable, both more and less predictable. And in so doing, arguably, it makes us more human, overthrowing the theories which bind us.

Monday, January 11, 2016

Why Being Vulnerable to Fraud Isn't Bad

Maria Konnikova, The Confidence Game: Why We Fall for It . . . Every Time

Our world positively teems with swindlers, ripoff artists, and con-men. From ordinary curbside Three-Card Monte to charming, narcissistic domestic abusers, to Ponzi schemers and Wall Street market riggers, the confidence game exudes from society’s very pores. Psychologist turned journalist Maria Konnikova wants to unpack what makes us susceptible to con artists, a journey that leads through all human psychology, sometimes vulnerable to diversions and cow paths.

Konnikova’s first book, Mastermind: How to Think Like Sherlock Holmes, dealt with how crime fighters organize thoughts, observe reality, and undermine criminal mentality. This book essentially addresses the same issues from the opposite angle: how criminals create situations that need busting. Konnikova’s conclusions may seem surprising, until we consider them further. Vulnerability to confidence artists and other professional chiselers actually means our psyches are healthy.

Confidence artists work with an encyclopedic understanding of human psychology with which research scientists are only now catching up. They recognize common traits, like our tendency to see others as similar to ourselves, our illusion of control, and our unwillingness to think badly about ourselves. These traits aren’t weaknesses; without them, we’d be functionally paralyzed. Effective swindlers work by turning our best characteristics and human capabilities against us.

We must recognize, therefore, that making ourselves insusceptible to cons isn’t actually desirable. Fraudsters prey on traits that open us to community, family, and fiscal reward. As Konnikova writes: “The same thing that can underlie success can also make you all the more vulnerable to the grifter’s wares. We are predisposed to trust.” With swindles, as with propaganda, those who think themselves most immune are, actually, most vulnerable.

Maria Konnikova
The answer lies in understanding ourselves and the swindlers better. They don’t see us like we see ourselves. They don’t want to. We must cultivate complex understanding of different human thought patterns, and a stronger sense of ourselves. Konnikova again: “It's not that the confidence artist is inherently psychopathic, caring nothing about the fates of others. It's that, to him, we aren't worthy of consideration as human beings; we are targets, not unique people.”

All isn’t bleak. Throughout most of this book, Konnikova suggests it’s difficult to prevent con-games without isolating ourselves and descending into cynicism. In the later chapters, though, she reverses the trend, showing how skilled, self-aware people can resist flim-flam artists’ techniques. Not hypothetically, either: she shows how real people, cult busters and cultural anthropologists and police, have maintained their sanity when confronted by seemingly insurmountable double-dealing. Resistance is possible.

As Konnikova writes, psychologists once believed humans achieved maturity when they were able to see themselves objectively. We now know that isn’t true. Humans naturally see the world through rose-colored glasses; if we didn’t think the best of ourselves, other people, and the future, we’d be unable to move. Even the bleakest pessimists think more positively than they probably realize. We must sustain such positivity to sustain forward movement.

The solution, then, is to test everything. Know who you are, what you want, and others will find it difficult to warp your identity. Understand the ways swindlers misuse your best tendencies, and you’ll have better luck distinguishing friendliness from abuse. No technique, Konnikova admits, is foolproof; even professional fraud busters sometimes get over their heads. But you have the capacity to withstand grifters.

This book’s release at the beginning of an election season certainly isn’t coincidental. The techniques Konnikova describes—make yourself look friendly and approachable, tell audiences an engaging story, start targets on tiny commitments so they’ll double down later—exactly reflect the techniques long used in demagoguery and electioneering. Journalist Sasha Issenberg has described previously how election managers use exacting scientific advances to improve upon what candidates have long done intuitively.

As Konnikova explains confidence artists’ psychological techniques, her focus expands to include much about recent discoveries in psychology and behavioral economics. She wants readers to emerge with as thorough an understanding of human minds as the fraud merchants enjoy. This sometimes makes her technique sprawling (this book runs over 300 pages plus back matter, unusually long for its genre). Reading Konnikova sometimes requires especial concentration and focus.

She richly rewards those who stick with her narrative, though. I’ve recently seen one friend lose rafts to shady investments and two others get burned by charming, narcissistic romantic partners. Even if we never vote for crooks, invest with Bernie Madoff, or buy salvation sellers’ wares, the potential for confidence games still surrounds us. Konnikova provides needed tools for self-awareness, clear boundaries, and bold self-defense. Swindles are inevitable; victimhood isn’t.

Friday, January 8, 2016

The Parts of the Building That You Never See

Painted masonry chiseled out above the line of the future suspended ceiling,
to accommodate future electrical conduit for fluorescent lighting
Late in his book Little Rice, NYU professor Clay Shirky describes an apparently common sight for Western visitors in China: even in foreign-owned luxury hotels, like Sheraton, lightswitch plates are often attached cockeyed. Wallpaper is hung slightly out of alignment. Drywall has visible unpatched seams. Shirky presents these design quirks as proof that the Chinese aren’t willing to go that extra mile to ensure visible quality. Not like us Westerners.

I read Shirky’s book while working construction, building a public works project, a new municipal high school. Writing my review, I ignored his implicit dig, knowing most readers won’t understand the significance. For many people, buildings emerge whole from the ground, like corn. Indeed, many people, mistaking the machines tipping precast concrete wall panels into place for the entire construction process, blithely ask: Why isn’t that building done yet? Then they wonder why I blocked them on Facebook.

But as we enter the home stretch on this school, I’m noticing some traits around the building that deserve some greater level of comment. I’ve noticed cinder-block walls, already painstakingly erected, painted, and enclosed within the building, being broken back out with chisels to run heating ducts and electrical conduit. I’ve seen parking lots partially paved, then abandoned literally for months. I’ve seen how sloppy Western construction really is, below the surface.

Trash wedged between two pieces of completed cinder block wall, probably by the masons.
Right after this picture was taken, we placed straight wood trim along the wall edge.
Roofers will cover this trim with insulation and rubber, and the trash will probably remain
hidden inside the walls for the life of the building.

Remember that scene in Witness, where Harrison Ford joins the Amish community in raising a neighbor’s barn? Audiences largely agree that’s the movie’s most touching moment, when Ford has learned to share and cooperate. It shows him becoming open to community in ways city life openly discourages. Alongside dancing with Kelly McGillis in the garage, that scene probably did more effective outreach than the Amish have attempted in two centuries.

Unfortunately, that doesn’t represent how construction gets done nowadays. Getting the neighbors together to raise a timber-frame horse barn makes perfect sense, but most Americans are unwilling to live Amish-style lives. We demand electricity, central heat, cable TV and Internet access, and more. And public buildings like schools go further, requiring science labs, gymnasiums, and other specialized facilities. Do you know how to wire a computer lab?

Me neither. And neither do most people. Construction today is a highly specialized process, subdivided among numerous subcontractors. As a result, this workplace is undoubtedly the most segregated job I’ve ever had. Not only are tasks strictly allocated among companies (and turf wars as hotly contested as drug territory), but there’s often severe racial division among task categories: the brickmasons are overwhelmingly black. Concrete workers are mainly Hispanic. Welders are largely white.

And each group, in pursuit of their separate tasks, largely ignores how their tasks step on other workers’ toes. I work directly for the general contractor, and my job is largely to ensure the subcontractors have whatever they need to do their jobs. This means I often spend days, even entire weeks, doing tasks subcontractors wouldn’t notice unless they weren’t done, like building safety rails or sweeping floors. I’m basically the jobsite’s invisible man. So I’ve seen things other workers might rather ignore.

I’ve seen, for instance, how many workers regard the entire jobsite as their general ashtray, garbage receptacle, spittoon, and in extremis, urinal. I’ve swept up strangers’ tobacco loogies, removed their piss bottles, and gathered empty McDonald’s cups they just tossed aside. For me, that’s become an emblem for just how inexact the construction process really is. That’s what Clay Shirky misses: those cattywampus lightswitch plates only make visible what Western construction workers try to hide.

Once you have the opportunity to witness a building’s creation from within, you realize just how intricate the process really is. Especially when deadlines loom, workers prorate what levels of precision really merit their time. Laying cinder blocks involves extremely precise measurements, so masons accept if blocks sometimes look uneven. Moving electrical conduit involves moving PVC, wire, and connectors very long distances through blind avenues to remarkably exact destinations, so if doing so involves walloping off chunks of masonry, who cares.

When we view construction projects like this from outside, it’s easy to complain that the process isn’t already done. It’s easy to poke fun if we notice workers apparently lounging, and grumble about “our tax dollars at work.” But there’s an exchange. What appears slow from outside actually progresses lickety-split, because doing it correctly involves more steps than you realize. Your building is remarkably complex, and more intricate than you realize. So if your switch plates are lopsided, how dare you complain?

Vinyl heating ducts run from a rented furnace through temporary partition walls into a
half-completed gymnasium. I was the only one who saw this. That is, several people looked
right at it, but I'm pretty sure I was the only one who actually saw it. And now I can't unsee it.

Wednesday, January 6, 2016

The Human Hole in Our Economy

Back in my teaching days, I made one-on-one meetings with my students mandatory. I had as many as six such sessions per semester, a very time-consuming process, especially for an adjunct, nominally a part-time worker. However, I believe they helped both me and my students. The one semester where I focused exclusively on lecturing, I didn’t enjoy the process, and my students’ writing suggests they didn’t learn much either. We both benefitted from just talking.

I remembered this experience when two essays crossed my desk almost simultaneously this week. The first, from the Chronicle of Higher Education, exhorts American universities (and, implicitly, elementary and high schools) to reduce hiring of educational consultants, lay off non-classroom administrators, and use the savings to hire new classroom teachers. The demand has grown familiar recently. University tuition has ballooned while teacher pay has declined, mainly behind paying expensive administrators to criticize from on high.

The second essay, from Salon, discussed the author’s experience clerking a bookstore counter during the Christmas rush. The author describes customers thanking her profusely for her nearly instantaneous ability to find books on flimsy descriptions. She also describes customers turning abusive when made to stand in line, threatening to take their business to Amazon. Essentially these customers attempt to hold the author’s job hostage unless clerks behave as compliantly as one of Earth’s biggest websites.

These two essays, considered independently, simply rehash complaints we’ve heard endlessly. Teachers want to teach, and resent non-classroom pundits prescribing their curricular methods. Physical bookstores have that human touch, but customers have lost their appreciation. Ho hum. But together, a theme develops: whether in humane endeavors like education, or for-profit commerce, the best results are outgrowths of human relationships. And relationships are something we, abetted by new technology, are writing out of our daily lives.

My belief that bookstores and paper books still matter is already well documented. My unscientific opinions have been validated recently by declines in ebook sales relative to print volumes. But we’re seeing forces beyond one commodity being pushed into new territory. We have more stuff available, but opportunities for discovery are significantly narrowed, with costs we’re only beginning to recognize. The consequences of moving American commerce online have been severely misstated, to put it mildly.

Just one example: as more people book airline flights and hotel rooms directly, America’s travel agencies flake off our economy like dandruff. But the supposed savings from “cutting the middleman” scarcely exist. UCLA tech guru Shlomo Benartzi writes that any savings gets more than absorbed by the increased costs vendors pay for simple visibility on aggregators like Kayak and Expedia. More than half your hotel bill today goes into advertizing, triple about fifteen years ago.

Please don’t misunderstand me. I’d never encourage readers to abandon ecommerce altogether. Where I live, savvy book shopping requires Amazon and other retailers. My town has two mall bookstores; visiting quirky indie bookshops and record stores would require hours-long drives to Lincoln, Denver, or Kansas City, at great economic and environmental costs. Online retailers (and the much-maligned mall bookstores) make books, and other specialty commodities, available to residents in outlying areas underserved by big-city capitalism.

However, this doesn’t change the greatest drawback to Amazon, Netflix, iTunes, and other online retailers, that they’re ultimately passive. Though one could make proactive choices to seek boundary-busting content from these providers, most people don’t. We (including me) drift along the current of what’s available, selecting what we know we already like. Having done this to media consumption, some would do the same to education. Passivity isn’t a virus; it’s being propagated like a seed.

The irony that the two essays which inspired this response reached me through the Internet, possibly the most passive mass medium currently available, isn’t lost on me. But in receiving these sources, I chose to absorb their message. I chose to have a relationship with them, and by extension with their authors, a choice available with all media, however passive. That choice exists for everyone. We need only to consciously make that choice, every day.

We don’t send youth into school to convey information into their heads; we send youth into schools to be changed. We also read books for similar reasons. We can, if we choose, be changed by movies, music, and arguably by television. But we do that only by having a relationship with others. Maybe it’s face-to-face in teacher conferences. Maybe it’s vicariously, through media others create. But relationships matter. We mustn’t let electronics replace these relationships.

Monday, January 4, 2016

Star War-To-End-All-Wars

Kylo Ren and his infantry, immediately after destroying a civilian village

This Essay Contains Spoilers!

In the opening scenes of the newest Star Wars movie, The Force Awakens, a stormtrooper division commanded by black-clad villain Kylo Ren fails to find their desired target in an isolated desert village. Frustrated, Ren orders his stormtroopers to destroy the village and all the villagers. Watching one stormtrooper unload his flamethrower into a ramshackle hut, I realized where I’ve seen footage like this before.

It looks disturbingly like the after pictures at My Lai

Star Wars has long utilized World War II imagery to propel the Empire’s unquestioned evil, and the Rebellion’s manifest virtue. Arising, as it did, after America’s ignominious collapse into mission drift during Vietnam, George Lucas could only recapture the ethos of the Flash Gordon serials he loved by ignoring much American postwar history. Moral ambiguity might feel more realistic, but Lucas wasn’t seeking reality; he sought mythology.

But since the original Star Wars trilogy, America’s subsequent experiences with strategically questionable wars and rudderless missions have colored our mythology of warfare. Right-wing critics charged Lucas’ prequel trilogy with fomenting anti-Bush Administration sentiment, an opinion not worth disputing now that even most Republicans agree Operation Iraqi Freedom was poor judgment at best. Vietnam isn’t a blip Americans can overlook anymore.

Serious scholars have even called World War II into question. Historian Howard Zinn, who attended Columbia University on the GI Bill, recounts having flown bombing missions over occupied France. He describes having strafed a village of 1,400 French citizens, occupied citizens of an Allied country, to roust a nest of forty Nazis. One recalls the famous AP quote about Bển Tre, that “It became necessary to destroy the town to save it.”

And, oops, we’re back in Vietnam again. Or did we ever really leave?

Rey and Finn fleeing the Luftwaffe—erm, I mean TIE fighters—strafing their squatter camp

Thus the renegade Stormtrooper Finn’s moral struggle during that opening scene, his refusal to open fire into a mass of unarmed civilians, isn’t enough. Layperson moralists have long insisted that if just one American soldier had refused Lieutenant Calley’s order to fire, righteousness would’ve spread like contagion. But would it? The problem wasn’t My Lai in isolation. America has suffered severe mission drift beyond one military conflict.

America’s reliance on international military force in furtherance of domestic agendas has undoubtedly enhanced American political stability. Despite rumors of Daesh incursions, nobody legitimately threatens America at home anymore. Anybody who would seriously threaten America today had better have multiple tactical thermonuclear weapons. Anything less only risks angering the giant.

This unquestioned might requires a trade-off, however. America’s military prowess means every challenge has military solutions. Though our military took time off following V-J Day, we stuck right in again; we haven’t gone twelve straight months without a shooting conflict with someone, somewhere, since 1948. Though we disclaim international territorial ambition, the fact is, America’s military machine works so well, it’s become our first diplomatic tool.

Thus, even though enough time has passed for Han and Leia’s son to reach adulthood, our classic protagonists remain trapped in a neverending war cycle. They could’ve passed comfortably into government positions within the Republic (as the did in the now non-canon novels), but instead, remain leaders of the Resistance, dwelling illegally deep behind enemy lines. As the movie unfolds, we discover many Rebel leaders retain essentially Rebellious mindsets.

Can a galaxy ever know peace, truly lasting peace, if its leaders and icons know only warfare? Well, that’s a loaded question: what does knowing warfare mean? Our classic protagonists cut their teeth on epoch-making war. Somehow, those who grew up knowing peace have never ascended to leadership. When General Organa and Admiral Ackbar remain leaders, never relinquishing power to coming generations, war must almost inevitably remain the standard option.

Passing the baton of conflict onto the next generation

But extending our American metaphor, bequeathing power to new generations won’t necessarily solve much. 2016’s last serious Presidential candidate with military experience, Senator Lindsey Graham, has left the race. But Donald Trump, Ted Cruz, Hillary Clinton, and others verbally jockey to prove they’re the candidate most serious about crushing America’s opponents. As British scholar Christopher Coker insists, war remains a cheap way to bind peoples together into a nation.

Science fiction always represents the era which created it. From Flash Gordon’s Aryan hero standing strong against a slant-eyed Fascist, to the Terminator’s fear that humans have created our own destruction, effective sci-fi lets us witness ourselves through a mythological lens. And in America today, our greatest fear is that we’ve created war without planning for peace. Like the Resistance, we fear “victory” is only retooling for the next battle.

Friday, January 1, 2016

The Luddite Buys a Do-Funny

To say I have mixed feelings about Jeff Bezos,
the genius behind my new Kindle Fire, is an
I'm composing this essay entirely on my new tablet, as a test of personal endurance. I never much asked for a tablet, because I never particularly took them seriously. I already have two laptops and a desktop, so what profit did a tablet offer? Besides, I'm a clumsy two-thumb typist; simple text messages can take me hours to compose.

Yet I was given this tablet for Christmas, and it seems rude to not at least try. It behooves me to remember that I haven't always been the most receptive technology audience. The household I grew up in refused to embrace new tech until we believed the makers had run all the bugs out—a process often so slow that my parents never purchased dial-up Internet until it was already obsolete. My sister describes herself, with evident pride, as "a Luddite," even though she bought her tablet a year before I received mine.

Back in my teaching days, I was the first member of my English department to get a Kindle. The technology was still new and strange, and many colleagues cooed over it like I'd introduced my baby. But I quickly got frustrated with the slow-turning pages, the sluggish download speeds,the easily damaged keys. I suppose I still have that Kindle, somewhere. If I look for it.

Besides,I never particularly enjoyed reading from it. Reading, for me, has always been a multi-sensory experience. Books have differed not only by their content, but by their weight, color, aroma (associated often with their age), even their size. Simple mass distinguishes hardcovers from paperbacks. You don't read books interchangeably, because you don't hold them identically.

My early Kindle sanded all such distinctions away. Everything I downloaded had the same page size, line spacing, font, smell. Only the content separated one book from another. Before long, I found myself using my Kindle to download books where information mattered but aesthetics didn't, or where timeliness held a premium. Where I hoped to enjoy a book, I bought (and still prefer to buy) the paper book.

But it's difficult to resist tech's forward march. I bought a smartphone, not for calling or texting purposes, but for the streaming media players, so I could enjoy music beyond the uniform corporate blandness broadcast over small-town airwaves. Then I connected Amazon streaming video to my phone, because why not. Then I saw a new novella advertised by a favorite author, for some ridiculous price in hardback, or three bucks on Kindle.

And kaboom, I downloaded the free reader app.

Because sometimes the content really does matter. Because sometimes getting the words before your eyes, without counting the price of dead trees or burning hydrocarbons, makes a difference. Because my desire to open myself to other people's experience through the medium of words trumps my desire for ideological purity.

I still prefer paper books, just as I still prefer typing on an external keyboard (autocorrect is driving me nuts). And I'm not alone. Shlomo Benartzi cites studies going back to the 1980s showing that readers who read off screens understand, recall, or even finish less than when reading off paper. Back during those Apple  IIe days, researchers assumed low-quality 8-bit graphics impeded comprehension; but that remains true into the present, when many smartphone and tablet screens have better graphic resolution than most paper books.

Benartzi attributes this continued retention deficit to modern screens being too easily readable. When our eyes move fluidly over the content, we invest too little effort to trigger memory retention. I disagree. As Marshall MacLuhan wrote, the medium is the message; books and screens are just different. When I turn paper pages, the content remains, but turning a digital page returns the content to the primordial byte soup from which it emerged. I pre-consciously believe this, even if my rational brain knows otherwise.

That's why my new tablet will never replace books, because they do different things. They create unique experience in unique ways that have divergent effects upon my life. My laptop's word processor, my old IBM Selectric, and my mom's manual Remington all do the same thing in different ways. My tablet and my paper books enter my brain divergently, to different consequences.

But nevertheless, I've come to appreciate the virtues of networked access. My tablet isn't a substitute for books, but it's a parallel. Having access to content maybe lacks books' aesthetic pleasure, but sometimes, the trade-off is acceptable. Advancing technology needn't mean leaving tradition behind, just expanding the choices available moving forward.