Friday, December 29, 2017

Chaos Theatre and the Great American Comedy Renaissance

Sam Wasson, Improv Nation: How We Made a Great American Art

Improvisational theatre began in Viola Spolin’s workshops, beginning with theories that humans have the most authentic, open interactions during opportunities to play. Spolin moved to California, turned her theatre games into an actor training program, and produced several storied actors. But the real magic happened when Spolin’s son, Paul Sills, took her theatre games to the University of Chicago. There a strange maelstrom of talent created a new form of theatre.

Sam Wasson, a sometime performer himself, has written four previous books about American performing arts. Until now, he’s focused on single personalities, like Bob Fosse or Audrey Hepburn. Here, he turns his mixed artistic and journalistic background onto an artform, improv theatre, which would emerge, phoenix-like, from the moldering corpse of post-WWII theatre. American-made performance was dying, but improv breathed new life into it.

In Chicago, Paul Sills met several personalities longing for something new, something revolutionary. These included several still-famous performers, like Mike Nichols and Elaine May, or Severn Darden. Others included people mostly known only by other theatre professionals, including Del Close and David Shepherd. And that revolutionary zeal wasn’t metaphorical; many early improvisors believed they could overthrow the capitalist patriarchy and rebuild society by simply being authentic.

Sadly (or not), they discovered, as revolutionaries do, that capitalism is remarkably elastic. Several offspring of Sills’ original vision, including the Compass Players and Second City, became money-making enterprises when they discovered an untapped market for genuine, unplanned laughs. Soon, performers who paid their dues doing improv, became stars of the scripted circuit, like Alan Arkin and Barbara Harris. “Legitimate” theatre began adapting to improv timing.

Sam Wasson
Starched-shirt theatre historians claim improv has roots going back to Italian commedia dell’arte. Wasson blows his nose on this. Italian commedia actors worked without a script, but they played single characters, repeating stock speeches, in scenarios audiences committed to memory. American improv was something new, something dangerous: every performance presents something new. (Sometimes literally dangerous: letting actors extemporize their lines threatened countries with speech laws.)

But Wasson also notices patterns developing. Improv began as anti-capitalist theatre, but became so in-demand that prices skyrocketed. The satirical edge became so beloved that public figures relished getting skewered, rather than fearing it. Improv has long struggled to maintain a legitimate edge, and whenever it risks becoming safe, it requires innovators to blast the artform from its comfy confines. It appears to need this kind of rescuing a lot.

And the rescuers often aren’t stable people themselves. Improv innovators like Del Close, Bill Murray, and Chris Farley have repeatedly breathed new life into unscripted performance, sometimes through sheer force of personality. But these personalities are also frequently self-destructive, craving new experience at any cost. The qualities that make improvisors fascinating performers often make them dangerous human beings No wonder so many have a tendency to die young.

The reciprocal relationship between improv and “straight” performance apparently fascinates Wasson. In the 1960s, many famous improvisers became more conventional, scripted stars: Mike Nichols turned Second City alum Dustin Hoffman into a star with The Graduate. Since the 1970s, improv has funneled its best performers into TV shows like SCTV and Saturday Night Live. It’s almost like “straight” performance needs the vitality that only improv provides. And improvisers need “straight” paychecks.

Wasson doesn’t write a how-to for improv comedy. Such books already exist, in numbers too massive to sift. Instead, he writes a love song for an artform that strives to keep American theatrical performance from drifting into comfy passivity. In Wasson’s telling, improv prevents other performance forms drifting into safe, commercial ruts. But now, improv itself is a commercial enterprise. As so often in the past, the artform’s future is up for grabs.

Early on, describing the love-hate relationship that drove Nichols and May, Wasson writes one of the truest things I’ve ever read about performance and theatre: “Say you meet someone. You like something about them and they like something about you. Your mutual interest begets mutual play. Play begets cooperation and mutual understanding, which, trampolined by fun, becomes love. Love is the highest form of play.”

As a sometime actor myself, I appreciate this thought. We who perform spend tremendous efforts trying to help our audiences have genuine experiences. Maybe we don’t destroy ourselves like Del Close, or burn out like Elaine May, but we know the value of sacrifice. And we do it because we love our audiences, our fellow performers, our art. Improv gives performers the liberty to simply exist. And that is beauty indeed.

Wednesday, December 27, 2017

Crime in the Dark Heart of Space

Chris Brookmyre, Places in the Darkness

Ciudad de Cielo, humankind’s first permanent orbital civilian settlement, is a picture of responsible productivity. Industrious, thriving, with a robust native economy and (officially) low crime rate, it has qualities humanity’s sprawling, overpopulated Earthside cities desire. New security head thinks she’s taking a rubber-stamp job, until her first day, when someone takes a potshot at her. But even that pales when, almost simultaneously, CdC gets its first official murder.

Probably no genres are more innately tied to the times in which they’re written than crime and science fiction. That makes reviewing them particularly difficult when an author has clearly not been reading the current developments in the genres. British crime writer Chris Brookmyre has clearly been reading vintage Isaac Asimov and Arthur C. Clarke, because his science fiction looks around thirty-five years out of date. His buddy cop narrative isn’t much newer.

Having not been formally inducted into her position yet, Alice Blake goes undercover to investigate this murder alongside Nikki Fixx. Ex-LAPD, Nikki is manifestly corrupt and runs multiple shakedown operations which keep her well-paid, while keeping CdC’s official crime rate low. She’s also the city’s only security officer with experience running a murder investigation. Which makes it puzzling when she sleepwalks through the most desultory investigation since Raymond Chandler started phoning them in.

This isn’t accidental. Nikki has a “round up the usual suspects” attitude toward crimefighting, colored by the fact that she’s in neck-deep with the criminals she supposedly busts. Alice, in her undercover persona, repeatedly leans on Nikki about the importance of law. Nikki, meanwhile, uses object lessons to show Alice how only someone who thinks like criminals can successfully enforce the law. Veteran noir audiences know how this debate ends.

Chris Brookmyre
Ciudad de Cielo, CdC in official parlance, “Seedee” on the streets, is a careful balance of rational ingenuity and moral compromise. We know this because Brookmyre repeatedly stops the narrative to tell us. Brookmyre is a master explainer, pausing his story to repeatedly explain how subsurface maglevs work, the space elevator that brings everything to Heinlein Station (!), the processing of food and other necessities in deep orbit.

Unfortunately, Brookmyre’s science and technology are as pointedly dated as his noir. I keep noticing pieces I recognize from reading authors like Robert Silverberg, Orson Scott Card, and Pat Cadigan back in the 1980s. His “dirty streets” pop psychology is equally dated. He fronts multiple, long-winded explanations why, even in space, people skirt the law and indulge their Freudian impulses. “Feed the beast” looms large.

Nevertheless, I could accept Brookmyre’s dated sci-fi and his sketchy postwar ruminations on human nature, if he’d put primary emphasis on his story. When he permits events to happen, they generally happen with a reasonable degree of punch. We get glimpses into the vulgar undercarriage of space exploration, the Mos Eisley of Low-Earth Orbit. Repeatedly, I feel something starting to happen, and get excited for genuine genre-bending crunch.

Then, whoomp, Brookmyre interrupts again. If he isn’t lecturing audiences directly through his third-person-limited voice, his characters lecture one another. This happens literally once, when an important supporting character, a university professor, delivers an actual TEDtalk about the nature of consciousness, an important theme in a society where neural implants are a career necessity. This lecture pinches, almost verbatim, from Sam Harris and Daniel C. Dennett.

While Brookmyre continues frontloading such information, I spend time trying to identify what sources he plunders for influence. I’ve name-dropped several already; you may spot others. The mere fact he has important events take place on a platform entitled Heinlein Station bespeaks just how dated his sources are. The events on that platform, incidentally, include members of a crowd going around a room introducing themselves. Talk about low tension.

I don’t mind Brookmyre’s familiarity with genre history. I myself still enjoy vintage Asimov and Silverberg. But dated references come so close upon one another, it becomes questionable whether Brookmyre has read any sci-fi more recent than Neuromancer. Science fiction and crime fiction, both dependent on current technology and social mores, usually evolve very quickly. If your cultural references are outdated, your audience, familiar with the tropes, can tell.

Shame, really. Because Brookmyre’s reputation as a “Tartan Noir” author precedes him, I expected something more earth-shattering. I expected something that would propel both genres he blends into new, adventurous territory. Instead, this pop-culture porridge will only interest readers who aren’t particularly familiar with either genre. Veteran audiences will get distracted spotting Brookmyre’s sources. There’s little else to hold our attention.

Monday, December 25, 2017

The Economics of Frank Capra's “It's a Wonderful Life”

James Stewart and Donna Reed in It's a Wonderful Life

The sentiment and spirituality of Frank Capra’s 1945 classic It’s a Wonderful Life has made it a Christmas season classic. Audiences love seeing James Stewart's transition from bleak, snowbound despair, to joyous Christmas exuberance, because most of us have endured such extremes in our lives, too. Less widely considered, though, is the economic philosophy supporting the story. And it provides a hopeful blueprint for America’s future.

George Bailey (James Stewart) has many conflicts in this movie. Urban dreams versus small-town realities; goals for the future versus obligations to the past; individual autonomy versus marriage and family. But for our purposes, the most important conflict pits the Bailey Building and Loan against Henry Potter (Lionel Barrymore), who personally controls the bank, apparently making major decisions from his oaken desk.

The difference between Bailey and Potter couldn't be more stark. The Building and Loan gives applicants money enough to build their own homes, which they can own freely and build equity on. As Peruvian economist Hernando de Soto observes, in the United States, the most common source of funds to launch a new business comes from the mortgage on the owner’s home. So a house admits its owner into the ranks of economic mobility.

Mr. Potter, by contrast, keeps a tight hand on money. He gives loans out sparsely, lecturing young George Bailey on the moralistic importance of cultivating a strong work ethic. If we make access to credit as difficult and laborious as possible, people will, it follows, work harder and with greater diligence for everything they have. If, while they're building credit, they have to live in Potter’s dilapidated rental properties, that’s surely an ancillary issue.

Lionel Barrymore in It's a Wonderful Life

If we see this entirely in Cold War terms, it’s easy to mistake Capra's economic ideals. Someone, name blocked out in released documents, used It’s a Wonderful Life to prove that Frank Capra was a secret Communist. If he pitches Potter, the primal libertarian capitalist, as the film’s villain, then clearly Capra must be a dirty pinko. Since the world clearly only exists in black and white.

Except that in mocking the villainous capitalist, Capra doesn't reassign authority to the state. The Communist takes power from citizens, and concentrates it in the hands of the state, or an organization that operates with state-like power. G.K. Chesterton noted, nearly a century ago, that for most people, the difference between capitalist and communist society is vanishingly small. The choice between living under state-controlled or privately managed bureaucracy is no choice whatsoever.

Given the Cold War dichotomy between The Market and The State, it’s easy to mistake It’s a Wonderful Life for Communist propaganda. Except George Bailey doesn’t arrogate power into the state; he distributes power into the people’s hands. Not that famous socialist canard, The People, but the ordinary people, who have jobs and will use house-building money to improve their situation and join the upwardly mobile ranks.

“I feel that in a small way we are doing something important,” says Pa Bailey in the movie. (James W. Rodgers’ stage adaptation, a community theatre staple in which your humble commentator recently performed, reassigns this speech to Uncle Billy.) “Satisfying a fundamental urge. It's deep in the race for a man to want his own roof and walls and fireplace, and we're helping him get those things in our shabby little office.”

In other words, owning one’s own house, knowing one has freedom to improve or trash or mark up and sell one’s nest, gives one power. Mr. Potter, in attempting to own everything in town (he’s specified as owning the bank, department stores, and bus line), wants to concentrate power over the entire distribution chain in his own hands. The Bailey Building and Loan distributes that power onto those most immediately affected by relevant decisions.

Jeff Ensz, left, as George Bailey, and your author, Kevin L Nenstiel, right, as Clarence
Odbody, in the 2017 Kearney Community theatre production of It's a Wonderful Life
This system isn’t perfect. Recent economic changes have seen populations engage in flocking behavior, making access to simple resources like houses, harder to come by. The house Giuseppi Martini purchases in the film, an austere cottage in Flintridge, California, last sold in 2003 for $745,000; Zillow estimates it’s now worth twice that. Even George Bailey’s distributed system needs offsetting influences.

But a decentralized economy provides that. The system we have now, which concentrates wealth into fewer and fewer hands, while crippling citizens’ ability to organize countervailing influence, rewards Potter-like behavior. Claiming Potter is okay because he isn’t the state is fatuous. Concentrated wealth, public or private, almost always precipitates abuse. George Bailey simply provides citizens the ability to join the system.

Friday, December 22, 2017

A Short Handbook for Confronting Liars

Bruce Bartlett, The Truth Matters: A Citizen's Guide to Separating Facts From Lies and Stopping Fake News In Its Tracks

Reality exists. This didn’t used to be a controversial thesis statement; people of good conscience could disagree about how to interpret facts, or how to situate facts in a larger narrative, but debates about what constituted an actual fact stayed in Platonic discussions, where they belonged. But the increasing decentralization of news-gathering has separated citizens from objective reality, and we’ve “elected” a president who openly disdains facts. How can engaged citizens regain the factual landscape?

Bruce Bartlett started as an economic advisor to the Reagan Administration, and later retired to a lucrative conservative think-tank career. But he quit movement conservatism during the George W. Bush years, when he claims his former party became unmoored from reality. The current climate of buffet-style news sourcing, coupled with polarized voters seeking confirmation of their prejudices, tweaks his scholarly hackles. He wants you, the voter, to care about sources as keenly as he does.

This book offers a pocket-sized primer to source checking and factual verification in news. Some of it is very basic: does more than one source agree that something happened? Does an online source use links to verify claims of fact? Other pointers get more technical. Readers need to know context for facts, rather than fact dumps, for instance. Journalists who use orphaned statistics and unsourced quotes are probably actively confusing the issues, rather than clarifying.

Journalism is a profession, with its own practices and conventions. Though chin-pullers like me have lamented the professionalization of journalism, this has nevertheless helped working reporters build practices of internal verification and fact-checking into their business. But the conventions also obscure core practices in ways we can’t always see, a fact fake journalists use, judo-style, against us. A 101-level familiarity with journalistic practices among news audiences could open our eyes to objective truth versus baloney.

Bruce Bartlett
Technology has complicated the news distribution business. As 2016 proved, inflammatory lies travel quickly, and polarize the electorate. But the same technologies that spread lies, also make citizen fact-checking more possible. Much that turned the 2016 election into a national shitshow, like Pizzagate or false quotes attributed to either candidate, could be debunked through the Internet. Though straight Google searches often return widespread lies, more specialized tools, like Google Scholar or ProQuest, provide verifiable facts.

And some of Bartlett’s pointers involve readers acting in reflective, informed manners. One major source of fake news has been ordinary people resharing specious, unsourced claims because it confirms their pre-existing biases, or because a clickbait title inflames passions. Bartlett asks us to ensure the “facts” we use, and distribute, aren’t wrong. But that requires us to periodically check and make sure we aren’t wrong. A good debater first assumes: “I should double-check myself, too.”

Bartlett provides here a brief introduction to research techniques used in social sciences. Though he doesn’t get into the premises of original research (contact your local graduate school about that), this volume, sized to fit most purses and blazer pockets, provides up-to-date guidance in checking secondary sources and existing fact databases. Instructors could incorporate this book into high-school and college courses in journalism, political science, history, economics, and even behavioral psychology. One could hope, anyway.

This entire enterprise, admittedly, assumes readers want to screen reports to determine facts, and their context. In other words, Bartlett’s fact-checking process requires people engaged enough to spend time researching instead of passively watching TV. This isn’t a foregone conclusion; some people enjoy shouting others down, the schadenfreude they receive from hurting others and making them feel stupid. I wish Bartlett included a chapter in here on knowing when it’s time to cut provocateurs loose.

Not that Bartlett doesn’t have occasional partisan lapses. Early on, he describes Fox News as a centrist alternative to left-leaning mainstream news. This description would surprise Rupert Murdoch, who, in an interview given about the time this book debuted, described his baby as “conservative.” He also lumps satirists like The Daily Show and other late-nighters, who broadly lean left,  into “fake news,” which is specious since satirists admit their principles. Liars and jokers aren’t interchangeable.

This only proves that even scholars and fact-checkers need a skeptical eye occasionally. Informed readers should never be passive, because we’re already part of that reality which, I already noted, exists. Reading the news should change us, but we should challenge the news. The relationship is reciprocal, and requires constant testing and refinement. That’s missing in the “post-truth era.” The techniques Bartlett provides are slow and unglamorous; they’re also real, which is what we need.



On a related theme, see also:
A Short Handbook for Confronting Dictators

Tuesday, December 19, 2017

A New Form of Hollywood Suicide

Christopher Meloni in a promo
poster for Syfy's Happy!
Syfy’s new comedy-thriller Happy! commences with a shot of somebody vomiting blood into the ice chips in a rust-stained urinal. We naturally respond with revulsion but, like rubberneckers at a traffic accident, we cannot help watching. The camera pulls back to reveal who just upchucked, and zOMG it’s Christopher Meloni! Stabler from SVU! Wait, how did clean-cut, down-at-heels Detective Stabler turn into a burned out contract killer systematically frying his liver in lieu of suicide?

When movies first became an industry in the late 19th Century, production houses (which were mainly technology laboratories, like those of Edison and Marconi) resisted the star system which dominated theatre. They wanted individual films, which were frequently under five minutes long, to exist as freestanding enterprises, without relying on personalities, which could quickly overshadow the production. It’s easy to understand this now: top-grossing actors like Tom Hanks and Scarlett Johansson become lucrative genres themselves.

Television exaggerates this trend. Film actors like Julia Roberts or Christopher Walken cultivate a marketable type, but TV actors invest in a character, sometimes for years. Then they shift to another network, another genre, or another franchise, bringing that character’s accumulated baggage with them. Actors closely associated with famous roles have difficulty shaking them; will William Shatner ever have another role that doesn’t, at least implicitly, carry shades of Captain Smirk? Of course he won’t.

Actors have traditionally had two options after leaving iconic characters. Some resist being typecast, though this usually ends in disaster. Leonard Nimoy’s I Am Not Spock phase, when he tried to recapture his heady youth in Westerns and crime dramas, came to an abrupt end when he realized he couldn’t shake the character without compromising his market viability. Others gleefully embrace their typecasting. Bill Cosby milked the Cliff Huxtable market for years before submarining himself.

Christopher Eccleston in Heroes
But recently, some actors have identified a third option. They can actively kill their personas while winking at the audience. I first noticed this with Christopher Eccleston. Though already established for playing quirky anti-establishment characters for years, he probably secured his stature in the nerd pantheon as the first actor inhabiting the title role in the reëstablished Doctor Who, in 2005. However, the series proved an awkward fit, and Eccleston left after only one year.

After spending 2006 largely in private, Eccleston reappeared in 2007 on NBC’s breakout hit-turned-dumpster fire, Heroes. This series included several iconic genre veterans, especially George Takei (Sulu) and Nichelle Nichols (Uhura) from Star Trek. But where those two genre demigods played jokey, winking versions of themselves (Takei’s license plate read NCC-1071, the Starship Enterprise’s designation), Eccleston played a drunken, washed-out ex-hero who’s given up on humanity. He played the Doctor, after the Doctor stopped caring.

Like Eccleston, Christopher Meloni’s character, Nick Sax, has become alcoholic, stopped shaving, and abandoned his ideals. Eccleston’s character sleeps in a cardboard box; Sax is implicitly homeless, literally and metaphorically adrift in Manhattan’s unscrubbed underbelly. He accepts contract assassination jobs, not because they pay well or because he has any loyalty to organized crime, but because it’s the only thing left that makes him feel alive. Nick Sax is the polar opposite of Elliot Stabler.

Nor is this comparison accidental. The show takes remarkable efforts to assure us Nick Sax used to be an exemplary cop. We see a New York Daily News cover identifying Sax as “Supercop.” His former disciple, Detective Meredith McCarthy, greets him by saying “This piece of shit used to be the best cop on the force.” A prostitute he once busted remembers him as gentlemanly, even fatherly, coaching her to get her life on track.

George Takei in Star Trek (left) and Heroes (right)

These descriptors all apply to Elliot Stabler. Detective Stabler lasted fourteen years in the sex crimes division, a hitch most cops rotate out of every two years, because his compassion for others exceeded his fears for himself. Though sometimes ruthless, with a make-do commitment to professional ethics, Stabler’s clearance rate would make most supercops look lazy. As characters heap praise on Nick Sax’s former career, we realize, Nick Sax is Elliot Stabler’s shambling, undead corpse.

Meloni takes a bold risk here. If Happy! lasts into January, he’ll effectively kill Elliot Stabler, and any chance of returning to the Golden Ticket that made him famous. But it also provides an opportunity many once-notorious TV actors never receive, an opportunity to put his starpmaking role behind him forever. Only time will tell whether this gamble succeeds. Until then, let’s marvel at a successful actor metaphorically stabbing his hard-earned stardom in the face.

Friday, December 15, 2017

A Short Handbook for Confronting Dictators

Timothy Snyder, On Tyranny: Twenty Lessons from the Twentieth Century

The American President came second in his race. The current British government garnered barely a third of the vote in the last general election. And Russia hasn’t had a truly free election since Boris Yeltsin. Couple that with appallingly low numbers for legitimate newspapers, record highs for passive venues like aggregator websites, and stunning disinterest in voting among the young, and you have a recipe for government by tyrants.

Timothy Snyder, a Yale historian specializing in Europe around and between the World Wars, has made a career of identifying the social forces which made Fascism and Stalinism possible. He isn’t the first to find correlations between that era and the current international scene. However, he uniquely distilled that era’s lessons into a brief handbook which audiences have enthusiastically embraced. And it’s easy to see why.

Some of Snyder’s pointers seem obvious when confronting illegitimate or autocratic governments. “Beware the one-party state.” “Contribute to good causes.” “Be wary of paramilitaries.” These all reflect very true conditions in a state where the ruling party actively strives to make the opposition seem criminal, strips funding from good works, and keeps an off-the-books security force. These would be truisms, if they weren’t so frequently ignored.

Other points seem less obvious. “Make eye contact and small talk”? Don’t we have better things to do than have conversations when world history is at stake? Not necessarily. If we fight for global causes, but don’t actually know our neighbors, it becomes easy to make immoral compromises that sacrifice the lives of apparent strangers. Similar reasoning applies to “Establish a private life” and “Remember professional ethics.”

Timothy Snyder
And a few pointers seem downright counterintuitive. “Stand out”? Doesn’t the tallest weed get cut down first? Maybe. But while tyranny originates from above, deriving its power from unelected or semi-legal means, Snyder insists tyranny perpetuates itself mainly through citizens who conform, who go along to get along. Resisting tyrannical governments, protecting the institutions that make democracy possible, requires people who think freely, and act on those thoughts.

This is a recurring theme in Snyder’s analysis. Mindless conformity enabled leaders like Stalin and Hitler to consolidate their control, as citizens followed the path of least resistance so they could continue making a living. Then, when leaders used their consolidated control to annex Austria or collectivize Ukrainian farming, nobody could stand against them; all avenues of resistance had been swallowed by the desire to not make waves.

Snyder contrasts this conformist thinking to powerful non-conformists. Winston Churchill dismissed calls from both sides to make peace. Teresa Prekerowa saved families from crackdowns in the Polish ghettos. Václav Havel distributed samizdat literature that fired anti-communist resistance when centralized governments became too powerful. Non-conformists made resistance possible at times when standing up to autocrats seems pointless and self-defeating.

Tyranny, after all, isn’t inexplicable. Snyder notes in his introduction that “Both fascism and communism were responses to globalization: to the real and perceived inequalities it created, and the apparent helplessness of the democracies in addressing them.” This similarity should chill anyone following current politics. The election of demagogues like Donald Trump and Teresa May reflects social conditions brought about almost exactly how Twentieth Century tyrants profited from the Gilded Age.

The ultimate resolution may be similar.

Most important, the theme permeating this book involves holding to principles of truth and reciprocity. Tyrants tend to govern by force of personality, adhering to principles of self-advancement and “nature red in tooth and claw.” Maintaining the structures of civil society require organized dissidents linked by moral foundation and a belief that human beings, individually and collectively, are worth fighting for. Which, thankfully, we have seen on the ground.

If I have one point of disagreement with Snyder, it’s his lack of source notes. Throughout this book, he cites from history, name-drops theorists like Hannah Arendt and Victor Klemperer, and generally quotes a grab bag of luminaries who lived through or commented upon modern technocratic tyranny. The intellectual-minded among us might enjoy delving into his sources. If things get worse before they get better, which seems likely, we need prepared responses.

Throughout, Snyder makes repeated references to “the president” and to current leaders. However, he carefully avoids proper nouns when calling out personalities. He clearly refers to Donald Trump, especially when suggesting that midterm elections might get suspended (a common scare tactic among whichever party lacks power). However, it bears noting that, in 2016, both major parties floated top-of-the ticket candidates with noted authoritarian tendencies. Tyranny should be a non-partisan issue.

Tuesday, December 12, 2017

The Liberal Arts Are More Important Now, Not Less

A famous portrait assumed
to be Christopher Marlowe
Christopher Marlowe’s play Doctor Faustus opens with an illustrative scene. Newly minted in his doctorate, Faustus chooses several books from his shelves, takes a seat in his study, and… picks his academic specialty. He decides on sorcery only after examining, and discarding, his era’s three legitimate academic fields: Law, Medicine, and Divinity. Remember, he does this after already achieving his doctoral degree.

I recalled this scene while reading Nina Handler’s lament, “Facing My Own Extinction,” in the Chronicle of Higher Education online. Handler, coordinator of English at Holy Names University in Oakland, California, looks both forward and backward as she makes peace with her school’s abolition of the English major. Her school, she laments, has become a preprofessional training seminar, where any class without a career payout gets dismissed as unnecessary.

It’s tempting to defend English for transcendental ideals, like that people who read are better able to have empathy, or that it potentially immunizes democracies against tyranny. But these arguments basically persuade the already persuaded. The educational reformers aggressively pushing STEM-focused curricula, subsidized by captains of industry and legislators desperate to not look backward, won’t hear such arguments, because they already disbelieve or dismiss them.

Instead, let’s contemplate the very material rewards that come from a diverse liberal arts curriculum. Two authors I’ve recently reviewed, Christian Madsbjerg and Scott Sonenshein, both write that business success and diverse education go hand in hand. Both authors observe something I’ve observed, though they have better source notes: specialists know how to do one thing well. But they’re incapable of adapting to changing economic, business, or professional conditions.

Christian Madsbjerg
The simple ability to read and understand multiple genres provides one recourse against such inflexibility. By simply stepping outside ourselves, our narrow range of experiences and specialized training, we learn ways of thinking that keep our minds active and developing. If we can wrap our heads around Gilgamesh, Hamlet, Elizabeth Bennett, and Bigger Thomas, we can also handle economic downturns, changes in job-related technology, and evolving moral values.

This isn’t only about English. In History, for instance, certain ideas are objectively correct, and therefore testable on a Scantron sheet. The last successful invasion of England, for instance, was objectively in 1066. But why? What made William the Conqueror able to sack England and take the throne, while Hitler’s Operation Sea Lion, much more thoroughly planned and backed with more advanced technology, got abandoned even after the Luftwaffe?

Don’t mistake this for abstruse woolgathering. America has a president who thinks Andrew Jackson could've posthumously prevented the Civil War, and an administration that thinks “the lack of an ability to compromise led to the Civil War.” These people, with the ability to guide the economy or take America to war, demonstrate a palpable lack of awareness about the weighty economic, social, historical, and military factors that shaped history, and continue shaping the present.

Perceptive readers will notice I haven’t really made a concrete argument for the English major specifically. Despite name-checking literary characters and historical events, I’ve made a more broad-based argument for a diverse liberal arts education. You’re right. Locking youths into an academic program, and resulting career path, at age 18, seems ridiculous to me. Like Faustus, they should choose their specialization only after gaining a diverse general foundation.

A handful of colleges and universities have followed this path. St. John’s College of Annapolis and Santa Fe comes to mind, as do Thomas Aquinas College and Reed College. These schools have softened or abolished undergraduate major departments, focusing on a diverse grounding in humanities and sciences. By contrast, Holy Names and other schools not only lock students into career tracks, they’re narrowing the number of tracks available.

Scott Sonenshein
Business, citizenry, and just plain humanity, require a diverse grounding in the humanities. This includes language arts and social sciences, but also mathematics and physical science. Our society suffers a plague of specialists today. It’s easy to point fingers at lawyers who only know law, or businessmen who only know business. But I’d also include journalists who only attended journalism school, and bureaucrats who have spent decades only in government.

Remember Doctor Faustus. Having chosen his discipline based upon earthly rewards, he gets the knowledge he seeks quickly. But he almost immediately declines into a shadow of himself, conjuring famous shades for kings like a carny barker, and playing horse pranks on foolish yokels. At his death, even God won’t take him back. Because knowledge isn’t supposed garner worldly payouts, it’s supposed to create a full and rounded soul.

Wednesday, December 6, 2017

When Did Americans Become Afraid of Due Process?

John Keats
English poet John Keats wrote about a phenomenon he called “negative capability.” Some people are such advanced thinkers, Keats supposed, that they could hold two contradictory ideas in their heads simultaneously, without reaching for a facile resolution. Simple minds need everything to equal out, but according to Keats, complete contradictions don’t faze superior minds. Which makes sense hypothetically, but we’re seeing it makes bad policy in practice.

The recent rash of public excoriations for sexual harassment in high places casts important light on how contemporary society perceives women as subordinate to men’s desires. While the shakeout at the peaks of entertainment, journalism, and politics has garnered the most attention, the popular #MeToo uprising demonstrated it permeates all levels of society. Though powerful people probably harass disproportionately, since as The Atlantic writes, Power Causes Brain Damage, this is a women issue, not a power issue.

As a person who prefers to side with the powerless in social issue debates, I’m glad to see powerful people getting their comeuppance. Men like Harvey Weinstein and Kevin Spacey have had make-or-break power over aspiring young artists’ careers for so long, they’ve forgotten what it means to be hungry, or what we’ll do to assuage that hunger. The mere fact of a decades-long, well-loved career, like Matt Lauer or Garrison Keillor, shouldn’t shield anyone from criticism.

(I say “men” because the accused have mostly been men. At this writing, singer-songwriter Melanie Martinez has become the highest profile woman thus accused, though less than forty-eight hours in, it’s impossible to say how her story will shake out. And the occasional woman doesn’t counteract the statistical preponderance of men in this meltdown.)

So yes, I’m happy to see powerful abusers brought low. Except…

Matt Lauer
I’m not the only one to notice this meltdown has largely jettisoned due process. People who have dedicated years, sometimes decades, into difficult careers, are seeing them torpedoed by accusations, sometimes only one accusation. Dozens of women came forward criticizing Harvey Weinstein. Between NBC and Variety, we have four corroborated accusations against Matt Lauer. Garrison Keillor has only one accuser—and details have been frustratingly sparse.

Don’t mistake me. Most of these accused carry a palpable whiff of guilt around them. Besides Louis CK’s frank confession, the mumbling non-denials we’ve heard from Spacey and Weinstein indicate they know they did wrong, and cannot honestly deny it. But even American courts will tell you, a confession isn’t binding without corroborating evidence. Even when the accused confesses, due process must happen for a conviction.

The Weinstein Company board fired their namesake after days of deliberation. But if NBC’s official statement is credible, less than thirty-six hours passed between the first corroborated accusation and Matt Lauer’s firing statement being read aloud on-air. (That statement, incidentally, is not credible; Lauer's voracious sexual appetite was known for years.) The pace of firings appears increasingly hasty; Minnesota Public Radio still hasn’t commented, beyond vague generalities, about Garrison Keillor.

It’s difficult to avoid comparisons with McCarthyism. Though popular Hollywood personalities like Dalton Trumbo and Ring Lardner, Jr., were never convicted of or confessed to Communist sympathies, the studio system successfully spiked their careers for decades. Some never had their reputations restored. The mere fact that Trumbo was, indeed, a Communist, and Lardner showed strong leftist sympathies, doesn’t excuse the way they were railroaded.

The enduring popularity of courtroom dramas like Perry Mason might explain why we’re reluctant to permit harassers due process. These shows create the impression that trials exist to exonerate the falsely accused, since the police consistently arrest the wrong person first. It’s clear these accused aren’t the wrong people, especially as Matt Lauer, like Louis CK, has confessed and sought forgiveness. Even he believes he’s guilty.

Harvey Weinstein
But we have trials to protect the rights of the guilty, too. The slow, deliberative process ensures we don’t respond from panic, rage, or fear. America has a history of using false, or specious, accusations to railroad the accused, often with tragic consequences. That’s why we entrust law enforcement to state agents, not private vigilantes. Especially when accusations appear quickly and vigorously, like right now, we must slow the conviction process.

It’s possible to acknowledge, like Keats, that these accused sure look guilty, but also that they deserve due process. Indeed, arguably, the guilty deserve a full hearing more than the innocent. Both the accused and the accuser deserve to be heard. And that’s not happening right now. This cuts to the heart of our belief in ourselves as a just people.

Monday, November 27, 2017

Growing Up in a Land of Priests and Martyrs

1001 Books To Read Before Your Kindle Battery Dies, Part 85
Marjane Satrapi, Persepolis: the Story of a Childhood, and Persepolis 2: the Story of a Return


Marjane Satrapi was ten years old when the ayatollahs overtook Iran. Formerly educated in a French-language school with classrooms integrated by gender, she found her life uprooted and turned sideways. Teachers who, one year, told students the Shah was chosen by God, the next year told students to rip the Shah’s photos from their textbooks. Who can blame her for developing a drifting, nihilistic view of life?

Satrapi’s memoirs, written in graphic novel form, first appeared in French in four volumes at the height of the 1990s “comix” craze; they were later reprinted in English in two volumes. (A one-volume edition exists.) Like most comix artists, Satrapi embraces an auteur mindset, with a single writer-artist, and minimal editorial influence.This permits an introspective, deeply personal approach to her telling her own story.

Perhaps the most important theme in Satrapi’s memoirs is the contrast between her family’s secular, Westernized upbringing, and the increasingly repressive, theocratic regime. Before the Islamic revolution of 1979, Satrapi’s parents participated in anti-Shah marches, believing the eventual revolution would be primarily Marxist in nature. Imagine their shock when the ayatollahs became the revolution’s driving force, and eventual ideological captains. Like many, their sense of betrayal was palpable.

Not that their Western ideals excludes Islam. As a child, Satrapi believes herself a prophet, in a lineage with Zoroaster, Jesus, and Mohammed. She has intimate conversations with God to understand her confusion. Later, as an adult, she quotes the Koran fluently when religious police attempt to squelch her voice. But the Satrapis’ religion doesn’t yoke them to the past. History, for them is a march toward secular democracy.

This battle between secular and theocratic mostly happens behind closed doors. Satrapi’s parents attend parties where everyone drinks homemade wine, wears neckties, and dances to American rock, emblems of Western excess. On those rare occasions where Iran opens its borders, they smuggle in posters and cassettes of Marjane’s favorite American heavy metal artists. But they also hang blackout curtains and bribe cops, because the state encourages snitching on one’s neighbors.

Marjane Satrapi
Later, as an adult, Satrapi studies art in Iran’s state-run universities. But Iran’s draconian modesty codes mean that, in life drawing class, the models must wear massive, billowing gowns. Satrapi organizes illicit after-dark classes where peers get to draw tasteful human nudity, in the Renaissance style. Her demands evolve consummately: where once she bought bootleg American music, she graduates to bootleg American contraception.

Satrapi’s two-dimensional, black-and-white art, consistent with the comix movement, permits readers to see Iran through her eyes. We can see clashing crowds of protesters and counter-protesters without her having to write long-winded descriptions— and her flat, cartoonish art reveals how screaming ideology strips everyone of individuality. Later, she uses cutaway reveals to expose, say, how women dress beneath the veil, how they express individuality in a state that demands conformity.

It’s possible to read Satrapi’s memoirs as moments in Iranian history; that’s how they’re often marketed. A nation’s struggle to overcome its past requires it to decide what future it wants to embrace. Satrapi’s liberal, educated family embraces the homo economicus model, believing an Islamic version of rational humanism will inevitably overtake the country. They simply don’t anticipate the confidence that religious conservatism promises people who feel dispossessed.

But like the best classic literature, Satrapi’s memoir is fundamentally about its audience. As Marjane first witnesses her parents’ collisions with religious authority, and later embraces such conflicts herself, it’s impossible to avoid noticing that both sides wear ideological blinders. Satrapi uses absolutist thinking to confront absolutist religion. How often, we wonder, do we ignore our own absolutism? What sacred cows do we refuse to sacrifice, and not even notice?

Satrapi sees the world in black-and-white because, essentially, she’s a child. As the story progresses, her art becomes more sophisticated and fully dimensional, because she herself becomes a more sophisticated soul. She loves her people, and when her parents ship her to Europe as a child for safety, she returns as an adult. But eventually, even that collapses under state pressure. To remain human and sane, she has to leave.

In some ways, Satrapi retains her childhood aspirations to prophethood, Like Jesus said, a prophet lacks honor in his homeland. By exposing us to the corrupting influences of absolutism, Satrapi encourages us to understand the complexity of fellow humans. We cannot manage change without loving one another; and we cannot love without knowing one another. But Satrapi’s prophecy rejects dogmatism. Truth is messy, because it’s finally made of human beings.

Wednesday, November 22, 2017

Al Franken and the Abuse of Insider Power


Show of hands: whose buddy ever drew a dick on your face while you were sleeping?

When I was nineteen, I traveled to South Dakota with a church group. Sharing a motel room with several rowdy teenagers, I fell asleep around ten PM, my usual time. I awoke fifteen minutes later to raucous laughter: I’d slapped myself in the face when my roommates drew a dick on my cheeks with shaving cream, and I tried to shoo the irritant away. The only adult in the room suggested and encouraged this behavior.

I’ve always known that you’re totally powerless when you sleep. That’s probably why children hate being ordered to bed, and certainly why, when they’re old enough to understand some of the world’s dangers, they frequently refuse to fall asleep without a trusted adult around. But that incident really solidified for me what an act of trust falling asleep in public really is. You have to believe others don’t have malicious intent.

That’s why, the longer the Al Franken controversy continues, the more it bothers me. The ongoing revelation of sexual predation in American politics, journalism, and entertainment continues growing, but it’s usually something pretty straightforward: Roy Moore targeted minors, forced himself physically upon them, and used his public authority to demand their complicity. Even his supporters understand why that’s completely awful, though they justify it.

Franken, by contrast, in his pre-senatorial days, did something more insidious. He didn’t out-and-out sexually assault journalist Leeann Tweeden; judging by the photograph that’s surfaced, he possibly never even touched her. (The alleged forced kiss happened off-camera.) But Tweeden trusted her fellow travelers on the USO plane enough to fall asleep in their presence. And Franken drew a metaphorical dick on her face.

Whether drawing literal vulgar images on somebody’s face, or doing hover-hands over her boobs, as Franken got photographed doing, the point remains the same: to cause a person humiliation for being powerless. Somebody who is awake has remarkable power over somebody who is asleep. The waker could draw vulgarities, take embarrassing photos, take sexual advantage, even stab the sleeper. And the sleeper can’t stop it.

And be honest, Franken wouldn’t have photographed the prank if he didn’t intend to show anybody else. His purpose was to demonstrate his power over another human being, because she made the mistake of considering him trustworthy. Physical pain or psychological distress probably mattered little. This is the ultimate insider humor, the sharing of jokes at a powerless outsider’s expense. Not unlike jocks bullying nerds.

Yeah, this crap never gets old. Unless you need to sleep. So yeah, unless you're human.

All this happened after Franken postulated, in his book The Truth (With Jokes), that he could run for Senate. He’d at least contemplated a life in public service, a role that, depending on the attitude you bring, either involves subjugating yourself to the greater common good, or ruling over others. Since eschewing comedy writing, Franken has used common-good rhetoric in public. But this photo demonstrates a self-superior, ruling-class mentality.

I know, from experience, how such humiliations undermine one’s ability to trust others. Since I was nineteen, I can count on my fingers the number of people I’ve shared sleeping quarters with, who weren’t related to me by blood. It’s very difficult for me to relinquish control that way. I can’t possibly be alone: chronic sleep deprivation, and its related behavior, carb-loading, are among America’s leading causes of obesity, heart disease, Type-II diabetes, and other ailments.

By definition, America’s representative government requires citizens to relinquish control, voluntarily, to others. They’re nominally people we choose, but in today’s party-driven system, where we often choose between elephants and jackasses rather than actual human beings, the selection is often a lesser-of-two-evils choice. By seeking public power, Senator Franken asks Minnesota’s voters to entrust public control into his hands.

That photograph, sadly, demonstrates that we cannot trust him with such authority. In two different public statements, Franken disclaimed that photograph as a joke gone awry. Even if that’s true, remember what he considers funny: this person’s body needs rest, so she trusted me enough to fall asleep, haw-haw. That isn’t a funny group laugh, it’s pointed mockery at another’s expense. It’s taunting the powerless for lacking power.

That’s why I must cut Senator Franken loose. Sure, he’s not my senator; I’ve never had any influence in Franken’s career. But as a voter, I have input. And Franken cannot effectively represent anybody, constituent or citizen, when he arrogates power that way. He’s demonstrated rot at his philosophical core. When given power over others, he uses it for his own aggrandizement. That’s why Senator Franken has to go.

Monday, November 20, 2017

North America's Other Bloody Border War

Hispaniola, in a map from the Encyclopedia Britannica

1001 Books To Read Before Your Kindle Battery Dies, Part 84
Michele Wucker, Why the Cocks Fight: Dominicans, Haitians, and the Struggle for Hispaniola


Though it’s impossible to say with precision, many historians believe the spot where Christopher Columbus first landed in the Americas is in Haiti. Archeologists have found remains of a rudimentary wooded village with European structures near modern Cap-Hatiën. This means that, though Native Americans had a rich, complex pre-Columbian civilization, modern written history in North America begins on the island of Hispaniola.

So it’s particularly puzzling why Haiti and the only country with which it shares a land border, the Dominican Republic, remain essentially terra incognita for English-speaking Americans. Ivy League-educated journalist Michele Wucker, who specializes in crisis points and why nobody prevents preventable explosions, turns to this issue in her first book. The answers she uncovers aren’t pretty, because they indict us first-worlders in the ongoing cataclysm that is Latin America.

Inspired by the American Revolution, Haitians became the second Western Hemisphere colony to overthrow its European masters. Though that war dragged for thirteen years, and cost far more lives, it produced North America’s second independent nation, and the first dominated by an entirely black, formerly enslaved population. Sadly, Haiti didn’t establish freedom right away; strings of coups, royalist pretenders, and overseas occupations meant Haiti’s first free election didn’t happen until 1989.

Haiti controls approximately one-third of Hispaniola; the Dominican Republic controls two-thirds. Though the French controlled Haiti, the former colony of Santo Domingo bowed to Spain, who didn’t treat its colony so cruelly, but also didn’t extract as much wealth. Santo Domingo didn’t rebel against Spain for decades, and when it did, it petitioned the United States for statehood, and was declined. So naturally, Haiti invaded.

Michele Wucker
The Dominican Republic celebrates its Independence Day, not from the Spanish colonial overthrow, but from the ouster of Haiti in 1844. From this moment, we get an ongoing struggle of economics, power, and race that makes American civil rights struggles look placid and well-behaved. Violence between nations, particularly by relatively wealthier Dominicans against chronically impoverished, and black, Haitians, marks the identity of Hispaniola to this day.

Wucker uses cockfighting as a metaphor to understand this battle. White Americans perceive cockfighting as a barbaric activity, one which we prosecute as inhumane and backward. The nations of Hispaniola, however, see cockfighting much like we see horseracing, as a perfect union of human trainer and animal capability. The Dominican capital, also called Santo Domingo, has a glamorous, state of the art cockfighting arena like Americans have football stadiums.

Haiti’s border with the Dominican republic wasn’t codified and made enforceable until the United States occupation of both nations, during and after World War I. This became important later. Haitians sneak into Dominica much like Mexicans sneak into Texas, crossing a border into a country where they’re not wanted in pursuit of work. An ironclad border made sneaking in a more definite activity, which introduced new risks and rewards on both sides.

During Dominica’s Trujillo dictatorship, from 1930 to 1961, the government cracked down on illegal Haitians. This crackdown started as an ordinary law-enforcement issue, but as unelected government leaders whipped up nationalist sentiment, the crackdown escalated to violence, some of it extreme. Dominica’s massacre of Haitian immigrants, which partly overlapped the German Holocaust, competed for one of the Twentieth Century’s bloodiest genocides. Consequences echo down Hispaniola’s history to the present.

And it’s impossible to deny the racial implications of this genocide. Upper-class Dominicans consider themselves “white,” and have nine gradations of racial identity, down to “black,” which is Haitian. (Most Haitians have unmixed African ancestry.) It’s a shock to many “white” Dominicans when they emigrate to the United States seeking work, and discover that, in America, they’re considered Black. The implications for the United States are glaring, and painful.

Throughout, Wucker keeps her emphasis mainly on the Dominican Republic, the larger, wealthier, paler-skinned neighbor in this arrangement. She strenuously avoids commenting upon larger world affairs, except where the global context contributes to Hispaniola’s conditions. (For instance, the way Haitians and Dominicans both travel to America for economic opportunity.) However, it’s tough to avoid noticing parallels with Mexicans in America, Muslims in the EU, and other perilous border crossings.

However, like the best classic literature, Wucker’s journalism isn’t really about its subject. Ultimately, it’s about the audience, challenging us to understand our position in the world arrangement. We’ll maybe never personally visit Hispaniola; we’ll never engage in bloody race-baiting or cockfighting. But we, individually and collectively, are part of the system that makes this possible. Where, Wucker asks, do we stand? And what will we finally do?

Friday, November 17, 2017

Order In the Court of Public Outrage

Senate candidate Roy Moore (R-Alabama)
I mostly applaud the recent shakeout of sexual harassers in public places. Since Hollywood’s casting couch horror stories and Washington’s power-mad gropers have long been an open secret, it pleases me to see somebody finally being held to account. But I fear, in the last week or so, we’re seeing something I never anticipated from this scandal: the court of public opinion is massively overruling due process.

Okay, some powerful people probably deserve to have their careers torpedoed by what’s happening. Louis CK’s frank confession to past sexual misbehavior, and the mumbling non-denials out of Kevin Spacey and Harvey Weinstein, have brought their recent ostracism on themselves. Frankly, Louis CK deserves credit for having balls enough to admit his past crimes, even as he deserves the ass-beating he’s now getting in the media.

But a different dynamic plays out in the cases of Roy Moore and Al Franken. These politicians both deny, to greater or lesser degrees, the accusations against them, but they’re both being treated as guilty. Calls for Moore to quit his Senate race, and for Franken to quit the Senate, have fallen out, not necessarily along partisan lines, but certainly according to group loyalties. Still, neither has confessed to, nor been convicted of, anything. Yet.

Maybe they should be. Moore has five accusers (that I know of), and they tell remarkably similar stories: targeting minors less than half his age, touching them without consent, then using his elected authority to demand their silence. Franken has one accuser, whose charges date back to when he was a comedian, and unlike Moore’s accusers, she has photographs. If he got caught on camera once, it probably happened other times, when cameras weren’t present.

Both deny their charges. Moore does so categorically, saying all his accusers are lying. Franken dismisses his accusations as a joke which fell flat. (I never liked him on SNL anyway; his Stuart Smalley character, in particular, was an unfunny slam on homosexuals and treatment seekers alike.) Moore has demanded proper legal charges be filed, knowing they can’t at this late date. Franken has submitted to an ethics committee investigation, like he had any choice.

Senator Al Franken (D-Minnesota)
The fallout has been downright predictable. Moore’s evangelical Christian supporters invent stupid excuses for his behavior, though he’s slipped into second place. Senate Majority Leader Mitch McConnell, and his colleagues, say they believe Moore’s accusers, though they continue mush-mouthing their responses to President Trump’s more numerous accusers. Senator Cory Gardner has suggested the Senate won't let Moore get seated, which would be unprecedented.

Congress has literally never preemptively prevented a legally elected member from being seated. It has very rarely even expelled a sitting member, and other than seventeen members expelled when their states joined the Confederacy, it has only happened when convicted of very serious charges. Congress expelled William Blount in 1797 for treason, and Michael Myers in 1980 and James Traficant in 2002, both for bribery and other financial malfeasance.

Meanwhile, the breakdown among progressives on Senator Franken is stark. Democratic Party loyalists praise his “leadership” for not impugning his accuser, and submitting to ethics inquiries. But people more loyal to principles than party began tweeting demands for his resignation within hours of the accusation. Unfortunately, this opens other terrible doors: if one single accuser requires resignation, could anybody accuse anyone of anything?

In both cases, well-meaning but misguided advocates clearly haven’t thought the implications through. If Republicans prevent Moore taking his seat, they’ll have to explain why they won’t hold Trump equally accountable; more likely, they’ll crumble as they did with Trump. And if Franken gets pressured into resigning without a hearing, it sets a precedent existing Senators won’t like (his accuser is a sometime paid Hannity commentator).

The Senate, and America, must hold both men culpable for their actions. Both should receive hearings, or face consequences if they refuse. But the Senate is a deliberative body, which exists to test ideas through open debate before foisting them onto the public. Sure, letting a seriously suspected sexual predator decide our laws has harrowing implications. But they gave Bob Packwood a fair hearing, and they had his confession on tape.

These charges are serious. Our lawmakers need to hear the accusers, and hold their peers accountable. But I fear we’re starting to see the ramifications of public outrage: moral panic, where accusations matter more than evidence. From Salem witch trials to the McMartin Preschool Trial, that never ends well. In times like these, we need due process more than ever.

Monday, November 13, 2017

The London Victorian Ladies' Social and Necromantic Circle

Molly Tanzer, Creatures of Will & Temper: a Novel

Lady Henrietta “Harry” Wotton’s glamorous salon has become somewhat bereft since her brother Oliver died. Oliver’s love, and Harry’s best friend, the portraitist Basil Hallward, has retreated into seclusion with Oliver’s painting. But, rootless in his grief, Basil has permitted his nieces to reside with him, temporarily, in London. Harry finally revives when she considers the woman fencing master Evadne Gray and her sister, Dorina. Yes, Dorina Gray.

On her blog, Molly Tanzer describes her latest as “a feminist retelling of The Picture of Dorian Gray with sword fighting and demons”, which is partly accurate. Though Tanzer uses Oscar Wilde’s only novel as her launching point, she doesn’t simply retell the novel; and the swashbuckling supernatural elements, though good, come remarkably late. She basically offers two overlapping books: a subtle historical relationship drama, and a heroic quest narrative.

Tanzer’s prologue mimics Wilde’s first chapter, note for note, though eschewing Wilde’s self-conscious authorial repartée, which is historically witty, but fake-sounding to modern ears. Instead, Tanzer focuses on relationships. Wilde’s characters use language to play status games; Tanzer’s characters use language to build, and tear down, bridges. The result is familiar to period literature buffs, but cuttingly contemporary for paperback genre readers.

Evadne and Dorina Gray are country girls, though that means something different for each. Evadne is adult, recently jilted, suddenly at loose ends: she only values strength, which she measures by her fencing ability, and rectitude, which has recently been shattered when the student vicar she loves betroths another. Dorina, meanwhile, is seventeen, secular, hedonistic, and… well, things explode when Evadne happens upon Dorina with another girl.

Molly Tanzer
There, immediately, we see Tanzer deviating from Wilde. What Wilde had to describe in coded language, Tanzer spells out immediately—probably has to, since modern readers aren’t accustomed to parsing hints. This theme continues. Once in London, the barely-legal Dorina begins aggressively courting Lady Henrietta, who attempts to maintain her distance. The demon living in Lady Henry’s brain, though, desperately desires the beautiful, sensual Dorina.

Oscar Wilde claimed that his trinity of characters, Dorian, Lord Henry, and Basil Hallward, represented three aspects of himself. It’s tempting to assign such meaning to Tanzer’s characters. She marginalizes Hallward, but foregrounds Evadne, keeping the three-legged stool standing. A Jungian psychologist would call the sisters one another’s shadows: adult versus adolescent, pious versus hedonistic, disciplined versus sensual. And so on.

The Gray sisters’ lifelong stalemate gets interrupted by Lady Henry, who resembles both, and neither. She has wealth and dignity, but disdains gender conventions. She encourages Dorina’s interest in art and sensual pleasure, but maintains polite remove. Lady Henry attempts to cultivate both sisters’ greatest strengths, but somehow invigorates Dorina’s hedonistic side, while alienating polite, self-contained Evadne. And most importantly, she channels demons.

See, there’s where problems erupt. Tanzer dangles that demonic influence episodically, but for most of the book, it’s oblique and distant. For chapter upon chapter, we have a historical novel about two sisters attempting to reach adulthood, but failing, because they cannot accept each other. Then suddenly, around the two-thirds mark, the demonic influence becomes central. Sword battles erupt on London rooftops. The book’s entire tenor changes.

As good as Tenzor writes, I cannot evade the fact that she’s created two parallel narratives, one of which idles for hundreds of pages before exploding violently, while the other proceeds carefully and thoughtfully, then gets sidelined by the action. One starts without really finishing; the other finishes without starting. I’d pay cash money to read either of these books, or one that blends them together without a hiccup.

Je suis frustrated.

We have, essentially, two books, fully realized, beautifully written, juxtaposed upon one another. Tanzer’s first two acts channel the kind of neo-Victorian writing created by A.S. Byatt and Sarah Waters. Then, in the final act, Tanzer offers urban fantasy from the mold that produced Jim Butcher and Seanan McGuire. Don’t mistake me: they’re very good books, and I enjoyed reading both. But there’s a scar where Tanzer stitched them together.

I legitimately enjoyed this book, and recommend it for audiences like me, who read both classics and contemporary genre fiction recreationally. She views Wilde’s classic characters from a contemporary perspective, one which rewards readings based not only on sexuality, but gender, class, and religion. My customer review has barely scratched the surface. One could follow Tanzer’s themes so much deeper.

Just be aware, going in, this journeyman author takes unconventional risks. Many of them pay off. But when they don’t, they leave a visible mark.

Friday, November 10, 2017

Gunfire Nation, Part II

The following essay is a continuation of my previous essay, Let's Just Accept It: We're Gunfire Nation
The church in Sutherland Springs, Texas, that brought guns
back into the mainstream discussion

Earlier this week, I wrote that we Americans need to stop fooling ourselves. Years of mawkish posturing can’t negate the fact that we’re apparently okay with some avoidable gunfire deaths, provided no measures whatsoever get taken to reduce access to guns among those most likely to use them recklessly. I still believe that; but I received some pushback from strangers who disagreed with my reasoning. I’d like to address three different objections, in order.

First, one respondent claimed that Americans “still have a higher chance to die by a fall.” No further clarification. Since “fall” is a vague category, I can’t quite confirm or deny this likelihood, especially since workplace falls get classed separately from “accidental falls” by stats collectors, so the exact number is somewhat murky. But a second respondent claimed “stats from 2013 give… 556,000 deaths from falls.”

I can’t find any such statistic anywhere. According to the CDC, there were 31,959 “unintentional fall deaths” in 2014, or ten per 100,000. Also according to the CDC, there were 33,594 firearms deaths in America in 2014, or 10.5 per 100,000. So if we believe the CDC numbers, you aren’t more likely to die of falls than gunfire. That’s just not true.

Then, the comparison between falls and gunfire falls down on one important distinction: we do something about falls. We require stairs to be built with handrails, elevators to have Otis safety clamps, excavations to be fenced off, and even natural cliffs and crevasses to have safety rails if they’re accessible to the public. We consider fall protection a priority, even if it runs up production costs.

This OSHA graphic demonstrates that we take falls very seriously, at least on paper

In my industry, construction, falls are a constant hazard. In any situation where workers could fall more than six feet, OSHA requires some combination of written warnings, orange cones, wooden safety rails, elastic fall harnesses, and other anti-fall protections. My subspecialty at work is specifically constructing handrails and other safety protections. Though we’ll never eliminate all fall deaths, we labor to eliminate all avoidable fall deaths. We consider the effort a moral imperative.

A second respondent mused vaguely that “Gun regulations in the past have not had much of an effect and new ones proposed, such as liability insurance would have about 0% efficacy in preventing gun deaths.” There may be something to that. The 1994 Assault Weapons Ban led to fewer mass shootings with semi-automatic weapons, but other weapons categories picked up the slack. And the liability insurance proposal is, at best, untested.

However, I never suggested any form of further regulation. In my essay, I specifically mentioned intelligence-gathering, and I stand by that suggestion. We now know that the shooter in Sutherland Springs, Texas, had been convicted of beating his wife, been involuntarily committed to a mental health facility, and been dishonorably discharged from the Air Force, according to the Washington Post. All these should have disqualified him from legal weapons purchases.

Yet according to Time magazine, the shooter legally purchased a rifle and two handguns, which police recovered at the scene. Apparently the Air Force never bothered telling relevant authorities about the shooter’s past, so the shooter faced no impediments to building a small personal arsenal. This failure to share important information painfully resembles the intelligence failures we discovered following 9/11. Apparently, we’re really bad at learning.

So that respondent is right; new regulations probably won’t work. We need, instead, to use information we already have, proactively, to identify people who give warning signs. We already use domestic intelligence to track chemical fertilizer buyers, so nobody can build a Timothy McVeigh-style truck bomb. Nothing but lack of will stops us doing the same with guns.

A David Horsey cartoon for the Los Angeles Times, dated 2013

My third respondent claimed that “The insurance proposal is simply a way to make sure poor people can't have guns.” There I must call complete bullshit. We require car owners to purchase liability insurance, even though statistically speaking, in America, the poorer you are, the further you live from work, and public transit: poor people need cars. We have no problem offloading expenses onto those who can least afford them. Until guns enter the picture.

Honestly, each of these three respondents has the grains of truth. We face many risks, some more imminent than guns. Regulations have a spotty record at best. And we shouldn’t perpetuate the idea that poor people can’t have nice things. But none of these statements really negates my point: that as a nation, we lack the will to do anything about misused guns. I believe my message still stands.