Friday, December 29, 2017

Chaos Theatre and the Great American Comedy Renaissance

Sam Wasson, Improv Nation: How We Made a Great American Art

Improvisational theatre began in Viola Spolin’s workshops, beginning with theories that humans have the most authentic, open interactions during opportunities to play. Spolin moved to California, turned her theatre games into an actor training program, and produced several storied actors. But the real magic happened when Spolin’s son, Paul Sills, took her theatre games to the University of Chicago. There a strange maelstrom of talent created a new form of theatre.

Sam Wasson, a sometime performer himself, has written four previous books about American performing arts. Until now, he’s focused on single personalities, like Bob Fosse or Audrey Hepburn. Here, he turns his mixed artistic and journalistic background onto an artform, improv theatre, which would emerge, phoenix-like, from the moldering corpse of post-WWII theatre. American-made performance was dying, but improv breathed new life into it.

In Chicago, Paul Sills met several personalities longing for something new, something revolutionary. These included several still-famous performers, like Mike Nichols and Elaine May, or Severn Darden. Others included people mostly known only by other theatre professionals, including Del Close and David Shepherd. And that revolutionary zeal wasn’t metaphorical; many early improvisors believed they could overthrow the capitalist patriarchy and rebuild society by simply being authentic.

Sadly (or not), they discovered, as revolutionaries do, that capitalism is remarkably elastic. Several offspring of Sills’ original vision, including the Compass Players and Second City, became money-making enterprises when they discovered an untapped market for genuine, unplanned laughs. Soon, performers who paid their dues doing improv, became stars of the scripted circuit, like Alan Arkin and Barbara Harris. “Legitimate” theatre began adapting to improv timing.

Sam Wasson
Starched-shirt theatre historians claim improv has roots going back to Italian commedia dell’arte. Wasson blows his nose on this. Italian commedia actors worked without a script, but they played single characters, repeating stock speeches, in scenarios audiences committed to memory. American improv was something new, something dangerous: every performance presents something new. (Sometimes literally dangerous: letting actors extemporize their lines threatened countries with speech laws.)

But Wasson also notices patterns developing. Improv began as anti-capitalist theatre, but became so in-demand that prices skyrocketed. The satirical edge became so beloved that public figures relished getting skewered, rather than fearing it. Improv has long struggled to maintain a legitimate edge, and whenever it risks becoming safe, it requires innovators to blast the artform from its comfy confines. It appears to need this kind of rescuing a lot.

And the rescuers often aren’t stable people themselves. Improv innovators like Del Close, Bill Murray, and Chris Farley have repeatedly breathed new life into unscripted performance, sometimes through sheer force of personality. But these personalities are also frequently self-destructive, craving new experience at any cost. The qualities that make improvisors fascinating performers often make them dangerous human beings No wonder so many have a tendency to die young.

The reciprocal relationship between improv and “straight” performance apparently fascinates Wasson. In the 1960s, many famous improvisers became more conventional, scripted stars: Mike Nichols turned Second City alum Dustin Hoffman into a star with The Graduate. Since the 1970s, improv has funneled its best performers into TV shows like SCTV and Saturday Night Live. It’s almost like “straight” performance needs the vitality that only improv provides. And improvisers need “straight” paychecks.

Wasson doesn’t write a how-to for improv comedy. Such books already exist, in numbers too massive to sift. Instead, he writes a love song for an artform that strives to keep American theatrical performance from drifting into comfy passivity. In Wasson’s telling, improv prevents other performance forms drifting into safe, commercial ruts. But now, improv itself is a commercial enterprise. As so often in the past, the artform’s future is up for grabs.

Early on, describing the love-hate relationship that drove Nichols and May, Wasson writes one of the truest things I’ve ever read about performance and theatre: “Say you meet someone. You like something about them and they like something about you. Your mutual interest begets mutual play. Play begets cooperation and mutual understanding, which, trampolined by fun, becomes love. Love is the highest form of play.”

As a sometime actor myself, I appreciate this thought. We who perform spend tremendous efforts trying to help our audiences have genuine experiences. Maybe we don’t destroy ourselves like Del Close, or burn out like Elaine May, but we know the value of sacrifice. And we do it because we love our audiences, our fellow performers, our art. Improv gives performers the liberty to simply exist. And that is beauty indeed.

Wednesday, December 27, 2017

Crime in the Dark Heart of Space

Chris Brookmyre, Places in the Darkness

Ciudad de Cielo, humankind’s first permanent orbital civilian settlement, is a picture of responsible productivity. Industrious, thriving, with a robust native economy and (officially) low crime rate, it has qualities humanity’s sprawling, overpopulated Earthside cities desire. New security head thinks she’s taking a rubber-stamp job, until her first day, when someone takes a potshot at her. But even that pales when, almost simultaneously, CdC gets its first official murder.

Probably no genres are more innately tied to the times in which they’re written than crime and science fiction. That makes reviewing them particularly difficult when an author has clearly not been reading the current developments in the genres. British crime writer Chris Brookmyre has clearly been reading vintage Isaac Asimov and Arthur C. Clarke, because his science fiction looks around thirty-five years out of date. His buddy cop narrative isn’t much newer.

Having not been formally inducted into her position yet, Alice Blake goes undercover to investigate this murder alongside Nikki Fixx. Ex-LAPD, Nikki is manifestly corrupt and runs multiple shakedown operations which keep her well-paid, while keeping CdC’s official crime rate low. She’s also the city’s only security officer with experience running a murder investigation. Which makes it puzzling when she sleepwalks through the most desultory investigation since Raymond Chandler started phoning them in.

This isn’t accidental. Nikki has a “round up the usual suspects” attitude toward crimefighting, colored by the fact that she’s in neck-deep with the criminals she supposedly busts. Alice, in her undercover persona, repeatedly leans on Nikki about the importance of law. Nikki, meanwhile, uses object lessons to show Alice how only someone who thinks like criminals can successfully enforce the law. Veteran noir audiences know how this debate ends.

Chris Brookmyre
Ciudad de Cielo, CdC in official parlance, “Seedee” on the streets, is a careful balance of rational ingenuity and moral compromise. We know this because Brookmyre repeatedly stops the narrative to tell us. Brookmyre is a master explainer, pausing his story to repeatedly explain how subsurface maglevs work, the space elevator that brings everything to Heinlein Station (!), the processing of food and other necessities in deep orbit.

Unfortunately, Brookmyre’s science and technology are as pointedly dated as his noir. I keep noticing pieces I recognize from reading authors like Robert Silverberg, Orson Scott Card, and Pat Cadigan back in the 1980s. His “dirty streets” pop psychology is equally dated. He fronts multiple, long-winded explanations why, even in space, people skirt the law and indulge their Freudian impulses. “Feed the beast” looms large.

Nevertheless, I could accept Brookmyre’s dated sci-fi and his sketchy postwar ruminations on human nature, if he’d put primary emphasis on his story. When he permits events to happen, they generally happen with a reasonable degree of punch. We get glimpses into the vulgar undercarriage of space exploration, the Mos Eisley of Low-Earth Orbit. Repeatedly, I feel something starting to happen, and get excited for genuine genre-bending crunch.

Then, whoomp, Brookmyre interrupts again. If he isn’t lecturing audiences directly through his third-person-limited voice, his characters lecture one another. This happens literally once, when an important supporting character, a university professor, delivers an actual TEDtalk about the nature of consciousness, an important theme in a society where neural implants are a career necessity. This lecture pinches, almost verbatim, from Sam Harris and Daniel C. Dennett.

While Brookmyre continues frontloading such information, I spend time trying to identify what sources he plunders for influence. I’ve name-dropped several already; you may spot others. The mere fact he has important events take place on a platform entitled Heinlein Station bespeaks just how dated his sources are. The events on that platform, incidentally, include members of a crowd going around a room introducing themselves. Talk about low tension.

I don’t mind Brookmyre’s familiarity with genre history. I myself still enjoy vintage Asimov and Silverberg. But dated references come so close upon one another, it becomes questionable whether Brookmyre has read any sci-fi more recent than Neuromancer. Science fiction and crime fiction, both dependent on current technology and social mores, usually evolve very quickly. If your cultural references are outdated, your audience, familiar with the tropes, can tell.

Shame, really. Because Brookmyre’s reputation as a “Tartan Noir” author precedes him, I expected something more earth-shattering. I expected something that would propel both genres he blends into new, adventurous territory. Instead, this pop-culture porridge will only interest readers who aren’t particularly familiar with either genre. Veteran audiences will get distracted spotting Brookmyre’s sources. There’s little else to hold our attention.

Monday, December 25, 2017

The Economics of Frank Capra's “It's a Wonderful Life”

James Stewart and Donna Reed in It's a Wonderful Life

The sentiment and spirituality of Frank Capra’s 1945 classic It’s a Wonderful Life has made it a Christmas season classic. Audiences love seeing James Stewart's transition from bleak, snowbound despair, to joyous Christmas exuberance, because most of us have endured such extremes in our lives, too. Less widely considered, though, is the economic philosophy supporting the story. And it provides a hopeful blueprint for America’s future.

George Bailey (James Stewart) has many conflicts in this movie. Urban dreams versus small-town realities; goals for the future versus obligations to the past; individual autonomy versus marriage and family. But for our purposes, the most important conflict pits the Bailey Building and Loan against Henry Potter (Lionel Barrymore), who personally controls the bank, apparently making major decisions from his oaken desk.

The difference between Bailey and Potter couldn't be more stark. The Building and Loan gives applicants money enough to build their own homes, which they can own freely and build equity on. As Peruvian economist Hernando de Soto observes, in the United States, the most common source of funds to launch a new business comes from the mortgage on the owner’s home. So a house admits its owner into the ranks of economic mobility.

Mr. Potter, by contrast, keeps a tight hand on money. He gives loans out sparsely, lecturing young George Bailey on the moralistic importance of cultivating a strong work ethic. If we make access to credit as difficult and laborious as possible, people will, it follows, work harder and with greater diligence for everything they have. If, while they're building credit, they have to live in Potter’s dilapidated rental properties, that’s surely an ancillary issue.

Lionel Barrymore in It's a Wonderful Life

If we see this entirely in Cold War terms, it’s easy to mistake Capra's economic ideals. Someone, name blocked out in released documents, used It’s a Wonderful Life to prove that Frank Capra was a secret Communist. If he pitches Potter, the primal libertarian capitalist, as the film’s villain, then clearly Capra must be a dirty pinko. Since the world clearly only exists in black and white.

Except that in mocking the villainous capitalist, Capra doesn't reassign authority to the state. The Communist takes power from citizens, and concentrates it in the hands of the state, or an organization that operates with state-like power. G.K. Chesterton noted, nearly a century ago, that for most people, the difference between capitalist and communist society is vanishingly small. The choice between living under state-controlled or privately managed bureaucracy is no choice whatsoever.

Given the Cold War dichotomy between The Market and The State, it’s easy to mistake It’s a Wonderful Life for Communist propaganda. Except George Bailey doesn’t arrogate power into the state; he distributes power into the people’s hands. Not that famous socialist canard, The People, but the ordinary people, who have jobs and will use house-building money to improve their situation and join the upwardly mobile ranks.

“I feel that in a small way we are doing something important,” says Pa Bailey in the movie. (James W. Rodgers’ stage adaptation, a community theatre staple in which your humble commentator recently performed, reassigns this speech to Uncle Billy.) “Satisfying a fundamental urge. It's deep in the race for a man to want his own roof and walls and fireplace, and we're helping him get those things in our shabby little office.”

In other words, owning one’s own house, knowing one has freedom to improve or trash or mark up and sell one’s nest, gives one power. Mr. Potter, in attempting to own everything in town (he’s specified as owning the bank, department stores, and bus line), wants to concentrate power over the entire distribution chain in his own hands. The Bailey Building and Loan distributes that power onto those most immediately affected by relevant decisions.

Jeff Ensz, left, as George Bailey, and your author, Kevin L Nenstiel, right, as Clarence
Odbody, in the 2017 Kearney Community theatre production of It's a Wonderful Life
This system isn’t perfect. Recent economic changes have seen populations engage in flocking behavior, making access to simple resources like houses, harder to come by. The house Giuseppi Martini purchases in the film, an austere cottage in Flintridge, California, last sold in 2003 for $745,000; Zillow estimates it’s now worth twice that. Even George Bailey’s distributed system needs offsetting influences.

But a decentralized economy provides that. The system we have now, which concentrates wealth into fewer and fewer hands, while crippling citizens’ ability to organize countervailing influence, rewards Potter-like behavior. Claiming Potter is okay because he isn’t the state is fatuous. Concentrated wealth, public or private, almost always precipitates abuse. George Bailey simply provides citizens the ability to join the system.

Friday, December 22, 2017

A Short Handbook for Confronting Liars

Bruce Bartlett, The Truth Matters: A Citizen's Guide to Separating Facts From Lies and Stopping Fake News In Its Tracks

Reality exists. This didn’t used to be a controversial thesis statement; people of good conscience could disagree about how to interpret facts, or how to situate facts in a larger narrative, but debates about what constituted an actual fact stayed in Platonic discussions, where they belonged. But the increasing decentralization of news-gathering has separated citizens from objective reality, and we’ve “elected” a president who openly disdains facts. How can engaged citizens regain the factual landscape?

Bruce Bartlett started as an economic advisor to the Reagan Administration, and later retired to a lucrative conservative think-tank career. But he quit movement conservatism during the George W. Bush years, when he claims his former party became unmoored from reality. The current climate of buffet-style news sourcing, coupled with polarized voters seeking confirmation of their prejudices, tweaks his scholarly hackles. He wants you, the voter, to care about sources as keenly as he does.

This book offers a pocket-sized primer to source checking and factual verification in news. Some of it is very basic: does more than one source agree that something happened? Does an online source use links to verify claims of fact? Other pointers get more technical. Readers need to know context for facts, rather than fact dumps, for instance. Journalists who use orphaned statistics and unsourced quotes are probably actively confusing the issues, rather than clarifying.

Journalism is a profession, with its own practices and conventions. Though chin-pullers like me have lamented the professionalization of journalism, this has nevertheless helped working reporters build practices of internal verification and fact-checking into their business. But the conventions also obscure core practices in ways we can’t always see, a fact fake journalists use, judo-style, against us. A 101-level familiarity with journalistic practices among news audiences could open our eyes to objective truth versus baloney.

Bruce Bartlett
Technology has complicated the news distribution business. As 2016 proved, inflammatory lies travel quickly, and polarize the electorate. But the same technologies that spread lies, also make citizen fact-checking more possible. Much that turned the 2016 election into a national shitshow, like Pizzagate or false quotes attributed to either candidate, could be debunked through the Internet. Though straight Google searches often return widespread lies, more specialized tools, like Google Scholar or ProQuest, provide verifiable facts.

And some of Bartlett’s pointers involve readers acting in reflective, informed manners. One major source of fake news has been ordinary people resharing specious, unsourced claims because it confirms their pre-existing biases, or because a clickbait title inflames passions. Bartlett asks us to ensure the “facts” we use, and distribute, aren’t wrong. But that requires us to periodically check and make sure we aren’t wrong. A good debater first assumes: “I should double-check myself, too.”

Bartlett provides here a brief introduction to research techniques used in social sciences. Though he doesn’t get into the premises of original research (contact your local graduate school about that), this volume, sized to fit most purses and blazer pockets, provides up-to-date guidance in checking secondary sources and existing fact databases. Instructors could incorporate this book into high-school and college courses in journalism, political science, history, economics, and even behavioral psychology. One could hope, anyway.

This entire enterprise, admittedly, assumes readers want to screen reports to determine facts, and their context. In other words, Bartlett’s fact-checking process requires people engaged enough to spend time researching instead of passively watching TV. This isn’t a foregone conclusion; some people enjoy shouting others down, the schadenfreude they receive from hurting others and making them feel stupid. I wish Bartlett included a chapter in here on knowing when it’s time to cut provocateurs loose.

Not that Bartlett doesn’t have occasional partisan lapses. Early on, he describes Fox News as a centrist alternative to left-leaning mainstream news. This description would surprise Rupert Murdoch, who, in an interview given about the time this book debuted, described his baby as “conservative.” He also lumps satirists like The Daily Show and other late-nighters, who broadly lean left,  into “fake news,” which is specious since satirists admit their principles. Liars and jokers aren’t interchangeable.

This only proves that even scholars and fact-checkers need a skeptical eye occasionally. Informed readers should never be passive, because we’re already part of that reality which, I already noted, exists. Reading the news should change us, but we should challenge the news. The relationship is reciprocal, and requires constant testing and refinement. That’s missing in the “post-truth era.” The techniques Bartlett provides are slow and unglamorous; they’re also real, which is what we need.



On a related theme, see also:
A Short Handbook for Confronting Dictators

Tuesday, December 19, 2017

A New Form of Hollywood Suicide

Christopher Meloni in a promo
poster for Syfy's Happy!
Syfy’s new comedy-thriller Happy! commences with a shot of somebody vomiting blood into the ice chips in a rust-stained urinal. We naturally respond with revulsion but, like rubberneckers at a traffic accident, we cannot help watching. The camera pulls back to reveal who just upchucked, and zOMG it’s Christopher Meloni! Stabler from SVU! Wait, how did clean-cut, down-at-heels Detective Stabler turn into a burned out contract killer systematically frying his liver in lieu of suicide?

When movies first became an industry in the late 19th Century, production houses (which were mainly technology laboratories, like those of Edison and Marconi) resisted the star system which dominated theatre. They wanted individual films, which were frequently under five minutes long, to exist as freestanding enterprises, without relying on personalities, which could quickly overshadow the production. It’s easy to understand this now: top-grossing actors like Tom Hanks and Scarlett Johansson become lucrative genres themselves.

Television exaggerates this trend. Film actors like Julia Roberts or Christopher Walken cultivate a marketable type, but TV actors invest in a character, sometimes for years. Then they shift to another network, another genre, or another franchise, bringing that character’s accumulated baggage with them. Actors closely associated with famous roles have difficulty shaking them; will William Shatner ever have another role that doesn’t, at least implicitly, carry shades of Captain Smirk? Of course he won’t.

Actors have traditionally had two options after leaving iconic characters. Some resist being typecast, though this usually ends in disaster. Leonard Nimoy’s I Am Not Spock phase, when he tried to recapture his heady youth in Westerns and crime dramas, came to an abrupt end when he realized he couldn’t shake the character without compromising his market viability. Others gleefully embrace their typecasting. Bill Cosby milked the Cliff Huxtable market for years before submarining himself.

Christopher Eccleston in Heroes
But recently, some actors have identified a third option. They can actively kill their personas while winking at the audience. I first noticed this with Christopher Eccleston. Though already established for playing quirky anti-establishment characters for years, he probably secured his stature in the nerd pantheon as the first actor inhabiting the title role in the reëstablished Doctor Who, in 2005. However, the series proved an awkward fit, and Eccleston left after only one year.

After spending 2006 largely in private, Eccleston reappeared in 2007 on NBC’s breakout hit-turned-dumpster fire, Heroes. This series included several iconic genre veterans, especially George Takei (Sulu) and Nichelle Nichols (Uhura) from Star Trek. But where those two genre demigods played jokey, winking versions of themselves (Takei’s license plate read NCC-1071, the Starship Enterprise’s designation), Eccleston played a drunken, washed-out ex-hero who’s given up on humanity. He played the Doctor, after the Doctor stopped caring.

Like Eccleston, Christopher Meloni’s character, Nick Sax, has become alcoholic, stopped shaving, and abandoned his ideals. Eccleston’s character sleeps in a cardboard box; Sax is implicitly homeless, literally and metaphorically adrift in Manhattan’s unscrubbed underbelly. He accepts contract assassination jobs, not because they pay well or because he has any loyalty to organized crime, but because it’s the only thing left that makes him feel alive. Nick Sax is the polar opposite of Elliot Stabler.

Nor is this comparison accidental. The show takes remarkable efforts to assure us Nick Sax used to be an exemplary cop. We see a New York Daily News cover identifying Sax as “Supercop.” His former disciple, Detective Meredith McCarthy, greets him by saying “This piece of shit used to be the best cop on the force.” A prostitute he once busted remembers him as gentlemanly, even fatherly, coaching her to get her life on track.

George Takei in Star Trek (left) and Heroes (right)

These descriptors all apply to Elliot Stabler. Detective Stabler lasted fourteen years in the sex crimes division, a hitch most cops rotate out of every two years, because his compassion for others exceeded his fears for himself. Though sometimes ruthless, with a make-do commitment to professional ethics, Stabler’s clearance rate would make most supercops look lazy. As characters heap praise on Nick Sax’s former career, we realize, Nick Sax is Elliot Stabler’s shambling, undead corpse.

Meloni takes a bold risk here. If Happy! lasts into January, he’ll effectively kill Elliot Stabler, and any chance of returning to the Golden Ticket that made him famous. But it also provides an opportunity many once-notorious TV actors never receive, an opportunity to put his starpmaking role behind him forever. Only time will tell whether this gamble succeeds. Until then, let’s marvel at a successful actor metaphorically stabbing his hard-earned stardom in the face.

Friday, December 15, 2017

A Short Handbook for Confronting Dictators

Timothy Snyder, On Tyranny: Twenty Lessons from the Twentieth Century

The American President came second in his race. The current British government garnered barely a third of the vote in the last general election. And Russia hasn’t had a truly free election since Boris Yeltsin. Couple that with appallingly low numbers for legitimate newspapers, record highs for passive venues like aggregator websites, and stunning disinterest in voting among the young, and you have a recipe for government by tyrants.

Timothy Snyder, a Yale historian specializing in Europe around and between the World Wars, has made a career of identifying the social forces which made Fascism and Stalinism possible. He isn’t the first to find correlations between that era and the current international scene. However, he uniquely distilled that era’s lessons into a brief handbook which audiences have enthusiastically embraced. And it’s easy to see why.

Some of Snyder’s pointers seem obvious when confronting illegitimate or autocratic governments. “Beware the one-party state.” “Contribute to good causes.” “Be wary of paramilitaries.” These all reflect very true conditions in a state where the ruling party actively strives to make the opposition seem criminal, strips funding from good works, and keeps an off-the-books security force. These would be truisms, if they weren’t so frequently ignored.

Other points seem less obvious. “Make eye contact and small talk”? Don’t we have better things to do than have conversations when world history is at stake? Not necessarily. If we fight for global causes, but don’t actually know our neighbors, it becomes easy to make immoral compromises that sacrifice the lives of apparent strangers. Similar reasoning applies to “Establish a private life” and “Remember professional ethics.”

Timothy Snyder
And a few pointers seem downright counterintuitive. “Stand out”? Doesn’t the tallest weed get cut down first? Maybe. But while tyranny originates from above, deriving its power from unelected or semi-legal means, Snyder insists tyranny perpetuates itself mainly through citizens who conform, who go along to get along. Resisting tyrannical governments, protecting the institutions that make democracy possible, requires people who think freely, and act on those thoughts.

This is a recurring theme in Snyder’s analysis. Mindless conformity enabled leaders like Stalin and Hitler to consolidate their control, as citizens followed the path of least resistance so they could continue making a living. Then, when leaders used their consolidated control to annex Austria or collectivize Ukrainian farming, nobody could stand against them; all avenues of resistance had been swallowed by the desire to not make waves.

Snyder contrasts this conformist thinking to powerful non-conformists. Winston Churchill dismissed calls from both sides to make peace. Teresa Prekerowa saved families from crackdowns in the Polish ghettos. Václav Havel distributed samizdat literature that fired anti-communist resistance when centralized governments became too powerful. Non-conformists made resistance possible at times when standing up to autocrats seems pointless and self-defeating.

Tyranny, after all, isn’t inexplicable. Snyder notes in his introduction that “Both fascism and communism were responses to globalization: to the real and perceived inequalities it created, and the apparent helplessness of the democracies in addressing them.” This similarity should chill anyone following current politics. The election of demagogues like Donald Trump and Teresa May reflects social conditions brought about almost exactly how Twentieth Century tyrants profited from the Gilded Age.

The ultimate resolution may be similar.

Most important, the theme permeating this book involves holding to principles of truth and reciprocity. Tyrants tend to govern by force of personality, adhering to principles of self-advancement and “nature red in tooth and claw.” Maintaining the structures of civil society require organized dissidents linked by moral foundation and a belief that human beings, individually and collectively, are worth fighting for. Which, thankfully, we have seen on the ground.

If I have one point of disagreement with Snyder, it’s his lack of source notes. Throughout this book, he cites from history, name-drops theorists like Hannah Arendt and Victor Klemperer, and generally quotes a grab bag of luminaries who lived through or commented upon modern technocratic tyranny. The intellectual-minded among us might enjoy delving into his sources. If things get worse before they get better, which seems likely, we need prepared responses.

Throughout, Snyder makes repeated references to “the president” and to current leaders. However, he carefully avoids proper nouns when calling out personalities. He clearly refers to Donald Trump, especially when suggesting that midterm elections might get suspended (a common scare tactic among whichever party lacks power). However, it bears noting that, in 2016, both major parties floated top-of-the ticket candidates with noted authoritarian tendencies. Tyranny should be a non-partisan issue.

Tuesday, December 12, 2017

The Liberal Arts Are More Important Now, Not Less

A famous portrait assumed
to be Christopher Marlowe
Christopher Marlowe’s play Doctor Faustus opens with an illustrative scene. Newly minted in his doctorate, Faustus chooses several books from his shelves, takes a seat in his study, and… picks his academic specialty. He decides on sorcery only after examining, and discarding, his era’s three legitimate academic fields: Law, Medicine, and Divinity. Remember, he does this after already achieving his doctoral degree.

I recalled this scene while reading Nina Handler’s lament, “Facing My Own Extinction,” in the Chronicle of Higher Education online. Handler, coordinator of English at Holy Names University in Oakland, California, looks both forward and backward as she makes peace with her school’s abolition of the English major. Her school, she laments, has become a preprofessional training seminar, where any class without a career payout gets dismissed as unnecessary.

It’s tempting to defend English for transcendental ideals, like that people who read are better able to have empathy, or that it potentially immunizes democracies against tyranny. But these arguments basically persuade the already persuaded. The educational reformers aggressively pushing STEM-focused curricula, subsidized by captains of industry and legislators desperate to not look backward, won’t hear such arguments, because they already disbelieve or dismiss them.

Instead, let’s contemplate the very material rewards that come from a diverse liberal arts curriculum. Two authors I’ve recently reviewed, Christian Madsbjerg and Scott Sonenshein, both write that business success and diverse education go hand in hand. Both authors observe something I’ve observed, though they have better source notes: specialists know how to do one thing well. But they’re incapable of adapting to changing economic, business, or professional conditions.

Christian Madsbjerg
The simple ability to read and understand multiple genres provides one recourse against such inflexibility. By simply stepping outside ourselves, our narrow range of experiences and specialized training, we learn ways of thinking that keep our minds active and developing. If we can wrap our heads around Gilgamesh, Hamlet, Elizabeth Bennett, and Bigger Thomas, we can also handle economic downturns, changes in job-related technology, and evolving moral values.

This isn’t only about English. In History, for instance, certain ideas are objectively correct, and therefore testable on a Scantron sheet. The last successful invasion of England, for instance, was objectively in 1066. But why? What made William the Conqueror able to sack England and take the throne, while Hitler’s Operation Sea Lion, much more thoroughly planned and backed with more advanced technology, got abandoned even after the Luftwaffe?

Don’t mistake this for abstruse woolgathering. America has a president who thinks Andrew Jackson could've posthumously prevented the Civil War, and an administration that thinks “the lack of an ability to compromise led to the Civil War.” These people, with the ability to guide the economy or take America to war, demonstrate a palpable lack of awareness about the weighty economic, social, historical, and military factors that shaped history, and continue shaping the present.

Perceptive readers will notice I haven’t really made a concrete argument for the English major specifically. Despite name-checking literary characters and historical events, I’ve made a more broad-based argument for a diverse liberal arts education. You’re right. Locking youths into an academic program, and resulting career path, at age 18, seems ridiculous to me. Like Faustus, they should choose their specialization only after gaining a diverse general foundation.

A handful of colleges and universities have followed this path. St. John’s College of Annapolis and Santa Fe comes to mind, as do Thomas Aquinas College and Reed College. These schools have softened or abolished undergraduate major departments, focusing on a diverse grounding in humanities and sciences. By contrast, Holy Names and other schools not only lock students into career tracks, they’re narrowing the number of tracks available.

Scott Sonenshein
Business, citizenry, and just plain humanity, require a diverse grounding in the humanities. This includes language arts and social sciences, but also mathematics and physical science. Our society suffers a plague of specialists today. It’s easy to point fingers at lawyers who only know law, or businessmen who only know business. But I’d also include journalists who only attended journalism school, and bureaucrats who have spent decades only in government.

Remember Doctor Faustus. Having chosen his discipline based upon earthly rewards, he gets the knowledge he seeks quickly. But he almost immediately declines into a shadow of himself, conjuring famous shades for kings like a carny barker, and playing horse pranks on foolish yokels. At his death, even God won’t take him back. Because knowledge isn’t supposed garner worldly payouts, it’s supposed to create a full and rounded soul.

Wednesday, December 6, 2017

When Did Americans Become Afraid of Due Process?

John Keats
English poet John Keats wrote about a phenomenon he called “negative capability.” Some people are such advanced thinkers, Keats supposed, that they could hold two contradictory ideas in their heads simultaneously, without reaching for a facile resolution. Simple minds need everything to equal out, but according to Keats, complete contradictions don’t faze superior minds. Which makes sense hypothetically, but we’re seeing it makes bad policy in practice.

The recent rash of public excoriations for sexual harassment in high places casts important light on how contemporary society perceives women as subordinate to men’s desires. While the shakeout at the peaks of entertainment, journalism, and politics has garnered the most attention, the popular #MeToo uprising demonstrated it permeates all levels of society. Though powerful people probably harass disproportionately, since as The Atlantic writes, Power Causes Brain Damage, this is a women issue, not a power issue.

As a person who prefers to side with the powerless in social issue debates, I’m glad to see powerful people getting their comeuppance. Men like Harvey Weinstein and Kevin Spacey have had make-or-break power over aspiring young artists’ careers for so long, they’ve forgotten what it means to be hungry, or what we’ll do to assuage that hunger. The mere fact of a decades-long, well-loved career, like Matt Lauer or Garrison Keillor, shouldn’t shield anyone from criticism.

(I say “men” because the accused have mostly been men. At this writing, singer-songwriter Melanie Martinez has become the highest profile woman thus accused, though less than forty-eight hours in, it’s impossible to say how her story will shake out. And the occasional woman doesn’t counteract the statistical preponderance of men in this meltdown.)

So yes, I’m happy to see powerful abusers brought low. Except…

Matt Lauer
I’m not the only one to notice this meltdown has largely jettisoned due process. People who have dedicated years, sometimes decades, into difficult careers, are seeing them torpedoed by accusations, sometimes only one accusation. Dozens of women came forward criticizing Harvey Weinstein. Between NBC and Variety, we have four corroborated accusations against Matt Lauer. Garrison Keillor has only one accuser—and details have been frustratingly sparse.

Don’t mistake me. Most of these accused carry a palpable whiff of guilt around them. Besides Louis CK’s frank confession, the mumbling non-denials we’ve heard from Spacey and Weinstein indicate they know they did wrong, and cannot honestly deny it. But even American courts will tell you, a confession isn’t binding without corroborating evidence. Even when the accused confesses, due process must happen for a conviction.

The Weinstein Company board fired their namesake after days of deliberation. But if NBC’s official statement is credible, less than thirty-six hours passed between the first corroborated accusation and Matt Lauer’s firing statement being read aloud on-air. (That statement, incidentally, is not credible; Lauer's voracious sexual appetite was known for years.) The pace of firings appears increasingly hasty; Minnesota Public Radio still hasn’t commented, beyond vague generalities, about Garrison Keillor.

It’s difficult to avoid comparisons with McCarthyism. Though popular Hollywood personalities like Dalton Trumbo and Ring Lardner, Jr., were never convicted of or confessed to Communist sympathies, the studio system successfully spiked their careers for decades. Some never had their reputations restored. The mere fact that Trumbo was, indeed, a Communist, and Lardner showed strong leftist sympathies, doesn’t excuse the way they were railroaded.

The enduring popularity of courtroom dramas like Perry Mason might explain why we’re reluctant to permit harassers due process. These shows create the impression that trials exist to exonerate the falsely accused, since the police consistently arrest the wrong person first. It’s clear these accused aren’t the wrong people, especially as Matt Lauer, like Louis CK, has confessed and sought forgiveness. Even he believes he’s guilty.

Harvey Weinstein
But we have trials to protect the rights of the guilty, too. The slow, deliberative process ensures we don’t respond from panic, rage, or fear. America has a history of using false, or specious, accusations to railroad the accused, often with tragic consequences. That’s why we entrust law enforcement to state agents, not private vigilantes. Especially when accusations appear quickly and vigorously, like right now, we must slow the conviction process.

It’s possible to acknowledge, like Keats, that these accused sure look guilty, but also that they deserve due process. Indeed, arguably, the guilty deserve a full hearing more than the innocent. Both the accused and the accuser deserve to be heard. And that’s not happening right now. This cuts to the heart of our belief in ourselves as a just people.