Monday, January 30, 2017

The Least Passive People In America



I just spent the most remarkable evening, among the least passive people I've seen. Deep in the wild heart of Lawrence, Kansas, I witnessed two groups of people not content to receive pre-packaged culture, so they made their own. In some ways, it ratified ideas they have percolate, unvoiced, inside my head for years. In other ways, it upended my expectations, and I may never be the same again.

In the first part of my evening, I participated in a painting session. About 32 people gathered in a storefront studio with photographs of their pets, and turned their images into acrylic-on-canvas memorials to how much humans can love small animals. For only about $25 per head, the participants, mostly but not exclusively white women, received paints, prepared canvases, and guidance from experienced artists to recreate their fuzzy or feathered buddies for posterity.

Then, still flush from the creative moment, my companion and I hurried across town to a downtown watering hole where, every Sunday afternoon, nearly two dozen musicians, a group as racially and sexually diverse as their famously polyglot town, gather for a three-hour Celtic music jam. At its peak, the jam included eight fiddlers, four guitarists, two banjos, three percussionists, two flutists, and a concertina player.

The two were very different experiences. The former was scheduled, with a fixed start and end time, a buy-in cost, and designated professionals. The latter was more free-from, with skilled and unskilled players drifting in and out throughout the night, and no definite leader. Some of the musicians, like some of the painters, were appalling compared to others. And both events created venues for undiscovered and unpolished, but gifted, artists waiting for guidance to great accomplishments.

Yet they had more in common than their differences. Both allowed people who almost certainly did something else for a living, people who, I'm sure, don't identify themselves as "painters" or "musicians," to create something worthwhile together. Simply sharing a space and a goal gave their lives shape and definition, even if only for three hours. For that time, they weren't consumers, mere statistics in someone's supply-side graph. For that time, they were creators.

I did it myself. Acrylic on canvas.

Watching these events unfold (and, in the former case, participating), I remembered back in college, when my friend Roger and I led our small prairie town's local pub quiz. For nearly five years, we led what I believe to have been Nebraska's first and, for many years, only British-style pub quiz. At our peak, we filled a good-sized bar, in a moderately prosperous university town, from wall to wall, two Tuesdays per month.

Yet we never achieved my goal, to cultivate a maker culture in our town. Because Roger and I stood before the assembled crowd, engaging (admittedly good) banter into the microphone, while the crowd watched and listened, our making remained one-sided and hierarchical. The few occasions we had balls enough to relinquish the Mike to crowd members, a whole different vibe developed. Yet I lacked perspective enough to recognize what happened.

Until now.

Roger, who is from Scotland, used to mock Celtic music, claiming he moved to the American prairie to escape what he called "deedly-deedly-dee" music. Yet hearing that music created before me, I realized why Celtic music tends toward repetition: because it's not meant to be heard, unlike Anglo-American pop music. It's meant to be created and, if necessary, constantly re-created. No wonder the earliest Irish albums were just live recordings of pub-house craics.

I love attending concerts, and travel to the Kansas City area to attend big-ticket live concerts two or three times per year. But watching the Celtic music crowd, I realized I wasn't seeing a concert. In a concert, the musicians face the audience, who face the musicians, reinforcing a strict hierarchy of makers and consumers. The effects be entertaining, but ultimately dehumanizing. At the Celtic event, the musicians crowded the room and faced one another, creating a truly small-d democratic experience.

Besides concerts, I've long enjoyed museums, poetry slams, and community theater. All reward a maker culture, and encourage people to participate in culture creation... to a point. Unfortunately, all three also enforce a division between artist and audience. Having witnessed two truly participatory creative experiences, I can't shake off the gap between these and what I've seen elsewhere. Sadly, I have no easy solutions. I just have a vision for a society that demolishes the wall between active creator, and audience.

If you need me, I'll be studying chiaroscuro and practicing my mandolin.

Friday, January 27, 2017

One Man's War on the Homefront

Arnold L. Punaro with David Poyer, On War and Politics: the Battlefield Inside Washington's Beltway

Second Lieutenant Arnold Punaro took a bullet in a distant Vietnamese valley, and woke up with a Purple Heart pinned to his hospital pillow. Having joined the Marines fresh from college to avoid getting drafted into the Army, he always considered himself a short-timer, but now he had a permanent reminder of his tour. After snagging a graduate degree on the GI Bill, he accepted a job for long-serving Georgia Senator Sam Nunn, and his adventure continued.

With a secondary title like “The Battlefield Inside Washington’s Beltway,” I anticipated a scholarly analysis of the intersections, and disconnections, between American political process and military action. Unfortunately, no. I might’ve enjoyed such a book, but this is Arnold Punaro’s memoir of stumbling into the halls of power, almost by accident, if his stories hold water. If you can clear away expectations generated by the somewhat misleading title, this is a fairly interesting read.

Much like his joining the Marines, which he did simply to avoid something worse, Punaro’s entry into America’s political establishment was more fluke than design. He answered a flyer hanging at the University of Georgia for senatorial interns, because the woman he loved was working in Washington. Senator Nunn, a young upstart then, needed office staff from his home state. And Punaro’s journalism degree made him valuable back before social media set America’s political tone.

Eventually, though, this accidental job became an important calling. Senator Nunn, a famously convivial speaker and ambitious personality, became the ranking Democrat on the Senate Armed Services Committee, and having a Purple Heart on his staff became a priceless asset. Punaro quickly became the Senator’s right-hand man on all matters military. His life’s circumstances, which seemed like chance at the time, conspired to make him insanely powerful, without ever running for office.

Punaro, with ghostwriter David Poyer, writes in a linear, straightforward manner, like somebody telling stories over drinks. He occasionally tosses out flashbacks or peeks ahead, but his story unfolds mostly in sequence. Brief interruptions from his wife Jan, who exerted powerful but not always intentional influence over his life, have a sort of “Now honey, tell it right” quality which increases the personal texture. This isn’t a polemic; it’s an engaging story between trusted friends.

Arnold L. Punaro
Punaro’s career has caromed between categories of public service. Despite leaving the Corps, he remained a reservist, and was mobilized during Desert Storm, making him an important living link between two eras of American military history. He eventually achieved the rank of Major General and, after leaving Senator Nunn’s office, took a prominent billet at Quantico. I don’t fear revealing this, since Punaro’s service is public record; but his storytelling gives events life.

Not surprisingly, Punaro’s encounters with power include some pretty notable names of his generation. From Richard Nixon to Dan Quayle, from Scoop Jackson to Dick Cheney, Punaro name-drops encounters with men (mostly, indeed, men) whose names became synonymous with power politics and American history. Sometimes he underplays them for humor: he subtly pretends nothing unusual matters about rubbing shoulders at Quantico with a lowly Marine captain named Oliver North. We wait for the other shoe to drop.

It may surprise nobody who’s read memoirs like this before, that Punaro’s life trajectory culminates in private enterprise. Now CEO of a public policy consultancy, Punaro has cultivated a media presence, of which this book is only the latest component. Surprisingly, his business ventures have dealt more with budget and technology than defense, a fact one has to Google, since Punaro’s final chapters compress the last twenty years into eighteen pages. Weirdly enough, self-promotion requires unexpected modesty.

Although throughout this book, Punaro essentially markets himself as product, his prose doesn’t have the huckstering feel of other books by professional consultants I’ve read recently. He shares lessons he’s learned, lessons that have made him practical and non-ideological in an increasingly bifurcated political scene; but I never get the “Gimme a buck for the white paper” vibe. He just comes across like a guy whose seemingly uncoordinated life choices paid off in understated success.

Punaro tells his story to make himself sound remarkably like the subject of a Malcolm Gladwell New Yorker profile, the fortunate brother whose combination of preparation and luck led him to fortune. Punaro’s behind-the-scenes influence spans decades, and he himself is an example of America’s possibilities and perils. Though I admit, I might’ve liked more insight into the intersection between military and public policy, Punaro’s own experience within the field is both engaging and eye-opening.

Wednesday, January 25, 2017

If You've Become Uncomfortable In Your Comfort Zone...

Andy Molinsky, Ph.D., Reach: A New Strategy to Help You Step Outside Your Comfort Zone, Rise to the Challenge, and Build Confidence

Back in the Paleolithic Era, having a fairly small comfort zone probably made good sense. Fearless people likely got eaten by lions. But today's relatively lion-free environment makes hugging our comfort zones perilous and constricting. Our reasons for avoiding uncomfortable situations are as numerous as our tactics. But the short-term stress relief robs us of long-term opportunities. Sadly, just telling ourselves to get out of our comfort zones doesn't work.

Brandeis organizational psych professor Andy Molinsky has dedicated years to studying this conundrum, and published dense research project on the topic. Thankfully, this book avoids the scholarly jargon of such papers. Writing with business professionals, job interviewees, and students in mind, Molinsky crafts a nuts-and-bolts explanation of why we have comfort zones, why we should step outside them, and how to achieve that seemingly impossible goal.

Permit me to paraphrase Molinsky, without giving anything away, since he says everything better, and in more detail than I could. We have five basic mental justifications for our comfort zones: because whatever makes us uncomfortable feels inauthentic or immoral, may make others perceive us as incompetent or unlikeable, or it really shouldn’t be our responsibility anyway. Molinsky identifies these five causes from both academic research, and in-person interviews.

Once we recognize which principle, or combination of principles, hold us back, we have three tools available to re-stack the deck in our favor. Molinsky calls these tools Conviction, Customization, and Clarity. This means we believe whatever makes us uncomfortable still really needs doing; we can organize how we handle our circumstances to minimize discomfort; and we have enough self-awareness to face our challenges without them breaking us.

It really is that simple, though like most simple explanations, it’s really not that simple. Molinsky wouldn’t have written a 250-page book if he could’ve written a fortune cookie. His book combines psychological research with field interviews, demonstrating ways people have allowed their comfort zones to constrain them, and how applying these simple principles has transformed their lives. If Molinsky’s interview transcripts are credible, “transformed” isn’t an exaggeration.

Andy Molinsky, Ph.D.
Molinsky puts his most important points in this book’s first half. Though he’s guilty of a little throat-clearing in the earliest pages, he mostly gets down to brass tacks, combining his measured principles with stories that serve as object lessons. His stories are often very personal, yet concise in structure, without excessive rumination. They demonstrate how people got out of their own paths and stopped being their own worst obstruction.

This isn’t a book of straightforward exercises. I admit being disappointed by that, as I’d expected something like Kelly McGonigal’s The Willpower Instinct, which combines high-minded theories with road-tested exercises for beginners. Molinsky compares this book to a free-form recipe, where rather than fixed measurements, we have a broad outline of what the finished product could look like. We simply have to interpret freely, putting our stamp on the recipe.

Not that Molinsky prescribes no concrete actions. As a sometime writing teacher, my favorite involves describing your own situation, in writing, in third person. Crystallizing ideas into words, Molinsky demonstrates, frees people to act, besides having quantifiable physical health benefits, according to one of his sources. So we have concrete actions, just no step-by-step journaling procedures. That probably works better overall, but requires your effort to get started.

Reading this book, I see myself, and other loved ones besides. As we get older, as we own more stuff and have more responsibilities to family and community, our comfort zones contract. Mine has become strangling. But Molinsky demonstrates we needn’t risk everything we treasure, to venture outside ourselves. By understanding why we’ve become so risk-averse, and applying Molinsky’s Triple-C approach, we’ll find doors opening immediately.

This book probably belongs on the same nonfiction-as-self-help shelf as Malcolm Gladwell and James Duhigg. It has the same purpose of taking known, well-studied science, and turning it into actionable advice. But unlike those other writers, who attempt to disguise their self-help as journalism, Molinsky, a working research scientist, doesn’t pretend his advice is anything but advice. Like Daniel Kahneman, he comes right out and says: Do This.

I’m still working to incorporate this book’s advice into my thought processes. Because this book really is about me, and people like me, it presents an opportunity to stop my decline into stultifying comfort. If you’re like me, and you probably are, this book is for and about you. Please grab this opportunity to turn your life into something greater. I already feel energized, knowing this exists.

Monday, January 23, 2017

Two Stories to Every Side

An ancient sculpture believed to
represent Protagoras the Sophist
The claim that “there are two sides to every story” dates back over two millennia, to ancient Greece. It is generally attributed to Protagoras the Sophist, who taught legal argument in ancient Athens. We cannot know this, however, since little of Protagoras’ own works have survived; he’s best known for the Platonic dialog in which Socrates mocks him for that claim. Socrates asserts this claim means Protagoras thinks truth doesn’t exist, and philosophy is futile.

That isn’t what Protagoras means, however. Like all Sophists, Protagoras’ chief interest wasn’t in knowing objective reality, but in staging arguments in court. Ancient Athenian democracy was far more litigious than today’s supposedly sue-happy society. Sophists taught citizens, who were expected to represent themselves, how to take facts which actually exist, and find the interpretive framework that most completely explains them. In other words, he taught citizens to argue from evidence, without making anything up.

This remains what we do today, in multiple forms of argument. In a murder trial, the defense may formulate claims that the defendant didn’t kill the victim, or that the prosecution’s case doesn’t hold water. If the defense claims the murder just didn’t happen, and the victim remains alive and productive, they’ve essentially surrendered all claims to seriousness. Serious argument starts with the position that reality exists. Without that, no other meaningful claims are possible.

Likewise, scientists may argue about, say, the universe’s background radiation, or Global Warming. These arguments start by acknowledging that the Hubble Space Telescope and the International Space Station have observed background radiation in manners consistent with mathematical predictions, or that globally, the ten hottest years on record have all happened since 1998. These facts exist. Anybody who crosses their arms, stomps their feet, and says reality didn’t happen that way, loses the right to participate.

This should be uncontroversial. We all have eyes and fingers through which we recognize the world. Anybody who studies psychology knows that the evidence of our senses isn’t always reliable, but when a robust consensus of skilled professionals, working within their areas of expertise, produce thoroughgoing evidence that something exists, that summers really are measurably warmer than they were when our parents were this age, or that universal background radiation exists, we have to trust.

Things aren’t always so simple, certainly. A person can only get murdered once. Prosecuting attorneys cannot bring claims that this murder is consistent with other murders enough to be considered a murder (except when linking multiple deaths to one serial killer, which is far less common than TV would imply). In that case, we use whatever evidence we have, to create a persuasive story. And our opponents will use matching evidence to create their counternarrative.

White House press secretary Sean Spicer, during
his first official press conference this past Saturday
If police have security camera footage of me entering the victim’s workplace one hour before co-workers found the body, I may have reasonable explanations for this. Perhaps I owed the victim money, and showed up to pay. Perhaps I simply intended to do business, and my proximity was coincidental. And I could completely render the evidence innocuous by saying, hey, I work there too. Of course I showed up; I had a shift that day.

I cannot, however, say the evidence doesn’t exist. If I say there’s no such photo after you’ve already seen it, you’ll know I’m lying, and everything I say becomes questionable. We’ve all had the schadenfreude, while watching TV news or documentaries, of seeing a person confronted with evidence contradicting everything they’ve just said, watching the mental hamster wheel turning as they try to invent, on the spot, reasons to distrust the evidence of your senses.

If I say something doesn’t exist, and you have evidence demonstrating it does, that’s not another side to the story. That’s not merely a case of “everyone’s entitled to their opinions.” And it’s certainly not “alternative facts.” If you have something concrete to demonstrate that my facts don’t support the explanation I’ve constructed, you must say something. You cannot simply say it ain’t so, you must demonstrate this. Whoever brings evidence, simply put, gets believed.

I confess a certain antipathy toward “two sides to every story,” even though I believe it’s true. When people say this, they almost never mean they have persuasive counterevidence that dismantles my argument. They preponderantly muster this bromide to keep a controversy alive after all evidence has submarined it. And that’s arguing in bad faith. If we cannot first agree that reality exists, we have nothing more to say. That’s not counterevidence, it’s a lie.

Friday, January 20, 2017

America and the Amazing Two-Headed President

1001 Books To Read Before Your Kindle Battery Dies, Part 78
David Orentlicher, Two Presidents Are Better Than One: the Case for a Bipartisan Executive Branch

Let’s start with a statement Americans of most political stripes will find agreeable: our Presidency currently doesn’t work. Policy positions emerge from the Executive Branch fully formed, and Congress essentially rubber-stamps them. Having one key decision maker leads to bad decisions, as in Bush’s torture memos, or Obama’s unsanctioned drone strikes. Every President since at least Nixon, and arguably most since Andrew Jackson, have exceeded their office’s Constitutional remit.

Like Barack Obama, David Orentlicher is a Constitutional scholar and sometime elected official. He examines multiple suggestions to offset what he calls “the imperial Presidency.” From the Constitutional Convention of 1789 to the present, scholars and jurists have called for executive committees, Parliamentary organization of the executive, or triumvirate power. Orentlicher notes the promises and shortcomings of every suggested reform, and puts forth his own: America needs a two-person Presidency.

Like most scholars, Orentlicher makes his point early, giving us his core outline within the first two chapters. After that, he spends the rest of the book deepening his claims, dismantling anticipated counter-arguments, and placing his position in a historical perspective. Admittedly, this doesn’t make for gripping beach reading. However, it postulates an interesting, and possibly workable, solution to the impasse dogging American politics, an impasse likely to get worse.

By vesting executive authority in the top two vote-getters, requiring two different political parties, Orentlicher suggests we’ll offset our government’s current aversion to consensus. If legislation requires two signatures for ratification, if moving troops requires two commanders’ authority, if executive orders require agreement of two, politically unaligned, Presidents, hasty actions become untenable. Presidential self-aggrandizement, currently widespread and worsening, is minimized by separating the person from the office.

Whatever party loses the White House is reduced to breaking everything in Congress. Republicans obstructed everything Obama attempted, and at this writing, Democrats promise the same treatment to Donald Trump. This reverses the Founders’ expectations, as expressed in Federalist 70. Having experience with powerful, capricious state legislatures, Madison and the Founders wanted a one-person President to offset a dictatorial Congress. Unfortunately, they shifted the nexus of authoritarianism without fixing it.


Other possible solutions exist, Orentlicher admits. Switzerland has a seven-member executive council. Britain’s Parliamentary system diffuses authority across the Cabinet. And many U.S. states have unbundled executive authority. However, on the federal level for a global superpower, Orentlicher demonstrates, such solutions create new problems. By keeping Presidential authority unified, but bifurcating the office itself, we retain the Presidency’s nimble nature, while dismantling the link between individual and office.

Orentlicher admits multiple serious reasons a small Executive Branch remains necessary. What psychologists now know about human decision-making tendencies recommends a small group. A large Congress makes decisions too slowly, while a one-person Presidency permits precipitousness and unconscious bias: every President since Nixon, for instance, has violated the War Powers Act. Two people can, hopefully, make decisions and reach consensus quickly, while minimizing the tendency toward haste.

I admit initially fearing a bipartite executive would basically enshrine the two major parties in Constitutional law. Orentlicher offsets that fear quickly by noting that, if America elects both Presidents simultaneously, voters might stake their conscience on a longshot third-party candidate, hoping not for a win, but a robust second-place finish. This might dismantle the current winner-take-all system that has kept two fossilizing parties in power since the Civil War.

Also, I have concerns about requiring the Presidents to be of different political parties: this would write parties into the Constitution for the first time. In his farewell address, George Washington warned against the overweening authority parties, which he called “factions,” could create. In Congress now, we see legislators clearly more loyal to party than to their constituencies. Orentlicher doesn’t resolve my doubts; this may remain an important debate point.

Prospects for enacting this policy solution are currently distant. Orentlicher admits this, noting that Americans historically like strong, unifying leaders. (Consider the market’s worship of hero CEOs.) However, if Americans can overcome their reluctance to amend the Constitution, this proposal has potential to offset the accumulation of authority into one office that we’ve seen over several decades—accumulation currently paying off by handing that authority to Donald Trump.

Orentlicher doesn’t promise his proposal will solve every American governmental problem. We’ll still face demagogues, bomb-throwers, and ideological impasses. But it potentially remedies a problem which the Founders largely failed to anticipate. It makes America’s Executive more reflective of America’s plural character. And it frees us from the belief that, whatever President gets inaugurated, we’re never yoked to one individual’s vision.

Wednesday, January 18, 2017

Blood in the Children's Ward

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 16
Jordan Melamed (director), Manic


Young Lyle Jensen (Joseph Gordon-Levitt) starts this movie already bloody. We never see what happened; we only see him, bandaged in a hospital gown, awaiting diagnosis for why he abruptly attacked another teenager. The doctor pronounces him prone to violent manic episodes, and with his mother’s assistance, has him involuntarily committed to a juvenile psychiatric ward. Among his fellow underage “crazies,” Lyle will either improve, or become trapped in his illness.

This strange, low-key movie, shot on a shoestring budget with mostly novice actors. Central actor Gordon-Levitt, a former child star, appeared in this film partly to kick-start a successful adult career. Don Cheadle, as Dr. Monroe, had nearly two decades of acting under his belt, but was still three years from his star-making roles in Crash and Hotel Rwanda. And this was only Zooey Deschanel’s third billed on-screen role. Most actors returned to anonymity when the production wrapped.

Despite the focus on Gordon-Levitt, Cheadle, and Deschanel, most of this movie is an ensemble drama. Peopled with young characters desperate to achieve adulthood on their own terms, and very few grown-ups, it structurally interrogates how we define maturity, much less sanity. As the story develops, some characters clearly have damaged faculties and need formal treatment. But which characters actually need help, and what constitutes help, is forever challenged.

“Behind the wall,” Lyle gets partnered with pathologically withdrawn adolescent Kenny. The doctors hope that Lyle’s hyperkinetic extroversion will draw Kenny from his paralyzing shyness, and vice versa. The two slowly develop a grudging, fraternal relationship, but not without some struggle. Meanwhile, Lyle must come to grips with his fellow inpatients, including an agoraphobic, a nymphomaniac, a pathological fantasist, and beautiful but tightly constrained Tracy (Deschanel).

Cheadle, as resident psychologist Dr. Monroe, has his own unresolved issues, comically lampshaded in one scene. He feels trapped and underappreciated, desperately trying to convince pained youths to address questions many adults would flee. His charges are diagnosed mentally ill, but we gradually learn, many have deep-seated wounds predating anything inherent to their heads. What appears like violent, maladjusted behavior might be completely appropriate, where these kids came from.


All this plays against a dangerously conflicting background: these characters must either get better and get released, or turn 18 and get thrown out. Lyle struggles to build trust with other characters, only to see them leave. A handful see legal adulthood looming, without having addressed their underlying struggles. Adulthood, for these characters, may mean release, or imprisonment in a mental hell from which they can expect no release. No wonder several turn self-destructive.

Shot on low-def, broadcast-quality Betamax film, the standard for TV around the time of release, this movie has a documentary quality, accentuated by the sometimes jerky, handheld camera quality. We feel like we’re watching Lyle’s health struggles, his psychiatric evaluations, his group therapy sessions. This isn’t always easy: though this movie is mostly low-strung and talky, it has moments of dry comedy or sudden violence, and the screen image can have jerky, nausea inducing incidents that make Oliver Stone look easygoing.

Essentially, despite its star moments and its tightly wrapped storyline, director Melamed presents a slow-moving character drama, a cerebral enterprise focused on the collision between the characters’ inner struggles and the outer world’s unforgiving demands. As Freud observed nearly a century ago, adult society requires submission to certain rules, designed to protect one another from humans’ animal impulses. But damaged people struggle with these rules. If Lyle cannot acquiesce, he cannot become an adult. He may leave the ward, but he’ll never be an adult.

To its credit, the movie resists easy answers. Both giving in, and standing fast, would mean changing Lyle’s, and his friends’, inner nature. The final resolution suggests Lyle has split the difference, but we we wonder whether, and why, he’s really changed. One cannot undo such deep damage quickly, as even Lyle acknowledges. As he embraces the difficult journey, we watch him with deep dread, knowing he’ll re-fight the same battles many more times.

This movie never had a wide release. It played a handful of film festivals before going straight to DVD, presumably because the makers knew a slow-paced, cerebral character drama about mental illness had limited market potential. It made less box-office revenue than most big-studio films make in one theatre in one weekend. But its receipts reflect neither its critical buzz, nor its impact. This movie changes you. It makes you squirm, and in the final frames, it makes you ask why you squirmed.

Monday, January 16, 2017

The Broken Manhood Spectacle

Click to view the thread on the original website
In recent years, I’ve watched three women I know endure long, bitterly contested divorce battles. In all three cases, the court battle has turned heavily on their efforts to regain their own birth names, so they need not be legally known by their abusers’ surnames. Names have power, and when abuse survivors are still required to sign their abusers’ names on legal documents, it has the ability to double down on scars that haven’t healed.

That was my thought when I wrote the image at right. For those unfamiliar with Whisper, it’s a smartphone-only social media app where people post 200-character thoughts without bylines, sort of like anonymous Twitter with better graphics. It’s fun sometimes to post random thoughts and get general feedback from strangers. This particular idea, more a random musing than a thought-out manifesto, sat unread for days. Then, over the weekend, it blew up in my face.

Click the link if you want particulars, which I don’t recommend. The idea got over 600 clicks on the heart logo, indicating over 600 people like this idea. But it also got nearly 250 text responses, skewing heavily negative. And by “negative,” I don’t mean people saying that’s a bad idea. Many responses are hostile and vulgar. Many impugn my sexuality. Several include threats against me, and against women who don’t adopt their husbands’ surnames.

This battery of emotionally charged, bizarrely personal responses possibly bespeaks an important cultural issue. These responses variously indicate I’m gay, I’m weak, that I’m an embarrassment. But I can’t help noticing two recurrent threads in these responses: accusations that I’m a porn addict, and that I’m sexually impotent. Both imply I’m incapable, psychologically or physically, of having normal heterosexual sex. These respondents, presumably mostly male, feel the need to impugn my sexuality and my manhood.

I’ve experienced this before. When I positively reviewed a book questioning why women are systemically marginalized in churches, I received responses indicating I was homosexual, that my receding hairline proves my moral degradation, and that I’m so weak and enfeebled that I can only convince women to have sex with me by pretending to be nice. Overthinking the responses arrives at multiple contradictions, not least that gay men aren’t interested in sex with multiple women.

But in both cases, we have men who’ve never met me, mostly men who don’t post their real names, projecting fears of diminished manhood onto me for isolated opinions. They don’t know my context, like the three women I’ve watched struggle to reclaim their identities from their abusers. They simply assume, because I don’t desire to dominate and sublimate women, that my manhood is somehow diminished. The consistency of their imputations is bizarre, and noteworthy.

These men lash out at another man who doesn’t share their dominance-based model of manhood. They implicitly see masculinity as something not innate, but achieved, presumably through shows of strength. They dominate women, and when the opportunity avails itself, they dominate other men. If they cannot dominate others face-to-face, they dominate electronically. But their behavior demonstrates that dominance beats all. You aren’t a man, their words declare, until you crush somebody else beneath your boot.

One other common trait bears comment. Though the app conceals the users’ names (unless users voluntarily offer their names), other details are visible, like gender, location, and broad age range. These hostile comments overwhelmingly come from men between ages 18 and 25, the ages when men are most likely to seek partners and marriage. I cannot say how these men treat the women in their lives. But their response to a male stranger is telling.

There are legitimate arguments against my position. Several young women, responding to my statement, expressed a desire to take their husbands’ names to shake off abusive fathers. That’s a good reason. A woman may want to adopt somebody else’s name. If and when I get married (a prospect seeming increasingly unlikely at my age), my wife and I will discuss, and make a decision then. My thoughts now aren’t binding for the rest of time.

But these responses, disproportionately, don’t involve appeals to contingency. They’re largely free of nuance. They bespeak a model of manhood that doesn’t brook women having separate identities, or men having dissenting ideas. I spoke for myself, not all men everywhere, but these men choose to threaten, insult, and belittle me. Be honest, you only attack somebody when you believe they’re dangerous. These men see independent-minded women, and men who support them, as threats to manhood.

Wednesday, January 11, 2017

We Need To Talk About Clara

The Twelfth Doctor (Peter Capaldi) and Clara Oswald (Jenna Coleman)

Steven Moffat’s run on Doctor Who has been controversial, to say the least. He started well, and had ambitions to recreate the kind of year-spanning story arcs untried since Graham Williams helmed the show in the late 1970s. But his love of long, rambling plotlines, his tendency to push stories lacking narrative tension into production, and is love of exaggerated, rococo scene design, have alienated many fans. And don’t get some started discussing Amy Pond.

For me, however, the culmination of my frustration came when Moffat concluded Clara Oswald’s two-story arc in 2015. Not counting her one-off appearance in as a slightly different character in 2012, Clara debuted at the beginning of 2013, and left at the end of 2015. During that time, we had the first cringe-inducing moment in the episode “The Name of the Doctor,” when she said frankly, “I was born to save the Doctor.”

Two years later, in “Face the Raven,” we had the surprising sight of Clara dying to save the Doctor. Other than mayfly companions like Astrid Peth, we hadn’t seen one of the Doctor’s credited companions die on-screen since Adric in 1982, thirty-three years earlier. Longtime fans weren’t prepared to have the Doctor’s friend die violently, even in a self-sacrificing way, and were surprised when, two episodes later, even the Doctor’s work-around couldn’t save her.

So Moffat literally put a woman onscreen who was born to save the Doctor, and who died to save the Doctor. Maybe Moffat thought it was time for someone else to save the Doctor, who has saved so many others; most of the Doctor’s regenerations have been self-abnegating acts of heroism, including the Fifth, Ninth, and Tenth Doctors, who all specifically died protecting their companions’ lives. Maybe Moffat decided somebody else ought to bring salvation to this unusually messianic character.

Or maybe it was just sloppy.

Early depictions emphasized the romantic tension between Clara and the Doctor. In “Hide,” the psychic Emma Grayling specifically warns Clara not to fall in love with the Doctor, claiming there’s coldness at his heart. (This theme runs throughout the Eleventh Doctor’s run, that he’s secretly plagued with rage stemming from unchecked guilt.) The warning doesn’t take, however. By “The Time of the Doctor,” Matt Smith’s regeneration episode, Clara accidentally lets slip that she follows the Doctor because “I fancy him.”

The Eleventh Doctor (Matt Smith) and Clara Oswald (then billed as Jenna-Louise Coleman)

Even if it’s true, that’s a weak, weak reason. Rose Tyler, the Doctor’s first companion following the show’s resurrection in 2005, also professed her love and attraction for the Doctor, sure. But in “The Parting of Ways,” the Ninth Doctor’s regeneration episode, she says she sticks with him because “He showed me a better way.” Rose’s attraction to him wasn’t only personal; the Doctor also taught her to be a better human being.

Clara shows some inclinations to self-improvement through her run. After the Doctor’s regeneration frustrates her romantic aspirations, she channels those feelings onto Danny Pink, and by learning to love and trust another human, she turns his struggles with wartime PTSD into actions that save humanity. She even learns to stand up to the Doctor, as seen in “Dark Water.” But romantic love remains her chief motivator throughout.

One never gets the feeling, watching Clara, that she’s exceeded the circumstances of her origin. She constantly exists to prod the Doctor, to keep him on the straight path and keep him moving forward when he’d rather descend into self-pity. She literally does so in “The Snowmen,” when it’s revealed he’s been wallowing since losing Amy Pond. Clara literally chases the Doctor onto his cloud, into his TARDIS, and encourages him to act.

Again, in “The Bells of St. John,” the first time Clara appears as Clara, her phone call rousts the Doctor from stewing in a medieval monastary. Lacking either the Fourth Doctor’s restlessness or the Tenth Doctor’s wrath, the Eleventh Doctor would rather sit around, taking up space, than do anything, unless Clara motivates him. No wonder, in “Flatline,” Clara undertakes the active role usually performed by the Doctor. Without her, he’s become moribund and immobile.

So when it transpires that she was born to save the Doctor, and dies to save the Doctor, I’m forced to wonder: why do we waste time with the Doctor? Despite his knowledge, age, and technical prowess, he’s become a passive passenger in his own series. Clara is doing things. Clara motivates the show; the Doctor is Clara’s companion now. Then Moffat kills her. And now, I have trouble watching my one-time favorite show.

Monday, January 9, 2017

The Conservative Feminist, and Other Chimeras

Melissa Ohden, You Carried Me: a Daughter's Memoir

Young Melissa Ohden always knew she was adopted. But at age 14, she learned the truth: in 1977, before the battle lines hardened, she survived a botched saline-infusion abortion. The revelation cast a formerly bubbly youth into a spiral of grief from which she recovered only after long struggles and deep prayer. As she faced adulthood, education, marriage, and motherhood, her defining question became: How do I use this knowledge?

Ohden, now a full-time speaker for groups like Feminists For Life and the Susan B. Anthony List, regards herself a progressive on women's issues. She doesn't clarify why abortion access, which mainstream progressives consider core to feminism, is the opposite of what her peers claim. Which explains why I'm of two minds about this book. She hits the required inspirational high points, but makes little attempt to communicate across the aisle.

Half memoir, half manifesto, this book guides readers through her discovery of herself, and her decision to go public with her message. She never sought celebrity, but her combination of tragic background, personal eloquence, and drive, make her an ideal spokesperson for her brand of mixed feminism. It's hard for readers not to feel strongly for Ohden and her struggles, even if, like me, we disagree with her legislative agenda.

Ohden's  story really begins to move around page 70, when nearly a decade of labor bears fruit, and she receives her unexpurgated birth records.  Before this, her story involves lots of throat-clearing about her loving childhood, pain of discovering her history, and difficult reconciliation of the two. All this feels interesting, but doesn't get treated in much detail. Basically her entire life before 2007 feels like something she includes because she thinks she must.

Two examples should suffice. First, after learning about her birth, she descends into self-abuse, including anorexia, underage drinking, and casual sex. This gets resolved in a chapter, giving her the Born-Again narrative Christian readers expect, then patly forgetting it. Later, in college, she writes, "I learned quickly that my story was one that could not be heard, and therefore must not be told." But she never says what that means; she assumes her audience understands.

Melissa Ohden
I suspect Ohden doesn't really want to tell either of those stories, but realizes conservative Christian readers expect them. She doesn't invest the energy or detail she commences after page 70. That's when she begins the two-track narrative of how she meets the birth family she never knew, and how she became an in-demand speaker. Here, her story fills out with details, dialog, and scenes to move a heart of stone.

She also shares the stories of people she meets along the way. Fellow abortion survivors; mothers (and fathers) who lived to regret their abortions; pro-life leaders who have gambled their lives fighting what they consider moral violence. Just anecdotes, sure, but anecdotes that touch the heart of today's continuing culture wars. And her birth family. Sudden death prevents her meeting her father, but when Ohden meets her mother... wow.

Let me emphasize: I disagree with Ohden's issues position. Experience tells me that a leading cause for abortion is poverty, or other circumstances that make women feel they have no future; banning abortion only further circumscribes their options. Though I dislike the idea of abortion as birth control, I doubt how many women would consent to such an invasive, difficult procedure if they believed they had another choice.

As a contribution to the other side, Ohden's argument is a one-plank platform. She briefly acknowledges her beliefs about women's rights and the need for progressive reform, but doesn't give any details. Like her organizations, Ohden nods toward diverse issues, but spends her actual time rehearsing standard conservative Christian talking points. How can we reverse abortion trends without first giving women something to live for?

Nevertheless, Ohden presents an emotionally complex autobiography of the pro-life position. Readers who already agree with her platform will find plenty to cheer, and maybe provide some tools when discussing this issue in the future. As a memoir, it requires pushing through the obligatory Fundamentalist building blocks, but once Ohden starts telling her own story, it becomes worth reading.

Just realize, readers who don't already hold pro-life opinions probably won't change their minds during this book. I finished reading more aware of the other side's beliefs, more conscious of where we actually disagree, less like to believe caricatures of the pro-life position. I think we can argue better now. But deep down, I haven't changed my mind.

Friday, January 6, 2017

Make America Sad Again

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 15
Kelly Makin (director), Kids in the Hall: Brain Candy

Deep within a mysterious Canadian laboratory, a research team makes a discovery: a permanent chemical cure to depression. Nobody ever need feel needlessly sad again. But the corporation holding the purse strings has no patience for due diligence or boring old science: they rush Gleemonex onto the market without proper testing. It becomes a hit, and happiness becomes epidemic. Until the side effects hit.

Like Monty Python, whose television series they shamelessly copied, the Kids in the Hall sought to transform sketch comedy gold into big-screen revenues. Their single resulting effort received lukewarm reviews, lacked studio support, and lost money. It’s exceedingly difficult to find currently. Yet it remains possibly the most accurate mass-media insight into how affective mental illness works, and why making your pain vanish might be worse than the alternative.

Dr. Chris Cooper (Kevin McDonald) and his researchers have toiled ceaselessly, pursuing the goal of curing depression. They’ve become hunched, gnome-like, in their isolation, unable to have authentic experiences. Basically they’ve become depressed. But when their drug proves effective at eliminating human sadness, Cooper gets swept into rock-star-like celebrity, the adulation of millions. Happiness becomes a commodity ordinary citizens purchase over the counter.

On one level, this film uses fast-paced slapstick to tell a funny story. The Monty Python comparison isn’t gratuitous; executive producer Lorne Michaels used old Flying Circus footage to pitch Kids in the Hall to CBC, as well as selling Saturday Night Live to NBC. This movie’s episodic story structure, arcing toward ultimate disappointment, partly mirrors The Holy Grail. KitH’s humor tends gloomier, but audiences will find comforting overlaps between the styles.

But this movie also has serious themes. It unpacks the nature of depression, the fact that people mired in bleak sadness will try anything to escape their condition. This proves a complicated, nuanced issue immune to simple exposition. Patients consuming Gleemonex in this movie often don’t want happiness; they want to leave themselves behind and become someone else. These patents aren’t necessarily depressed, not clinically, they just aren’t happy.


Historically, the British Utilitarian philosophers, led by Jeremy Bentham and John Stuart Mill, insisted all human effort sought to bring happiness. Nietzsche mocked this idea, noting people often perform actions that make themselves unhappy, and said people actually assert power. This movie splits the difference, insisting that people actually want the pain to go away. If that means power over themselves and others, well, that’s happiness.

Until it isn’t.

Spoiler alert: Gleemonex proves to have disastrous side effects. Patients supposedly made happy risk falling into coma-like trances in which they do nothing. Literally nothing, not even feed themselves. Gleemonex cures depression by unlocking patents’ happiest memories, but some become trapped in those memories, an all-encompassing cycle where they lack either past or future. Happy people become mindless, losing the ability to move forward, literally or figuratively.

The comedy in this story comes from people failing to comprehend one another. Mrs. Hurdicure, the chief test subject, doesn’t understand her family, who don’t even pretend to understand her. She clings onto moments of meaning, the ability to be together even for a few seconds, regardless of the long-term consequences. She wants to exist, and existence means doing something meaningful. She’s simply not doing anything meaningful in the present.

We see glimpses of other people’s struggles. An aging housewife, bloated from futility, gets trapped in a mental discotheque, repeating one moment where she felt connected with human beings. Wally (Scott Thompson), a frustrated suburban husband who embraces his homosexuality under Gleemonex’s influence, keeps repeating moments when he didn’t feel forced to conceal himself. But he’s neither present nor whole; he keeps retreating into these moments.

This story doesn’t necessarily discount the whole experience of depression. Dr. Cooper notes late that he invented his drug “for people too depressed to get off the floor.” Dr. Stephen Ilardi, at the University of Kansas, notes this is the only group that clinically benefits from taking antidepressants. The problem arises when ordinary people abuse the drug to escape ordinary disappointment. Without dissatisfaction and occasional sadness, this movie insists, we aren’t really human.

With its deep meaning layered beneath frenetic dorm-room comedy, this movie rewards multiple levels of watching. Unfortunately, it’s very hard to find. Unlike KitH’s original TV show, which has always been widely available on VHS and DVD, this movie never found its audience, and has gone out of print. If you find a copy somewhere, grab it. Intelligent, informed viewers will find reward unpacking this movie’s complicated comedy.

Tuesday, January 3, 2017

The Rise of Name-Calling Nation

Wow, way to raise the discourse!
I’m getting heartily sick of the word “fascist.” Political pundits on both sides of the aisle love throwing that word around with reckless abandon, often in disregard of the facts. Certain incoming American presidential administrations who shall remain nameless have, indeed, demonstrated structural characteristics reminiscent of classic Italian Fascism. Yet the casual attitude people use with the term reflects not political enlightenment, but fear of marked difference.

2017 is shaping up to be the Year of Nasty Name-Calling. Shouters from the Left have made “fascist” their rallying cry, while from the Right, “snowflake” and “cuck” have become ubiquitous and annoying. Both sides have managed to cheapen their debating stock by hollering the same repeated buzzwords until they’ve lost all meaning. This doesn’t merely undermine the principles of political discourse; it weakens our ability to communicate using language.

It’s tempting to attribute all manner of causes for this rising disparity. The increased importance of partisan news sources like Fox News and MSNBC means many Americans need only receive information they know they’ll already agree with. The “other side” gets heard only in contexts where their loss is foreordained. And social media provides a ready-made platform for clickbait, including the “fake news” we’ve heard ballyhooed so much recently.

But considered coldly, that doesn’t make sense. While improving Internet technology and the proliferation of deep-numbers basic cable channels have created new crevices for partisan bomb-throwers to fester in, print has always given voice to one-sided pseudo-journalism in titles like The Nation and The American Spectator. And supermarket tabloids have appealed to Americans’ hindbrain fears and knee-jerk judgmentalism for generations already. Shitty news, sadly, isn’t new.

Likewise, we could attribute Donald Trump’s frequently ugly language for the normalization of name-calling. His rallies stopped short of dropping N-bombs, but he did call Mexicans “rapists,” urge the exclusion of religious groups from receiving American visas, and encourage followers to pummel protesters. But that feels hollow, given historically ugly political language. Having such posturing from the presidential rostrum may be new, but sidemen have always talked that way.

Indeed, since I began following politics around 1988, there has always been ugliness. Ads produced by future Fox News head Roger Ailes, including the notorious Willie Horton spot, and the image of Governor Mike Dukakis wearing a tank commander helmet that looked like Mickey Mouse ears, propelled George HW Bush to President without addressing issues. Let’s skip Walter Mondale, who lost in 1984 despite accurately predicting Reagan’s impending tax increases.

So ugly political language isn’t new, isn’t technology-driven, and probably has no single source. However, it’s distinctly powerful: Hillary Clinton, who had majority favorable poll ratings when she announced for President, found herself fourteen points underwater by election day. Maybe that’s because facts got slung her way during the campaign, but probably not. She had actual policies that polled well, which Trump lacked, but her opponents had "Trump That Bitch" t-shirts.

Meanwhile, the insults actually slung become meaningless. “Snowflake” originally meant “special little snowflake,” a reference to the idea that individuals’ feelings are sacrosanct from insult and disparagement. It goes back to Mister Rogers-era pedagogy telling children “everyone is special, in their own way.” Kids don’t buy that buncombe: I remember thinking, if everybody is special, nobody is special. But this philosophy survives as insults flung at anyone my age and younger.

The image many pundits partly blame for
killing Michael Dukakis' presidential hopes
Look at the tweet above, using the term “snowflake.” One asshole says “calm down, snowflakes,” to policy experts who observe that poked bears bite back. This isn’t about protecting anybody’s precious feelings, it’s about not provoking a confrontation with a nuclear-armed opponent with a nationalist government. The insult has come unmoored from its origins, simply because the insulter hopes to provoke high feelings. And language progressively becomes meaningless.

Losing our linguistic anchors isn’t some academic exercise. If, through the coming administration, insults and name-calling become standard rhetoric, the effect will be cumulative. Ordinary citizens already don’t talk politics across the aisles (any mention of Hillary Clinton at my workplace results in a shouted tirade from my boss about Whitewater and the Rose Law Firm). Legislators don’t live in Washington full-time or meet for drinks after work anymore.

Political rhetoric got this ugly once before. When that classic name-caller Andrew Jackson got elected president, his all-or-nothing discourse drove his opponents to form the Whig party, establishing America’s two-party electoral system. And ugly language in Congress led to Congressman Preston Brooks beating Senator Charles Sumner on the Capitol rostrum with a cane. Not long after, the Civil War began. Let’s agree to prevent that happening again.