Tuesday, May 30, 2017

The Appalling Return of Honor-Based Politics

Greg Gianforte, who may be the first Congressman
sworn in while under indictment for assault
When Congressman-elect Greg Gianforte physically assaulted a journalist, in a move so brazen even Fox News “watched in disbelief,” the attack itself seemed secondary to me. When Gianforte handily won what observers had previously called a closely divided special election, I realized we’d witnessed a new period appearing in American politics. The next day, the Portland double stabbing confirmed for me: we’ve entered a new period of honor-based politics.

I’ve known several people who complain that American culture today lacks honor, that we’ve become a systemically dishonorable people dwelling chin-deep in shame and disrepute. Most toss this off fleetingly; the two I know who most vigorously repeat this accusation are ex-military, veterans of a culture steeped in honor. I think I understand their meaning, as American mainstream culture has lost any sense of shame, from politics to prime-time TV.

Yet restoring honor is no catch-all solution. Honor, the sense that today’s actions cling to one’s name into the future, has its appeal in a society where even sexual assault boasts can’t derail political campaigns. We need people to face consequences for their decisions; some people should need to spend time reclaiming the integrity of their names. But restoring a literally medieval honor code won’t solve the problem.

Malcolm Gladwell’s book Outliers dedicates an entire chapter to honor-based societies, to the elaborate culture of claims and counter-claims, of confrontation and defense, that comprise such societies. Honor systems are bound up in elaborate rules for how to answer somebody’s challenges. Tellingly, he grounds his description of honor society in the southern Appalachians: Hatfield and McCoy territory. Honor societies share one common cultural manifestation, the family blood feud.

Honor culture, in Gladwell’s telling, rests on a network of rules, mostly unwritten, of how to uphold one’s name. One such rule: no challenges go unanswered. If anyone impugns your name, that requires immediate response. If anyone questions you, even incidentally, you must answer immediately. Failure to answer leaves a stink of dishonor clinging to your person, which cannot wash off until you provide some response to reclaim your name.

In the Fox News telling, Guardian reporter Ben Jacobs treated Greg Gianforte very rudely. He interrupted an interview already in process, refused to switch a voice recorder off when directed to do so, and attempted to hijack the conversation. In a dignity-based culture, Gianforte could’ve sat back, let the Fox cameras capture Jacobs behaving dickishly, and won the debate without opening his mouth. It was his fight to lose.

You knew this guy was coming
into the story at some point, right?
But in an honor-based culture, Jacobs questioned Gianforte’s positions, which means he questioned Gianforte’s integrity. If Gianforte couldn’t respond quickly, and preferably with sufficient force to stop all future questions, he’d look weak. In honor societies, personal feuds can continue for years, dragging entire communities and families down, and end only when one participant is too thoroughly demolished to ever fight back. As you know if you’ve seen Hamilton.

Possibly emboldened by Gianforte’s victory after attacking somebody, Portland stabber Jeremy Joseph Christian went on a rampage the next day. We’ve already seen documented how his racist rantings on public transportation ascended to violence when onlookers intervened, creating two new heroes. But it’s largely the same motivator: somebody interrupted him, challenging his position. He needed to answer, quickly and overwhelmingly, to restore his name. That never ends well.

Gladwell’s description of honor culture attributes the Hatfield-McCoy dynamic to learned culture. The honor-bound behaviors of the southern Appalachians reflect the same attitudes found in the Scots-Irish homelands from which the region’s white residents first emigrated. Though Gladwell stops short of attributing the cause to genetics, he nevertheless hangs the motivation on the region’s Celtic heritage. These people feud, he implies, because they're a bunch of angry fighting Micks.

But I suggest something different happens. The Scots-Irish left, or more accurately got forced off, marginal land in impoverished, colonized countries, and wound up on marginal land in impoverished, hegemonized states. The same dynamic of marginal land, chronic poverty, and cultural subjugation obtains in Gianforte’s Montana today. And though Portland is hardly poor and marginal, Fox and Breitbart have convinced many whites they’re living in such conditions, despite the evidence.

Therefore I suggest we’ll probably face similar situations again. Voters in places like Billings, Montana, and eastern Oregon, preponderantly supported a tiny-handed President who sees ordinary questions as personal affronts, and answers every challenge by attempting to destroy the challenger. I once thought we were facing expressions of America’s id. Now, I think we’re seeing white America defend its honor.

Friday, May 26, 2017

Where American History Goes To Die

1001 Books To Read Before Your Kindle Battery Dies, Part 82
James W. Loewen, Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong

American attitudes toward history are deeply contradictory. On the one hand, we reverence our past and make demigods of our Founding Fathers. On the other, we’re frequently altogether mistaken, even flat damn wrong, about what they actually did. Demagogues use this factually muddled reverence to manipulate us toward ends we don’t want and scarcely understand. How did we reach this point, and how can we combat it?

Harvard-educated historian James W. Loewen became a celebrity within education circles in 1980, when Mississippi rejected a state history textbook he co-wrote, on the grounds that it focused too heavily on racial matters. He challenged this decision in court, and his victory became a landmark in First Amendment battles: states cannot reject textbooks simply because they dislike the content. This philosophy underlies much of Loewen’s later writings.

Who were the Native Americans? Was John Brown, the violent abolitionist who attacked Harper’s Ferry, insane? Who took the lead in the Civil Rights Movement? And what, really, happened in Vietnam? Your answers to these questions probably reflect how these topics were taught, or frequently avoided, in your high school American History class. They probably inform how you think and vote today. And, Loewen demonstrates, they may be wrong.

Loewen begins this, his most famous book, with twin anecdotes about how two figures, Helen Keller and Woodrow Wilson, get described in high school-level American History textbooks. Keller, a longtime labor activist who believed capitalism threatened American values, gets reduced to a child who triumphed over adversity. Wilson, a racist who invaded several countries on specious pretexts, gets elevated to a progressive icon whose economic policies delayed the Great Depression.

These anecdotes, which Loewen finds repeated across twelve textbooks widely used in American public education, represent the process by which controversy and debate vanish from American history. People who never study history beyond high school, never realize that Keller and Wilson, plus Columbus, Lincoln, and other sanctified icons, had deep conflicts and remain controversial today. Textbook authors would rather elevate heroes and celebrate triumphs, than acknowledge America’s fraught past.

James W. Loewen
Textbook history, Loewen finds, has a tendency to present history as a succession of heroes advancing American greatness, pushing us toward ever-better displays of virtue. Students get no sense of setbacks, struggles, and the difficulty we still face achieving America’s stated principles. Thus, many citizens believe the present somehow represents a decline from a storied past, and today’s controversies as irrevocably cluttered and dangerous. Which they’re not.

This sometimes requires Loewen debunking specific myths. The virtual erasure of both racial and economic factors from history textbooks leaves Americans believing the controversies over these topics are somehow recent. Even slavery gets divorced from race. Yet when Loewen reprints a pre-Civil War campaign song, “Nigger Doodle Dandy,” used to split poor white voters from blacks, for instance, America’s long history of race- and class-based divisions becomes glaringly obvious.

Other times, Loewen eschews specific myths, preferring to focus on the myth-making process holistically. How did the First Thanksgiving become a sort of American Genesis myth, one to which Native Americans are mere guests? How did poverty and want vanish from history texts? Why is the entire Twentieth Century often addressed in under fifty pages, as though textbooks fear to approach the recent past? How did so much get omitted?

Multiple explanations exist. Textbook authors write, not for students, but for textbook committees, which often don’t involve educated historians. Education departments fear the wrath of powerful private interests, which would often rather have students loyal and patriotic than open-minded. Many high schools hire history teachers to coach athletics, and they have only a cursory background in their discipline. And these are only some of Loewen’s diverse, scary explanations.

Partway through this book, Loewen says one thing I cannot support: he insists that history, alone among disciplines, is so badly taught in high school that college professors must spend entire semesters breaking students from oft-regurgitated myths. But that’s not so: Paul Lockhart says something almost identical about math, and Gerald Graff says that about English. Sadly, much higher education involves students unlearning ignorance propounded in public schools.

In a democracy, history matters to how citizens approach the present. Citizens who don’t understand that history is both contingent, and ongoing, can’t make informed decisions about their government. The failure to understand history’s themes often colors our tendency to approach the present with either outrage or helplessness. If schools won’t educate Americans, we must educate ourselves. Loewen provides the tools to begin that dangerous process.

Wednesday, May 24, 2017

What Do You Call a Thriller That Doesn't Thrill?

Dan Chaon, Ill Will: a Novel

Nearly thirty years ago, someone murdered thirteen-year-old Dustin Tillman’s extended family, leaving Dustin and his older female cousins orphaned. Dustin’s testimony steered his adopted older brother, Rusty, to Nebraska’s Death Row. But DNA evidence has exonerated Rusty. Dustin, now a successful Cleveland therapist with kids and a critically ill wife, must grapple with his brother’s sudden reappearance in his life… as a fresh round of killings begins in his area.

Award-winning novelist Dan Chaon has good intentions with this novel. He takes premises from genre fiction, filtered through the techniques of high-minded literary fiction. But like a sleeper couch, he creates a hybrid that performs different functions with relative discomfort, in a way that will satisfy neither thriller readers nor literary cognoscenti. By populating his simply ordinary story with supremely unlikable characters, he leaves audiences nowhere to hang their hats.

First, Chaon’s nonlinear storytelling confounds where it should clarify, and vice versa. By stringing events together in an unsequenced montage, like a hip-hop filmmaker improvising at the editing table, events coalesce more from context and inference than organically. Joseph Conrad did this in Nostromo, where the confusion of secondhand information was partly his point. When we have access to viewpoint characters’ thoughts, as we do here, it just looks sloppy.

Moreover, as narration tapdances without chronological coherence, experienced thriller readers will start watching for whatever the viewpoint character leaves out. We understand how unreliable narrators work. We read this shit every day. Within thirty pages, it becomes painfully clear which character has omitted which important information from the recounting. This isn’t a mystery, where our protagonist must coax reality from conflicting evidence. The characters are just lying to the audience.

Chaon’s characters, besides being willfully dishonest, are also unpleasant. Not ordinary unpleasant, like Sam Spade, whose impromptu ethics define his story. Dustin Tillman, who has buried childhood trauma in marriage and career, handles his wife’s early death by descending into alcoholism and parental negligence. His son Aaron becomes a quasi-goth junkie with homoerotic tendencies, presumably because “gritty realism” sells. Dustin’s cousins use promiscuity to plug the vacancies in their souls.

Dan Chaon
The characters come across, not as hard-boiled, but as merely dickish. Everybody’s morally vacuous, but not for any story-based reason. Indeed, I’m not entirely sure even Chaon understands why his characters do anything. Dustin, a therapist, revisits his childhood with Rusty (Rusty & Dustin, geddit? Jazz Hands!) in terms transcribed almost verbatim from the DSM-V. Chaon cursorily plugs proper nouns into the description and apparently considers his authorial responsibilities thus covered.

But Dustin is a deliberately unreliable narrator. What about his son Aaron, the junkie? His described descent into addiction, debauchery, and crime, feels memorized from ONDCP pamphlets and Tarantino movies. I don’t believe these events for one damn moment. Dustin’s cousins behave wantonly because what self-respecting attractive 17-year-old doesn’t? Flashes of homosexuality, incest, and domestic abuse evidently happen because literary fiction authors have little else to elicit emotional responses anymore.

Parallel to all this, Dustin befriends a former patient, an ex-cop who deals with being benched by diving into tinfoil-hat-wearing conspiracy theories. Aqil Ozorowski, who sounds like an error at the Scrabble factory, has identified a pattern of college-aged men disappearing at regular intervals and turning up later, drowned. He claims law enforcement is ignoring the truth. His wild surmises seem harmlessly annoying, until his pattern strikes Dustin’s family directly.

I feel so cynical describing Chaon’s work this way. His well-crafted narration, which sometimes reads more like Rimbaud’s epic prose poem A Season in Hell, deserves some mention. At the sentence level, Chaon writes well. But contra the advice sometimes dispensed by undergraduate writing instructors, writing is more than constructing good sentences. He’s chosen a genre with a dedicated, experienced audience, and apparently doesn’t realize his readers recognize the boilerplates.

Clearly Chaon wants to combine genre fiction’s gut-level sensory immediacy, with literary fiction’s thoughtful investigations of character motivation. But he doesn’t realize his thriller aspects are recycled, or that his characters treat the reader with contempt. I cannot help comparing Chaon’s story with Belinda Bauer’s Blacklands, which accomplishes what Chaon apparently cannot. Where Bauer explores her characters, Chaon acts like an exhibitionist. Bauer is morally ambiguous; Chaon is just unpleasant.

Somewhere around the one-third mark, I lost all motivation to keep reading. I realized, I didn’t care if these characters all died in a fire. I just couldn’t bring myself to persevere. That, fellow reader, may say everything you need to know about this joyless cinder block of a book.

Monday, May 22, 2017

The Struggles of 2017 (As Seen From 1968)

Alain Badiou, The True Life

Western traditions and moral foundations are withering, says Alain Badiou. Religion and politics are vestiges of an older time, while capitalism reduces us alternately to children and instruments. In this series of talks, originally directed at adolescents, Badiou questions where youth culture could head in an era when we distrust the past and cannot count upon the future. Answers aren’t much forthcoming, but in philosophy, sometimes the questions matter more.

As a sometime academic and recent convert to contemporary French philosophers, I had high expectations from this book. But even I found Badiou’s prose dense, his reasoning tangential, and his conclusions unsupported by evidence. He presents an opaque philosophy, putatively for teenagers and young adults, that even grey-haired scholars may find confusing and impractical. And it verges, at times, on messianism. I can’t imagine whom Badiou is actually writing for.

Much of Badiou’s philosophy comes straight from his foundations in Paris 1968. He is both agnostic (he says atheist, but fudges), and an unreconstructed Leninist. He draws on an ecumenical selection of sources: Plato and Lacan, Rimbaud and Marx. But he doesn’t feel merely beholden to his influences; he goes beyond them, comments on their thoughts, and attempts to weave his Situationist-era roots with the smartphone age.

The result is, shall we say, chaotic. Badiou caroms from the necrotizing consequences of late capitalism; through the imposed roles of young and old, whom he believes should ally in rebellion against the middle-aged system; through importance and absence of unifying adulthood rites in a post-religious society; to gender roles and, honestly, I’ve forgotten what all else. His underlying thesis is, apparently, that modernity is confusing. Anyone could’ve written that.

Not that I’d call Badiou wrong. He says plenty I find appealing. For instance, he writes how a secularized society without clear adulthood rites, traps citizens in perpetual adolescence. “The adult,” he writes, in one of my favorite quotes, “becomes someone who’s a little better able than the young person to afford to buy big toys.” Capitalism, in Badiou’s analysis, turns functioning grown-ups into vehicles of juvenile appetite.

Alain Badiou and friend
He flinches on this later. Not people, but boys specifically, occupy a permanent teenaged wilderness. Capitalism stunts boys well into senescence, but turns girls into women from the cradle. So, tacitly, he accepts males as “normal” and females as “exceptional.” This becomes most apparent when he says if you look at a woman, “really look at her,” atheism is proved. He doesn’t say how. I know female pastors who’d disagree.

So, okay, Badiou makes weird statements and assumes his readers’ preferential agreement. That doesn’t make him wrong. Indeed, he’s a veritable assembly line of meaningful quotes about modernism’s essential vacuity. “The career is the hole-plugger of meaninglessness,” he says of how men’s adulthood is purely instrumental to capitalism. Or of women’s roles, “There are some women who are laboring oxen and some who are Persian cats.”

These statements make perfect sense to anybody who’s witnessed how society values men according to their remunerative value, and how it forces women into pre-written scripts that, feminism notwithstanding, have changed little. Readers who find modernist capitalism disappointing, like this ex-libertarian, may find themselves pumping their fists in exultation to see a scholar learnedly attesting what we’ve already thought, in terms concise enough for a t-shirt.

Yet reading his reasoning, I keep thinking: your conclusion doesn’t follow from your evidence. In one key moment, Badiou defends lengthy arguments by citing Sigmund Freud’s Totem and Taboo, an attempted psychoanalytic explanation of rudimentary religion, which I couldn’t finish because it requires more leaps of faith than the Bible. Freud’s corpus of work is mainly regarded as pseudoscience now anyway, so citing Freud doesn’t strengthen your claims.

That’s just an example, but it’s realistically representative of Badiou’s reasoning process. One suspects he starts with certain premises, like perhaps, that the financial collapse of 2008 and the rise of reactionary nationalism in industrialized nations go hand-in-hand, a premise so bipartisan that Bernie Sanders and Marine le Pen could probably agree upon that. Then he ransacks his personal papers, unchanged since 1968, to craft a justifying explanation.

Basically, I expected better from someone of Badiou’s standing. I want to say, take what you need and leave the rest; but a right conclusion from wrong reasoning is still wrong. Badiou crafts just enough useful slogans that I suspect he understands the core of the common situation. Then he lards it with weird source citations and intellectual cow paths. I just can’t figure where he’s coming from.

Thursday, May 18, 2017

Why I Still Don't Want Genetically Modified Food

Much modern farming less resembles gardening than strip-mining

A friend recently shared another of those articles “proving”—to the extent that science can prove anything—that genetically modified foods are perfectly safe. Perhaps they are, I don’t know. However, the article included multiple references to “conventional” agriculture, insisting that GMO foods are perfectly equivalent to foods produced through selective breeding, which we’ve enjoyed for years, and here I definitely know something. Conventional agriculture, as currently practiced, is deeply dangerous.

That seems controversial to say. Americans today enjoy the cheapest food in world history, quite literally: on your typical grocery run, you probably pay more for packaging than the food inside it. Massive technological investments constantly improve agriculture, improving yield and ensuring continued, affordable supply for whoever can afford it. Selective breeding has produced more fruits, vegetables, meat, dairy, and grain than ever before. Am I calling this improvement dangerous?

That’s exactly what I’m saying, and I’ll offer examples. According to a recent Atlantic article, a single bull who lived in the 1960s produced so many offspring that fourteen percent of all Holstein cattle DNA descends from this one specimen. Anyone who lives in cattle country knows prize cattle semen fetches premium prices on auction. This bull’s DNA quadruples per-cow milk production, but also increases likelihood of spontaneous abortion in utero. Hardly an unqualified success.

Equally important, though, and something the article scarcely touches on: 14% of Holstein DNA is now genetically homogenous. This resembles the degree of crop homogeneity that preceded the Irish Potato Famine. The rise of genetically similar cultivars, some GMO, some developed through conventional selective breeding, has produced remarkable vulnerability to crop blight, resisted only through petroleum-based chemical pesticides and intrusive technological interventions.

Pigs don't live in pens anymore; this is where your pork comes from

One episode of the Showtime TV adaptation of Ira Glass’s This American Life features a visit to a contemporary Iowa hog farming operation. The selectively bred hogs raised here produce more piglets per birthing, and therefore more meat overall, a seemingly desirable outcome. But the pigs produced so completely lack native immune systems that they cannot survive outdoors. They’re raised in clean-room environments more restrictive than those used in silicon microchip manufacture, at massive expense.

So we have livestock so homogenous that they’re vulnerable to blight, so tender of constitution that they cannot handle the outdoors, and so expensive to raise that any output gains are offset by the extraordinary measures necessary to keep them alive. So agriculturalists are backing off these approaches, as reasonable people anywhere would, right? Of course not. A combination of government incentives and corporate marketing encourages increasing output, even during times of unrestrained surplus.

Recombinant bovine growth hormone (rBGH), marketed heavily by Monsanto and Eli Lilly, promises to increase milk outputs. This despite known health effects, including distended udders and pus in the milk, and suspected side effects—rBGH is a possible, but frustratingly unconfirmable, human carcinogen. And this also despite the fact that the U.S. government has purchased excess American dairy stocks and dumped them on the ground to prevent prices going into freefall. It has done this since the 1930s.

I use livestock as examples, because images of living creatures suffering tugs our heartstrings. But this pattern obtains across all farming: fear of shortfall justifies constant excess. According to agriculture journalist George Pyle, America grows twenty times as much corn as Americans could possibly eat. So most of the oversupply gets fed to cattle, making meat insanely cheap. But cattle cannot digest corn starches, turning their shit acidic, a perfect environment for toxic E. coli strains.

That’s saying nothing of the economic impact. When NAFTA became law in the 1990s, some Americans worried that manufacturing jobs would emigrate to Mexico, which somewhat happened. But when subsidized American agriculture hit Mexican markets below the cost of growing, rural poverty, especially in the agrarian south, hit record numbers. Mexican poor sought work where work existed: in the U.S. And Americans elected a demagogue promising to build a wall keeping those impoverished workers out.

Old McDonald had an assembly line, E-I-E-I-O
Corporations sell GMO seedstock by promising increased yields. Conventional farming currently produces enough food to feed 150% of the current world population, mainly driven by petroleum-burning equipment, with fertilizers and pesticides derived from petroleum. (The Rodale Institute estimates that farms currently produce more greenhouse gasses than cars.) When food is already so oversupplied that it’s cheaper than the packages it’s sold in, increasing yields makes no sense.

Yet, as George Pyle notes, American farm policy has assumed an imminent food shortfall justifies continual increases, ever since America devised its first farm policy, during the Lincoln Administration. One friend justifies continuing this approach because, he believes, near-future environmental collapse will require genetically modified foods to save the human race. Two problems: we cannot predict environmental outcomes any better than we could predict post-nuclear war conditions. And, Pyle writes, heirloom varietals are more adaptable anyway.

Starvation exists today, and chronic hunger exists close to home. But increasing supplies, whether through conventional or GMO means, makes little difference. People lack access to food, which usually means money. MLK noted, back in the 1950s, that fresh vegetables cost twice as much in poor neighborhoods as in rich neighborhoods. High-yield GMO seeds, often pitched to cure global famine, are expensive. People too poor to buy and plant heirloom varieties cannot trade up.

So basically, the demonstrable safety of individual GMO varietals doesn’t much matter. (Rampton and Stauber question that science anyway.) If they’re similar to selective breeding, well, breeding hasn’t been benign either. And they’re customized for an economic demand that doesn’t actually exist outside corporate PR. Yet the drumbeat of safety, quantity, and productivity has made these demands common coin. That’s just missing the point. Agriculture is hurting itself just fine right now, without gene technology’s help.

Monday, May 15, 2017

Deus Est Machina

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 19
Darren Aronofsky (writer/director), π (pi)

Reclusive genius Max Cohen (Sean Gullette) has built a supercomputer in his Manhattan apartment. He hopes to compute market movements, pick stocks with machine-like accuracy, and become rich without leaving home. But his computer, nicknamed Euclid, vomits a 216-digit number and dies. Thinking he’s failed, Max discards the printout; nefarious forces find it, and he finds himself caught in a battle over the forces guiding modern life.

Darren Aronofsky’s (Requiem for a Dream, Black Swan) first feature film, shot on a shoestring budget, works around its physical limitations with risky camera techniques, grim understated performances, and subtle writing. Shot on black-and-white reversal film, often from unusual angles, and cut with frenetic haste, it looks like we’re watching Max’s struggle unfold through surveillance cameras. Before long, we realize this isn’t an accidental technique.

A mathematical genius, Max impresses local children by performing complex equations faster than their pocket calculators. But he has few adult relationships. He wants reality to share math’s simple Platonic elegance, and often preplans his conversations using theory-and-experiment methods. Only his invalid mentor, Sol (Mark Margolis), shares Max’s passion for precision; they communicate mainly by playing Go, a Chinese strategy game based on strict mathematical principles.

While drinking his morning coffee, Max gets accosted by Lenny Meyer (Ben Shenkman), a gregarious Hasid who introduces Max to Gematria, a form of orthodox Jewish numerology. Curiosity overcomes Max’s usual reticence, and he lets Lenny explain the intricacies of his Biblical code-breaking. He isn’t entranced enough, though, to accept Lenny’s invitation to participate in ongoing research sessions. Especially when Lenny says they’re seeking a 216-digit equation.

Almost immediately, Max meets Marcy, an agent of Wall Street speculators, who somehow know about Max’s experiments with Euclid. They think his equations could help predict market movements, benefitting whoever controls the supercomputer. They offer Max a powerful circuit chip in exchange for access to Euclid; realizing this chip could complete his experiment (and possibly unaware of how finance works), Max accepts, permitting the agents full access to his creation.

Here we see Aronofsky’s themes expressed: mathematical constancy proves reality exists, but little more. Lenny’s Jewish colleagues believe reality demonstrates God’s beneficent existence, while Marcy places her faith in market forces. Two conflicting interpretations of an imperfectly glimpsed truth each demand validation, which spirals into powerful potential violence. Meanwhile Max grasps vainly for truth unvarnished by human interpretation, but cannot have basic relationships with adults as equals.

Max Cohen (Sean Gullette) seated at his supercomputer, Euclid, in Darren Aronofsky's π

Sean Gullette plays Max with dark, soft-spoken urgency. He narrates his own situation aloud, as though he can only understand reality when filtered through the dispassionate lens of language. This doesn’t work out well. Gullette himself apparently wrote many of Max’s narrations, playing up Max’s difficulty understanding sensory reality. Though Max believes objective reality exists, he also has delusions about surveillance and entrapment. At least one character exists only in his head.

Sol quietly encourages Max’s quixotic pursuit of undifferentiated reality. The movie implies, without stating, that Sol is a Holocaust survivor, jaded on all ideologies, but also unable to reconcile his belief in objectivity with his imminent death. When Euclid begins repeatedly producing the same elaborate code, Sol cross-examines Max. It appears Sol produced the same 216-digit sequence previously, and his health has been on a rapid decline ever since.

Throughout, images of mathematical precepts appear, sometimes more directly than others. Besides his stock-picking supercomputer, Max is fascinated by the Golden Spiral, a geometric paradigm that often serves to pique students’ interest in higher math. Number theory looms large in his calculations, but as those calculations become more elaborate, chaos theory overcomes his thinking. Confronted with the duelling theisms of the Jews and Capitalists, Max becomes more doggedly agnostic.

This movie also marks writer-director Aronofsky’s first collaboration with composer Clint Mansell. The atmospheric soundscape creates a psychological resonance with Max’s increasingly strident paranoia. Driven primarily by synthesizers and small-ensemble sound, Mansell’s score can career from bucolic afternoons in the park, to a texture like an electric drill on your teeth, with amazing speed, while never sounding out of place. It meshes so smoothly, audiences often won’t notice the score whatsoever.

Aronofsky’s works have frequently toyed with the incompatible forces driving modern life. The push between art and commerce, in The Wrestler; between beauty and mental illness, in Black Swan. Here, he insists humans need faith in something, anything, but also must realistically confront reality’s chaotic, seemingly meaningless veneer. He offers no solutions, and his resolution admits multiple interpretations. But his approach shakes viewers from their preconceived notions.

Thursday, May 11, 2017

The Ancient Art of Writing Well

1001 Book To Read Before Your Kindle Battery Dies, Part 81:
Aristotle, On Poetry and Style

Quality of language mattered to Aristotle. In classical Athens, all language was public: poetry, which mostly meant hymns and staged tragedies, was always performed for the assembled citizenry, and written prose was meant to be read aloud, particularly in courts. Therefore Aristotle, like his fellow Athenians, placed great stock in good-quality language. He was only one among many to write style manuals. Few others have had his durability, though.

This volume combines two Aristotelian classics. First, the full text of Poetics, his consideration of high-minded public verse. For Aristotle, this mostly means tragedies. University professors often emphasize Aristotle’s response to Sophocles’ Oedipus Rex, as though this book specifically criticized one play. Actually, on reading, he quotes liberally from multiple plays, sometimes the only remaining evidence of once-important dramas. Aristotle was clearly familiar with is genre.

Following that, this book includes twelve chapters from Aristotle’s Rhetoric, his chapters on creating a moving prose style. He strenuously emphasizes distinctions between poetic and prose forms, sometimes referring audiences back to the Poetics (including occasional lost chapters) to emphasize understanding how to pick your form. And what forms he describes. Aristotle gives a spirited introduction to the creation of warm, dynamic prose that stirs the audience, mind and soul.

Taken together, these two books represent a widely circulated primer on connecting with one’s audience for maximum impact. Concepts that remain widespread and influential in college writing courses, like the poetic foot or arguments from ethos, pathos, and logos, receive their first surviving descriptions from Aristotle. Though often dry and prolix himself, Aristotle concisely describes the decisions writers across ages have needed to make in composing their words.

As already noted, for Aristotle, all writing is public writing. The poems he analyzes are mostly plays, performed in Athens’ annual festival of Dionysus, before the assembled citizenry. His prose mostly means speeches, delivered first in the Agora, then copied for distribution afterward. These distributed copies were mostly read aloud in salons; the idea of reading alone, silently, arose largely after Gutenberg. Language, for Aristotle, happens aloud.

Plato and Aristotle, depicted by Raphael
To demonstrate these concepts, Aristotle cites extensively from poets and politicians, many still then living. He often alternates praise and criticism: he dislikes the tragedian Euripides overall, yet finds generosity enough to extol his ear for natural dialog. He acknowledges Herodotus for realizing the value of writing history in prose, though admitting he often gets high-flown and needlessly poetic. He concedes that even the great Homer could’ve been improved by brevity.

It’s somewhat unclear how widely these works were distributed in Antiquity. Aristotle’s precepts were widely known and cited, though not always under his name; these ideas were perhaps common coin, and only familiar to us in this form because Aristotle transcribed them. His Poetics particularly seems to have been lost for some generations. And since he cites passages we no longer have, some scholars believe a second volume was lost.

This edition weighs right at 100 pages, plus front and back matter, slim enough for a purse or jacket pocket. Yet it broaches enough topics to keep students and educators engaged with their topics for months. Translator George Grube provides liberal annotations to help readers unfamiliar with the Greek context decipher some of Aristotle’s more obscure passages. This part-time classicist appreciates a skilled guide holding my hand.

Some of Aristotle’s descriptions apply specifically to Greek-language writing. Entire chapters meander on topics like verb tenses and infixes, which don’t translate well; Grube, on multiple occasions, advises readers they can benignly skip this page. However, remarkable amounts of Aristotle’s directions, on topics like choosing your audience, constructing active metaphors, and creating rhythmic language, remain current and practical. Many Aristotelian precepts could use more airing in our frequently ineloquent age.

Aristotle’s style often requires great endurance. Unlike Plato, whose entire body of work historians believe has survived, we actually have nothing written in Aristotle’s own voice. Scholars conjecture that we basically have his lecture notes, like reading the handwritten sketches from which university professors extemporize. If Aristotle sometimes seems dry and vague, it’s because, like the style he advocated, he used language publically, and we lack his voice.

Still, Grube, one of his generation’s most accomplished classicists, annotates Aristotle sufficiently that, where possible, we have the information the Great Man’s notes elided. Some passages remain obscure, details lost to history, but compared with the unannotated Poetics I read in graduate school, this is remarkably lucid. Literary criticism basically begins with these classics. And with them, we begin understanding how literary style works.

Tuesday, May 9, 2017

The High Cost of Free Speech

The Gadsden flag, designed in 1775. Exactly when it became
a conservative nationalist symbol is somewhat murky.

People who use free speech arguments to defend bigoted speech are more likely to actually be bigots. That’s the message from a new research study published by Mark H. White and Christian Crandall, from the University of Kansas psychology department. As summarized in a KU press release, White and Crandall found a strong positive correlation, stronger than they expected, between using the “Free Speech argument” to defend racist speech, and actually possessing personal racist beliefs.

I read this news release the same weekend two pickup trucks made a theatrical show of crisscrossing my town flying some pretty distinctive flags. Each truck had one flag over each rear wheel well: an American flag, a Confederate battle flag, a Gadsden “Don’t Tread On Me” flag, and a Trump-Pence campaign flag. These trucks rode abreast down my town’s main roads, snarling traffic behind their display of… well, let’s postpone assigning intent for now.

We’ve all heard the “free speech argument” recently defending people saying truly awful things. Milo Yiannopoulos urged students to target peers, by name, and demanded free speech protection for what amounts to inciting violence. (He discovered even his intellectual fellow-travelers have limits on what they consider “free” speech, eventually.) Donald Trump rode a wave of Freudian mental meanders, protected by free speech, into the Oval Office. Clearly the free speech argument isn’t hurting anybody lately.

Free speech is absolutely fundamental to American identity. But like other moral absolutes, people fall back on free speech to defend language that’s otherwise morally indefensible. Rather than a starting place, foundations become a hammock. Political bomb-throwers adopt American founding language because they lack their own language. Do they simply lack their own moral reasoning? Or do they realize their own opinions lack foundation? Either way, it demonstrates core unawareness of what constitutes free speech.

Perhaps the most common mistake I hear, is that free speech requires everybody to provide a platform. When universities pull invitations for agents provocateurs to speak, or TV networks yank freelancers for stating odious opinions, the free speech argument comes trotting out. But when private enterprises fire somebody for expressing hateful views, this doesn’t violate free speech; the First Amendment says governments can’t foreclose citizens’ liberty to say something. Private enterprises may defend their platforms.

Country singer Hank Williams, Jr., sold this on his merch table in the 1980s
and early 1990s. In case anybody thinks the Civil War is actually over.

I cannot have employees spouting hateful language about immigrants and brown people if I own a factory staffed primarily by immigrants and brown people. And I can’t alienate America’s fastest-growing customer cohort to appease one person expressing opinions from a prior era. Private enterprises cannot stomach bigotry and also serve their mission. Companies, state schools, media outlets, and other employers have a right, even an obligation, to say: at least take it off the premises.

But more important, the free speech argument ignores two centuries of history. By refusing to have moral justifications for what we say right now, we surrender our place in the ongoing discussion. Falling back on first principles, without acknowledging the evolving debate, means retreating back to the beginning and starting again. Like a child taking his ball and going home, whenever we cite free speech, we fundamentally defend our refusal to have this discussion, now.

As in athletics, the rules aren’t the game. They simply define how we’ll play, to guarantee everyone a competitive chance. Quoting Rule #1 after every challenge is basically demanding a Mulligan whenever you mishandle the ball. Not that free speech is outdated; our debate needs a defensible starting point, and saying the state cannot preemptively silence anybody just makes practical sense. However, if we keep returning to the start, we don’t live in the present.

Thus, when I see people flying Confederate and Gadsden flags, I see someone stuck in time. Their slate of moral principles hasn’t evolved since 1865, over 150 years ago, because they don’t live in the present. Perhaps the world around them doesn’t exist right now, subjectively. Yet the retreat into moral foundations demands state-based defense to avoid addressing reality. Appealing to the state, with its arrest and trial capacity, is structurally an appeal to violence.

Maybe people using the free speech defense have forgotten what they’re fighting for. If they believe the American promise, how dare anyone use size and volume to shout down dissenters? Yet they keep re-fighting the Revolution and Civil War because it’s all they know how to do. They vanish into a mythic American past, perhaps, because war is the last gasp of dying ethics. They just use living values to keep a dying fight alive.

Thursday, May 4, 2017

Jimmy Kimmel is Wrong, and That Hurts



People who watch more TV than me tell me that Jimmy Kimmel weeps pretty easily. So maybe audiences weren’t surprised Monday night when he barely contained his emotions while describing his newborn son’s diagnosis of a life-threatening congenital heart defect. The video of his description certainly gained traction online, particularly among viewers touched by his thesis, delivered near the conclusion: “No parents should have to decide if they can afford to save their child's life.”

Yet apparently that statement isn’t as uncontroversial as Jimmy and I believe. Almost simultaneously with Kimmel’s twelve-minute monologue, I argued online with an anonymous provocateur who insisted no tax money, none whatsoever, should move toward teenage girls who get pregnant out of wedlock. When I asked whether that moral equation unfairly targeted newborn children, this person literally replied: “Not my problem. Maybe if a few more people suffered they wouldn’t make the same stupid choices.”

Audiences who share my moral convictions, which realistically means most people reading this essay, will share my shock and umbrage at this statement. I could discuss the epigenetic consequences of thrusting infants into deprived conditions, presetting their brains into lifelong patterns of pain avoidance. I could discuss the lingering moral and economic consequences of a society that kicks the weak. But let’s be honest, I’d be preaching to the converted. You probably already believe this.

The family photo with which Kimmel accompanied his monologue

Rather, I choose to focus on two elements, which are really one. First, this person, writing under the shield of anonymity, believes society has no obligation to protect children, the poor, and other disadvantaged. Second, she believes the government should cause people to suffer to serve moral ends. Contra Jimmy Kimmel and me, this person really believes some people should be too destitute to raise children, damn the consequences, if it suits her moral framework.

Combined, these are the attitudes of a sociopath. To use proper clinical language, Antisocial Personality Disorder describes a non-genetic, learned tendency to view others as categorically lesser than themselves. The sociopath doesn’t perceive others as having rights or deserving protection, so screwing the powerless is a perfectly reasonable action; only those powerful enough to fight back deserve deference. This is one possible outcome of Nietzsche’s will-to-power philosophy, and a guiding principle of Ayn Rand’s Objectivism.

Unfortunately, it’s also the attitude of a certain subset of American political discourse today. We saw this during the Republican primary debates of 2011, when an anonymous audience member answered a question about saving an uninsured veteran from preventable death by shouting “Let him die!” People who are poor, or who didn’t save money, or who hit unfortunate circumstances beyond their control, aren’t unfortunate this social arrangement. Those with less are undeserving, inherently inferior people.

I recently sat listening while a self-described libertarian fulminated about the evils of poverty protection. I’ve learned not to interrupt such people, even when they say something measurably wrong; it only energizes them. So when he said, of poverty assistance, “When did we take away the right to fail? That’s the only time people move forward,” I realized something. This schlemiel thinks chronically impoverished citizens have encountered momentary setbacks, and could recover, like dot-com entrepreneurs!



To the political sociopath, the poor are poor because they deserve punishment. Perhaps they made bad choices, like getting pregnant out of wedlock. Perhaps they bought a house before the market cratered. Perhaps they’ve hit an economic snag, and ought to reorient themselves to the majority. Whatever it is, to the sociopath, the sufferer’s plight has no relevance to them. Appeals to common morality, like Jimmy Kimmel and I make, fail, because they don’t care.

I don’t say this to propose solutions. I have none available, because these people see the world so completely differently from me that I cannot bridge the gap. Confronted with the reality that it takes money to make money, or that our financial system has structurally separated work from pay, or even the reality of starving infants, these people respond, as my interlocutor did, “Not my problem.” Their only philosophy is, what’s mine is mine.

Kimmel, in his monologue, claims the idea that children shouldn’t die is bipartisan. But clearly it isn’t. Kimmel’s underlying message, that society has an obligation to help its poorest and most defenseless members, falls flat for people who think suffering is educational, or temporary, or compartmentalized. Some people in Western society literally don’t care if babies die. And that’s a judgement on us all, even those of us who don’t share that twisted, soulless morality.

Monday, May 1, 2017

The Eternal Now Generation

Bruce Cannon Gibney, A Generation of Sociopaths: How the Baby Boomers Betrayed America

If anyone doubts the lingering socioeconomic influence of the Baby Boom Generation, the most numerous age-based American demographic before or since, tell a Bob Dylan fan he’s getting old. I dare you. Dropping a reference to Big Bob’s age on a then-popular website about a decade ago attracted a firestorm of criticism, and remains the only time I’ve ever been banned from a user-generated content site. For noting that his fans looked old in tie-dye.

American attorney and venture capitalist Bruce Cannon Gibney gained attention for aggressive writing, and a willingness to court controversy, long before he became a full-time writer in 2015. His first published book deliberately attracts dispute by linking America’s less-than-stellar economic performance since the middle 1970s, with the Baby Boom’s accession to political and capitalistic dominance, a position they’ve maintained despite two subsequent generations already achieving adulthood amid a weakened economy. Gibney assigns blame with gusto.

Unlike their parents, the Greatest Generation, who hit adulthood amid the shortages of the Great Depression and World War II, or their European and Asian generational cohorts who grew up with post-war rationing and devastated states, American Boomers grew up with relative comfort. With parents educated on the GI Bill, constant access to TV and other anodyne entertainment, and the expectation of permanent economic growth, Boomers considered comfort their entitlement, pursuing it at all costs.

Reading Gibney’s subtext, I encounter my first factual problem: his archetypal Boomer is apparently white. He even acknowledges this (in a footnote). That isn’t entirely unfair: he notes, “there were roughly as many white Boomers in 1990 as all ethnic minorities, of all generations, combined.” So treating middle-class white interests and Boomer demands as interchangeable, has some foundation. Nevertheless, it does symbolize Gibney’s tendency to forcibly homogenize diverse groups, even those bound only by age.

Still, one cannot review this book without reversing more often than a ping-pong ball. Yes, Gibney paints with a broad brush, but a well-earned one: he demonstrates depth of research missing from polemicists and rabble rousers. He laces this book heavily both with footnotes (he obliquely concedes David Foster Wallace’s influence) and endnotes. His text is unusually long for its genre, over 350 pages plus front and back matter, presumably to justify Gibney’s controversial thesis.

Bruce Cannon Gibney
Discussions of Boomer politics usually begin with anti-Vietnam protests. See, the implication runs, Boomers started out progressive before, somehow, they embraced the Reagan Revolution in the middle 1970s, when they became America’s largest voting bloc. The reversal remains inexplicable. Except, Gibney collates then-current polling data to indicate Boomers disproportionately favored the war; they just wanted somebody else to fight it. Cue multiple draft deferees, like Trump and Bill Clinton, or stateside chair-sitters, like George W.

Boomer tax policies have generally favored themselves, at others’ expenses. They voted for deep income tax cuts as they entered the workforce; capital gains tax cuts only after accumulating large stock portfolios; and inheritance tax cuts as their parents faced mortality. Using officially published statistics (which, y’know, lies, damned lies, and…), Gibney computes Social Security will be exhausted about when the youngest Boomers turn eighty—that is, when most of them will statistically be dead.

One could argue, as Gibney briefly does, that many leaders who crystallized Boomers’ opinions, like Ronald Reagan or Jerry Falwell, emerged from the Greatest Generation. Gibney quickly dispatches that argument by noting that leaders accomplish little without numerous, influential followers. Reagan and Falwell gained predominance as Boomers entered the voting pool; and political priorities have evolved, keeping pace with one specific age cohort’s shifting expectations. While Boomers remain numerous, Boomer issues will dominate our system.

Every item Gibney approaches, and there are many, he bolsters with generous evidence demonstrating that Boomer concerns steer the discussion. Taxes and deficits, education, criminal justice, and more: their constant evolution keeps redounding to Boomer benefit. Gibney doesn’t much address culture and art, except where it bolsters his argument. A financier himself, Gibney cares more about financial than cultural injustice. And he leavens his jeremiad with grim humor reminiscent of Matt Taibbi or P.J. O’Rourke.

Many readers will find this book ageist and blinkered. Others more sympathetic to its general thesis, like me, may nevertheless find fault with Gibney’s arguments, and places that need shored up. (Point-by-point refutation would take too long here.) But mass agreement probably isn’t Gibney’s goal. He wants instead to change the discussion by refocusing it through a generational lens, and he succeeds. Even if his thesis needs revision, he’s almost certainly already changed the debate.