Friday, June 23, 2017

More Human Than You

1001 Books To Read Before Your Kindle Battery Dies, Part 83
David Livingstone Smith, Less Than Human: Why We Demean, Enslave, and Exterminate Others


Nazis characterized Jews as rats, while Rwandan propagandists called Tutsis cockroaches. But while turning humans into household vermin justified killing them, colonists characterized Africans and Native Americans as cattle, which justified enslaving them. Both were instruments of control. To take away power from other people, we must first take away their human essence. But how do we do that, and why, and how do we live with ourselves afterward?

Philosopher David Livingstone Smith, in seeking sources to answer these questions, found that little has been written about dehumanization. The term gets discussed widely, especially in contexts of racism, sexism, and wartime propaganda. But little scholarly research has really addressed the social and psychological processes that let us perceive humans as “lower” life forms. That seems an oversight in today’s brutally sectarian times, which Smith decided to rectify.

We must begin any consideration of dehumanization with the question: what makes us human? This seems an obvious question, one easily answerable by science, but this is an illusion. Excessively specific definitions of humanity risk excluding groups, from racial categories to the disabled. Broader definitions risk including chimpanzees. Were australopithecines and Neanderthals human? Contemplate the question, and humanity becomes a philosophical rather than a scientific category.

It helps to understand the concept of essentialism here. Smith provides lucid explanations, which he clarifies throughout this volume, but the concept looms so large, it deserves some definition. Humans are different: skin color, height, language, disability. Yet across these superficial divides, we generally agree, some fundamental essence preserves our core humanity. The argument then becomes, what essence truly defines humanity? And does everyone classed “human” actually have human essence?

David Livingstone Smith
Humans, it appears, are master creators of categories and groups. While chimpanzees comprehend “Us” and “Them,” and sometimes brutally slaughter Them, only humans create narrow, intricate in-groups. Only humans create shifting alliances between such groups. Only humans institute rituals designed to reinforce such groups… and only humans show conscience enough to recognize when our group-creating inclinations harm the insiders we intended to help.

This capacity for unbounded cruelty, coupled with this unique ability to reflect on our own thinking— what Smith calls second-order thought— puts humans in the unique position of being both nature’s most destructive species, and its most creative. The two tendencies often travel together. The tendency to redefine humans into livestock, vermin, or monsters which need defeated, has often produced humanity’s most creative thinking, to our eternal discredit.

Understanding our capability for dehumanization requires delving into humanity’s most shameful history. Smith unpacks various genocides, like Rwanda, the Holocaust, the Turkish slaughter of Armenians, and Darfur. He also looks into European colonial history, where peoples once regarded as equals and allies, like Africans and Native Americans, became subhuman enemies almost overnight. The patterns Smith uncovers are chilling and informative. But as you’d expect, it makes for difficult reading.

This dovetails into humanity’s tendency to create races. Social scientists and philosophers have written on how races, far from being consistent or biologically mandated, are created and constantly reinvented by human societies. I was particularly struck, in Smith’s analysis, by how early children divide humans into groups, and how little those groups resemble the racial categories adults encourage others to fear. Racism both does and doesn’t need to be taught.

As the argument progresses, solutions become murkier. You cannot insist on transcendent human essentialism to people who believe designated groups lack human essence. And even when stereotypes of designated groups prove unreliable, bigotry remains remarkably intractable, immune to evidence. Smith doesn’t insult readers’ intelligence with false hopes or pat solutions. He makes readers live with our indictments, because nobody is immune from the capacity to push others outside humanity.

This isn’t a scientific text. Smith doesn’t rely on recently fashionable sciences like brain imaging and behavioral economics, currently voguish in mass-market nonfiction. Not only are such sciences less reliable than often peddled, but science lacks the vocabulary to describe the complex, amorphous interactions involved in this process. Humanity, and dehumanization, aren’t scientific facts to be analyzed, like amoebae. They’re philosophical concepts, changed by the fact of being examined.

Smith doesn’t pretend he has the last word. In his final chapter, Smith lays out questions that still need examined moving forward. This book represents an intermediate step in comprehending the ways human beings steal other humans’ essence. But as an intermediate, if considered with the sobriety the topic demands, this book offers us an opportunity to move forward. We could reclaim our humanity by restoring it to others.

Wednesday, June 21, 2017

Time For a New Economic Yardstick

Lorenzo Fioramonti, The World After GDP: Economics, Politics and International Relations in the Post-Growth Era

The Gross Domestic Product has proven a mediocre economic measurement at best. It totals the cash value of all economic transactions, but doesn’t measure costs and benefits commensurately; car wrecks and traffic congestion have cash value, but lovingly restoring Grandma’s classic Fairlane doesn’t, unless we sell it. I grew disgusted with GDP fifteen years ago, when national leaders presented shopping as the solution to the 9/11 attacks.

South African economist Lorenzo Fioramonti begins this dissertation with a brief history of Palau, an island nation once touted as miraculous for its powerful economy. After independence, it parlayed massive mineral reserves into Earth’s largest per-capita GDP. But that wealth wasn’t distributed equally, and GDP didn’t include non-priced factors, like environmental decay. When the minerals were tapped, the miracle proved illusory. Palau is now poor, physically blighted, and without hope.

Fioramonti sees a parable of modernity here. Economic measurements aren’t value-neutral; what economists count inevitably becomes what leaders and entrepreneurs pursue. Price elasticity causes an inverse relationship between market price and social value, meaning things we struggle to measure in dollars, like the environment or human communities, get forgotten… until catastrophe strikes. But it doesn’t have to be this way. Fioramonti progresses from grim history to optimistic forecasting.

GDP arose during World War II, for specifically wartime purposes: to quantify America’s ability to manufacture military supplies. Quoting several other economists, Fioramonti compares GDP to the Manhattan Project, a wartime planning tool that somehow persisted into peacetime and remains impervious to changing conditions. (It even triumphed in the Soviet Union, eventually, because the preferred Leninist measurement failed to account for the service industry.)

Lorenzo Fioramonti
But even GDP’s chief inventor turned against his creation. Contemporary critics deride GDP for its inability to incorporate environmental costs: dirty air and flammable rivers have no price, and therefore no economic weight. But GDP pioneer Simon Kuznets realized his invention didn’t encompass human costs. Worn-out workers, sundered families, and communities severed from their roots have consequences, but no price, so they don’t get figured into the GDP.

And this only includes what happens visibly. Early in this book, Fioramonti uses a familiar, but still impactful analogy. He writes that “food cooked at a restaurant and purchased by consumers is registered as part of a nation’s economy, but the same food cooked at home and shared with family and guests is not.” We could continue: grocery shopping counts, gardening doesn’t; replacing old socks counts, darning them doesn’t.

This leads directly into Fioramonti’s most important precept for creating an alternate economic measure: “one important step in shifting attention is to make the invisible visible. This is what ‘post-GDP’ scholars and activists are trying to achieve.” This proves more ideal that systematic. Though Fioramonti lists several alternate economic yardsticks devised since around 1975, none encapsulates every possible contingency. We need complementary measures, Fioramonti writes, not one-size-fits-all.

Among other topics, Fioramonti spends considerable time on what officials euphemistically call the “informal economy.” This sometimes means off-the-books accounting, like the Mafia, but it also includes everything productive we do that doesn’t generate money. Volunteer work, time spent with family, and home-cooked dinners all create value, but in ways that lack price, and therefore the GDP cannot track them. Does mom’s home cooking have no economic value?

“The GDP-induced categorization of work,” Fioramonti writes, “also hides the fact that only a fraction of people’s time is spent on formal jobs.” But other systems of measurement can include these pastimes. If the economic devaluation of environmental destruction doesn’t convince you the GDP measures the economy badly, then maybe you’ll be convinced when other measurements place value on your hobbies, community, or family. The GDP considers these wasted time.

I repeat, because Fioramonti does, that economic yardsticks are never value-neutral, despite what ardent capitalists claim. The GDP rewards whatever costs money, hides whatever “externalities” get buried off the books, and encourages reckless, interest-bearing debt. Fioramonti does a remarkable job detailing this history. Committed followers of events, like me, may have some prior familiarity with Fioramonti’s descriptions of what already exists, though he collates diverse sources in new, enlightening ways.

Then, when we’re convinced the status quo cannot continue, Fioramonti provides us the alternative. These aren’t just alternate accounting systems. They’re innovative value measurements, means of rewarding productive behaviors beyond slapping price tags on everything. Our world is changing, bringing the marketplace with it. If we don’t change our economic paradigms appropriately, history will surely leave us holding the bag for costs we’re not yet prepared to pay.

Monday, June 19, 2017

The Economics of Addiction


My father has emphysema. Who wouldn’t, after fifty-six years of smoking? Though he hasn’t been formally diagnosed, his pained breathing and persistent fatigue have finally forced him to use the word “emphysema” for the first time, at age 72. He once told me he began smoking at age fourteen, though he never mentioned why; after several false starts, he finally kicked the habit about eighteen months ago.

When my family relocated from California to Nebraska in 1992, so my parents could retire near where they grew up, I immediately noticed how many people smoked. My first job, behind the convenience store counter, involved accessing the tobacco rack for customers, a position for which I now suspect I was technically underage. While it wouldn’t be accurate to say most purchases included packs of smokes, enough did to worry me.

At that time, still under the sway of neoliberal political thinking, I would’ve never attributed economic reasoning to personal habits. I just wondered why smoking seemed so pervasive in Nebraska culture compared to California. Though I knew smokers Out West, they remained primarily obscure, pursuing their habits less blatantly. My workplace never completely stopped at smoke-break time, as it did in Nebraska, where smokers herded, lemming-like, toward the doors.

But thinking about my father’s struggling health, I realized I did see something similar in California. Though people smoking weed and consuming other drugs needed to maintain more cover than smokers do, the same basic behavior obtained. People embraced work, school, and other mandatory responsibilities as clear-headed as circumstances allowed, then when duty ended, they raced headlong to whatever substance made them feel human again. Legal or otherwise.

Addiction specialist Gabor Maté writes that understanding substance addicts in terms of recreational users is mistaken. Some people smoke weed, inject heroin, snort cocaine, and consume other drugs because their substances make them feel good. Addicts don’t want to feel good, however; they want to feel normal. They want whatever suffering infects their sober lives to vanish under the comforting glow of their favored substance, even for an hour.

Cocaine and heroin are painkillers. Before they became illegal, snake-oil salesmen included these drugs in their patent medicines because, no matter whatever else their concoctions included, Peruvian marching powder made the pain go away. So when considering what turns people into coke or smack addicts, or what hooks people on other painkillers like alcohol or Vicodin, we must look not at the drugs, but at whatever pain needs killed.


Nicotine and cannabis, however, aren’t painkillers. Like Valium, another widely abused substance, they’re anti-anxiety drugs. When jitters paralyze you, having a smoke, a toke, or a tab of V really drains the tension. So if painkillers require us to find the user’s unexamined pain, logic dictates, anti-anxiety drugs require us to find the unexamined anxiety. Why would a 14-year-old from bucolic western Nebraska have anxieties that need smoked out?

Rural life is frequently precarious. The principal economic driver, farming, is constantly subject to weather, market fluctuations, and other forces individuals cannot control. Dips in commodity prices take money from farmers, but also from businesses dependent on farmers, like equipment dealers, small-town banks, and entire rural downtowns. Despite tough-talking individualist myths, rural and small-town people, the western Nebraska population, live constantly at the verge of a sheer cliff.

Compare big-city life. Even after the collapse of 2008, the financial services sector remains America’s largest industry, in dollar terms. People wager massive fortunes on a 24-hour cycle. As we learned during the last economic contraction, financial services operates like a casino, plying big winners with rewards to keep them at the table. In Vegas, the rewards include comped drinks. One icon of bankers’ lifestyles is the three-martini lunch.

So while small-town people live constantly with the anxiety of hoping they’ll make next month’s payments, big-city moguls swallow the risks of gambling away Grandmother’s retirement savings. People raised in rural life, like my dad, or in California’s suburban uncertainty, smoke the anxiety away. While Bernie Madoff-type gamblers kill the pain of knowing they’re rewarded while they’re winning, but could lose everything at any moment.

Cocaine and heroin have little presence in western Nebraska, where I live, but at my construction job, I’m among the few men who don’t use tobacco. This isn’t coincidental. People’s favored drugs reflect their circumstances, and their circumstances have dollar measurements. Though hard drugs remain the province of urbanism, where difficulty and pain define daily life, rural workers will always prefer to smoke their fears away.

Wednesday, June 14, 2017

Tommy Gunn's School Daze

Laurie R. King, Lockdown: a Novel of Suspense

Career day at Guadalupe Middle School will make or break Principal Linda McDonald. After turning a failing school around, Linda has managed to corral enough community members to remind her students they have a future. For one day, they’ll forget their personal dramas, the pains festering at home, and the two criminal investigations lingering at the margins, and celebrate their potentials. Too bad somebody’s coming to school with a gun.

The dust-flap synopsis on Laurie R. King’s newest standalone novel is slightly misleading. Though the story promises a violent schoolyard confrontation, the anticipated explosion doesn’t actually arrive for over 300 pages. Rather, King places the emphasis on the buildup, the suspense as a school of over seven hundred students simmers. We know something’s coming. We’re left to wonder not what, but who, and why. Because King offers multiple suspects.

Principal McDonald has shepherded her school through several powerful conflicts in her first year. A well-liked, but possibly abused, student has disappeared, leaving behind a best friend pitching conspiracy theories pinched from Doctor Who. A high-school gangland murder drags the middle school in because the only witness was one of McDonald’s students. And that’s just the problems McDonald can see. She has multiple cauldrons waiting to boil over.

There’s the kid harboring a nasty grudge and carrying something in his backpack so powerful, he can’t bring himself to think about it directly. The beautiful but damaged teen desperate to escape the stultifying strictures her political refugee parents place upon her. The principal’s husband, always fleeing his personal demons. The wannabe gang-banger desperate to prove his chops. And the janitor, known only as Tío, carrying a bloody secret.

Laurie R. King
Guadalupe MS, in the (fictional) agricultural community of San Felipe, California, has dozens of conflicting forces pushing on its students. They come from a mix of economic, racial, and cultural backgrounds: poor Hispanic migrant workers and software developers send their children to one school. Also fugitives fleeing poorly defined threats, and working families hoping to shield their children from gangs. Add heat, and watch the chaotic combination boil.

Our story unfolds, minute by minute. King offers us glimpses into different viewpoint characters’ heads, so we see the same events from multiple contexts. What one character considers a flippant comment, another perceives as an insufferable slight. Principal McDonald has at least two opportunities to deflect the looming violence, but misses them because she can’t read students’ minds. Characters live in their own brains, never knowing how close they miss.

This, King implies, is the theme of life in public society: everyone thinks their personal dramas are unique. Especially in middle school, with the simmering pressures of looming adulthood, every character sees their own conflicts, and doesn’t realize others have the same. King only addresses this indirectly, as when her wannabe gangster thinks the beautiful girls have life easy. But it’s constantly present: everyone has problems nobody else sees.

The front cover calls this “A Novel of Suspense,” which isn’t inaccurate: we know something catastrophic will happen, changing characters’ lives forever. But this isn’t like bog-standard procedurals or action potboilers. King offers an overlapping matrix of character dramas, inviting us to tease out secrets and layers. The suspense comes from wondering which of the many private controversies will eventually spill over into public violence.

King is most famous for writing novels starring Mary Russell and an obscure supporting character named Sherlock Holmes. The publisher bills this as King’s first standalone novel in over a decade. But even that isn’t entirely accurate, since there’s a brief chapter linking this novel to King’s lesser-known series protagonist, SFPD detective Kate Martinelli. It’s a fun teaser, but one needn’t know King’s prior works to appreciate this story.

If this novel suffers one shortcoming, King introduces so many characters, with their own private plotlines, that she can’t possibly defuse them all. We know, with the bloody climax coming, that King can’t resolve both the miscommunication between the clique of insecure pretty girls, and Tío’s attempt to save the gangbanger from his myths. King starts multiple interesting stories which remain unfinished. Maybe she’s saving something for the sequel.

Nevertheless, she does a remarkable job displaying not only the causes of life-changing violence, but the lives that will be changed. Middle school is a crucible, even when literal blood doesn’t spill, a dark and brooding place where everyone thinks they suffer alone. And, as we read, we realize: maybe we aren’t so different from these kids ourselves. Which is actually a liberating thought.

Monday, June 12, 2017

Götterdämmerung, the Reader's Digest Version

Daniel Kehlmann, You Should Have Left: a Novel

A successful screenwriter rents an AirBnB in the mountains to write. His studio wants a sequel, and his family needs the money. But secluded on a vacation cabin with his glamorous, bored actress wife and their daughter, the words begin to flow. Until the bad dreams begin. And right angles don’t add up to ninety degrees. And his little girl wakes up speaking prophecies of doom.

Veteran novelist Daniel Kehlmann is a household name in Germany, but remains largely unknown to English-speaking readers. This novella might change that. Mixing elements of Shirley Jackson, Stephen King, and Elizabeth Hand, Kehlmann creates the kind of creeping dread that American paperback readers love, channeled through the kind of linguistic mindset that gave us Thomas Mann and Günter Grass.

The story unfolds as our nameless narrator’s journal. On one hand, he jots notes for his screenplay, a John Hughes-like coming-of-age comedy where two “Besties” adjust to an adult friendship. The story provides ironic commentary on events building around him, as his vacation home apparently grows new bedrooms overnight, reveals a massive mountain nobody else can see, and insinuates itself into his marriage. Symbolism abounds.

The story immediately invites comparison to King’s The Shining and Jackson's The Haunting of Hill House. Kehlmann doesn't even pretend to deny such allusions, though he doesn't acknowledge them either. But such comparisons, while accurate, are nevertheless incomplete. Kehlmann doesn't so much present a horror novella, as a novella of what Freud calls “the Uncanny,” the subconscious made manifest in the protagonist's senses.

The house seemingly accentuates its inhabitants’ identities. When the narrator and his wife fight, they fight like feral cats cornered in the same Dumpster. When they agree, they mesh like two hemispheres of the same brain. And when their daughter begins speaking grim, powerful words beyond her ken, they realize, almost without words, that the problem isn't them, it’s the house.

Daniel Kehlmann
Much more happens, of course. But not, I fear, enough. I like this book, and don’t want anybody to ever say I said otherwise, but man, this book is short. The story itself runs under 110 pages, which makes its $18 list price awfully steep. I read the whole thing inside three hours. And though Kehlmann is easily King’s or Jackson’s equal in scene-setting, his conclusion is abrupt, without resolution, leaving only questions.

Horror, after all, is a genre of balance. Writers must reveal enough to keep readers engaged, but withhold enough to maintain suspense. No wonder I mainly write poetry.

So I vacillate on how to review this book. For most of the reading experience, I wanted to lavish praise upon it: though he reuses tropes horror readers have seen before, Kehlmann employs them in ways that create tension and make us care about his characters. Then he just stops. His characters, complex and deeply humane, deserve more explanation at the denouement than he offers. It’s like he just got bored.

Perhaps I’m being overly critical. Kehlmann clearly comes from a literary, rather than genre, background. His emphasis rests on creating nuanced characters and telling details, rather than dawning fear. In that case, rather than Stephen King, his work more closely resembles Brian Evenson or Thomas Ligotti. As with those writers, the why of the situation matters less than its imminence for the characters.

Nevertheless, Kehlmann’s audience reads works like this every day. Veteran readers recognize both the similarities, and the differences, with King’s Jack Torrance or Liz Hand’s Julian Blake. We deserve some explanation of how our nameless narrator finds himself in this situation. Without that, the final four pages appear weirdly disconnected from what came before. Like Kehlmann was writing for a predetermined end, rather than one arising from the situation.

Translator Ross Benjamin has a long history of translating German-language writers into American English. His CV reads like a Who's Who of contemporary German writers, though most will remain unfamiliar to English-speaking audiences. The one readily familiar name also clarifies his qualifications to translate this novella: Benjamin has won awards for translating Franz Kafka.

On balance, I suppose I should recommend this book. Some of my favorite authors, like Joseph Conrad and Salman Rushdie, have difficulty writing resolutions worthy of the novels they’ve crafted. And King himself extols the short form because it exonerates authors from the imperative to explain. My disappointment arises not because this book is weak, but because it's so strong that it sets itself a high bar. This is a good novella. I just wish it was great.

Thursday, June 8, 2017

Welcome to Fatland

You’ve probably seen something like this image recently; several versions circulate. Don’t write another article on obesity in America until you explain why fatty, unhealthful foods cost more than their healthy, nutritionally complete equivalents. And I’ve seen several answers back, like: well, if you only eat McDonalds then yeah; or, anything is cheaper deep-fat-fried than prepared in a healthy way. I’d like to offer just one possible explanation.

You’ve probably heard lots about America’s notoriously subsidized agriculture. Because of massive monetary transfusions used to keep farmers working and food affordable, American crops are often cheaper than the dirt they grow in. That’s especially true with today’s inflated land values. When NAFTA lowered trade barriers, subsidized American-grown food hit Mexican markets below the cost of growing, causing rural poverty to hit seventy percent in Mexico.

But those subsidies don’t go just anywhere. Since the Great Depression, America has subsidized just five staple crops: corn, wheat, rice, sorghum, and cotton. These staples all have long shelf lives, which makes their market value very volatile: oversupply can last a long, long time. If farmers overplant lettuce, it’ll rot within a matter of weeks. If farmers overplant corn—and who know’s what’s too much at planting time?—markets could be destabilized for a year.

So, America has decided we owe our planters of cereal grains and natural fibers the dignity of a stable income. After all, an unstable grain market owing to oversupply jeopardizes farmers, but we still need to eat. Grains provide dietary fibers that we all need, and unlike fruit or salad greens, we can ship corn to where it’s needed. Why not, therefore, dedicate public money to ensuring people who grow our corn aren’t rolling the dice on uncertain markets.

Except that hasn’t been the effect. By subsidizing only a few crops, we’ve created cash incentives for farmers to overproduce these grains at massive numbers. Cotton is so cheap now that we use it to make disposable shop rags. According to agricultural journalist George Pyle, American farmers currently produce twenty times as much corn as American consumers can possibly eat. All that oversupply has to go somewhere.

And that “somewhere,” overwhelmingly, is animal feed. American agricultural policy doesn’t directly subsidize livestock agriculture. However, we have Earth’s cheapest meat, because by encouraging oversupply, we indirectly subsidize cattle farming. Cattle raised on grass, like God intended, reach market weight in about two years. Cattle raised on corn, fed to them in confined feedlots, reach market weight in about fourteen months. It’s a cash boon for livestock farmers.

A typical confined animal feeding operation—in this case, a hog pen

Stay with me here. The wheat used in making buns is directly subsidized. The beef slapped between those buns is indirectly subsidized. Even the cheese used to make the burger taste less like dead flesh is subsidized, because dairy oversupply keeps threatening to crash market values; the government buys excess dairy and pours it on the ground to stabilize prices. Does the government want us to eat more burgers?

Of course not. They just don’t want farmers subject to the instabilities of market fluctuations. Readers old enough to remember the “tractorcades” of the 1980s know that farmers are more beholden to market forces than most other producers. As we learned in 2008, housing oversupply is bad for home builders; but builders can store their tools, pull in their claws, and wait. Farmers, to keep their families together, often have to sell their land.

This says nothing about side effects of agricultural policy. Subsidizing only five crops has led to massive monocropping, which overtaxes the soil of certain nutrients. To keep the land producing crops, farmers saturate it with fertilizers derived from hydrocarbons. American farms today produce more greenhouse gases than cars do, not from inefficiency, but because farmers need the five magic crops to show a profit. And nutrient-depleted topsoil washes away whenever it rains.

That seems simple enough. The makings of a burger are directly or indirectly subsidized, while the makings of a salad are not. If the ways we spend our money reflect our cultural values, then apparently we place higher value on maintaining certain food crops than on encouraging Americans to eat well. This approach, though moralistic, isn’t wrong. Maintaining the status quo is cost-efficient, while changing the system, even a system that causes bad health, is scary.

Designing an agricultural policy that would result in more diverse crops, better land management, and healthier foods at more modest prices, will challenge even seasoned legislators. Even in today’s environment of armchair quarterbacking, I don’t dare extend myself this way. But somebody must. Because the meme isn’t wrong: we won’t tackle American obesity until ordinary Americans can afford better-quality food.

Wednesday, June 7, 2017

Did Chain Restaurants Just Declare Capitalism Dead?


The language couldn’t be more moralistic: “Millennials are killing chains like Buffalo Wild Wings and Applebee’s,” screamed the headline. As if the murder image weren’t clear enough, the tagline continues: “Casual dining is in danger — and millennials are to blame.” I wish I was kidding; had my students written this as fiction, I’d have returned it marked too high-handed to be plausible. But this screen-capture proves my point:



This unsubtle attempt to shift blame for flagging sales off chains that haven’t handled changing times, onto the young customer base they’ve long taken for granted, is stacked with assumptions. By saying customers are “killing chains,” the article makes customers into murderers, and chains into victims. By unambiguously assigning “blame,” it makes the demand side of economics responsible for chains’ fortunes, rather than chains themselves.

But most important, it implies that Millennials, possibly the most poorly defined generational cohort since popular media began naming generations, have the same choices their parents had about spending money. This is, of course, ridiculous to anybody who follows economics. Starting wages are down, housing costs—especially in major cities—are way up, and entry-level jobs frequently require graduate degrees just for consideration. Youth have no remaining money for hot wings out.

Though writer Kate Taylor acknowledges, in her article, that blaming Millennials has become “a trend to the point of cliché,” she ultimately maintains the pattern, squarely hanging responsibility for chains’ fortunes on young, supposedly childless customers. Throughout her diatribe, Taylor implies customers have a moral obligation to create demand for poor, beleaguered chains. Which spits in the eye of that beloved libertarian fetish, the supply-demand curve.

Think back to college economics. You took college economics, right? If statistics hold, you probably didn’t; your last mandatory exposure to economic theory probably happened in high school civics class, sandwiched between a unit on the Constitution and one on the War Powers Resolution of 1973. Therefore, you probably have a glimmering of the supply-demand curve, but nothing concrete. You vaguely remember that when supply equals demand, we know what something is worth.

But Taylor’s article says that demand should rise to meet supply. So-called casual dining chains have grown since around 2000 by oversaturating markets, mostly suburban malls, and maintaining the just-in-time resupply model pioneered by big-box retailers like Walmart. This flood-marketing model has, for years, served to create demand among young families with reliable income and limited expenses. McDonald's and Pizza Hut used the same model in prior generations.

That model demands steady, basically white-collar economic growth. This failing model dominated the 2016 presidential campaign. Hillary Clinton stumped on the same suppositions, while Donald Trump channeled outrage at the disappearance of jobs in manufacturing and extraction. The jobs that haven’t been automated, have largely been shipped overseas. Domestic employment has bifurcated into executive leadership and the service industry. Working Americans now sell each other, well, Buffalo Wild Wings.

Chains, facing slumps because youth would rather eat at home, or cannot afford to eat out, cry foul. Because they think demand should follow supply, not vice versa. Economists call this “induced demand.” I first encountered the concept in models of urban parking: more parking lots cause more driving, largely because putting stores and homes further apart makes walking an unsustainable cost. Gasoline is cheap; walking becomes tedious.


But food doesn’t follow that one-point model. As better grocery stores increasingly offer inexpensive delivery service, and meal-kit marketers like Blue Apron make gourmet home-cooking available to pure amateurs, simply building new stores behind massive parking lots doesn’t induce demand anymore. In short, market forces no longer make dining out a desirable choice, no longer cheap and convenient, especially in relation to youths’ wages.

Libertarian capitalism declares that market forces are sacrosanct. Whatever people willingly pay for, must perforce be good. Don’t interfere with markets. Unless, apparently, markets shift, and a previously successful business model becomes untenable. The underlying core of this article declares that demand exists to serve producers, not vice versa, and customers who don’t want their product are just wrong. Sounds like chain restaurants just basically declared libertarian capitalism dead.

Productive American industries have cut costs, including labor, for three generations. And service providers now whine that customers won’t, or can’t, buy their product. I think they think they’re crying foul, like an entire generation isn’t playing by the supply-side rules. Under that, though, they secretly concede the truth, that they don’t trust customers. Market providers apparently just can’t handle capitalism if it doesn’t serve the capitalists.

Monday, June 5, 2017

Wonder Woman and the True Meaning of No-Man's Land

Wonder Woman (Gal Gadot) preparing to go "over the top" into No-Man's Land

You’ve seen the trailer footage: Diana Prince, Wonder Woman, clad in Greco-Roman armor, pausing to stand tall amid a fire-blackened landscape, before charging into overlit tracer bullet fire. The footage doesn’t make entirely clear that she’s just risen from a British trench in the Great War, crossed into No-Man’s Land, and begun to charge the German line. And, when those numerous, fast-moving bullets inevitably pin her down, men crest the trench and follow her lead.

This doesn’t just create a good visual. After a three-movie streak of stinkers from DC studios, this moment demonstrates what makes superheroes, something Zack Snyder apparently doesn’t appreciate. Heroes represent, not the recourses we’re willing to live with, as with Snyder's Superman, but the aspirations we pursue, the better angels we hope to achieve. We all hope, faced with the nihilism of the Great War, that we’d overcome bureaucratic inertia and face our enemies head-on.

In some ways, this Wonder Woman, directed by relative novice Patty Jenkins, accords with DC’s recent cinematic outings. Diana’s heroism doesn’t stoop to fighting crime, a reflection of cultural changes since the character debuted in 1941. Ordinary criminals, even organized crime, seem remarkably small beer in today’s world. Crime today is often either penny-ante, like common burglars, or too diffuse to punch, like drug cartels. Like the Snyder-helmed movies, this superhero confronts more systemic problems.

But Snyder misses the point, which Jenkins hits. Where Snyder’s superheroes battle alien invaders, like Superman, or pummel the living daylights out of each other, Wonder Woman faces humanity’s greatest weaknesses. The Great War, one of humanity’s lowest moments, represents a break from war’s previous myths of honor. Rather than marching into battle gloriously, Great War soldiers hunkered in trenches for months, soaked and gangrenous, seldom bathing, eating tinned rations out of their own helmets.

Steve Trevor (Chris Pine) and Wonder Woman (Gal Gadot) strategize their next attack

This shift manifests in two ways. First, though Diana speaks eloquently about her desire to stop Ares, the war-god she believes is masquerading as a German general, this story is driven by something more down-to-earth. General Ludendorff’s research battalion has created an unusually powerful form of mustard gas. The very real-world Ludendorff, who popularized the expression “Total War,” here successfully crafts a means to destroy soldiers and civilians alike. He represents humanity’s worst warlike sentiments.

Second, this Wonder Woman doesn’t wear a stars-and-stripes uniform. Comic book writer William Moulton Marston created Wonder Woman as an essentially female version of Superman’s American values, an expression externalized in her clothing. This theme carried over into Lynda Carter’s TV performance. But this Wonder Woman stays strictly in Europe, fights for high-minded Allied values rather than one country, and apparently retires to curatorship at the Louvre. Her values are unyoked to any specific nation.

Recall, Zack Snyder’s Superman learned from his human father to distrust humankind, and became superheroic only when threatened by Kryptonian war criminals. Diana, conversely, learned to fight for high-minded principles—which she learned through myths which, she eventually discovers, are true without being factual. Snyder’s Superman, in fighting General Zod, showed remarkable disregard for bystanders, his film’s most-repeated criticism. But Diana charges into battle specifically to liberate occupied civilians. The pointed contrast probably isn’t accidental.

Unfortunately, Diana learns, war isn’t about individual battles. She liberates a shell-pocked Belgian village, and celebrates by dancing with Steve Trevor in the streets. But General Ludendorff retaliates by testing his extra-powerful chemical weapons on that village. No matter what piteous stories she hears about displaced, starving individuals, ultimately, her enemy isn’t any particular soldier. It’s a system that rewards anyone willing to stoop lower than everyone else, kill more noncombatants, win at any cost.

This picture doesn't serve my theme; I just really like that it exists (source)

In a tradition somewhat established by the superhero genre, Diana culminates the movie with a half-fight, half-conversation with her antagonist. Ares offers Diana the opportunity to restore Earth’s pre-lapsarian paradise state by simply scourging the planet of humanity. (Though Greek in language, this movie’s mythology reflects its audience’s Judeo-Christian moral expectations.) Diana responds by… well, spoilers. Rather, let’s say she simply resolves that fighting the corrupt system is finally worthwhile, even knowing she cannot win.

Wonder Woman’s moral mythology resonates with audiences, as Superman’s doesn’t, at least in the Snyderverse, because she expresses hope. Watching Diana, we realize it’s easy to become Ludendorff, wanting to not just beat but obliterate our opponents. Yet we desire to emulate Diana, standing fast against human entropy and embodying our best virtues. Diana is a demigod, we eventually learn, and like all good messiahs, she doesn’t just rule humanity, she models humanity’s truest potential.

Tuesday, May 30, 2017

The Appalling Return of Honor-Based Politics

Greg Gianforte, who may be the first Congressman
sworn in while under indictment for assault
When Congressman-elect Greg Gianforte physically assaulted a journalist, in a move so brazen even Fox News “watched in disbelief,” the attack itself seemed secondary to me. When Gianforte handily won what observers had previously called a closely divided special election, I realized we’d witnessed a new period appearing in American politics. The next day, the Portland double stabbing confirmed for me: we’ve entered a new period of honor-based politics.

I’ve known several people who complain that American culture today lacks honor, that we’ve become a systemically dishonorable people dwelling chink-deep in shame and disrepute. Most toss this off fleetingly; the two I know who most vigorously repeat this accusation are ex-military, veterans of a culture steeped in honor. I think I understand their meaning, as American mainstream culture has lost any sense of shame, from politics to prime-time TV.

Yet restoring honor is no catch-all solution. Honor, the sense that today’s actions cling to one’s name into the future, has its appeal in a society where even sexual assault boasts can’t derail political campaigns. We need people to face consequences for their decisions; some people should need to spend time reclaiming the integrity of their names. But restoring a literally medieval honor code won’t solve the problem.

Malcolm Gladwell’s book Outliers dedicates an entire chapter to honor-based societies, to the elaborate culture of claims and counter-claims, of confrontation and defense, that comprise such societies. Honor systems are bound up in elaborate rules for how to answer somebody’s challenges. Tellingly, he grounds his description of honor society in the southern Appalachians: Hatfield and McCoy territory. Honor societies share one common cultural manifestation, the family blood feud.

Honor culture, in Gladwell’s telling, rests on a network of rules, mostly unwritten, of how to uphold one’s name. One such rule: no challenges go unanswered. If anyone impugns your name, that requires immediate response. If anyone questions you, even incidentally, you must answer immediately. Failure to answer leaves a stink of dishonor clinging to your person, which cannot wash off until you provide some response to reclaim your name.

In the Fox News telling, Guardian reporter Ben Jacobs treated Greg Gianforte very rudely. He interrupted an interview already in process, refused to switch a voice recorder off when directed to do so, and attempted to hijack the conversation. In a dignity-based culture, Gianforte could’ve sat back, let the Fox cameras capture Jacobs behaving dickishly, and won the debate without opening his mouth. It was his fight to lose.

You knew this guy was coming
into the story at some point, right?
But in an honor-based culture, Jacobs questioned Gianforte’s positions, which means he questioned Gianforte’s integrity. If Gianforte couldn’t respond quickly, and preferably with sufficient force to stop all future questions, he’d look weak. In honor societies, personal feuds can continue for years, dragging entire communities and families down, and end only when one participant is too thoroughly demolished to ever fight back. As you know if you’ve seen Hamilton.

Possibly emboldened by Gianforte’s victory after attacking somebody, Portland stabber Jeremy Joseph Christian went on a rampage the next day. We’ve already seen documented how his racist rantings on public transportation ascended to violence when onlookers intervened, creating two new heroes. But it’s largely the same motivator: somebody interrupted him, challenging his position. He needed to answer, quickly and overwhelmingly, to restore his name. That never ends well.

Gladwell’s description of honor culture attributes the Hatfield-McCoy dynamic to learned culture. The honor-bound behaviors of the southern Appalachians reflect the same attitudes found in the Scots-Irish homelands from which the region’s white residents first emigrated. Though Gladwell stops short of attributing the cause to genetics, he nevertheless hangs the motivation on the region’s Celtic heritage. These people feud, he implies, because they're a bunch of angry fighting Micks.

But I suggest something different happens. The Scots-Irish left, or more accurately got forced off, marginal land in impoverished, colonized countries, and wound up on marginal land in impoverished, hegemonized states. The same dynamic of marginal land, chronic poverty, and cultural subjugation obtains in Gianforte’s Montana today. And though Portland is hardly poor and marginal, Fox and Breitbart have convinced many whites they’re living in such conditions, despite the evidence.

Therefore I suggest we’ll probably face similar situations again. Voters in places like Billings, Montana, and eastern Oregon, preponderantly supported a tiny-handed President who sees ordinary questions as personal affronts, and answers every challenge by attempting to destroy the challenger. I once thought we were facing expressions of America’s id. Now, I think we’re seeing white America defend its honor.

Friday, May 26, 2017

Where American History Goes To Die

1001 Books To Read Before Your Kindle Battery Dies, Part 82
James W. Loewen, Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong

American attitudes toward history are deeply contradictory. On the one hand, we reverence our past and make demigods of our Founding Fathers. On the other, we’re frequently altogether mistaken, even flat damn wrong, about what they actually did. Demagogues use this factually muddled reverence to manipulate us toward ends we don’t want and scarcely understand. How did we reach this point, and how can we combat it?

Harvard-educated historian James W. Loewen became a celebrity within education circles in 1980, when Mississippi rejected a state history textbook he co-wrote, on the grounds that it focused too heavily on racial matters. He challenged this decision in court, and his victory became a landmark in First Amendment battles: states cannot reject textbooks simply because they dislike the content. This philosophy underlies much of Loewen’s later writings.

Who were the Native Americans? Was John Brown, the violent abolitionist who attacked Harper’s Ferry, insane? Who took the lead in the Civil Rights Movement? And what, really, happened in Vietnam? Your answers to these questions probably reflect how these topics were taught, or frequently avoided, in your high school American History class. They probably inform how you think and vote today. And, Loewen demonstrates, they may be wrong.

Loewen begins this, his most famous book, with twin anecdotes about how two figures, Helen Keller and Woodrow Wilson, get described in high school-level American History textbooks. Keller, a longtime labor activist who believed capitalism threatened American values, gets reduced to a child who triumphed over adversity. Wilson, a racist who invaded several countries on specious pretexts, gets elevated to a progressive icon whose economic policies delayed the Great Depression.

James W. Loewen
These anecdotes, which Loewen finds repeated across twelve textbooks widely used in American public education, represent the process by which controversy and debate vanish from American history. People who never study history beyond high school, never realize that Keller and Wilson, plus Columbus, Lincoln, and other sanctified icons, had deep conflicts and remain controversial today. Textbook authors would rather elevate heroes and celebrate triumphs, than acknowledge America’s fraught past.

Textbook history, Loewen finds, have a tendency to present history has a succession of heroes advancing American greatness, pushing us toward ever-better displays of virtue. Students get no sense of setbacks, struggles, and the difficulty we still face achieving America’s stated principles. Thus, many citizens believe the present somehow represents a decline from a storied past, and today’s controversies as irrevocably cluttered and dangerous. Which they’re not.

This sometimes requires Loewen debunking specific myths. The virtual erasure of both racial and economic factors from history textbooks leaves Americans believing the controversies over these topics are somehow recent. Even slavery gets divorced from race. Yet when Loewen reprints a pre-Civil War campaign song, “Nigger Doodle Dandy,” used to split poor white voters from blacks, for instance, America’s long history of race- and class-based divisions becomes glaringly obvious.

Other times, Loewen eschews specific myths, preferring to focus on the myth-making process holistically. How did the First Thanksgiving become a sort of American Genesis myth, one to which Native Americans are mere guests? How did poverty and want vanish from history texts? Why is the entire Twentieth Century often addressed in under fifty pages, as though textbooks fear to approach the recent past? How did so much get omitted?

Multiple explanations exist. Textbook authors write, not for students, but for textbook committees, which often don’t involve educated historians. Education departments fear the wrath of powerful private interests, which would often rather have students loyal and patriotic than open-minded. Many high schools hire history teachers to coach athletics, and they have only a cursory background in their discipline. And these are only some of Loewen’s diverse, scary explanations.

Partway through this book, Loewen says one thing I cannot support: he insists that history, alone among disciplines, is so badly taught in high school that college professors must spend entire semesters breaking students from oft-regurgitated myths. But that’s not so: Paul Lockhart says something almost identical about math, and Gerald Graff says that about English. Sadly, much higher education involves students unlearning ignorance propounded in public schools.

In a democracy, history matters to how citizens approach the present. Citizens who don’t understand that history is both contingent, and ongoing, can’t make informed decisions about their government. The failure to understand history’s themes often colors our tendency to approach the present with either outrage or helplessness. If schools won’t educate Americans, we must educate ourselves. Loewen provides the tools to begin that dangerous process.

Wednesday, May 24, 2017

What Do You Call a Thriller That Doesn't Thrill?

Dan Chaon, Ill Will: a Novel

Nearly thirty years ago, someone murdered thirteen-year-old Dustin Tillman’s extended family, leaving Dustin and his older female cousins orphaned. Dustin’s testimony steered his adopted older brother, Rusty, to Nebraska’s Death Row. But DNA evidence has exonerated Rusty. Dustin, now a successful Cleveland therapist with kids and a critically ill wife, must grapple with his brother’s sudden reappearance in his life… as a fresh round of killings begins in his area.

Award-winning novelist Dan Chaon has good intentions with this novel. He takes premises from genre fiction, filtered through the techniques of high-minded literary fiction. But like a sleeper couch, he creates a hybrid that performs different functions with relative discomfort, in a way that will satisfy neither thriller readers nor literary cognoscenti. By populating his simply ordinary story with supremely unlikable characters, he leaves audiences nowhere to hang their hats.

First, Chaon’s nonlinear storytelling confounds where it should clarify, and vice versa. By stringing events together in an unsequenced montage, like a hip-hop filmmaker improvising at the editing table, events coalesce more from context and inference than organically. Joseph Conrad did this in Nostromo, where the confusion of secondhand information was partly his point. When we have access to viewpoint characters’ thoughts, as we do here, it just looks sloppy.

Moreover, as narration tapdances without chronological coherence, experienced thriller readers will start watching for whatever the viewpoint character leaves out. We understand how unreliable narrators work. We read this shit every day. Within thirty pages, it becomes painfully clear which character has omitted which important information from the recounting. This isn’t a mystery, where our protagonist must coax reality from conflicting evidence. The characters are just lying to the audience.

Dan Chaon
Chaon’s characters, besides being willfully dishonest, are also unpleasant. Not ordinary unpleasant, like Sam Spade, whose impromptu ethics define his story. Dustin Tillman, who has buried childhood trauma in marriage and career, handles his wife’s early death by descending into alcoholism and parental negligence. His son Aaron becomes a quasi-goth junkie with homoerotic tendencies, presumably because “gritty realism” sells. Dustin’s cousins use promiscuity to plug the vacancies in their souls.

The characters come across, not as hard-boiled, but as merely dickish. Everybody’s morally vacuous, but not for any story-based reason. Indeed, I’m not entirely sure even Chaon understands why his characters do anything. Dustin, a therapist, revisits his childhood with Rusty (Rusty & Dustin, geddit? Jazz Hands!) in terms transcribed almost verbatim from the DSM-V. Chaon cursorily plugs proper nouns into the description and apparently considers his authorial responsibilities thus covered.

But Dustin is a deliberately unreliable narrator. What about his son Aaron, the junkie? His described descent into addiction, debauchery, and crime, feels memorized from ONDCP pamphlets and Tarantino movies. I don’t believe these events for one damn moment. Dustin’s cousins behave wantonly because what self-respecting attractive 17-year-old doesn’t? Flashes of homosexuality, incest, and domestic abuse evidently happen because literary fiction authors have little else to elicit emotional responses anymore.

Parallel to all this, Dustin befriends a former patient, an ex-cop who deals with being benched by diving into tinfoil-hat-wearing conspiracy theories. Aqil Ozorowski, who sounds like an error at the Scrabble factory, has identified a pattern of college-aged men disappearing at regular intervals and turning up later, drowned. He claims law enforcement is ignoring the truth. His wild surmises seem harmlessly annoying, until his pattern strikes Dustin’s family directly.

I feel so cynical describing Chaon’s work this way. His well-crafted narration, which sometimes reads more like Rimbaud’s epic prose poem A Season in Hell, deserves some mention. At the sentence level, Chaon writes well. But contra the advice sometimes dispensed by undergraduate writing instructors, writing is more than constructing good sentences. He’s chosen a genre with a dedicated, experienced audience, and apparently doesn’t realize his readers recognize the boilerplates.

Clearly Chaon wants to combine genre fiction’s gut-level sensory immediacy, with literary fiction’s thoughtful investigations of character motivation. But he doesn’t realize his thriller aspects are recycled, or that his characters treat the reader with contempt. I cannot help comparing Chaon’s story with Belinda Bauer’s Blacklands, which accomplishes what Chaon apparently cannot. Where Bauer explores her characters, Chaon acts like an exhibitionist. Bauer is morally ambiguous; Chaon is just unpleasant.

Somewhere around the one-third mark, I lost all motivation to keep reading. I realized, I didn’t care if these characters all died in a fire. I just couldn’t bring myself to persevere. That, fellow reader, may say everything you need to know about this joyless cinder block of a book.

Monday, May 22, 2017

The Struggles of 2017 (As Seen From 1968)

Alain Badiou, The True Life

Western traditions and moral foundations are withering, says Alain Badiou. Religion and politics are vestiges of an older time, while capitalism reduces us alternately to children and instruments. In this series of talks, originally directed at adolescents, Badiou questions where youth culture could head in an era when we distrust the past and cannot count upon the future. Answers aren’t much forthcoming, but in philosophy, sometimes the questions matter more.

As a sometime academic and recent convert to contemporary French philosophers, I had high expectations from this book. But even I found Badiou’s prose dense, his reasoning tangential, and his conclusions unsupported by evidence. He presents an opaque philosophy, putatively for teenagers and young adults, that even grey-haired scholars may find confusing and impractical. And it verges, at times, on messianism. I can’t imagine whom Badiou is actually writing for.

Much of Badiou’s philosophy comes straight from his foundations in Paris 1968. He is both agnostic (he says atheist, but fudges), and an unreconstructed Leninist. He draws on an ecumenical selection of sources: Plato and Lacan, Rimbaud and Marx. But he doesn’t feel merely beholden to his influences; he goes beyond them, comments on their thoughts, and attempts to weave his Situationist-era roots with the smartphone age.

The result is, shall we say, chaotic. Badiou caroms from the necrotizing consequences of late capitalism; through the imposed roles of young and old, whom he believes should ally in rebellion against the middle-aged system; through importance and absence of unifying adulthood rites in a post-religious society; to gender roles and, honestly, I’ve forgotten what all else. His underlying thesis is, apparently, that modernity is confusing. Anyone could’ve written that.

Alain Badiou and friend
Not that I’d call Badiou wrong. He says plenty I find appealing. For instance, he writes how a secularized society without clear adulthood rites, traps citizens in perpetual adolescence. “The adult,” he writes, in one of my favorite quotes, “becomes someone who’s a little better able than the young person to afford to buy big toys.” Capitalism, in Badiou’s analysis, turns functioning grown-ups into vehicles of juvenile appetite.

He flinches on this later. Not people, but boys specifically, occupy a permanent teenaged wilderness. Capitalism stunts boys well into senescence, but turns girls into women from the cradle. So, tacitly, he accepts males as “normal” and females as “exceptional.” This becomes most apparent when he says if you look at a woman, “really look at her,” atheism is proved. He doesn’t say how. I know female pastors who’d disagree.

So, okay, Badiou makes weird statements and assumes his readers’ preferential agreement. That doesn’t make him wrong. Indeed, he’s a veritable assembly line of meaningful quotes about modernism’s essential vacuity. “The career is the hole-plugger of meaninglessness,” he says of how men’s adulthood is purely instrumental to capitalism. Or of women’s roles, “There are some women who are laboring oxen and some who are Persian cats.”

These statements make perfect sense to anybody who’s witnessed how society values men according to their remunerative value, and how it forces women into pre-written scripts that, feminism notwithstanding, have changed little. Readers who find modernist capitalism disappointing, like this ex-libertarian, may find themselves pumping their fists in exultation to see a scholar learnedly attesting what we’ve already thought, in terms concise enough for a t-shirt.

Yet reading his reasoning, I keep thinking: your conclusion doesn’t follow from your evidence. In one key moment, Badiou defends lengthy arguments by citing Sigmund Freud’s Totem and Taboo, an attempted psychoanalytic explanation of rudimentary religion, which I couldn’t finish because it requires more leaps of faith than the Bible. Freud’s corpus of work is mainly regarded as pseudoscience now anyway, so citing Freud doesn’t strengthen your claims.

That’s just an example, but it’s realistically representative of Badiou’s reasoning process. One suspects he starts with certain premises, like perhaps, that the financial collapse of 2008 and the rise of reactionary nationalism in industrialized nations go hand-in-hand, a premise so bipartisan that Bernie Sanders and Marine le Pen could probably agree upon that. Then he ransacks his personal papers, unchanged since 1968, to craft a justifying explanation.

Basically, I expected better from someone of Badiou’s standing. I want to say, take what you need and leave the rest; but a right conclusion from wrong reasoning is still wrong. Badiou crafts just enough useful slogans that I suspect he understands the core of the common situation. Then he lards it with weird source citations and intellectual cow paths. I just can’t figure where he’s coming from.

Thursday, May 18, 2017

Why I Still Don't Want Genetically Modified Food

Much modern farming less resembles gardening than strip-mining

A friend recently shared another of those articles “proving”—to the extent that science can prove anything—that genetically modified foods are perfectly safe. Perhaps they are, I don’t know. However, the article included multiple references to “conventional” agriculture, insisting that GMO foods are perfectly equivalent to foods produced through selective breeding, which we’ve enjoyed for years, and here I definitely know something. Conventional agriculture, as currently practiced, is deeply dangerous.

That seems controversial to say. Americans today enjoy the cheapest food in world history, quite literally: on your typical grocery run, you probably pay more for packaging than the food inside it. Massive technological investments constantly improve agriculture, improving yield and ensuring continued, affordable supply for whoever can afford it. Selective breeding has produced more fruits, vegetables, meat, dairy, and grain than ever before. Am I calling this improvement dangerous?

That’s exactly what I’m saying, and I’ll offer examples. According to a recent Atlantic article, a single bull who lived in the 1960s produced so many offspring that fourteen percent of all Holstein cattle DNA descends from this one specimen. Anyone who lives in cattle country knows prize cattle semen fetches premium prices on auction. This bull’s DNA quadruples per-cow milk production, but also increases likelihood of spontaneous abortion in utero. Hardly an unqualified success.

Equally important, though, and something the article scarcely touches on: 14% of Holstein DNA is now genetically homogenous. This resembles the degree of crop homogeneity that preceded the Irish Potato Famine. The rise of genetically similar cultivars, some GMO, some developed through conventional selective breeding, has produced remarkable vulnerability to crop blight, resisted only through petroleum-based chemical pesticides and intrusive technological interventions.

Pigs don't live in pens anymore; this is where your pork comes from

One episode of the Showtime TV adaptation of Ira Glass’s This American Life features a visit to a contemporary Iowa hog farming operation. The selectively bred hogs raised here produce more piglets per birthing, and therefore more meat overall, a seemingly desirable outcome. But the pigs produced so completely lack native immune systems that they cannot survive outdoors. They’re raised in clean-room environments more restrictive than those used in silicon microchip manufacture, at massive expense.

So we have livestock so homogenous that they’re vulnerable to blight, so tender of constitution that they cannot handle the outdoors, and so expensive to raise that any output gains are offset by the extraordinary measures necessary to keep them alive. So agriculturalists are backing off these approaches, as reasonable people anywhere would, right? Of course not. A combination of government incentives and corporate marketing encourages increasing output, even during times of unrestrained surplus.

Recombinant bovine growth hormone (rBGH), marketed heavily by Monsanto and Eli Lilly, promises to increase milk outputs. This despite known health effects, including distended udders and pus in the milk, and suspected side effects—rBGH is a possible, but frustratingly unconfirmable, human carcinogen. And this also despite the fact that the U.S. government has purchased excess American dairy stocks and dumped them on the ground to prevent prices going into freefall. It has done this since the 1930s.

I use livestock as examples, because images of living creatures suffering tugs our heartstrings. But this pattern obtains across all farming: fear of shortfall justifies constant excess. According to agriculture journalist George Pyle, America grows twenty times as much corn as Americans could possibly eat. So most of the oversupply gets fed to cattle, making meat insanely cheap. But cattle cannot digest corn starches, turning their shit acidic, a perfect environment for toxic E. coli strains.

That’s saying nothing of the economic impact. When NAFTA became law in the 1990s, some Americans worried that manufacturing jobs would emigrate to Mexico, which somewhat happened. But when subsidized American agriculture hit Mexican markets below the cost of growing, rural poverty, especially in the agrarian south, hit record numbers. Mexican poor sought work where work existed: in the U.S. And Americans elected a demagogue promising to build a wall keeping those impoverished workers out.

Old McDonald had an assembly line, E-I-E-I-O
Corporations sell GMO seedstock by promising increased yields. Conventional farming currently produces enough food to feed 150% of the current world population, mainly driven by petroleum-burning equipment, with fertilizers and pesticides derived from petroleum. (The Rodale Institute estimates that farms currently produce more greenhouse gasses than cars.) When food is already so oversupplied that it’s cheaper than the packages it’s sold in, increasing yields makes no sense.

Yet, as George Pyle notes, American farm policy has assumed an imminent food shortfall justifies continual increases, ever since America devised its first farm policy, during the Lincoln Administration. One friend justifies continuing this approach because, he believes, near-future environmental collapse will require genetically modified foods to save the human race. Two problems: we cannot predict environmental outcomes any better than we could predict post-nuclear war conditions. And, Pyle writes, heirloom varietals are more adaptable anyway.

Starvation exists today, and chronic hunger exists close to home. But increasing supplies, whether through conventional or GMO means, makes little difference. People lack access to food, which usually means money. MLK noted, back in the 1950s, that fresh vegetables cost twice as much in poor neighborhoods as in rich neighborhoods. High-yield GMO seeds, often pitched to cure global famine, are expensive. People too poor to buy and plant heirloom varieties cannot trade up.

So basically, the demonstrable safety of individual GMO varietals doesn’t much matter. (Rampton and Stauber question that science anyway.) If they’re similar to selective breeding, well, breeding hasn’t been benign either. And they’re customized for an economic demand that doesn’t actually exist outside corporate PR. Yet the drumbeat of safety, quantity, and productivity has made these demands common coin. That’s just missing the point. Agriculture is hurting itself just fine right now, without gene technology’s help.

Monday, May 15, 2017

Deus Est Machina

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 19
Darren Aronofsky (writer/director), π (pi)

Reclusive genius Max Cohen (Sean Gullette) has built a supercomputer in his Manhattan apartment. He hopes to compute market movements, pick stocks with machine-like accuracy, and become rich without leaving home. But his computer, nicknamed Euclid, vomits a 216-digit number and dies. Thinking he’s failed, Max discards the printout; nefarious forces find it, and he finds himself caught in a battle over the forces guiding modern life.

Darren Aronofsky’s (Requiem for a Dream, Black Swan) first feature film, shot on a shoestring budget, works around its physical limitations with risky camera techniques, grim understated performances, and subtle writing. Shot on black-and-white reversal film, often from unusual angles, and cut with frenetic haste, it looks like we’re watching Max’s struggle unfold through surveillance cameras. Before long, we realize this isn’t an accidental technique.

A mathematical genius, Max impresses local children by performing complex equations faster than their pocket calculators. But he has few adult relationships. He wants reality to share math’s simple Platonic elegance, and often preplans his conversations using theory-and-experiment methods. Only his invalid mentor, Sol (Mark Margolis), shares Max’s passion for precision; they communicate mainly by playing Go, a Chinese strategy game based on strict mathematical principles.

While drinking his morning coffee, Max gets accosted by Lenny Meyer (Ben Shenkman), a gregarious Hasid who introduces Max to Gematria, a form of orthodox Jewish numerology. Curiosity overcomes Max’s usual reticence, and he lets Lenny explain the intricacies of his Biblical code-breaking. He isn’t entranced enough, though, to accept Lenny’s invitation to participate in ongoing research sessions. Especially when Lenny says they’re seeking a 216-digit equation.

Almost immediately, Max meets Marcy, an agent of Wall Street speculators, who somehow know about Max’s experiments with Euclid. They think his equations could help predict market movements, benefitting whoever controls the supercomputer. They offer Max a powerful circuit chip in exchange for access to Euclid; realizing this chip could complete his experiment (and possibly unaware of how finance works), Max accepts, permitting the agents full access to his creation.

Max Cohen (Sean Gullette) seated at his supercomputer, Euclid, in Darren Aronofsky's π

Here we see Aronofsky’s themes expressed: mathematical constancy proves reality exists, but little more. Lenny’s Jewish colleagues believe reality demonstrates God’s beneficent existence, while Marcy places her faith in market forces. Two conflicting interpretations of an imperfectly glimpsed truth each demand validation, which spirals into powerful potential violence. Meanwhile Max grasps vainly for truth unvarnished by human interpretation, but cannot have basic relationships with adults as equals.

Sean Gullette plays Max with dark, soft-spoken urgency. He narrates his own situation aloud, as though he can only understand reality when filtered through the dispassionate lens of language. This doesn’t work out well. Gullette himself apparently wrote many of Max’s narrations, playing up Max’s difficulty understanding sensory reality. Though Max believes objective reality exists, he also has delusions about surveillance and entrapment. At least one character exists only in his head.

Sol quietly encourages Max’s quixotic pursuit of undifferentiated reality. The movie implies, without stating, that Sol is a Holocaust survivor, jaded on all ideologies, but also unable to reconcile his belief in objectivity with his imminent death. When Euclid begins repeatedly producing the same elaborate code, Sol cross-examines Max. It appears Sol produced the same 216-digit sequence previously, and his health has been on a rapid decline ever since.

Throughout, images of mathematical precepts appear, sometimes more directly than others. Besides his stock-picking supercomputer, Max is fascinated by the Golden Spiral, a geometric paradigm that often serves to pique students’ interest in higher math. Number theory looms large in his calculations, but as those calculations become more elaborate, chaos theory overcomes his thinking. Confronted with the duelling theisms of the Jews and Capitalists, Max becomes more doggedly agnostic.

This movie also marks writer-director Aronofsky’s first collaboration with composer Clint Mansell. The atmospheric soundscape creates a psychological resonance with Max’s increasingly strident paranoia. Driven primarily by synthesizers and small-ensemble sound, Mansell’s score can career from bucolic afternoons in the park, to a texture like an electric drill on your teeth, with amazing speed, while never sounding out of place. It meshes so smoothly, audiences often won’t notice the score whatsoever.

Aronofsky’s works have frequently toyed with the incompatible forces driving modern life. The push between art and commerce, in The Wrestler; between beauty and mental illness, in Black Swan. Here, he insists humans need faith in something, anything, but also must realistically confront reality’s chaotic, seemingly meaningless veneer. He offers no solutions, and his resolution admits multiple interpretations. But his approach shakes viewers from their preconceived notions.

Thursday, May 11, 2017

The Ancient Art of Writing Well

1001 Book To Read Before Your Kindle Battery Dies, Part 81:
Aristotle, On Poetry and Style

Quality of language mattered to Aristotle. In classical Athens, all language was public: poetry, which mostly meant hymns and staged tragedies, was always performed for the assembled citizenry, and written prose was meant to be read aloud, particularly in courts. Therefore Aristotle, like his fellow Athenians, placed great stock in good-quality language. He was only one among many to write style manuals. Few others have had his durability, though.

This volume combines two Aristotelian classics. First, the full text of Poetics, his consideration of high-minded public verse. For Aristotle, this mostly means tragedies. University professors often emphasize Aristotle’s response to Sophocles’ Oedipus Rex, as though this book specifically criticized one play. Actually, on reading, he quotes liberally from multiple plays, sometimes the only remaining evidence of once-important dramas. Aristotle was clearly familiar with is genre.

Following that, this book includes twelve chapters from Aristotle’s Rhetoric, his chapters on creating a moving prose style. He strenuously emphasizes distinctions between poetic and prose forms, sometimes referring audiences back to the Poetics (including occasional lost chapters) to emphasize understanding how to pick your form. And what forms he describes. Aristotle gives a spirited introduction to the creation of warm, dynamic prose that stirs the audience, mind and soul.

Taken together, these two books represent a widely circulated primer on connecting with one’s audience for maximum impact. Concepts that remain widespread and influential in college writing courses, like the poetic foot or arguments from ethos, pathos, and logos, receive their first surviving descriptions from Aristotle. Though often dry and prolix himself, Aristotle concisely describes the decisions writers across ages have needed to make in composing their words.

Plato and Aristotle, depicted by Raphael
As already noted, for Aristotle, all writing is public writing. The poems he analyzes are mostly plays, performed in Athens’ annual festival of Dionysus, before the assembled citizenry. His prose mostly means speeches, delivered first in the Agora, then copied for distribution afterward. These distributed copies were mostly read aloud in salons; the idea of reading alone, silently, arose largely after Gutenberg. Language, for Aristotle, happens aloud.

To demonstrate these concepts, Aristotle cites extensively from poets and politicians, many still then living. He often alternates praise and criticism: he dislikes the tragedian Euripides overall, yet finds generosity enough to extol his ear for natural dialog. He acknowledges Herodotus for realizing the value of writing history in prose, though admitting he often gets high-flown and needlessly poetic. He concedes that even the great Homer could’ve been improved by brevity.

It’s somewhat unclear how widely these works were distributed in Antiquity. Aristotle’s precepts were widely known and cited, though not always under his name; these ideas were perhaps common coin, and only familiar to us in this form because Aristotle transcribed them. His Poetics particularly seems to have been lost for some generations. And since he cites passages we no longer have, some scholars believe a second volume was lost.

This edition weighs right at 100 pages, plus front and back matter, slim enough for a purse or jacket pocket. Yet it broaches enough topics to keep students and educators engaged with their topics for months. Translator George Grube provides liberal annotations to help readers unfamiliar with the Greek context decipher some of Aristotle’s more obscure passages. This part-time classicist appreciates a skilled guide holding my hand.

Some of Aristotle’s descriptions apply specifically to Greek-language writing. Entire chapters meander on topics like verb tenses and infixes, which don’t translate well; Grube, on multiple occasions, advises readers they can benignly skip this page. However, remarkable amounts of Aristotle’s directions, on topics like choosing your audience, constructing active metaphors, and creating rhythmic language, remain current and practical. Many Aristotelian precepts could use more airing in our frequently ineloquent age.

Aristotle’s style often requires great endurance. Unlike Plato, whose entire body of work historians believe has survived, we actually have nothing written in Aristotle’s own voice. Scholars conjecture that we basically have his lecture notes, like reading the handwritten sketches from which university professors extemporize. If Aristotle sometimes seems dry and vague, it’s because, like the style he advocated, he used language publically, and we lack his voice.

Still, Grube, one of his generation’s most accomplished classicists, annotates Aristotle sufficiently that, where possible, we have the information the Great Man’s notes elided. Some passages remain obscure, details lost to history, but compared with the unannotated Poetics I read in graduate school, this is remarkably lucid. Literary criticism basically begins with these classics. And with them, we begin understanding how literary style works.