Wednesday, August 30, 2017

The Virtue of Not Burning Out

Rachael O'Meara, Pause: Harnessing the Life-Changing Power of Giving Yourself a Break

America’s work ethic idolizes ninety-hour workweeks, absentee families, and staycations. If some work is good, we reason, more is better; waking hours spent not working are wasted. Sometime Google executive Rachael O’Meara disagrees. As beneficiary of that increasingly rare beast, an American corporation that encourages sabbaticals, O’Meara has firsthand experience with therapeutic stopping. And she wants to evangelize her discoveries to you.

“Pausing,” for O’Meara, may mean taking a ninety-day personal leave, as she did, an option supported by only a minority of employers (and bill collectors). Or it may mean scheduling time throughout ordinary days for personal reflection, breaking from ordinary pressures to evaluate one’s trajectory. Many options exist, she reveals, most of them adaptable to your unique circumstances. Though I admit some trepidation, her description sounds both promising and uplifting.

O’Meara mixes autobiography, business philosophy, and scholarly research to convey her point. When the ambitious Internet start-up that employed her got bought out by Google, she found herself on a career fast-track. But one promotion proved particularly poorly chosen, and she couldn’t adapt to the requirements. When management said her position was jeopardized, she embraced Google’s paid sabbatical program to evaluate her future options.

The choice proved well founded. She not only had the opportunity to rediscover her personal goals, she investigated how others have approached the same problem in themselves and others. Her list of citations includes entrepreneurs, psychologists, theologians, behavioral economists, and more. They share a remarkable consonance on one premise: if you live to work, and structure your lifestyle around employment, you’ll burn out and become unemployable in a heartbeat.

Rachael O'Meara
Pauses, in O’Meara’s structure, should be substantially unstructured. You should permit yourself opportunities to explore, without schedules and checkpoints. However, this isn’t a chance to drift listlessly through life, falling backward into circumstance after circumstance. She explains systems of approach to your otherwise self-guided time, to ensure you have a guiding goal keeping you oriented toward your desired ends. Don’t restrictively schedule yourself, but don’t float, either.

O’Meara describes multiple approaches she and others use to evaluate themselves during their pauses. Some of my favorites include mindful journaling and a form of mental reprogramming she lists under the acronym TASER. (Not that one.) Many of these approaches involve taking assumptions you’ve held implicitly for years, and making them explicit, at least to yourself. You cannot redirect your trajectory if you remain stuck on attitudes and suppositions you’ve followed passively for years.

As O’Meara writes, around 70% of people go through life without examining their choices. I propose even if you do examine, you probably leave around 70% unexamined, because you haven’t noticed it. Therefore, cultivating newer, better pre-conscious tendencies makes for a more fulfilling, mind-positive lifestyle. In this way, O’Meara overlaps significantly with Gretchen Rubin, who has written extensively on building habits that better serve our goals.

Though I appreciate O’Meara’s thesis, she has some class-based suppositions she probably doesn’t realize. Extended sabbaticals, or even mindful daytime pauses, are often the exclusive domain of those who can afford them. I have colleagues working two jobs and raising families; their only private time is their commute. Telling such people they’ll escape their trap if they only approach the day with more intentionality, sounds elitist and risks unintended backlash.

Many captains of industry, especially tech entrepreneurs like Elon Musk and the late Andrew Grove, extol the virtue of ninety-hour, seven-day workweeks; a documentarian recently captured Musk calling coders who took Saturdays off “soft.” Young strivers commencing careers, like O’Meara describes her former self, encounter a “move up or move out” ethos upon starting. Tell them to take ninety semi-scheduled days for solitude and introspection? They’ll call you nuts.

Therefore, many readers, mostly those at the bottom and top of America’s economic ladder, will have to adapt O’Meara’s principles pretty freely. Fortunately, by her own admission, O’Meara creates a system friendly to being adapted. If your only available mindful journaling time is five minutes in the parking lot before clocking in, do what I do: carry a miniature notebook in your pocket beside your wallet. Because five minutes is five minutes.

To her credit, O’Meara doesn’t assume your journey will parallel hers. She provides a buffet of options, which she and others have tested and found effective, but she assumes you’ll experiment and find what works for you. She provides guidance, not a checklist. As the average American workweek approaches fifty hours, the point of professional burnout, let’s consider the value of stopping. Before reality stops us first.

Monday, August 28, 2017

A Prophet Without Honor

Alain Mabanckou, Black Moses: a Novel

An orphan comes to a Congolese hospital wholly anonymous. The local priest saddles him with a name so lengthy, everyone calls him by a shortened version: Black Moses. Old Papa Moupelo apparently believes, with all seriousness, he’s discovered the prophet to deliver Congo-Brazzaville from tyrants and colonialists. But Marxist-Leninist revolution strikes Congo, religion is banned, and this becomes just the first time Moses gets thrown back on his own devices. Which he mostly doesn’t have.

Francophone author Alain Mabanckou proffers a novel guaranteed to frustrate most English-speaking audiences. Readers reared on American fiction will find many frustrations: Mabanckou introduces multiple plot threads which go unresolved. Characters antagonize Moses to the brink of outrage, then disappear. Our protagonist doesn’t act upon the story, but is strictly acted upon. Which is a common thread running through much post-colonialist literature, but Euro-American readers, unfamiliar with this tradition, may mistake Moses’ confusion for passivity.

Moses’ story begins inside the orphanage in Loango. First built by Catholics, it became a government institution upon Congolese independence. Black Moses enjoys Papa Moupelo’s lessons in religion and folk tradition, but questions his heritage. Then revolution strikes, priests are banned, and the orphanage becomes a reform school. Bullying twins take Moses under their wing when he resists their dominion over the boys’ dorm… but their mentoring proves equally damaging as the bureaucratic appointee director.

In the second half, the twins convince Moses to escape with them. Working together, the trio seize control of the juvenile gangs roaming Pointe-Noire, Congo’s richest city. But the twins demonstrate a “four legs good” attitude toward their gang, emulating the capitalists who keep them down. Moses flees to the shelter of a nearby brothel, working as the madam’s right-hand man. He graduates to keeping her property, until a purge of prostitutes challenges Moses’ loyalties.

Alain Mabanckou
If this sounds choppy and very busy, I won’t disagree. It’s necessary to resist the temptation to analyze this book by Western standards. Dust jacket blurbs comparing this novel to Dickens’ Oliver Twist could mislead Brit-Lit majors into seeking Anglophonic parallels. That would be incorrect. Mabanckou creates a character who drifts, rudderless, into the most important social upheavals of his people’s history, repeatedly witnessing shocking moments of widespread national tragedy, without understanding what he’s seeing.

Mabanckou does something fairly unusual in literary fiction, providing a thesis statement. Describing trapping feral cats for meat in Pointe-Noire, Moses ruefully describes the animals’ inability to understand their situation: “they have chosen to stay domestic, rather than go and live in the bush, where they could live with their feet up, away from the Bembés. Now, cats don’t know that true freedom is to be found in the wild.” Moses apparently misses the irony.

Like those cats, Moses doesn’t realize he has options. He vacillates between being somebody’s pet—the orphanage director, the mercurial madam, the witch doctor curing his “madness”—and being grist chewed up and consumed by others—the twins, the dockyards where the madam eventually dumps him, and… well… worse. He never sees the third option of rejecting the roles written for him, going wild, and charting his own course. Eventually he ends at the beginning.

I admit, I missed this message until Mabanckou spelled it out for me on page 114. Before that, I thought I was witnessing a passive character drifting helplessly, refusing to take responsibility for his own actions. Once Mabanckou offered his thesis statement, and I recognized how this story fitted into a tradition previously utilized by authors like Salman Rushdie and Sara Suleri, things came together for me. Moses’ helplessness wasn’t accidental, it was the point.

But I question whether audiences unfamiliar with postcolonial literature will recognize this message. Euro-American audiences expect through-lines, they expect plots commenced to be resolved, they expect protagonists to advance or be destroyed. Moses doesn’t do that, because that isn’t Mabanckou’s point. I enjoyed this book, flaws and all, because I recognized the literary conversation Mabanckou joins. I struggle to imagine other audiences, lacking my backlog of post-imperialist reading, will understand just what Mabanckou has accomplished.

Which is a shame, because, from within its context, Mabanckou lends a remarkable voice to the postcolonial conversation. Like many English-speaking readers, I’m familiar with Anglophone writers like Salman Rushdie and J.M. Coetzee; it’s interesting to see the how the context looks from the Francophone angle. Mabanckou probably alienates audiences unfamiliar with his milieu. But once I understood where he fits within the postcolonial discussion, I realized Mabanckou had written a smart, often funny contribution.

Friday, August 25, 2017

The Crater Left By Dropping an N-Bomb

Actor Jimmy Walker from Good Times, my idea for years of what being Black looked like
Warning: this essay contains language and references some readers will find offensive.
I first heard the word “nigger” at a Cub Scout day camp in upstate New York’s Catskill Mountains region. Well, surely I must’ve encountered the term somewhere before age eight, given the ubiquity of racist attitudes in America today, though by 1982, most people had enough discretion not to say that word aloud in public. The risk is too high. But my first real memory of that word is Cub Scout camp.

Members of my troop, joshing aggressively with a kid we’d never met before from another troop, answered his boisterous ribbing by calling him “nigger.” The other kid never broke his stride, so if the term offended him, it didn’t make enough of a difference to stop the goofing. I didn’t join in, not because I considered that word offensive—I didn’t even understand it—but because I never liked being on the receiving end of aggressive teasing, so I wasn’t about to dispense it.

Thing is, I never connected the word with his race. Looking back, this kid was clearly Black, but I didn’t see him that way. He didn’t resemble the very dark-skinned actors on Good Times and What’s Happening, my white suburban exposure to African Americans, so I never would’ve considered him Black. To me, he just looked that way, like any other kid would. He was just, y’know, that guy.

Michael Eric Dyson
Becoming a book reviewer has exposed me to multiple viewpoints I never would’ve considered before. Though I’ve had some run-ins with racism on this blog, the real education has come from academics. Though I disliked racism from an early age, I never much considered the nature of race until difficult books on the subject crossed my desk, books this white suburbanite never would’ve purchased independently.

For instance, I originally gave a lukewarm review to sociologist Michael Eric Dyson’s The Black Presidency, which I now regret, as Dyson’s ideas have grown on me. One stands out: white people, especially white men, have the privilege of considering ourselves not as a race, not as members of a demographic group, but as simply normal. That’s why reactionary activists see attempts to diminish racism as attacks on themselves, because white issues aren’t white issues, they’re just how things are.

Similarly, anthropologist Richard J. Perry provides a detailed examination of why race doesn’t really exist. Many members of my generation, though clearly not all, grew up believing that racism was objectionable, and racists were backsliders; but we never really understood why. That’s probably because race still exists in common discourse, and, like other bad ideas, never completely goes away once released. But Perry demonstrates how race isn’t merely a bad idea, it’s a falsehood.

Because I encountered the term “nigger” in an environment I didn’t consider hostile, in dialog clearly intended as jest, I thought it was a mild, even playful, term of personal contempt. The person whose chops you’re busting is a nigger, obviously. So, about a year later, while bantering with my friend Timmy across the street (a typically freckle-faced, ginger-haired Irish-ish kid), I called him a nigger. Timmy never paused. The adults around us did.

My mother dragged me off the street, into our living room, and explained what that word meant. I was horrified. Though my suburban upbringing involved little interaction with African American kids until middle school, I understood the idea of race from television, and the consequences of racism, at least as sanitized for mass-media consumption. Had I misused that word other times, possibly more hurtful times? I couldn’t remember, then or now.

Sheryll Cashin
Sometimes, when discussing white privilege with white people, they’ll dispense the stock response, “How exactly am I privileged?” I understand this response since we use the word “privileged” to connote wealth and luxury, which most whites lack. As Sheryll Cashin writes, when you disaggregate the very rich (who are mostly white), race-based economic disparity falls to within $5,000 annually, big money if you lack it, but hardly Trump Tower rent.

To me, though, white privilege entails a white kid getting antsy on a suburban street, tossing the word “nigger” recklessly, because he’s never needed to think about what that word means. Did that kid at scout camp feel wounded when an all-white troop called him that? Or when one member of that troop remained passively silent, his face probably betraying he didn’t even know what that word meant?

That’s why events like we’re seeing unfold around us today bother me so much. Because it’s proof that some people just don’t care.

Tuesday, August 22, 2017

A Proper Lefte Bank Murder

Mark Pryor, The Sorbonne Affair: A Hugo Marston Novel

Bestselling American author Helen Hancock believes her Paris hotel room is bugged, but she dare not involve local authorities. So she contacts Hugo Marston, security chief at the American embassy. Marston doesn't take Hancock, her potboiler novels, or her suspicions seriously... until an actual pinhole camera appears behind a painting. This spy gewgaw traces back to an indebted American hotel employee, who turns up dead, crashing Marston's investigation into a wall.

Reading British-born American author Mark Pryor's seventh Hugo Marston novel reminds me of Inspector Morse, the classic British mystery series. Both feature educated, well-spoken protagonists whose gentlemanly demeanor conceals a roiling past. Both have subdued tones and cerebral characters whose occasional sudden bursts of violence feel more powerful because they surge forth unexpectedly. This comparison could be good or bad, depending.

On the one hand, Pryor writes with a pensive tone reminiscent of a primarily cerebral subgenre I haven’t much seen in years. His characters discuss facts, evaluate evidence, and have long conversations about, well, stuff. In a paperback thriller market dominated by antiheroes who love kneecappings and fistfights, this approach, with a primary emphasis on the puzzle, seems somehow both nostalgic, and a welcome relief from the constant action.

Not that the character is bloodless or boring. Like Inspector Morse, Marston continues believing he’ll find the right woman, even well into middle age. And when, in an important subplot, a convict from Marston’s past comes barging into the present, we discover his repressed capacity for naked savagery. One suspects Marston’s normally donnish approach to even ordinary conversations serves to shackle a powerful inner conflict between libido and violence, between Freud and Nietzsche.

Mark Pryor
On the other hand, this subdued tone, so welcome throughout most of the novel, does set a very slow pace. The number of important expository scenes that occur in bistros, while Marston and another important character chat over wine, becomes pointed somewhere around page 80. Especially since Pryor has a large ensemble of characters to reintroduce from prior novels, the exposition gets long and talky. Much bread and zinfandel is consumed. When there’s a body in the stairwell, maybe postpone introductions?

Pryor establishes an interesting locked-room mystery. Despite the impression I gained growing up during the Cold War, watching off-label spy thrillers, bugging somebody’s personal space is very laborious. Most remote surveillance devices have limited range and short battery life, Thus Marston and his multinational battalion of crime experts must unlock a mystery that could only take place within limited geographic range. They successfully find the bug’s receiver… after its owner is already dead, but the data must still be going somewhere. Evidence accumulates without any clear suspects.

Helen Hancock compounds these problems by her personal quirks. Her first scene establishes she probably hasn’t learned the meaning of the word “inappropriate.” She strives to emulate Hemingway-era stereotypes about American expats in France: sexually flagrant, workshy, and moody. She fears spies pirating her yet-unfinished novel, but apparently spends little time writing. Though she came to Paris for research, her labors apparently consist of wine by the bottle, hot stone massages, and indolently mentoring American MFA students. She flirts with Marston in front of his date.

This collision between Hancock’s histrionic behaviors, and how the evidence supports her paranoia, provided momentum enough to propel me through Pryor’s more sluggish passages. Fundamentally, Pryor cares less about what happens, than about how characters respond. One suspects Pryor might rather write highbrow character novels, but his training as an attorney, and the exigencies of today’s publishing market, make genre series more lucrative.

It’s somewhat cliché to say a novel has a self-selecting audience. Don’t all books, especially in today’s niche market? Yet Marston’s distinctive retro style, reminiscent more of Agatha Christie than Raymond Chandler, deserves mention. Hugo Marston, our protagonist, has the capacity for profound violence and coarse outbursts, especially when a suspect from his FBI-agent past barges into the present. But he’s primarily a thinker, a tendency conveyed in his dialogue and slow, discursive expositions.

I found plenty to like about this novel. But I had to adjust my mental rhythms to match those of my author, a choice not all readers make anymore, when television and Internet pander to our hunger for novelty. Pryor’s audience will want an experience that takes them outside themselves, an experience more like literary fiction than the usual genre boilerplates. I’m glad I read this novel, and will probably investigate Pryor’s previous Marston novels. But ask yourself whether this book is right for you.

Monday, August 21, 2017

The Real War on White Americans

Charlottesville's Robert E. Lee statue, erected in 1924 and removed in 2017
“You will not replace us!” chanted the Charlottesville marchers at their Unite the Right rally on Saturday, August 12th, 2017. And, more tellingly, “Jews will not replace us!” This fear of being replaced speaks significantly to the almost entirely Caucasian marching party, which nominally came together to protest the removal of a heroic-scale bronze sculpture of Robert E. Lee. But the language revealed a more deep-seated fear of humans, particularly white humans, being rendered obsolete.

This idea isn’t new to Charlottesville. I haven’t studied the history of racist rhetoric, though fear of getting displaced by numerically insurgent minorities has a past, being part of Hitler’s claims about Jews and Gypsies. The shooter at Emanuel AME church, whose name I’ve purposed to never repeat, justified his violence by telling friends “blacks were taking over the world.” Fear of imminent displacement seemingly infects racism altogether. I’d argue this fear isn’t without merit.

Marchers outed for racism have included, most prominently, a college undergraduate, facing probable graduation into a job market massively narrowed by automation, and a restaurant waiter, denizen of an industry that hasn’t seen its federal minimum wage adjusted since 1997. These are people on the front lines of the widening gulf between work and reward. Despite libertarian economic principles, hard work and diligence don’t raise most workers above their roots anymore. If they ever did.

In high school American history, I learned that Gilded Age American workers successfully organized, beat back corporate thugs, and pushed Teddy Roosevelt to bust trusts and break the industrial oligopoly. My textbooks conveniently omitted that industries answered this organization by moving work outside organized areas. Iconic industries like the Pennsylvania ironworks and Chicago stockyards, exist today only as museums. The work itself is massively decentralized, moved to “right-to-work states” like Oklahoma and Indianapolis, or overseas.

When I first voiced a less polished version of this idea, a friend countered by indicating that several Charlottesville protesters who’d been outed on Twitter weren’t laborers. Those identified—sometimes, tellingly, in error—included college professors, regional entrepreneurs, business owners, and more. These are skilled professionals, presumably with standing in the community, who contributed their faces to a mass march that has chilled the nation. Calling them laborers, my friend implied, is classist and uninformed.


But I’d contend even these skilled professionals have reason to fear being made obsolete. One person wrongly, let me repeat wrongly, accused of participating in the racist march, was an engineering professor specializing in researching wound treatments. As a former college instructor, I’ll defend the idea his position is vulnerable, as adjunct instructors and other part-time faculty are the numerical majority in many American colleges today. Our children’s teachers have to hustle for grocery money.

We could continue the reasoning this way: entrepreneurs, particularly brick-and-mortar business owners, are constantly jeopardized by advances in online marketing technology. Where once companies like Amazon once paid delivery drivers, they’re now researching drones and other human-free delivery to further trim costs. National retailers moving online shutter stores. People like Jeff Bezos call this friction-free marketing. But that friction is you. This philosophy sees human beings, not as drivers for the economy, but as impediments.

Neighborhood entrepreneurs take personal pride in hiring locally, and deserve recognition for that. But to national and multinational corporations, labor is a sunk cost wise investors avoid. If jobs can be seconded onto machines, they will be, especially since human labor is a tax liability, but machinery is a write-off. The marchers flashing Hitler salutes and brandishing torches insist Jews and minorities won’t replace them, but in many places, they’ve already been replaced by machines.

Even where work exists, it’s pretty sketchy. Anglo-American sociologist David Graeber notes that, as jobs which once produced valuable goods and services have been automated, the economy switches onto “bullshit jobs,” his technical term for busywork labor which contributes no value. The massive rise in supervisory positions means the economic pyramid has been heavily flattened. As nominal management jobs become commonplace, we see what happens whenever any formerly scarce resource becomes abundant: it becomes valueless.

The Charlottesville racists therefore have foundation for their complaints. Our economy has bifurcated into a powerful management class and grunt labor, with little chance to move between. Yet when faced with a rich, mostly white management class systematically replacing them, they turn their outrage onto politically powerless minorities. I understand their outrage; I share some of it. But in targeting minorities, they’re doing management’s work for them. They’re hugging the book that stomps them down.

Saturday, August 19, 2017

Should Free Society Limit Free Speech?

Charlottesville protester Peter Cvjetanovic claims this photo doesn't represent his real
personality. But he's chanting Nazi slogans which inherently threaten violence against
others based on their demographic group. Which is the Fundamental Attribution Error,
though we'll talk about that later.
This week, I’ve gotten drawn into multiple debates about when and whether a free society places limitations on free expression. Which isn’t surprising. For many of us, the boundaries of free speech were primarily thought experiments in philosophy courses. For white Americans especially, we consider free speech our native birthright, and expect nobody will circumscribe our ability to say anything. Events this week sorely test that assumption.

For most, the stereotypically unacceptable extremes of free speech are limited to obscenity and incitement. Language (or images) so vulgar and transgressive that it undermines social order should, courts agree, be squelched. And no matter how crude my language, I can disparage individuals or groups until I threaten their bodily safety. We accept these limits tacitly. But I suggest we go further. We should consider both what language says and does.

Imagine a cranky drunk. Well into his fifth pint, he blurts: “Christ Almighty, I hate Jews.” He might even say: “That Doctor Goldstein across town overcharges for routine shots, what a Kike.” As thoughtful people, I’d hope we agree that’s hurtful, possibly even damaging language. But it serves a purpose, even if only to signal we should avoid this guy, since most people say drunk what they think sober. We know he thinks offensive thoughts.

Things change if that guy says: “Somebody out to put a cap in Doctor Goldstein’s ass.” We could consider that a threat. Depending on our drunk’s history, we might toss him in a cell until he dries out, for his own and Dr. Goldstein’s safety. But if he says “Get your pitchforks, boys, we’re gonna teach Goldstein a lesson,” we’d agree he’d crossed a line. Because his language no longer described something, it actually constituted an action.

When Queen Elizabeth smashes a bottle of gourmet plonk against a battleship’s prow and says,/ “I christen this ship the HMS Insurmountable,” her words don’t merely describe something which already exists. Her words create a new reality. She bestows upon the vessel, and its subsequent crew, a concrete sense of identity. She recognizes the ship into membership in the Royal Navy. She initiates the ship into a maritime valor tradition.

Rhetoricians call this “performative language,” meaning the words change something. The change may have only symbolic value or change the mental landscape rather than making physical changes: for instance, naming your baby doesn’t change the baby’s physical structure. But it grants the baby an identity, which permits it to participate in human civilization. Your unnamed baby is interchangeable with other groups of human cells and organs; naming bestows individuality.

This picture was taken in Charlottesville, Virginia, on 12 August 2017. The Nazi salute
is a prime example of performative language, since it pledges allegiance to a particular
ideology, which itself killed sixteen million unarmed civilians.

Incitement language does something similar. In Charlottesville, crowds surrounded a Black church, brandishing torches. This admits multiple interpretations, since firelight vigils have a lengthy Christian tradition. Without actual violence occurring, how do we interpret this event? By language. These crowds didn’t pray, they chanted racist slogans meant to cause fear, and keep parishioners contained. The words spoken, not the actions taken, define what happened, and what makes it execrable.

Sadly, I hear racist language regularly. Working construction, I’ve heard people drop N-bombs, and other language I consider wholly offensive. Construction is the most thoroughly segregated workplace I’ve ever seen, physically and rhetorically. But these racist spewings aren’t performative language. When my colleague says hurtful, mostly uninformed things about African Americans, he reveals himself, but doesn’t change anything. And I’ve seen him work with Black people just fine.

The dynamic would change if my colleague changed his language, however. When he drops N-bombs, his hurtful words drive wedges between himself and others. Which is jerkish, but tolerable. If he recited Nazi or Klan slogans, language historically associated with racially motivated language, he wouldn’t be merely speaking, he’d be willfully causing others to fear for their bodily safety. Violent language creates a violent interaction, even without physical bodily violence.

Therefore, we can divide language into two categories: descriptive and performative language. Descriptive language, even when vile, is ultimately harmless. That’s why anybody can say “I hate Donald Trump.” Performative language gets further divided into benevolent and harmful performances. “I love you” is benevolent performative. “I hope Trump is assassinated” is harmful, and deserves every backlash it receives.

Language that describes even the most vile depths of human behavior deserves every protection, because slippery slopes do exist. But language that says something differs, measurably, from language that does something. Society should scrutinize all performative language to ensure it doesn’t harm others or undermine society. Which, oh wait, it already does, with obscenity and incitement.

Friday, August 18, 2017

The Art and Science of American Racism

We stumbled into the Lawrence Arts Center at the tail end of an exhausting weekend. The events in Charlottesville had hypnotized the country for days, and Sarah and I desperately needed something quiet, something aesthetic, not carrying the stench emanating from the upper echelons of American power. We thought surely an art gallery would ease the tensions. We had no idea Iola Jenkins’ textile art would greet us inside.

Jenkins, a self-taught African-American outsider artist, uses a mixture of quilting and embroidery to create dynamic, multicolored images based on her beliefs and experiences. Her textile art includes portraits of Whoopi Goldberg’s character from The Color Purple and activist professor (and sometime fugitive) Angela Davis. She also has pictures of African market days and folk scenes. The depth Jenkins extracts from what look like discarded scraps makes mere paint look one-dimensional and wan.

Sarah and I paused before twinned portraits of Barack and Michelle Obama. Rendered in exceptionally bright colors, the images had a certain quality of a 1990s comic book, but that didn’t undermine their depth. They looked like photographs taken under tinted stage lights, or possibly run through an oversaturated Instagram filter. They captured the energy and potency African-Americans like Jenkins must’ve felt seeing a couple who resembled themselves in Washington.

We stood there, dumbstruck, for several minutes. Both quilts were dated 2015, when President Obama was riding his highest crest of popular support, before the primary election campaign heated up. Jenkins must’ve made these textiles when Donald Trump still looked like a longshot protest candidate, a spoiler meant to agitate a wounded but angry Republican base. She might’ve made them before Trump declared his candidacy. I don’t know.

I threw my hat on the floor.


Many protesters in Charlottesville were unambiguous: they felt emboldened to express their vile opinions because Donald Trump became President. Though it’s overgeneralizing to say Trump caused the Charlottesville violence, his discourse—calling Mexicans rapists, spouting decades-old stereotypes about “the inner city”—emboldened whites who already had racist tendencies to express them. Trump’s failure to condemn people toting swastikas as Nazis hasn’t helped.

Unlike many American prairie communities, Lawrence, Kansas, didn’t spring up spontaneously. Activists from the New England Emigrant Aid Company deliberately founded Lawrence as an abolitionist colony during the Bleeding Kansas fighting, to provide anti-slavery forces an added edge in determining the future state’s future. The city’s main downtown corridor, Massachusetts Street, reflects the city’s abolitionist heritage. As often happens, contemporary attitudes mainly reflect historical foundations.

Historian Ibrahim X. Kendi writes that beliefs tend to follow policy. By this he means that public opinion, on multiple issues but especially race, tends to reflect the ideas floating from the top of politics, economics, and culture. Racism, as we experience it in America, didn’t really exist in pre-colonial Europe. People we’d now consider “white” hated one another and fought violently: French versus English, Spanish versus Portuguese, Germans versus Germans.

After plague and warfare decimated the Native American population, rendering North America fit for colonization, Europe started dumping its undesirable denizens on distant shores. According to Nancy Isenberg, America’s first English settlers weren’t called heroic pioneers at home. The English used colonies in Virginia and Massachusetts to unload what they called “the offscourings of the land.” The word “offscourings” refers to shit that clings to your ass and needs wiped off with paper.

England bound this goulash of beggars, debtors, thieves, and other outcasts together, by telling them: well, at least they weren’t Black. Parliament created policies forcibly separating white settlers from slaves in work, residence, and even food. England initially regarded Indians as whites with extreme tans, until Native pushback against English adventurism turned violent; then policies changed to separate Black, white, and Red. White beliefs accommodated these policies.

So racism became a response to public policy. After the Revolution, when Northern states couldn’t reconcile their rhetoric of freedom with slaveholding, they changed policies to emancipate their Black slaves and white indentured servants. But once the policy of race took hold, nobody could undo it. Northerners still saw themselves as Black, white, and Red. Even abolitionists progressive enough to colonize Lawrence, Kansas, carried the idea of race into their new homes.


Iola Jenkins made her art during a time when it appeared the legacies of colonial policy might finally disintegrate. But electing a Black President, a political moderate with big-tent views and even bigger smile, couldn’t reverse the trend. As Charlottesville proved, the vile colonizers simply moved underground, awaiting their chance. The persistence of abolition in Lawrence, and of racism in the Trump administration, proves boldly: problems don’t go away because head operators change.

They simply take new form. And the fight, physical and policy alike, must adjust appropriately.

Thursday, August 17, 2017

American Flags and the History of Violence

The offending banner, in its natural environment

So as we sat down to eat at a perfectly pleasant restaurant with an Early American decor theme, I happened to glance to my left and see a banner. Not really a flag, the proportions were all wrong; fly it in the wind, on a battlefield where the infantry needs to know which colors to rally around, and it would hang limply. The only correct word for this red, white, and blue confection is “banner.”

It had a ring of thirteen stars on a blue union, and stripes in red and blue. It also had a fourteenth star inside the ring. And I thought it looked uncomfortably familiar. Like any good resident of the Third Millennium, I reached for my smartphone, because what’s the point of carrying a massively powerful networked computer in your pocket if you can’t occasionally use it to Google things? So I did, and I immediately found what I feared:

This banner sure looked like the Confederate national flag.

We’re accustomed to associating the Confederate States of America with the “Stars and Bars,” a blue St. Andrew’s Cross with white stars on a solid red field. But this wasn’t the national flag of the Confederacy, it was the battle flag of Robert E. Lee’s Army of Northern Virginia. That flag gained prominence during the Civil Rights Movement, as a militant white pushback against the idea that black Americans deserve equal standing. But that’s a later addition to the Confederate myth, and one that serves modern rather than historical purposes.

The Confederate national flag had thirteen stars in a ring on the blue union, and three stripes: two red and one white. The banner flying at my left had five stripes, three red and two white, besides the puzzling fourteenth star. I told Sarah that I thought we’d spotted a Confederate flag, just three days after the violence in Charlottesville, Virginia. As people sympathetic with civil rights and racial equality, I wondered aloud whether we could eat here.

So naturally Sarah went to speak with the manager. She needed to know whether she was about to give her money to a business that openly advertised sympathies with a treasonous pseudo-nation that fought against the United States to protect slavery. But while she did that, I kept probing, and discovered I’d possibly made a terrible mistake. This wasn’t a Confederate flag, it was a Betsy Ross flag.

The Betsy Ross flag (above) and the Confederate national flag (below)

Both flags have thirteen white stars, a blue union, and red and white stripes. At a brief glance, the only distinguishing characteristic between the two flags is the number of stripes. This probably represents a Confederate attempt to usurp American mythology. As historian Nancy Isenberg writes, Confederates and Northerners sniped incessantly about which represented the real American heritage, calling each other Crackers and Mudsills, respectively.

With five stripes, and the inexplicable fourteenth star, the offending banner clearly was neither a Betsy Ross flag nor a Confederate flag. And when Sarah returned with the manager, who explained the banner was hanging around as part of her store’s post-July 4th decorations, I realized there were other pieces of patriotic kitsch hanging around. I’d wandered into a Hall of Americana for brunch without realizing it.

Sarah and I apologized profusely to the manager. Clearly, in light of recent events, we’d made a significant mistake. Yet had we? Images of America’s slaveholding past linger everywhere. If you count colonial times, and historians do, we were slaveholding longer than we’ve been free. Movies like Gone With the Wind, which openly extols slaveholding society, are considered classics. And many of our Founding Fathers were slaveholders.

President Trump, in the video linked above, makes an equivalency between Stonewall Jackson and Robert E. Lee, and George Washington and Thomas Jefferson. This isn’t unfair, since all defended and profited from slavery (though Lee never owned slaves himself). But the equivalency is false. Jackson and Lee fought for a nation that existed to preserve and extend slavery. Washington and Jefferson set the process of liberation in motion… though failed at enacting it themselves.

We ate our French toast in relative peace, surrounded by a comfortingly multiracial dining room that clearly didn’t share our offense at the possibly Confederate banner. We probably hadn’t stumbled into a den of covert racism. Yet I still couldn’t wash the bad aftertaste from my mouth. American society, for all its virtues, still fails to redress its explicitly racist past. People are still dying for the cause. If I’m not willing to speak up, what am I?

Thursday, August 10, 2017

The Other Boy Who Could Fly

John Leonard Pielmeier, Hook's Tale: Being the Account of an Unjustly Villainized Pirate Written By Himself

First, his name isn’t Hook. James Cook, great-grandson of the explorer James Cook, is press-ganged into the Queen’s Navy, aged 14, ending his London childhood and Eton education forever. But rumors of treasure lead to mutiny, and Cook finds himself sailing under the Black Flag. Soon his ship crosses the line into a mysterious land where nobody, not even little boys dressed in tattered leaves, ever grows up.

American author John Leonard Pielmeier is probably best-known for his play, and later film adaptation, Agnes of God. Since that classic, he’s become an in-demand screenwriter, especially for adaptations of heavy, difficult literature. But he admits, J.M. Barrie’s Peter Pan first awakened his interest in reading, and in his first novel, he returns to Neverland, retelling the story from the forsaken antihero’s perspective.

Cook finds himself orphaned, expelled, and pressed in quick succession. A comforting life of middle-class London innocence surrenders to harsh sailors’ compromises. Under his captain’s Puritanical supervision, Cook toughens his skin, practices his Latin, and conquers his ignorance. Soon he’s a real sailor. Then the mutiny forces him to choose between honesty and survival. And, on a distant Neverland shore, he finds a castaway who remembers Cook’s long-lost father.

If Peter Pan is the Boy Who Wouldn’t Grow Up, James Cook is the Boy Who Has Adulthood Thrust Upon Him Violently. There’s a Luke Skywalker quality to Cook’s transition, but he often learns the wrong lessons. He abandons his post to discover more about his father. He nurses petty grudges and pursues vengeance so far, he inadvertently injures himself. He admits lying to achieve his ends—then demands we trust him, not Barrie, to tell the real story.

John Leonard Pielmeier
Peter Pan, meanwhile, proves himself capricious, controlling, and worse. Marooned by his shipmates, Cook meets Peter, and both are overjoyed to finally make friends their own age. But Cook doesn’t want to stay fourteen forever. He faces a monster so terrible, even Peter can’t stomach it, and in so doing, wins Tiger Lily’s heart. Peter, jealous that his friend doesn’t live in the eternal present, murders her. Or so Cook says.

Pielmeier strips Barrie’s Edwardian sensationalism. Cook repeatedly insists he’s no pirate, but an orphan caught in something beyond his control. He’s certainly not Blackbeard’s bo'sun. The Piccaninnies aren’t a stereotyped Plains Indian tribe, they’re a proud Polynesian nation, the Pa-Ku-U-Na-Ini. And Neverland isn’t a haven of eternal innocent irresponsibility, it’s a land of Lotus-Eaters where all time gets compressed into Yesterday, Today, and Tomorrow.

Repeatedly, Cook insists he’s no villain. Yet he’s exactly that, if accidentally: everywhere he goes, his presence disrupts the balance. Gentleman Starkey initiates the mutiny because he finds Cook’s treasure map. Peter and the Pa-Ku-U-Na-Ini live in peaceful rapport until Cook interrupts their religious ceremony, breaks Tiger Lily’s prior engagement, and leaves Peter friendless. He even accidentally hastens the Wendy Darling’s kidnapping.

Critics have seen, in Barrie’s Peter Pan, an enactment of the Oedipal conflict, as Peter battles the piratical father-figure and must choose between three ideals of womanhood. I see, in Pielmeier’s Cook, a dark mirror of Campbell’s Heroic Journey metaphor. Pielmeier hits every marker: the Call to Adventure, the Threshold, the Road of Trials, the Temptress, even the Return. But unlike Campbell’s hero, at every opportunity, Cook makes the wrong choice.

Cook insists he’s innocent. But everywhere he goes, he leaves a trail of broken souls and dead bodies. He insists upon his own honesty, and gives a detailed accounting of his actions, while he admits lying to achieve selfish ends. Though book-smart and crafty, he lacks wisdom, perhaps because his lifetime’s experiences don’t match his bodily appearance. Thus, instead of achieving enlightenment, he becomes driven by vengeance and rage.

Maria Tatar writes, of Barrie’s original play and novel, that the dominant theme is futility. The Lost Boys, Piccaninnies, and pirates pursue one another in a permanent clockwise pattern around the island, perpetually enacting time, though they never age. Pielmeier disrupts that: Cook enters a magic archipelago where time means nothing, but instead he brings change. He brings mortality into a land without age. But he never understands this.

Pielmeier isn’t the first author to rewrite Hook’s backstory. Besides Barrie himself, recent entries have included J.V. Hart, Christina Henry, and Dave Barry. However, I particularly like Pielmeier’s psychological depth and emotional complexity. Pielmeier’s Cook is a master schemer, but also a master of self-deception. He successfully complicates Barrie’s original story, but only at great cost to himself, which he clearly hasn’t begun to understand.

Tuesday, August 8, 2017

A Goddess's Guide to Folk Rock Stardom

1001 Albums To Hear Before Your iPod Battery Dies, Part Seven
Ani DiFranco, Living In Clip


Ani DiFranco gained attention for her DIY music ethos in the 1990s, as probably the most successful musician to found her own label and release her own albums. That’s how I first encountered her. In the final fifteen years when record sales still mattered, her ability to control her own sound, marketing, and image control made her legendary. Frequently, this forward-thinking creative control overshadowed how profound her music actually was.

This recording showcases DiFranco’s uncompromising musicianship. Recorded over the previous two years, these songs display a performer notorious for her assertion that she lived to play before a live audience. Her ability to respond to audience energy, and the audience’s willingness to answer her cues, show a reciprocal relationship between both sides of the divide. Her intensely autobiographical lyrics clearly touch listeners through their immediate intimacy.

Though famous for her entrepreneurial ethic, DiFranco’s music was equally ambitious, a mix of acoustic austerity with indie rock drive. Though she never got much radio play, lacking connections to distribute payola, occasional songs like “32 Flavors” or “Untouchable Face” got airplay from radio programmers rebelling against the then-nascent ClearChannel monopolism. Her independence apparently rubbed off on gung-ho individualists, college students, and other freethinkers.

She certainly conveys this independence in her live recordings. Though self-identified as a folksinger (and in frequent rotation of venues like FolkAlley.com), her style combines folk introspection with punk clarity. She drives her own sound with just her voice and guitar, backed mostly by a rhythm section. She doesn’t invest in ornamentation or ensemble complexity—with exceptions, as she does front the Buffalo Philharmonic Orchestra on two tracks.

Ani DiFranco
But mostly, she carries her own weight onstage. She plays with a modified clawhammer strum, the same basic style used by Bob Dylan and John Lennon. (In interviews around this time, she described teaching herself guitar with a Beatles songbook.) Her evident love of playing comes across when she doesn’t stop strumming during stage banter. And banter she does: she uses a Lenny Bruce-style conversational rapport to establish, and respond to, her audience’s desires.

Despite her acoustic folk roots, DiFranco shows herself comfortable with innovation. Tracks like “Not So Soft” or “The Slant” utilize a hip-hop recitative style which punctuates her lyrical urgency. On other tracks, like “Sorry I Am” or “Fire Door,” she allows her sound operator to loop her vocals, permitting her to harmonize with herself, in a style other acoustic artists wouldn’t embrace for a decade after this album’s release.

DiFranco has often been the most vocal and strenuous critic of her own studio recordings, describing them as “sterile” or worse, despite serving as her own producer and arranger. This is often unfair, as anyone who’s heard albums like 1996’s Dilate can attest; she’s a masterful stylist who uses studio effects without overusing them. However, even her best studio recordings do have a certain lack of immediacy about them.

Not so this recording. Her mostly acoustic performances, with session drummer Andy Stochansky and bassist Sara Lee, showcase her power as a live performer. In an essay reprinted in the Utne Reader in 2002, DiFranco admitted she mostly made albums to publicize her live tours, largely the opposite of the then-accepted music business standard. She invested studio time to justify her passion for playing before a live audience.

Despite her personal lyrics, her writing is often intensely political too. DiFranco, an admitted pansexual agnostic, adopted opinions too liberal even for most mainline progressives back then, embracing her sexual inclusivity on songs like “Adam and Eve,” and confessing gender-based personal traumas with “Letter to a John” and “Tiptoe.” She was too aggressive even for most feminists: at her 1990s peak, she declined Lilith Fair, though she could’ve headlined, calling it too timid.


This landmark album pushed DiFranco into mainstream consciousness, drawing listeners’ attention to her muscular, unapologetic live performances. She dared audiences to join her introspective journey, and that largely self-selecting audience followed. Her mainstream acceptance followed, including larger venues and ten Grammy nominations in ten years. Though never a superstar, this album ushered in DiFranco’s moment of greatest artistic and commercial triumph.

DiFranco’s particular stretch of the 1990s produced several iconic women singer-songwriters, from fresh-faced ingenues like Fiona Apple to seasoned geniuses like Tori Amos. Like them, DiFranco saw her commercial star marginalized by the artistically anodyne stylings of the middle 2000s, and she’s returned to headlining the specialized circuit she once loved. She’s probably better for it. These pre-fame recordings display an artist most comfortable with intimacy and vision.

Friday, August 4, 2017

Dollar Store Jesus

Mitch Kruse with D.J. Williams, Street Smarts From Proverbs: How to Navigate Through Conflict to Community

The other day I reviewed a business book that had an underlying moral message. This time, it goes the other way: I have a Christian book that’s essentially an encomium to capitalism. And I have the same basic responses to both: they’re okay, provided you don’t push either to their logical extremes. Tempered with moderation and accountability, either could be uplifting and game-changing; unmoored from community, either could lead to self-sanctification and arrogance.

Reverend Mitch Kruse paid for his seminary education through proceeds from selling his Internet start-up to eBay. This duality, the transition from capitalist self-marketing to Christian humility, inflects this, his second book. He has a biblically solid exegesis, based on a fairly consistent conservative evangelical read of Proverbs and, to a lesser degree, other Scripture. But the emphasis is very first person singular. It’s about I, me, my relationship with God… which isn’t what Proverbs is about.

Kruse believes monthly rereadings of Proverbs will instruct open students in judgement, discretion, and restraint. This, for Kruse, makes a working definition of “wisdom,” a form of thinking in which faithful believers sublimate their human reason to God’s will. Because Proverbs has thirty-one chapters, reading one per day will increase opportunities for learning and self-correction. Through repetition, devoted readers will acquire the habits of right thinking and self-control which Proverbs makes available to open minds.

Pursuing this goal, Kruse is a master maker of lists and other mnemonic devices. The largest part of this book delves into what Kruse calls the Twelve Words, important themes which recur in Proverbs and define its overarching goal, Wisdom. These include Righteousness, Justice, Discipline, and Learning— words which mean something different in Biblical contexts than in conversational English. Kruse defines these terms using a mix of scholarship and anecdote, in the classic sermonist style.

Mitch Kruse
My problem isn’t Kruse’s writing. He hews to a familiar homiletic style which I assume Christian writers must learn in seminary, because it recurs regularly. I might wish Kruse broke from a mold I find boring in Christian pop nonfiction, because I’ve seen it so often that my eyes skim the page, but that’s my personal problem. Rather, his choice of Proverbs, probably the most first-person-singular book of the Bible, leaves me scratching my head.

Proverbs is part of what Hebrew scholars call Wisdom Literature, four books (seven in the Septuagint, called the Catholic Apocrypha) that differ from other Biblical books. Neither history, like the Torah, nor exhortations of the people, like the Prophets, Wisdom Literature mostly involves gnomic poems and sayings restraint, humility, and godliness. Unlike most Hebrew Scripture, Proverbs isn’t intended for the whole people; it’s specifically for kings or (like the case of Ecclesiastes) sons of kings.

That’s why mainline liturgical churches often avoid Proverbs in pulpit ministry. We can’t agree on how it applies to most believers today. We consider it inspired, and like all scripture, useful for instruction. But to call it controversial is underselling the situation. Kruse often has to perform theological gymnastics to make Proverbs yield the communitarian thesis he promises in his title, an approach that, I’ve noticed, doesn’t much include direct citations from Scripture, including Proverbs.

Thus we’re faced with a Christian book that seldom cites the principal Christian source, a communitarian book about hierarchies, a book about Hebrew Scripture that largely eschews the Hebrew prophetic tradition. Kruse trades primarily in contradictions, which I think even he doesn’t always see. His most recurring theme holds that Christians need to subjugate their will and intellect to God’s, yet h almost entirely emphasizes individual salvation, not what God would have us actually do.

My biggest problem turns on Kruse’s distribution of rewards. His anecdotes generally follow a reliable pattern: a person Kruse knows, or knows about, ventures into willfulness and self-seeking, which ends badly. (He particularly dreads substance abuse.) That person rediscovers God’s purpose, surrenders to divine will, and gets restored. This often ends with some earthly reward: a university degree and a family, lucrative speaking gigs, media stardom. Heavenly salvation, in Kruse’s theology, generally brings earthly rewards.

Kruse never says anything I find altogether wrong. He often shares meaningful, uplifting, theologically sound precepts. But with his emphasis on individual salvation and the journey from poverty to riches, literal or metaphorical, he’s essentially sharing a Christianized Horatio Alger story. Though I often like Kruse’s message, I cannot escape the reality that, throughout, he never stops thinking like a businessman. Christianity is, for Kruse, a transaction, where the faithful hope to make a profit.

Wednesday, August 2, 2017

The Trouble With “Isms”

John Oliver, the grumpy uncle
of pay-cable comedy
John Oliver, a man who’s described his accent as being like “a chimney sweep being put through a wood chipper,” has made Net Neutrality a personal pet issue. After pushing the issue once during the Obama administration, successfully, he came back to the issue again after President Trump appointed Ajit Pai chair of the FCC. For a polemicist famous for his wide-ranging interests, coming back to any issue is significant for Oliver.

For those unfamiliar with Net Neutrality, the concept is simple. Under current regulations, internet service providers have to make all web content equally available. Whether dialing up, say, an obscure blog by a struggling provincial writer (let’s just say), or the web’s most successful commerce sites, speed and accessibility shouldn’t vary. Service providers shouldn’t charge site owners for faster or more reliable consumer access.

Seems simple enough. I thought I supported the principle undividedly. And when a friend, an outspoken Libertarian who believes everything would improve if all top-level rules vanished tomorrow, claimed that Net Neutrality rules were forms of government micromanagement interfering with a system that worked just fine without bureaucratic interference. Let whomever charge whatever to whoever they want! It’s not government’s place to get involved.

To be clear, I don’t buy the anti-neutrality argument. If service providers could charge content providers for superior access, massive corporate systems like Facebook and Google, which between them control half the online advertising revenue for the entire earth, or mega-commerce sites like Amazon, could afford extortionate rates, while struggling artists and shoestring entrepreneurs would get shunted onto second-string connections and forgotten. That includes my friend, a strictly regional business owner.

Jeff Bezos, whose Amazon controls
half the Internet commerce in the world
(Full disclosure: this blog appears on a platform owned by Google. And my book reviews link to Amazon. I couldn’t afford this service if I had to pay out-of-pocket.)

But thinking about it, I realized: I’m protecting the very big against the very big. And so is my friend. As stated, half of all online ad revenue travels through two corporations and their subsidiaries. Google owns YouTube, Zagat, Picasa, and the Android operating system; Facebook owns Instagram, Oculus, and WhatsApp. Just for starters. I’m protecting the already massive from paying to get their product onto my computer screen.

Some Google subsidiaries, like Boston Dynamics, actually produce marketable product. But Google mostly sells ads on their search engine—ads that frustratingly often lead customers to something they already wanted to find. They’re already charging small operators, like those artists and entrepreneurs I mentioned, access to get seen by people like me. Same for Facebook: it’s primarily an ad vendor. And if you don’t buy these two companies’ ads, you probably won’t get seen.

So the Net isn’t Neutral right now.

Nearly a century ago, G.K. Chesterton wrote that, for most citizens, the difference between communism and capitalism is vanishingly small. The only choice is whether we prefer to be ruled by government bureaucrats, or corporate bureaucrats. That’s clearly happening here. On careful consideration, Net Neutrality is essentially protecting the very large corporations, who are imposing their own rules on smaller companies, rules that are proprietary and therefore both invisible and arcane.

So we’re faced with the choice between protectionism, which will ensure Google, Facebook, and to a lesser degree Amazon (which controls half of all Internet commerce) can charge small operators like me to get seen by anybody whatsoever; or Libertarianism, which… um… will make these companies pay Comcast, Time Warner, and Charter Spectrum to continue doing the same thing. On balance, neither choice really protects small operators like me.

G.K. Chesterton thinks your neutrality
rules are sweet and naive
I’ve chosen, at present, not to pay either Facebook or Google for increased visibility on this blog. This means I mainly get seen by people clicking through on my personal Facebook and Twitter accounts, and my daily hits seldom exceed 100 to 150. (Oh, who owns Twitter? It’s also a corporate umbrella; it just hasn’t grown nearly as fast.) I’ve chosen not to kiss corporate ass, and the price I’ve paid is vanishingly small readership. Sad trombone.

Both “isms” depend on the premise that, if we create the appropriately utopian regulatory system (too big? Too small? Any at all?), information will flow freely. Except we need only open our eyes to realize information isn’t flowing freely. The commercially accessible Internet, as it currently exists, is essentially a joint-stock partnership between Sergey Brin and Mark Zuckerberg. There’s no sweet spot of appropriate regulation on the Internet. Because there’s no freedom for information to move on a platform that isn’t already free.

Tuesday, August 1, 2017

Business, Ethics, and the Risk of (Self-)Righteousness

Scott Sonenshein, Stretch: Unlock the Power of Less—and Achieve More Than You Even Imagined

Professor Scott Sonenshein divides the the business world into two categories: chasers, who pursue more and better resources to do achieve their objectives, and stretchers, who make what they already have perform double duty and prove maximum return. Sonenshein, who teaches management at Rice University, uses language from business and behavioral economics to convey his message. I was shocked, however, to notice how he made a fundamentally moral point.

A popular mindset persists, Sonenshein writes, particularly among business professionals born into the wealthy class, or among people with very narrow, specialized educations. If we had more money, this mindset asserts, or better tools, or more people, or something, we’d crack the success code and become unbelievably successful. If only we acquire something new, we’ll overcome whatever impediment stops us achieving the success waiting below our superficial setbacks.

By contrast, successful businesses like Yuengling beer, Fastenal hardware, and others, practice thrift in resource management, utilizing existing resources in innovative ways, maximizing worker control over business decisions, eschewing frippery, and making the most of everything they own. Sonenshein calls this “frugality,” a word he admits has mixed connotations. But he’s clearly demonstrating familiarity with once-common ethical standards, what economists still call the Protestant work ethic.

Sonenshein doesn’t once cite religion or morality, either implied or explicit. However, when he breaks successful businesses down into bullet point lists of best practices, like “psychological ownership,” “embracing constraints,” “penny pinching,” and “treasure hunting” (to cite the takeaways from just chapter three), the ethical correspondences become rather transparent. Take responsibility for your choices, little Timmy! Work with what you have! Save now for bigger rewards later! Et cetera.

Scott Sonenshein
From the beginning, Sonenshein structures this book much like religious sermons. His points are self-contained, backed with pithy illustrations showing real-world applications. He asserts his point, cites his text, backs it with anecdotes, then reasserts his point. The structure appears altogether familiar to anybody versed in homiletics. It persists in religion, and translates into business books like this one, because it holds distractible audiences’ attention long enough to clinch brief points.

But again, Sonenshein never cites religion. He frequently quotes research from psychology and behavioral economics to demonstrate how scrutiny supports his principles. But if he’s proffering a business gospel, it’s a purely secularized one. Though Sonenshein comes from the same mold as religious capitalists like Norman Vincent Peale, Zig Ziglar, and Og Mandino, he never relies upon revealed religion. Earthly evidence, not God’s law, demonstrates this gospel’s moral truth.

Oops, did I mention Norman Vincent Peale? See, there’s where doubts creep in. I had mostly positive reactions to Sonenshein’s points until I remembered Peale. There’s a direct line between Peale’s forcibly optimistic theology, and Joel Osteen’s self-serving moralism. We could achieve earthly success by aligning our vision with God’s… but often, already successful capitalists have recourse to God to justify their own goodness. I’m rich because I deserve it!

This often leads to retrospective reasoning—what Duncan J. Watts calls “creeping determinism.” In finding already successful operations, then applying his learning heuristic to them, Sonenshein risks missing invisible factors steering his anecdotes. I cannot help recalling Jim Collins, who praised Fannie Mae and Circuit City scant years before they collapsed. In reading Sonenshein’s anecdotes, like hearing Christian sermons, it’s necessary to listen for the sermonizer’s unstated presumptions.

Please don’t mistake me. I generally support Sonenshein’s principles. I’ve reviewed previous business books and found them cheerfully self-abnegating, urging middle managers to sublimate themselves to bureaucratic hierarchies and treat themselves basely. Sonenshein encourages workers to stand upright, own their jobs, and always seek improvement… and he encourages employers to treat workers like free, autonomous partners. Though Sonenshein never embraces an “ism,” he corresponds with my personal Distributism.

I wanted to like Sonenshein’s principles because I generally support his ethics. His belief in thrift, in embracing a mindset of ethical management, and of getting outside oneself, is one I generally applaud, and strive to apply to myself (not always successfully). Though I don’t desire to control a multinational corporation, I wouldn’t mind leveraging my skills into local business success and financial independence. And I’d rather do it ethically.

But like any ethicist, Sonenshein requires readers to police themselves carefully. They must apply these principles moving forward, not looking backward. Financial success often inspires implicit self-righteousness, which business ethics can inadvertently foster. I’ll keep and reread Sonenshein, because I believe he’s well-founded. But I’ll read him with caution, because his framework conceals minefields I’m not sure even he realizes are there.