Showing posts with label ethics. Show all posts
Showing posts with label ethics. Show all posts

Tuesday, May 28, 2024

The Police Are Lying Liars Who Lie To You

Thomas Perez, Jr., tears his clothing several hours into the interrogation for a murder
he not only did not commit, but it did not even happen. (San Bernardino Sun)

The mathematical subdiscipline called Game Theory uses an influential thought experiment called the Prisoner’s Dilemma. In this exercise, police question you and your friend separately for some crime. Interrogators claim to have substantive proof of your guilt. If neither of you confesses, you’ll receive moderate sentences. But if one confesses and implicates the other, the confessor walks free, while the other does hard time. Using subjective measures of likelihood, should you stand fast or confess?

I’ve seen this exercise repeated in popular science and mathematics books, and in introductory undergraduate textbooks. But the narration always focuses on the likelihood of you or your friend confessing. It consistently omits one other subjective likelihood: how likely we consider it that interrogators are lying. Under the Supreme Court standard in Frazier v. Cupp (1969), police have qualified immunity for what jurists call “deceptive interrogation tactics.” That is, they’re permitted to lie with impunity.

According to the San Bernardino Sun, police in Fontana, California, arrested Thomas Perez, Jr., in August 2018 on suspicion of murdering his father. Thomas Perez, Sr., wasn’t dead; he hadn’t even been missing particularly long. Four officers grilled Perez Jr. so long and hard that, per the Sun, he struck himself, screamed, and tore his clothes. According to the Guardian, the interrogation lasted seventeen hours, long enough for Perez to become “sleep-deprived” and possibly delusional.

According to both sources, the police response looks wildly slipshod. Though Perez Jr. commenced the investigation for his putatively missing father, the dispatch officer who took his call deemed him suspicious for being insufficiently attentive. Therefore they brought Perez in for questioning, having deemed him the prime suspect, before commencing the investigation or gathering any evidence. In other words, police unilaterally decided not only that a crime had occurred, but who was responsible for it.

Worse, though, were the primary tactics employed. Police told Perez they had overwhelming evidence that didn’t exist. They insisted they had Perez Sr.’s body in the building, although as noted, he wasn’t dead; he’d simply gone on an unannounced wander. They withheld Perez Jr’s depression and hypertension medications, claiming he didn’t really need them. There was no crime, no evidence, and trivially little investigation. Police simply fabricated everything, then used their lurid fantasies as “proof.”

After coercing a confession out of Thomas Perez Jr., police left him alone with his
dog. Moments after this still was taken, he used the drawstring from his shorts to attempt
to hang himself. (San Bernardino Sun)

Since the 1950s, police interrogators have mostly used the Reid Technique, an approach based on the most up-to-date psychological assessments of the Eisenhower era. Rather than physical force, the preferred prior technique, Reid Technique interrogators apply psychological pressure to achieve what courts still consider the gold standard of evidence, a confession. One problem: according to informed critics, the Reid Technique produces false confessions over fifty percent of the time. Its outcomes simply are not reliable.

The Reid Technique actively aims to leave suspects desperate, isolated, and dependent on police. Lies and leading questions are totally permissible. If you’ve ever watched cop dramas and heard an interrogator say “Here’s what I think happened,” that’s the Reid Technique. The technique also permits threatening suspects. Many commentators have expressed greatest outrage at interrogators’ threat to euthanize Perez Jr.’s dog, though that too was probably a lie; they probably lacked authority to do that.

All these tools are completely impermissible when interrogating foreign combatants under the Geneva Convention. If you remember the transnational outrage surrounding the Abu-Ghraib prison scandal, you know this. Attempts to psychologically degrade prisoners are crimes against humanity. But these same techniques—lying, torture, threats, withholding medical care, and more—are perfectly acceptable when applied by American police to American citizens. Legality doesn’t matter, as any crimes get swept together under the rubric of “qualified immunity.”

Beyond the legal and moral implications, the Reid Technique just doesn’t work. Fontana police coerced a confession from Perez Jr. and continued holding him for psychiatric evaluation for several days after they knew Perez Sr. was alive. As noted, around half of Reid Technique confessions are demonstrably false, but juries often don’t know that, and consider confessions binding. In my adopted home state, the Beatrice Six demonstrates how powerful and destructive false confessions can be.

Qualified immunity bakes dishonesty into police procedure. Evaluating police work by case closures rather than accuracy, creates perverse incentives to produce confessions by any available means. Even police who mean well and work honestly are evaluated by the same yardstick, forcing them to adopt specious methods if they work. We can argue whether the police can be reformed, but one thing is clear: abuses like the Fontana police will always squeak through on qualified immunity.

Wednesday, August 9, 2023

The God of Justice, and the Justice of Humankind

Jesse Watters

“He just believed the election was stolen,” Jesse Watters said last week on his recently minted prime-time Fox News show. The “he” in this statement is, of course, former President Donald Trump, arraigned last week for his part in fomenting the January 6th, 2021, insurgency. According to Watters, if Trump sincerely believed his legitimate reelection was stolen, violence was justified. As Watters and Greg Gutfeld both state, proving Trump didn’t believe this is nigh-on impossible.

Hearing this last week, I mentally time-traveled to President George W. Bush’s second term. As Operation Iraqi Freedom dragged on, suffering terrible mission drift and causing incalculable harm, a right-wing talking point arose that President Bush didn’t necessarily lie in falsely claiming Iraq harbored weapons of mass destruction. Calling it a “lie,” conservative prognosticators claimed, implied Bush knew his statements were false. A “lie” wasn’t necessarily a false statement; lying required intent, which is unprovable.

In both cases, we witness conservative pundits defending Republican Presidents based not on actions, but belief. If President Bush believed, in the chambers of his heart, that WMDs existed, then he wasn’t morally culpable for deceit; he was as misled as the American people. (We now know this is measurably untrue.) Likewise, if President Trump legitimately believed the 2020 Electoral College outcomes were insidiously doctored, then his sincerity morally shields the legality of his actions.

We should immediately reject this argument. If one’s moral state protects the legality of one’s actions, then Americans would never prosecute minors as adults, even for violent crimes. Yet American prosecutors do this frequently, asserting that the heinousness of crimes committed by minors, especially Black minors, overrules the diminished moral capacity of youth. In these cases, action defines morality. But pundits claim that Presidents—America’s most morally culpable people—are somehow shielded by their sincerity.

Even beyond this prima facie contradiction, foregrounding belief unearths a vipers’ nest. It introduces a twisted variation on the Christian doctrine that only God knows the contents of a human soul. Despite what we’d sometimes prefer to believe, humans can neither let somebody into Heaven, nor condemn somebody to Hell; these options belong exclusively to God. Shifting the parameters away from what Bush or Trump did, to what they believed, makes justice a divine prerogative.

The Accused

At least nominally, jurisprudence focuses not on the defendant’s morals, but upon actions. Did the accused actually hurt, steal, or kill? We may consider aggravating factors, such as whether the violence seems disproportionate. Prosecutions for first-degree murder may consider whether the actions demonstrated “depraved disregard for human life,” as by elaborate advance planning or coordination. But even in these cases, we don’t question the impurity of the defendant’s soul, but the severity of their actions.

Using these standards, we can evaluate the Presidents’ actions, without considering their mental or spiritual state. Even if President Bush believed, with the solemnity of church, that Iraq possessed WMDs, members of his administration stated unequivocally that no such weapons existed. Bush notoriously overruled their objections. Likewise, Jack Smith’s indictment of President Trump takes Trump’s beliefs of the table in Paragraph 3; Smith spends 45 pages unpacking Trump’s actions, not his mental or spiritual state.

These right-wing pundits negate all questions of action by asking: do they know they’re committing a legal or moral crime? Even laying aside the base hypocrisy of the fact that they only apply this question to Presidents, they also replace a legal question with a theological question. They declare Presidents, or at least Republican Presidents, as members of the Elect, saved from earthly sin by God’s inscrutable movement. Their only judgement is the Final Judgement.

Earthly courts obviously cannot judge human hearts. That’s why jailhouse conversions usually don’t create thorny legal issues: if the incarcerated is legitimately penitent, well, the penitentiary has done its (supposed) job. Keep up the good work. Legitimately run courts, in the English Common Law tradition, care only about the accused’s actions—and, in Trump’s case, those actions played out on live television. Bad actions for benevolent reasons are still, in the court’s eyes, bad actions.

Even if this premise wasn’t bad-faith partisanship, we should still resist this intrusion of spiritual judgement into the earthly justice system. The law does not, indeed cannot, judge what happens inside a person’s heart or mind. Though courts have some latitude to judge purposes, for instance self-defense, these conditions should remain exceptional and rare. Once courts start judging anyone’s beliefs or intentions the state assumes God’s role. And that fact alone should cause bipartisan concern.

Wednesday, August 2, 2023

Neil Gaiman and the Road to Truth

Michael Sheen and David Tennant as Aziraphale and Crowley in Good Omens 2
This essay contains spoilers.

Neil Gaiman and Terry Pratchett’s novel Good Omens specifies that Aziraphale and Crowley, the angelic protagonists, don’t have a sexual relationship. Though Pratchett passed away in 2015, Gaiman maintained this parameter when adapting the novel for the 2019 BBC/Amazon joint production. Though he didn’t deny anybody their personal headcanon, he rejected the idea that Aziraphale and Crowley’s relationship was anything but platonic.

Therefore it’s sudden and jarring in the final minutes of Good Omens 2 when (seriously, spoilers) Crowley grabs Aziraphale roughly and kisses him. This becomes the first moment that concretely sexualizes the characters. Throughout the season, Aziraphale and Crowley struggle to create a meet-cute between their neighbors, Nina and Maggie. But they’ve failed miserably because their knowledge of human romance comes entirely from Richard Curtis movies and Jane Austen novels.

Understanding the change requires understanding the context. Though Gaiman and Pratchett share billing on the original novel, Pratchett did most of the actual writing; Gaiman, a novice prose writer, wasn’t equipped to write an entire novel. Pratchett wanted to remain faithful to the Abrahamic mythology their novel satirized, which meant that transcendent beings lacked binary gender. To pinch a Kevin Smith line, angels are as sexless as Ken dolls.

Although Good Omens 2 is co-written by Gaiman and John Finnemore, it’s the first time the setting reflects exclusively Gaiman’s vision. And it bears noting Gaiman’s other recent streaming success: Sandman on Netflix. Not only does Sandman contain a noteworthy number of same-sex couples, Gaiman even gender-swaps John Constantine, a longstanding DC Comics character, to create increased Sapphic tension. Same-sex partnerships mean something to Gaiman.

In Sandman episode 5, Bette, a diner waitress, expresses purblind views about sexual identities. She claims Judy, a regular customer, is too pretty to be a lesbian, and engineers a meet-cute (another theme) with another customer, Mark. But when John Dee, empowered by Dream’s magic ruby, stops everyone lying and sheds their inhibitions, Bette and Judy find themselves entangled in a passionate embrace. That, the story implies, is their truth.

Throughout Sandman, Gaiman uses same-sex relationships as shorthand for characters who follow their own moral code. Johanna Constantine, Bette and Judy, Hal Carter, and Chantal and Zelda are all depicted as characters unbeholden to convention, free of judgement, and wholly alive. This freedom isn’t necessarily “good” in any moral sense, as The Corinthian’s ravenous sexuality is second only to his murderous impulses. But it does mean one is unbound.

Shelley Conn and John Hamm as Beelzebub and Gabriel in Good Omens 2

Good Omens depicts a world deeply bound to binaries. Good and evil, Heaven and Hell. We glimpse both eternal realms: Heaven is orderly, brightly lit, and aseptic, while Hell is noisy and cluttered, and several denizens show signs of gangrene. Both realms also keep demanding transcendent beings, like Aziraphale and Crowley, and their human allies, make binding declarations for one side or another. They demand complete moral absolutes.

Crowley and Aziraphale, however, spend the entire series finding ways to thread the moral needle. Both beings balk, for instance, at the biblical Job’s predicament, with its requirement to kill, and gradually devise a workaround. When they find an urchin robbing graves to escape poverty, Aziraphale learns that humans face degrees of wrong, while Crowley decides that death doesn’t resolve his sympathies. Broken moral bromides litter this story like flies.

Therefore, reaching the series culmination where (again, spoilers) the Metatron offers Aziraphale command of Heaven’s forces, this pushes the limits of Gaiman’s disdain for moral absolutes. By accepting the offer, Aziraphale must accept Heaven’s moral straight jacket, something Crowley can’t do. Crowley would rather continue mapping his own moral landscape, something both powers have done successfully for millennia. But, as Aziraphale notes, they have little to show for it.

Gaiman believes, from the textual evidence, that all absolute morals eventually collapse. But that doesn’t mean seeking one’s own moral resolution makes everything better. Johanna Constantine, John Dee, and now Aziraphale and Crowley have manufactured their own moralities, evidenced by their rejection of sexual identity myths, but they’re also terribly lonely. They occupy society’s margins, with only a few friends. Their stand requires courage and durability that most people lack.

Where Crowley kisses Aziraphale, therefore, it’s arguable whether the action is sexual. Maybe the characters remain, as both authors assert, essentially sexless. But Crowley demands, with his kiss, to know whether joining Heaven’s moral absolutes will, as Aziraphale claims, make a difference. Does morality, without context, mean anything? Neil Gaiman seemingly thinks not. Truth may be a lonely road, Gaiman suggests, but it’s the only one worth walking.

Monday, January 23, 2023

Dark Rogers and the Illusions of Power

So I’ve seen this image drifting across social media, as they do, and it got me thinking. (No, surely not you, Nenstiel!) Fans of both Fred Rogers and Star Trek will recognize this thought process. The “Mirror Universe,” a recurring Star Trek thread, depicts an interstellar civilization controlled, not by a United Federation of Planets, but by an Imperium. The Mirror Universe is antidemocratic, violent, and ruled by the strong.

Star Trek fans love the Mirror Universe to the exact extent we generally admire Gene Roddenberry’s underlying humanist ethic. Roddenberry believed the modernist myth that all human history marks an unwavering path from caveman savagery, through an arc of warring tribes and nations, toward an ultimate gleaming future defined by peace, prosperity, and the shedding of divisions. Human societies are always moving from mud-dwelling to utopia.

This involves a sort of civic Calvinism. Okay, Roddenberry himself was personally atheist, and believed the arc of history would move away from religion and “blind faith.” But in accepting the modernist myth, Roddenberry believed the Calvinist precept of “Total Human Depravity,” the principle that humans, left to ourselves, are selfish, venal, and angry. John Calvin believed Jesus Christ would redeem this innate venality; Roddenberry trusted in an evolving state.

Fred Rogers, a Presbyterian minister (a form of Calvinism), had a difficult relationship with Total Human Depravity. He didn’t accept the maxim that humans are innately bad, and that Christianity must purge our sinful desires. But alongside icons of childhood like Roald Dahl and Maurice Sendak, Rogers recognized that children live in long-term states of intrinsic powerlessness. Children lash out, sometimes violently, because they have no other tools available.

Many ex-kids, especially we who preferred to think and build rather than compete and vanquish, have chilling memories of childhood authority. I understand now, as I couldn’t then, how many schoolyard bullies were simply reenacting the power dynamics they learned at home. But that didn’t matter when they’d literally form crowds intended to corner me, shouting and screaming and shoving. The only thing I understood then, was my own powerlessness.

Rather than stopping the constant low-grade violence, school authorities encouraged us to mollify the bullies around us. Not even metaphorically, either: my parents literally told me that bullies acted out of their own terror and pain, and they needed somebody willing to embrace them, regardless of their inappropriate behavior. My family literally told me that it was my Christian moral duty to befriend the kids who made me terrified to go to school.

Current sociology suggests early humanity wasn’t violent, despite what Freud, Marx, and Calvin believed. Little evidence of warfare exists before humans became settled. But then agriculture, with the limits of space and resources, took over. Just as the Biblical Cain, a farmer, killed his brother Abel, a herdsman, so early civilization invented war, and the scars which war produces. Humans had to learn how to be angry, violent, and resentful.

The meme jokes about Mirror Rogers claiming children are weak. But our Fred Rogers acknowledged that same weakness. The difference is, Fred Rogers didn’t see weakness as something to exploit or punish. Like Jesus, Fred Rogers encouraged children to embrace their weaknesses. He authorized us to care more deeply, to reach out to those suffering, to love without reserve. He didn’t promise it wouldn’t hurt, only that we would emerge.

Star Trek’s Mirror Universe, like its Klingon Empire, depicts a society ruled not by humanist values, but by chest-thumping displays of strength. Because Roddenberry realized, despite his ideals, that some people never outgrow their childhood vulnerability stage. We’ve all worked in jobs where the biggest assholes have the most friends, because just like on the schoolyard, scared peers never stop trying to constantly mollify them.

Roddenberry, an atheist, and Rogers, a Christian, shared an underlying belief that rule by fear seems enormous when we’re in its midst, but can never truly last. Roddenberry believed society itself would reach a stage of development where it shackled our violent impulses, but the shadow self, the dark mirror, would remain. Rogers believed the fight would never truly be won, but which side we chose would define our lives.

Because yes, fundamentally, children are weak. Humans are weak. We spend our lives vulnerable to money, violence, and injustice. We never outgrow our weaknesses; we only reach a stage of development where our weaknesses no longer rule us. Whether we reach that individually or together, we’ll all, hopefully, reach a point where weakness is no longer shameful.

Friday, June 3, 2022

A Manichean Guy With a Gun

President & Dr. Biden at Robb Elementary, Uvalde, TX

“The only thing that stops a bad guy with a gun is a good guy with a gun.” NRA executive vice-president Wayne LaPierre popularized this bromide in the days following the Sandy Hook shooting in December 2012. Like its close compatriot, “thoughts and prayers” LaPierre’s saying quickly became unmoored from its roots and warped into a caricature. It’s just another piece of faux morality clogging American rhetoric now.

But it’s also a demonstration of how American public morality has become meaningless. The division of humanity into “good guys” and “bad guys” is a form of Manichaeism: the belief that the universe is divided into the altogether good, holy, and virtuous, against the altogether bad, godless, and depraved. In traditional Christianity, Manichaeism is considered a heresy. And for good reason, too: because it makes us insensible to the world around us.

As usually happens following mass shootings, the narrative surrounding last week’s massacre in Uvalde, Texas, has turned on identifying the menace, the crime, or the “evil.” The shooter is described as a manifestation of ultimate evil—or perhaps of “mental illness,” which is frequently a sloppy shorthand for evil. Likewise law enforcement who dithered outside the school for an hour are described in terms from a medieval morality play: “cowardly,” “shameful,” “dishonorable.”

These attempts to define “evil” situate it externally, and in absolute terms. These people are defined by one overwhelming, Aesopic moral trait and, by implication, so are we. We’re reassured that we, mercifully, didn’t kill anyone, didn’t stop parents from rescuing children, didn’t cause or enable the violence. Therefore we can confidently condemn those whose actions or inactions violate our code. Calling others “evil” defends our perceived goodness.

Marjory Stoneman Douglas High School, site of the Parkland, Florida, shooting

Problem is, things are never that absolute. The Uvalde police join a host of security professionals, school resource officers, and other armed guardians who protected their lives over those of schoolchildren. In schools from Littleton, Colorado, to Parkland, Florida, school police fled, and civilian teachers who resisted the violence were the first to die. Armchair strategists love claiming they’d rush into the fire, but few of us really would.

(On a personal note, I have personally rushed into violent conflicts between people better prepared than me. These conflicts only involved fists, though. I doubt strongly I’d ever have rushed optimistically into an active crossfire. That’s why I scoff whenever somebody says “I would’ve taken the risk.” No you wouldn’t, and neither would I.)

I remember taking exception to the “good guy with a gun” narrative when it first appeared. How do we define good guys and bad guys? One acquaintance, an NRA life member and gun rights advocate, insisted we could identify good guys by their licensure. He believed that good guys had training, certification, and permits to handle and carry weapons. Goodness, for this person, is bestowed by official declaration by esteemed professionals.

Sadly, I lost contact with this individual long before last week’s shooting. I wonder how his inner narrative of goodness as accord with authority has survived the knowledge that official local, state, and federal law enforcement did nothing while civilians died. If we’ve learned anything from COINTELPRO or Ruby Ridge, it surely must be that authority and goodness aren’t synonyms—yet many people, evidently, forget quickly.

The quest for external, measurable evil has produced terrible outcomes. By imputing “evil” to others and making that characteristic their defining being, we have, at different times, segregated people by race, economic class, national or regional origin, and the kitchen sink. When we believe that evil lives in cities, or has a certain skin color, or prays a certain way, we make it acceptable to keep others at arm’s length.

The now-infamous photo of concert-goers fleeing the shooter
at the country music festival in Las Vegas, 2017

One needn’t be Christian and believe in Original Sin, though, to realize that humans aren’t neatly symmetrical. We all have the inner capacity to do tremendous good or epic evil, depending on the choices we make, the influences we let inside, and how we react to how others treat us. Moral Manichaeism lets us assign blame, and most importantly, lets us assign it away from ourselves. But that’s always false.

People who quote the Manichean heresy inevitably see themselves on the good side of the split. Wayne LaPierre’s division of humanity into “good guys” and “bad guys,” or anybody’s reliance on good vs. evil, innately ignores that these words have no external meaning. Goodness and evil aren’t things we are; they’re descriptions of things we do. And as such, they’re completely meaningless in defining how we respond to external calamity.

Friday, July 30, 2021

The Return of Cartoon Morality

A promotional still from Netflix’s Masters of the Universe relaunch

Masters of the Universe wasn’t really my thing in the 1980’s. I watched it occasionally when it aired, but it mostly ran when my parents expected me to do homework, so I never followed story arcs (such as they existed in Reagan-era animation) or savvied the characters. So when Nexflix proudly announced their intent to resume the show where it ended, my first reaction was to shrug.

I realize I’m at the age when the popular culture of my childhood has become mythologized. I grew up watching the Eisenhower Era and its hangover treated as somehow sacred. Nick at Nite reran Mister Ed, The Donna Reed Show, and Rowan & Martin’s Laugh-In with reverence once designated for Bach chorales and High Church Mass. The Wonder Years was one of network TV’s biggest hits, and Classic Rock Radio first became a thing.

So maybe I should’ve been better prepared, thirty years later, when the mass media icons of my childhood became similarly fetishized. From Michael Bay’s Transformers movies to Masters of the Universe, I’ve watched the chintzy animation of my childhood, little better than commercials when they first aired, turn into half-serious mass-market art. The result has left me curious and dumbfounded.

He-Man and Optimus Prime began life, of course, as toys. With media largely deregulated during the Reagan Administration, the strict boundary between TV content creators and the advertisers who subsidized them melted away. The syndicated weekday cartoons, few of which lasted beyond a single season, mostly existed to sell toys. And I, desperate to fit in, purchased some of them, particularly Transformers.

Looking back, these shows’ storylines had messages I didn’t necessarily catch as a child. He-Man and Optimus Prime both had American accents, with greater or lesser degrees of slow Southern drawls; Prime practically sounded like a stand-up comedian doing a John Wayne impression. By contrast, Skeletor, Megatron, and Cobra Commander all had British RPs, offset by their raspy voices. Villains were easy to spot: they spoke like aristocracy with laryngitis.

The morality of these shows had the complexity of comic strips in Boy Scouts magazines. Heroes and villains were unambiguous, and heroism or villainy was simply the characters’ respective natures. Villains lived sumptuously, ate caviar, and shat upon their sidekicks, while heroes ate red meat, slept rough, and had tight friend networks. It’s tough to avoid correlating these stories’ simple morality with Late Cold War propaganda.

Michael Bay with Bumblebee, perennially the most popular Transformer

Therefore, though it’s tempting to treat recent reboots of She-Ra and Voltron as mere nostalgia, the moral complexity these characters introduced strikes me. The revamped Voltron Force’s attempts to maintain enthusiasm for their mission, faced against an empire driven by strange and often poorly defined motivations, seems intended more for adults than children. Are we as Americans, the story seemingly asks, the plucky adolescent heroes, or are we the empire?

I have avoided watching Michael Bay’s Transformers movies, not out of love for cinematic grandeur and distaste for Bay’s notoriously excessive spectacle, but because the Transformers represent a piece of my childhood I’m not proud of. I purchased Transformers toys, not because I wanted to participate in their story, but because I thought I might fit in with other boys. I thought I could purchase my way into normalcy, a late-capitalist attitude that’s hard to shake in adulthood.

Therefore, when I see these artifacts of my childhood revamped for adults, with the complex morality that grown-ups know and fear, I’m left puzzled. We adults know that reality doesn’t break down neatly into good and evil, that villainy is contextual rather than innate, and that carefully placed violence doesn’t really make social problems go away. Yet exactly that attitude is being marketed back to us.

Netflix’s decision, with their Masters of the Universe relaunch, to resume the existing story rather than reboot it as they did with She-Ra, makes me wonder who these products are for. Despite attempts to popularize binary morality with stunts like the War on Terror, we just don’t live in the Cold War anymore; pitting bare-chested American virility against the skeletal Red Menace surrogate makes little sense. Who asked for this?

Then I realize: maybe we did. As the oldest Millenials have already turned 40, and Generation X has rollover IRAs, maybe we wish we really had the moral clarity we imagined in childhood. In a world where our greatest enemy is a virus, and where our fellow citizens are the greatest threat to democracy, maybe we want a bad guy we can punch. Maybe we just miss morality.

Friday, February 19, 2021

Rush Limbaugh and the Texas Freeze

Rush Limbaugh, before the consequences set in.

Two images should persist in America’s imagination during this, the harshest winter many can remember: Ted Cruz leaving the country, and Rush Limbaugh leaving the Earth. Two prominent Americans refusing to face the consequences of a situation they, as much as anyone, created. Their bizarre, inflexible moral codes have left America unprepared to face unexpected circumstances, and Texas, right now, is paying for it.

It seems flagrant to say Rush left America uglier and more divided. I’d rather emphasize how he turned politics into a moral imperative. No longer was politics about finding the best interpretation of facts, and basing policy on evidence. Rush’s onanistic obsession with cost-cutting and low taxes became a moral absolute, pursued with Inquisition-like zeal. He wasn’t a politician, he was a radio preacher, expounding belief and punishing sinners.

Which leads directly to events in Texas now. An entire state refused to plan for unlikely disasters, including the current extreme freeze. The power grid wasn’t winterized, and there weren’t enough snowplows. People are literally dying because the state made no preparations for a circumstance they considered unlikely. In a state that’s repeatedly cut costs and privatized its grid, spending on unlikely circumstances seemed like an unnecessary luxury.

This is particularly ironic in the state where the Galveston Hurricane killed thousands, simply because nobody planned for it. Because a massive, devastating hurricane had never struck Galveston, city fathers believed none ever could, and refused to build seawalls and other disaster preparations. Galveston has never fully recovered from the hurricane. And Texas, apparently, has never fully learned from it, either.

Rush Limbaugh didn’t cause this, of course. Having never held elective office, he had no decision-making authority in Texas, or anywhere. He did, however, create a moral landscape based on belief that economics is absolute and ineffable. Other right-wing commentators came before and after Limbaugh, certainly. But he created a cultural landscape where we define politics in exclusively moral terms, pugnaciously deny any common good, and kick the weak.

This, which we might call “Limbaughism,” is a reversal of Marxism. Where Marx believed economics drives morality, Limbaugh believed morality drives economics. Chosen economic principles don’t necessarily produce the best outcome, but are morally right, even Godly. We can argue whether those who propounded this view really believed that moral hokum; fact is, that’s the half-religious liturgy Limbaugh sold the nation.

Texas Senator Ted Cruz, caught on camera trying to leave the country

An entire state disconnected itself from the national energy grid, because its leaders, empowered by a slim majority of voters, believed doing so was moral. People are freezing, without water or power, because the moral imperative said cutting costs, no matter the consequences, would pay for itself. The mere fact that it didn’t, and the program’s chief engineer, Enron, went tits-up, didn’t impede the moral argument.

Because that’s how moral absolutes work.

Our entire system is obviously worse for lack of preparation. The Texas deep-freeze is a metaphor for America’s COVID-19 response, which itself is a metaphor for global warming. We could’ve prepared for this disaster, or any other, but we didn’t. Unlikely events don’t seem worth the expense until they strike. Even when they do strike, some people refuse to adapt, rejecting the mask, or taking their previously scheduled holiday in Mexico.

Tim Boyd, the mayor of Colorado City, Texas, was hounded from office this week for suggesting the Texas deep freeze was an opportunity to cull the weak. Those willing to burn their possessions, he implied, deserved to live. This is Limbaughism invested with public power. Elected to serve the public trust, Boyd instead insisted the public trust doesn’t exist, and told Texans they were on their own. His constituents, mercifully, hooted him out of office.

Limbaugh, like Ted Cruz and Tim Boyd, actually recognized a moral need in America. On the Right, White Christianity has failed to adapt to social changes since around 1955, becoming morally slippery and inchoate. The Left, meanwhile, has become increasingly unwilling to call anything wrong, except calling things wrong. Americans flailed for moral guidance. Limbaugh, visibly angered by the same things that angered his listeners, stepped into the void.

The morality he sold, however, was worthless. It left America unprepared for Black Swan events because money became the ultimate moral indicator. Now Limbaugh leaves Earth, just as Cruz tried leaving America, during a moment of payback. As millions of Texans pay for this morality, some with their lives, Limbaugh had better hope that transcendent justice doesn’t exist. Because he needs to face the consequences for the world he leaves behind.

Tuesday, December 22, 2020

Truth, and the Metaphors That Make It

If, like me, you care about politics and the relationship between people and power, then chances are your recent social media feed has looked like this:

Seven weeks after the incumbent President lost the election, after multiple court cases to overturn the vote have been ejected prima facie for lacking evidence, after the Electoral College has certified the results, and the Supreme Court (one-third of which the incumbent hand-selected) has unanimously refused to even consider a case they consider specious, loyalists continue demanding their definition of truth.

This demand for “truth” strikes me. Thousands of all-caps tweets continue pouring in, asserting the truth, dammit, that Joe Biden could only have won the presidential election through dishonesty. The only material evidence of electoral malfeasance has come from inside the incumbent administration; time and again, the incumbent’s claims of cheating die for lack of evidence. Yet loyalists continue demanding “the truth.”

President-Elect Joe Biden

Late in the George W. Bush administration, Berkeley linguist George Lakoff published his book Whose Freedom? Lakoff, whose career has focused on how linguistic metaphors shape how humans perceive the outside world, observed the late-Bush-era arguments over how to define freedom. He realized that, though conservatives and progressives used the word “freedom” generously, they imputed it with very different meanings.

Electoral politics generally stands or falls according to bromides, not principles. If you tell voters you’ll tax anyone holding remunerative jobs, or that you’ll let the poor starve, they’ll vote against you in numbers sufficient to torpedo your career. So working politicians learn to traffic in generalities: letting the rich hoard resources is “freedom,” and so is lifting the poor from penury. “Freedom” is an eternally elastic metaphor.

So, I’m coming to realize, is “truth.” When the tweeter above, and the thousands of her cohorts demanding the “truth,” shout that Biden stole the election, we progressives respond, almost crinkum-crankum, “Where’s your evidence? Show me the proof!” Because, for us, “truth” means accordance with reality. For claims to have truth value, they must have some real-world correspondence, something measurable enough to stand up in court.

But capital-T Truth, for defenders of the status quo, doesn’t require evidence. Truth is ultimately moral, not evidentiary; truth derives from accordance, not with the world, but with purity. If the world contradicts their Truth, then the world must amend itself, for the world is immoral. Just as morality requires us to examine ourselves and change our ways, it also requires us to change our world, by force if necessary.

Notice who, in public life, continues most assiduously defending the lame-duck administration. It’s mainly public moralists, people who think in black-and-white terms. From religious leaders like Franklin Graham and Kenneth Copeland, to secular moralists like Edwin Meese and Rick Santorum, the figures most inclined to defend this administration have a history of dividing the world into camps of good and evil, and acting accordingly.

Dr. Jill Biden

In fairness, I sympathize with this position. When I witness how our legal system continues to exclude certain populations, even as we’ve excised naked bigotry from the ledgers, I see a world plagued by moral compromise. When I swallow my objections to my coworkers using racist and homophobic slurs, because that’s the culture of industry, and purging it would leave us without skilled workers, I realize my world is unjust.

However, this doesn’t mean I can ignore reality. Elections frequently break in ways I find morally objectionable, and I’d love to overturn the outcomes; in my world, Elizabeth Warren should be preparing her incoming administration. But that’s not how the election happened. Electoral processes are devised by humans, and Arrow’s Impossibility Theorem proves we’ll never completely purge unfairness, so we must accept a certain level of arbitrary injustice.

To people who think in mainly moral terms, this acceptance of injustice is intolerable. Whether it’s progressives railing against the fact that we haven’t purged racism’s stain from economics, or conservatives wailing that their champion of continuity lost, any level of perceived injustice is unacceptable. Boring old evidence isn’t a meaningful counter-argument to morally founded truth; only a superior morality can reverse their purified thinking.

We must relinquish the idea that we can out-argue defenders of the status quo based upon evidence. They don’t want arguments based upon facts (though some, like Ben Shapiro, claim they do), because to them, this world is less true than the Truth. Even quoting their holy scripture generally doesn’t dissuade them. Truth, to them, is an eternal verity, not a worldly fact; we can only respond with better verities.

Monday, July 27, 2020

There Is No Opioid Epidemic (Part 2)


Three years ago, I wrote an essay entitled “There Is No Opioid Epidemic.” It got several positive responses and, when shared by a popular Colorado yoga practitioner, became my first essay to go viral. So I assume it must’ve struck a chord with readers. Almost like, in a one-note media chorus of professional chin-waggers complaining about the opioid epidemic, readers were happy to hear somebody say their suffering wasn’t invalid.

When I re-shared it yesterday, on its three-year anniversary, a friend contacted me with her story. She suffers a chronic disability with symptoms which include immobilizing chronic pain. (This friend, and the stories she’s shared recently, have made me aware how pervasive disability-based prejudices are in American society.) She told me she’d been prescribed opioids, but had the prescription yanked, fearing she might, in the future, become addicted.

Not, please note, that she’s currently addicted; rather, that she might become addicted in some hypothetical future. Her doctor admitted that, under pressure from the DEA and other authorities, she’d been forced to restrict the number of opioid prescriptions she wrote. So my friend got handed a prescription for another drug that doesn’t work as well, leaving her in a constant state of pervasive, low-level pain—not debilitating, but restrictive.

Thinking about this, I’ve struggled to comprehend the reasoning. Clearly, the authorities responsible for regulating American drug behaviors believe chronic addiction is a worse affliction than chronic pain. We’ve certainly been conditioned to believe this through generations of “This Is Your Brain On Drugs” messages, coupled with police interventions and DARE programs. We’ve heard for years that becoming addicted is always, innately, a fate worse than pain.

British journalist Johann Hari writes, in Chasing the Scream, that Harry J. Anslinger, America’s former top narcotics officer, had a personal aversion to addicts. Owing to a complex and nuanced experience in childhood, he believed addicts weren’t just suffering, they were actively bad people. Because Anslinger had a hand in negotiating the end of World War I, President Wilson gave Anslinger a blank check for his post-war public service career.

Anslinger chose to dedicate his life to punishing addicts. He chaired the Federal Bureau of Narcotics (forerunner to the ONDCP) for thirty-two years, one of the longest public appointments in American history. Using this leverage, he pushed legislators to enact laws intended to punish addicts, enforce private morality, and force people to be, in Anslinger’s view, good. All because Anslinger had an inflexible moral attitude.

Harry J. Anslinger, near retirement in 1962
Even this explanation doesn’t really satisfy. Though the unprecedented power Anslinger enjoyed (if that’s the word) allowed him to enforce his moral prejudices, President Nixon declared the War on Drugs in 1971, nearly a decade after Anslinger retired, and years after abundant science demonstrated that many illegal drugs, particularly cannabis, aren’t that dangerous. Clearly the prejudice persists beyond one person’s preconceptions.

In 1994, John Ehrlichman, Nixon’s former deputy, admitted to a journalist that the administration turned hostile to drugs, in brief, because they needed camouflage. Busting drugs became a convenient legal condom that allowed Feds to harass Civil Rights leaders and anti-war demonstrators without contracting the taint of racism or war-hawk-ism. In other words, for Nixonites (and Reaganites after them), addicts were bad people because they were different.

This has been extensively documented. Legal scholar Michelle Alexander records that, though it’s illegal to stop somebody simply because they’re Black, race is nevertheless an acceptable qualifying condition. In other words, police can randomly stop somebody because they’re Black and something else. The law is so inclusive that “something else” can include basically anything that a reasonably healthy imagination could conjure. And it’s almost always something drug-related.

So basically, when drug enforcement officers force my friend’s doctor to take her off pain control medications that work, and put her on something marginally better than a placebo, they say they’re preventing her from becoming an addict. But really, they’re preventing her from becoming a nonconformist. Our drug regulations, which doctors must comply with to remain licensed, are based on moral fears of hippies and African Americans. We can’t let a White woman become… that.

It boggles my mind that, in 2020, these prejudices remain so persistent. Worse, the moral panic surrounding opioid “abuse” has been perpetrated by media forces marching in lock-step. The same newsrooms that want congratulations for holding Donald Trump to account, continue reinforcing Harry J. Anslinger’s class-based prejudices, and Nixon’s racism. Eighty years later, it’s time to show some moral independence, and let these old, outdated ideas finally, mercifully, die.

Monday, June 1, 2020

The Danger of Worshiping Saint Martin

Dr. King, center, commences a march

White people love trotting Doctor King out of retirement whenever civil unrest happens in America. Like, say, now. We love pictures of him crossing the Edmund Pettis Bridge, linking arms with a mixed-race coalition, resolute and heady. We especially love tossing orphaned quotes around heedlessly, stripped of context: “content of our character,” perhaps, or “Hate cannot drive out hate; only love can do that.” We love that.

Having grown up surrounded by anodyne White suburbia, in neighborhoods and schools with minimal diversity, I was exposed to a sanitized, low-risk version of Martin Luther King, Jr. I heard some of his speeches, like “I Have a Dream,” or “I Have Been to the Mountaintop,” soul-stirring examples of rhetoric which fired a boy’s imagination. I grew up never not knowing who Doctor King was, for which I’m certainly thankful.

However, the version of Doctor King I knew was sanitized and made into a simple morality narrative. This version challenged racism, embodied in naked bigotry of Jim Crow segregation and Bull Connor crackdowns, and won. The stories placed an individual of surpassing morality and personal rectitude, against a system so riven by internal rot that its defeat was written into its structure. The end was always inevitable.

In short, the schoolbook version of Dr. King I learned was a saint.

This version excluded the great complexity of King’s struggle. For instance, I too saw the pictures of King crossing the Edmund Pettis Bridge. Not until college did I see photos of what happened next: Alabama police hitting his marchers with batons and loosing German shepherds on protesters who’d already fallen. I learned in fourth grade that King received the Nobel Peace Prize. In college I learned America protested this award.

Nor did King’s complexity end externally. He also had significant problems within his own soul. Famous for his forward-thinking engagement on racial issues, King had retrogressive attitudes about women, and held organizational positions vacant for years rather than let women take charge. This gendered thinking manifested in his now-extensively documented adultery, and accusations of far worse.

Dr. King depicted as a literal saint,
Martin Luther King of Georgia
Further, as Ibram Kendi demonstrates, the centrist, cooperative King beloved by White moderates, from early in his career, is often at odds with the frequently darker, more confrontational King of his later career. Early King often soft-pedaled White abuses, and urged Black Americans to live up to impossibly high White standards to achieve an ever-elusive level of acceptance. He later acknowledged this mistake, but it was already written down.

Even his strengths were far from perfect. While he made great inroads on American racism, and saw many forms of outright oppression banned, he didn’t really end racism, it just went underground. Plus, his concerns didn’t end with racism. Later in life, he described the Giant Triplets of Evil: Racism, Militarism, and Materialism. Remember, he died in Memphis, where he had gone to organize a labor union; his economic concerns remain largely untouched.

This places an important rift between Saint Martin, the man whose bold stands make clear moral lessons for today’s children, and Dr. King, the man whose struggles remained largely unresolved upon his death. The schoolbook version of King, trotted out in Internet memes whenever Black Americans become restive, is definitely the sainted version, always beneficent, never ruffled. The real man became frequently angry and frustrated.

Sainthood can provide powerful instruction in moral goodness. When somebody has accomplished something so outstanding that their work becomes memorable after their deaths, we can study what they achieved, and how. Then we can mimic them until we internalize their moral strengths and become able to act independently. That, presumably, is why churches canonize saints and other holy figures.

But sainthood also freezes people in moments without context. The rush to canonize, say, Mother Theresa, came sideways on important questions raised in her journals, published only posthumously. We now know, as her contemporaries didn’t, that she struggled with deep doubts, wondering not only whether God noticed her actions, but even whether God existed. Her records show she didn’t find the answers she sought in this life.

Freezing Dr. King this way, eliminating his dark side and ignoring the fights he didn’t win, teaches today’s audiences the wrong lessons. It makes us perceive setbacks as permanent, doubts as disqualifying, and sins as irredeemable. If even Dr. King, the great saint of my childhood textbooks, could hold awful opinions, and lose his most important battles, it means my efforts, however thwarted, still matter. That’s much more valuable to me than Saint Martin.

Wednesday, May 22, 2019

Death Can Only Be Understood By the Survivors


This beautiful long-haired house panther is my boy Pele. I know remarkably little about his backstory. I don’t know where he came from, how many homes he’s had before mine, or even how old he is. The only clear, inarguable fact I know about him, is that I adopted him from my co-worker Jeff during the upheaval surrounding Jeff’s expensive, acrimonious divorce. And that Jeff died last week.

Pele loves to cuddle in my lap and bury himself under my arm. He loves laps so much, in fact, that I’m typing this essay with great difficulty, because he’s draped himself across my forearms, with his paws wrapped around my left arm in a big bear hug. He desperately craves human attention, and when he discovered that I sleep lying on my side, curled into a semi-fetal position, he decided the center of that curl is a pretty cool place to be.

Jeff loved alcohol. Though he was a frequently diligent employee, who didn’t hesitate to accept additional task assignments and overtime hours when needed, he also repeatedly showed up to work at 7 a.m. with beer already on his breath. He told giddy stories of various exploits he’d accomplished, stories which almost invariably began with him already being wasted. He had one of the worst cases of alcoholic rosacea I’d ever seen.

I hesitate to say too much, because Jeff was actively in my life less than one year (I’ve known his cat nearly three times as long as I knew him), and because he leaves behind a ten-year-old son who doesn’t need a dark cloud over his adolescence, or anyway a darker cloud than he’ll have growing up without a father. But I’m a writer, and like most writers, I can’t comprehend difficult situations without writing about them. I hope Jeff will forgive me.

Problem drinking is widespread in my workplace. At least three co-workers are capable of drinking a twelve-pack on a weeknight and still showing up for work the next morning. One co-worker won’t be eligible to have his driver’s license reinstated until 2021. I’ve witnessed colleagues arriving for work so thoroughly hung over, they needed to find secluded spots away from bosses’ gaze to grab quick naps before beginning the productive day.

Alcohol is, of course, a painkiller. Before scientists invented anaesthetics, they used brandy and bourbon to numb patients before surgeries and dentistry. Nowadays, people use alcohol to numb their brains from the maladaptive effects of lifelong trauma. Scratch below an addict’s surface, I have learned, and you’ll find somebody who survived something horrific, usually at a very early age. The issue isn’t whether, it’s what.



Sadly, I never knew Jeff well enough to understand his full story. He fleetingly mentioned an adversarial relationship with his own father, but always changed the topic quickly. He was determined to not repeat his father’s mistakes with his own son; but he also hadn’t yet grappled with his own history, and therefore needed to numb the pain artificially. So he surrounded himself with living things he could love.

Besides Pele, he had two dogs, an energetic little lapdog and the chillest retriever mix I ever met. As his marriage crumbled, and he saw less of his son behind a difficult custody battle, he doted religiously on his animals. But as his divorce dragged on interminably, he couldn’t make house payments, and eventually needed to move back in with his mother—a humiliating concession in a 46-year-old man. So he needed to re-home his animals.

He was red-eyed, and even drunker than usual, the day I arrived to take Pele home with me.

Not much later, Jeff got into a heated argument with a manager and walked off the job forever. I only saw him twice after that. Both times, he had his son with him, as well as his wits. But I heard stories from other colleagues who ran into him without his son. His drinking had apparently intensified; one reported he’d begun suffering minor hemorrhages because his capillaries were shot. I also heard he’d begun shuffling when he walked, like a much older man.

I wonder whether Pele is capable of understanding Jeff’s absence. Like me, he knew Jeff less than one year. I’ve cuddled Pele and talked to him about what happened, but he just blinks his pretty golden eyes, so I don’t know. I’m typing this through tears, while Pele purrs contentedly in my lap. Maybe he knows he’s loved right now. Maybe that’s enough.

Thursday, October 18, 2018

In Praise(-ish) of Conformity

David Sloan Wilson
In the school where I attended second grade, our classroom was two doors down from a kindergarten class. The kindergartners had to walk past our door to reach theirs. Several of my classmates had a favorite taunt they employed whenever the kindergartners wandered too close:
Kindergarten babies!
Stick your head in gravy!
Wash it off with bubblegum
and send it to the Navy!
I resisted singing along as long as possible. First, because it seemed just mean, running little kids down for being little. Hell, I'd been a kindergarten baby just two years earlier. Then, because I'd just moved into that area myself, and had as little in common with my classmates as with the kindergartners.

Yet before long, the dirty looks from my classmates became overwhelming. My silence marked me as an outsider. And be real, I had to interact with my classmates daily, while the kindergartners remained virtually strangers. What else could a kid with few friends do? To my later shame, I started singing along with the bullies’ taunt.

We're accustomed to thinking of “conformity” as something weak-minded people do, a zombie-like behavior. We often couple conformity with the word “mindless.” Yet evolutionary biologist David Sloan Wilson, in his book Darwin's Cathedral, lists conformity as a necessary precondition to build human society. We can't get along unless we accord with others’ behavior and expectations.

Several benign actions serve to advance productive (rather than mindless) conformity. Small talk is one, though I cringe to admit it. Clichés in speech and writing are another, since they let speakers share a background of reference. As any football fan, science fiction convention-goer, or political party devotee knows, engaging in chants and songs is a powerful group-building act.

We see this in religious songs. When Lutherans sing “A Mighty Fortress Is Our God,” or Methodists sing “O For a Thousand Tongues To Sing,” they confirm their group identity. These songs contain the germinal forms of their group theology, but for religious purposes, the lyrics are secondary. The point is, we sing them together.

Colin Kaepernick
Nobody would mistake “Kindergarten Babies” for secular hymnody, but it serves the same point. By singing it together, we confirmed we'd passed beyond the ignorance of infancy (“those who dwelt in darkness have seen a great light”). We also confirmed our identity as mature, diverse minds prepared for life's strange and dangerous exigencies. Duh, we were seven!

One of today's most inflammatory issues deals with the correct way to handle a national identity song. Must we all, as one side contends, stand to attention in absolute unison? Or may we, as the other side contends, kneel and pray as our conscience dictates?

This isn't a thought experiment. The two sides feud for control of how we express our group identity. One side says we're a martial people defined by our loyalty to the hierarchy (remember, the national anthem is a military song). The other says we're a people of morals and principle, and sometimes we're most American when we defy the American state.

Arlie Russell Hochschild, in her book Strangers In Their Own Land, interviews several people living in strongly conservative areas. She discovers that many have what, to her, sound like progressive values. Some are committed to environmental protection, others to economic fairness, others to their own causes. Yet in the voting booth, time and again, they vote for the party that opposes their pet issues.

Arlie Russell Hochschild
Hochschild, a scholar, avoids attributing intent to this disconnect. I have no such restraint. Like me, swallowing my principles to sing “Kindergarten Babies,” they'd rather get along with the people they have to live with every day, than be morally pure and lonely. This uniformity makes these individuals into a people.

But does it make them a people they'd like to live with?

The difference between productive conformity, and mindless conformity, is often visible only at a distance. I don't mean physical distance, outsiders standing around passing judgement. I mean time distance: I now regret singing “Kindergarten Babies” because I'm an adult who knows the difference between building community, and buying fifteen minutes’ peace. Once-popular actions like Operation Iraqi Freedom, mean something similar to the nation.

In short, we need conformity to survive. But we're lousy judges, in the moment, of the difference between productive and mindless conformity. We need constant guidance and reminders, and even that isn't foolproof. I have no answers yet. But I think I have better questions, and that's maybe more important than facile answers at this point.

Wednesday, March 28, 2018

Ethics For Unbelievers: a Ten-Point Plan

James Miller, A Better Ten Commandments: A Guide To Living With, and On Purpose

Over 130 years ago, Friedrich Nietzsche postulated that a new philosophy was dawning, a moral structure without recourse to God. But Nietzsche punted on what that philosophy actually looked like. Ever since, unbelieving philosophers great and small have attempted to step into that gap, but their philosophies have generally suffered terminal vagueness, lacking a firm foundation. They’ve mostly offered bromides like “Live Well” and “Be Creative.”

Entrepreneur James Miller becomes the latest to join this chorus, and his moral philosophy is remarkably good. His code, though sometimes suffering the broad generalities that plagued Camus and Russell, nevertheless provides a framework people can use to live a fulfilling and meaningful life, without turning to higher powers. Though a believer myself, I find plenty to like about Miller’s slim, plain-English philosophy. But it isn’t really Ten Commandments.

In his introduction, Miller gives a brief autobiographical precis, including how he came to disbelieve conventional religion, and explains his reasoning process. He pooh-poohs Moses’ original Top Ten because rigid obedience to concise laws creates moral contradictions—a conclusion that wouldn’t surprise theologians from Augustine to Bonhoeffer. Modern, pluralistic society needs more elastic ethics, responsive to today’s difficult moral throes.

Yeah, probably.

Then Miller disparages Moses’ Commandments as “common sense.” That’s problematic. I’ve read sociologists like Duncan J. Watts, who agree that common sense seems obvious mostly because we already know the answer. Though unbelievers could seriously dispute stuff about graven images and keeping the Sabbath, laws about murder, theft, and adultery need to be written down because some people need them spelled out, in black and white.

Okay, laying that trepidation aside, what do Miller’s actual “commandments” look like? Pretty good, actually. Exhortations like “Be the Best Version of Thyself,” “Find Perspective,” and “Cultivate a Rational Compassion” make admirable life goals, especially as Miller unpacks exactly what he means. Dedicated opponents could nitpick Miller’s precepts for contradictions, of course, but that’s true of all moral codes, religious or secular. Miller’s code is malleable enough to encompass difficult conditions.

James Miller
My problem isn’t Miller’s precepts; it’s his reasoning. Moses’ Commandments represent a floor. God supposedly declared these ten standards as a bottom standard below which we cannot sink, and still call ourselves His people. Levitical law works upward from that, creating more intricate rules about why we can’t eat shellfish, razor our beards, or weave cloth from two fabrics. Moses’ Commandments are bottom-up reasoning.

Miller’s Commandments, by contrast, represent top-down reasoning. His Commandments are well-meaning goals we should strive toward, but he requires justification for why they’re important, or what they even mean. The justifications he provides are quite good, drawing on an impressively catholic selection of Eastern and Western philosophy, different religions, science, and more. His reasoning is well-supported, yet somehow never quite finds its floor.

Anyone who’s ever had or worked with children knows it’s possible for obstreperous two-year-olds to lapse into an infinite regression of “But Why?” Adults do something similar when pushing moral boundaries, coming up with reasons why fusty ethics don’t apply in my situation. Parents eventually fall onto the “just because” argument, and religions do much the same, declaring that God unilaterally forbids murder, theft, and adultery, just because, so shhh!

Miller implicitly accepts some moral floor exists somewhere. Early on he writes: “Of course, no rule is perfect, so I must insist on a few caveats. If being the best version of yourself includes unethical or unsustainable behaviors, this rule doesn’t really pan out.” Literally, that’s on page two; versions of this thinking get repeated periodically throughout. So ethics and sustainability exist beneath Miller’s given Top Ten. What, then, are they?

I can’t really say. Though Miller accepts a Platonic ideal of goodness exists somewhere, it remains abstract beneath his standards. Therefore his standards, good though they are, remain beholden to some foundation, somewhere, which we still hope to discover. Without an absolute bottom line, Miller’s “commandments” remain subject to “But Why?” thinking which could literally regress infinitely. That’s how social corruption works. We must draw the line eventually. What does that line look like?

I still don’t know.

Miller provides a workable second-tier moral code without reliance on religion or divine revelation, I’ll give him credit. Readers who already have some idea what ethics and sustainability mean to them could apply his principles productively. But his moral floor remains vaguely defined, the problem which has plagued skeptical philosophers since at least David Hume. Miller provides a good moral framework, admittedly. But it’s too soon to call them Commandments.

Friday, January 12, 2018

The Week America Finally Surrendered

Yes, Oprah, I agree. This shit needs to stop.
Oprah for President, a pastor getting a standing ovation for admitting statutory rape, and “shithole countries.” The second week of 2018 really feels like the week America went off the rails. I’ve tried processing everything that’s happened in the last seven or so days, and been unable to do so. It seems too radical, violent, and spasmodic to permit definition. Until I recognized the overarching theme: a willful embrace of unreason.

It’s become commonplace over the last two years for the punditocracy to claim we’ve finally crossed a bridge too far. Donald Trump has finally alienated his base. Coal-burning companies have finally overloaded the climate. Papa John’s comments about NFL kneelers prove the far-right’s moral vacuity. Look!, the pundits scream. Proof, proof I say, that we’ve hit rock bottom and are prepared to reverse course! Somehow it keeps not happening.

Yet somehow, things feel different this week. We didn’t just see somebody doing something awful. Despite left-wing pledges one year ago, we’ve already permitted truly awful behavior from public figures to become sufficiently “normalized” that we’re not shocked anymore. But this isn’t awful behavior. These three incidents represent America completely abandoning historical precedent, moral foundation, and common decency, to embrace… well, I’m not entirely sure what.

It began with the “Oprah for President” outcry following her Golden Globes speech. Though probably well-meant, this push is the exact leftist equivalent to Donald Trump’s overthrow of Republican hierarchy. Pinching concepts from linguist George Lakoff, if Donald Trump is America’s “strict father,” Oprah is our “nurturing parent.” But both share an ideological core of rejecting expertise and routine competence, in favor of giving the political establishment a massive middle finger.

Pastor Andy Savage received a standing
ovation when he admitted a "sexual incident"
with a parishioner. He was 23. She was 17.
Before Oprah’s dust settled, Pastor Andy Savage confessed a “sexual incident” in a Sunday sermon, a confession that garnered a twenty-second standing ovation. Like David Letterman before him, Savage confessed his indiscretions to forestall his accuser taking her accusations public. But he sought forgiveness without repentance; he sought what Dietrich Bonhoeffer called “cheap grace,” the forgiveness we bestow upon ourselves. Importantly, he hasn’t relinquished his liturgical authority.

The collapse culminated (hopefully—the week isn’t over yet) with President Trump’s “shithole countries” comment. I’ll avoid my temptation to condemn these comments on Biblical “least of these” grounds, because this won’t persuade anyone not already persuaded. However, Trump’s comment actively spit upon American commitments going back at least to the Marshall Plan, when Americans agreed we have obligations to poorer, bleaker, less fortunate nations globally. It’s an abandonment of history.

These three incidents demonstrate a certain subset of America has come unmoored from the principles it claims to represent. By embracing Oprah, the American left has admitted commitment, competence, and dedication no longer matter in governing Earth’s most powerful nation. By not needing to undertake some form of penance, or surrender authority, Andy Savage proves even Christians prefer established power over moral foundation. And Trump has essentially relinquished America’s claim to morality on the world stage.

Somebody staging a counter-argument might observe that, in all three cases, only a minority actually believes that. Oprah ginned a strong reaction, but the Democratic party remains committed to process and organization. Andy Savage represents only one congregation, and has received massive Christian pushback. And Donald Trump has the lowest approval ratings of any President ever, at this stage in his administration.

I respond: yes, but it doesn’t take a majority. Donald Trump only got approximately one-third of his party’s primary votes, and came second in the general election. What matters isn’t the majority, but the process. Wing-nuts and lunatics can seize the process without actually winning the debate. And that’s what we’re seeing happening: because Democrats now have to answer Oprah fanatics rather than creating policy, for instance, Oprah has appropriated the system.

Irrationality isn’t entirely bad. Behavioral economist Dan Ariely has demonstrated humans’ irrational tendencies have firm foundations, which actually drive a just-minded and functional society. Indeed, complete rationality, of the homo economicus model, is both untenable and potentially downright harmful. But I’m not discussing ordinary, moment-to-moment irrationality. I’m describing a deliberate, long-term rejection of reason, and the lessons of the Renaissance and Enlightenment.

No, the problem isn’t irrationality, which is inevitable, even beneficial. The problem is defiance of what we know, an active retreat from thinking, preferring animal-level gut reactions over evidence and proof. American public discourse now apparently prefers stupid over smart. We’ve relinquished our past, sat on our asses, and forgotten our identity. I seriously question whether we’ll now ever get it back again.

Tuesday, August 1, 2017

Business, Ethics, and the Risk of (Self-)Righteousness

Scott Sonenshein, Stretch: Unlock the Power of Less—and Achieve More Than You Even Imagined

Professor Scott Sonenshein divides the the business world into two categories: chasers, who pursue more and better resources to do achieve their objectives, and stretchers, who make what they already have perform double duty and prove maximum return. Sonenshein, who teaches management at Rice University, uses language from business and behavioral economics to convey his message. I was shocked, however, to notice how he made a fundamentally moral point.

A popular mindset persists, Sonenshein writes, particularly among business professionals born into the wealthy class, or among people with very narrow, specialized educations. If we had more money, this mindset asserts, or better tools, or more people, or something, we’d crack the success code and become unbelievably successful. If only we acquire something new, we’ll overcome whatever impediment stops us achieving the success waiting below our superficial setbacks.

By contrast, successful businesses like Yuengling beer, Fastenal hardware, and others, practice thrift in resource management, utilizing existing resources in innovative ways, maximizing worker control over business decisions, eschewing frippery, and making the most of everything they own. Sonenshein calls this “frugality,” a word he admits has mixed connotations. But he’s clearly demonstrating familiarity with once-common ethical standards, what economists still call the Protestant work ethic.

Sonenshein doesn’t once cite religion or morality, either implied or explicit. However, when he breaks successful businesses down into bullet point lists of best practices, like “psychological ownership,” “embracing constraints,” “penny pinching,” and “treasure hunting” (to cite the takeaways from just chapter three), the ethical correspondences become rather transparent. Take responsibility for your choices, little Timmy! Work with what you have! Save now for bigger rewards later! Et cetera.

From the beginning, Sonenshein structures this book much like religious sermons. His points are self-contained, backed with pithy illustrations showing real-world applications. He asserts his point, cites his text, backs it with anecdotes, then reasserts his point. The structure appears altogether familiar to anybody versed in homiletics. It persists in religion, and translates into business books like this one, because it holds distractible audiences’ attention long enough to clinch brief points.

Scott Sonenshein
But again, Sonenshein never cites religion. He frequently quotes research from psychology and behavioral economics to demonstrate how scrutiny supports his principles. But if he’s proffering a business gospel, it’s a purely secularized one. Though Sonenshein comes from the same mold as religious capitalists like Norman Vincent Peale, Zig Ziglar, and Og Mandino, he never relies upon revealed religion. Earthly evidence, not God’s law, demonstrates this gospel’s moral truth.

Oops, did I mention Norman Vincent Peale? See, there’s where doubts creep in. I had mostly positive reactions to Sonenshein’s points until I remembered Peale. There’s a direct line between Peale’s forcibly optimistic theology, and Joel Osteen’s self-serving moralism. We could achieve earthly success by aligning our vision with God’s… but often, already successful capitalists have recourse to God to justify their own goodness. I’m rich because I deserve it!

This often leads to retrospective reasoning—what Duncan J. Watts calls “creeping determinism.” In finding already successful operations, then applying his learning heuristic to them, Sonenshein risks missing invisible factors steering his anecdotes. I cannot help recalling Jim Collins, who praised Fannie Mae and Circuit City scant years before they collapsed. In reading Sonenshein’s anecdotes, like hearing Christian sermons, it’s necessary to listen for the sermonizer’s unstated presumptions.

Please don’t mistake me. I generally support Sonenshein’s principles. I’ve reviewed previous business books and found them cheerfully self-abnegating, urging middle managers to sublimate themselves to bureaucratic hierarchies and treat themselves basely. Sonenshein encourages workers to stand upright, own their jobs, and always seek improvement… and he encourages employers to treat workers like free, autonomous partners. Though Sonenshein never embraces an “ism,” he corresponds with my personal Distributism.

I wanted to like Sonenshein’s principles because I generally support his ethics. His belief in thrift, in embracing a mindset of ethical management, and of getting outside oneself, is one I generally applaud, and strive to apply to myself (not always successfully). Though I don’t desire to control a multinational corporation, I wouldn’t mind leveraging my skills into local business success and financial independence. And I’d rather do it ethically.

But like any ethicist, Sonenshein requires readers to police themselves carefully. They must apply these principles moving forward, not looking backward. Financial success often inspires implicit self-righteousness, which business ethics can inadvertently foster. I’ll keep and reread Sonenshein, because I believe he’s well-founded. But I’ll read him with caution, because his framework conceals minefields I’m not sure even he realizes are there.