Monday, January 30, 2023

Humanity as a Man-Made Phenomenon

James Paul Gee, What Is a Human?: Language, Mind, and Culture

Like “freedom” or “democracy,” most people think we have a working definition of “humanity” in our heads, and it works adequately most of the time. But this loosey-goosey approach to human essentialism has caused negative outcomes throughout history. War and slavery have let powerful people strip the social designations of humanity from strangers, while belief in human exceptionalism currently threatens humanity’s very existence through anthropogenic climate change.

In his youth, James Paul Gee initially trained for the priesthood, but after losing his faith, he earned a Ph.D. in linguistics. This duality probably influenced the interdisciplinary nature of his subsequent activities, such as the tacitly public nature of literacy, or the social interpretation of video games. This book, written as Gee retired from active academia shortly before the pandemic, is the culmination of his life’s work.

Gee identifies human nature through a balance of extremes. Each human is utterly unique, he writes, but unique human attributes manifest themselves mainly through social context. Therefore we are separate, circumscribed by the limits of our senses in the world, but we’re never truly separate, as we rely utterly on relationships with other humans and the natural world. We lack “free will,” a sludgy and imprecise term, but that lack doesn’t justify determinism.

Past attempts to define humanity have fallen down on the lack of nuance inherent in brevity. Recall Plato defining a human as a “featherless biped,” and Diogenes responding by brandishing a plucked chicken. Gee makes no such mistake here. His definition of humanity sprawls over 200 pages, sometimes narrowly focused on precise scientific outcomes, other times expanding to encompass philosophic maunderings and autobiographical anecdotes. Brevity isn’t Gee’s weakness.

Humans, to Gee, exist in community; he uses termite mounds as his metaphor (sometimes stretched to breaking). Obviously we rely upon others to divide labor, collaborate on labor, and amplify our thought processes. But we don’t just exist in community; we are ourselves communities, what Gee calls “transacting swarms,” made up of our microbiomes and our relationship with the earth. We live in termite mounds, and we are termite mounds.

James Paul Gee

But Gee distrusts the mechanistic materialism of Richard Dawkins and Daniel Dennett. Just as humans have organic biomes, we have “spiritomes,” the complex nest of spiritual realities in which humans dwell, individually and collectively. Though Gee, a lapsed Catholic, flinches from capital-T Truth claims, he believes human spiritual subjectivity is real enough to matter in making life-altering decisions. We all have relationships with evidently noncorporeal realities.

To this point, Gee’s thesis draws heavily on research from other thinkers academically grounded in the physical sciences. Not surprisingly, as a linguist, Gee’s anthropology becomes most dense and detailed when discussing how language shapes the human mental structure. Gee admits coming from a Chomskian generative linguistic background—fascinating but often abstruse. But exactly how his linguistic background shapes humans may surprise you.

Gee admits never reading poetry until after achieving his doctorate. How he studied linguistics without at least a historical survey of poetic metaphor eludes me, but whatever. Gee waxes rhapsodic about what a revelation it was discovering poetry in adulthood, unclouded by state-school “skillz drillz.” The unsullied joy he describes bespeaks a wonder that we who still read poetry often struggle to recapture. I’m downright jealous.

Despite his sometimes scientistic mindset, the humanities offer Gee’s greatest insight into the relationship between our outside, communal world, and the strictly internal neural landscape of senses and higher reasoning. We perceive the world according to our senses, and also according to our ability to describe it to others. His lavish fondness for poetry, especially Emily Dickinson, bespeaks a worldview in which subjectivity isn’t a weakness, but a defining trait.

To his credit, Gee doesn’t pretend his definition is more binding or global than it actually is. He acknowledges that any definition of humanity is provisional and circumscribed by the author’s background and prior knowledge. His language is colored by nuance and the frequent need to walk a tightrope between seemingly contradictory positions. He invites informed readers to challenge and refine his definition of humanity; he doesn’t just stand pat.

Now past seventy, Gee clearly writes with one eye angled toward how posterity will remember him. He clearly intends this volume as a capstone of his academic career. He finished writing in the months leading up to the pandemic, and one wonders how this book might’ve looked just six months later. Yet as a prolegomenon to future humanistic studies, Gee offers an exciting, readable, and purely joyful philosophic consideration.

Thursday, January 26, 2023

Advanced Placement and Unadvanced Schools

Florida Governor Ron DeSantis in his
favorite pose: angrily lecturing the crowd

The College Board’s Advanced Placement program has existed since 1952, and in that time, no state has refused to certify an AP course. Until last week. In the latest salvo in his one-sided war against “wokism” and “Critical Race Theory,” Florida Governor Ron DeSantis issued a blanket refusal to participate in a pilot program for African American studies. This despite the course not even being offered in Florida yet.

I never participated in Advanced Placement in high school. I was slated to take AP American History and AmLit in 11th grade, the first time that school offered any AP courses at that level. But circumstances changed and my family relocated that summer, to a school where AP only existed in 12th grade. I handled the adjustment with resentment, and mentally checked out of school, nearly flunking Senior Year.

Therefore I’ve watched subsequent AP developments through a lens of lingering resentment. But I’ve also watched through the recollection of how I reversed my academic skid; after a few years of desultory employment, I returned to school, graduated college with a double major, and earned a Master’s Degree. I taught Freshman Comp for four years, and had students report that my course was their favorite in four years of college.

I say all this so you’ll know who’s speaking when I say: American high schools aren’t equipped to teach Advanced Placement courses. AP began with noble intentions of shepherding advanced youth through college-level general studies courses without wading through tedious prerequisites or paying college-level tuition. But we have abundant evidence that American public schools are beholden to pressures that make AP teaching impractical, if not impossible.

Even assuming younger teens’ brains are sufficiently developed for deep dives into collegiate liberal studies—a premise I doubt—their schools aren’t equipped for such education. Teachers’ credentials are regulated at the state level, creating uneven standards across jurisdictions. Some states may permit intensive study of academic subjects, but my state requires more courses in classroom management than in any academic subject to receive a teaching degree.

This isn’t a knock against teachers individually. Rather, like police or landlords, good individuals often come a-cropper against institutions designed to preserve the status quo. As Dana Goldstein writes, American public (state) schools are organized to maximize cost efficiency, not pedagogical efficiency. Even the most dedicated teachers can’t provide intensive education when working short-staffed, underfunded, and with years-old textbooks.

Charlie Kirk

Worse, the DeSantis Administration’s attempt to kneecap African American Studies isn’t the first time states have undermined AP. In 2014, Colorado students staged a mass walkout when school boards tried to rewrite advanced history courses. Rather than teaching history, with all its messy contradictions, authorities wanted to teach patriotism and libertarian economics. The language presaged current anti-CRT rhetoric, which fears that kids might not unquestioningly love America anymore.

College-level education requires academic independence, something the America First crowd abhors. Charlie Kirk, a prominent young nationalist, kick-started his pundit career by complaining about supposed anti-American sentiment in higher education, despite having dropped out of online Bible college. Christopher Rufo almost single-handedly engineered conservative America’s pants-wetting paranoia over Critical Race Theory. They and others propose tighter state controls on education as the solution.

The DeSantis Administration’s rejection of AP African American studies explicitly cites fears about the subject’s “ambiguity.” High school teachers have long struggled with institutional fears of ambiguity. That’s why social studies teachers inevitably reduce history to lists of dates and vocabulary words, while literature teachers make students memorize plot points, and biology teachers often have to include disclaimers leaving room for seven-day creationism. State schools institutionally forbid ambiguity to creep in.

But any serious academic knows that real scholarship happens in ambiguous spaces. History isn’t just what happened; it’s the great debates about why it happened, and how events continue to influence us today. Great literature always emerges from, but also reacts against, its social environment, and therefore exists in dangerous tension. As scholars like James Loewen and Gerald Graff write, only where ambiguity exists can real learning ever happen.

The College Board generously wants to help advanced students clear academic hurdles in high school. But it ultimately can’t, because high schools face administrative burdens that prevent college-level learning. Again, this isn’t a knock against teachers, who generally enter the field for love of students. But public schools are motivated by political goals, not simple generosity. Schools simply aren’t free.

The solution is to get students out of high school faster, not move college into politically hampered high schools.

Monday, January 23, 2023

Dark Rogers and the Illusions of Power

So I’ve seen this image drifting across social media, as they do, and it got me thinking. (No, surely not you, Nenstiel!) Fans of both Fred Rogers and Star Trek will recognize this thought process. The “Mirror Universe,” a recurring Star Trek thread, depicts an interstellar civilization controlled, not by a United Federation of Planets, but by an Imperium. The Mirror Universe is antidemocratic, violent, and ruled by the strong.

Star Trek fans love the Mirror Universe to the exact extent we generally admire Gene Roddenberry’s underlying humanist ethic. Roddenberry believed the modernist myth that all human history marks an unwavering path from caveman savagery, through an arc of warring tribes and nations, toward an ultimate gleaming future defined by peace, prosperity, and the shedding of divisions. Human societies are always moving from mud-dwelling to utopia.

This involves a sort of civic Calvinism. Okay, Roddenberry himself was personally atheist, and believed the arc of history would move away from religion and “blind faith.” But in accepting the modernist myth, Roddenberry believed the Calvinist precept of “Total Human Depravity,” the principle that humans, left to ourselves, are selfish, venal, and angry. John Calvin believed Jesus Christ would redeem this innate venality; Roddenberry trusted in an evolving state.

Fred Rogers, a Presbyterian minister (a form of Calvinism), had a difficult relationship with Total Human Depravity. He didn’t accept the maxim that humans are innately bad, and that Christianity must purge our sinful desires. But alongside icons of childhood like Roald Dahl and Maurice Sendak, Rogers recognized that children live in long-term states of intrinsic powerlessness. Children lash out, sometimes violently, because they have no other tools available.

Many ex-kids, especially we who preferred to think and build rather than compete and vanquish, have chilling memories of childhood authority. I understand now, as I couldn’t then, how many schoolyard bullies were simply reenacting the power dynamics they learned at home. But that didn’t matter when they’d literally form crowds intended to corner me, shouting and screaming and shoving. The only thing I understood then, was my own powerlessness.

Rather than stopping the constant low-grade violence, school authorities encouraged us to mollify the bullies around us. Not even metaphorically, either: my parents literally told me that bullies acted out of their own terror and pain, and they needed somebody willing to embrace them, regardless of their inappropriate behavior. My family literally told me that it was my Christian moral duty to befriend the kids who made me terrified to go to school.

Current sociology suggests early humanity wasn’t violent, despite what Freud, Marx, and Calvin believed. Little evidence of warfare exists before humans became settled. But then agriculture, with the limits of space and resources, took over. Just as the Biblical Cain, a farmer, killed his brother Abel, a herdsman, so early civilization invented war, and the scars which war produces. Humans had to learn how to be angry, violent, and resentful.

The meme jokes about Mirror Rogers claiming children are weak. But our Fred Rogers acknowledged that same weakness. The difference is, Fred Rogers didn’t see weakness as something to exploit or punish. Like Jesus, Fred Rogers encouraged children to embrace their weaknesses. He authorized us to care more deeply, to reach out to those suffering, to love without reserve. He didn’t promise it wouldn’t hurt, only that we would emerge.

Star Trek’s Mirror Universe, like its Klingon Empire, depicts a society ruled not by humanist values, but by chest-thumping displays of strength. Because Roddenberry realized, despite his ideals, that some people never outgrow their childhood vulnerability stage. We’ve all worked in jobs where the biggest assholes have the most friends, because just like on the schoolyard, scared peers never stop trying to constantly mollify them.

Roddenberry, an atheist, and Rogers, a Christian, shared an underlying belief that rule by fear seems enormous when we’re in its midst, but can never truly last. Roddenberry believed society itself would reach a stage of development where it shackled our violent impulses, but the shadow self, the dark mirror, would remain. Rogers believed the fight would never truly be won, but which side we chose would define our lives.

Because yes, fundamentally, children are weak. Humans are weak. We spend our lives vulnerable to money, violence, and injustice. We never outgrow our weaknesses; we only reach a stage of development where our weaknesses no longer rule us. Whether we reach that individually or together, we’ll all, hopefully, reach a point where weakness is no longer shameful.

Saturday, January 21, 2023

Making a Law About “Gender”

State Sen. David Clemens (R-ND)

Online journalists, like crows, often get distracted by shiny novelties, and this week’s North Dakota "Don’t Say Trans" bill is no different. Many outrage reporters have seized State Senator David Clemens’ proposal to compel public institutions to identify residents by their biological sex, not their preferred gender. The desired outrage has appeared, but the bill is a guaranteed nine-days’ wonder. Local reports show the bill has no support beyond Clemens himself.

I care less about this worthless spectacle, and more about the motive behind it. Clemens’ bill aims, for the first time ever, to create legally binding definitions of gendered words. Legally binding definitions give words power, and not just the moral power that language always has. Once words have legal definitions, the state has authority to enforce those definitions, without regard for context. Lawyers love creating legally binding definitions.

Read over any contract you might have lying around your house: your student loan documents, the lease or mortgage on your house, any nondisclosure agreement you signed with whatever celebrity you last slept with. These contracts will make the same statement multiple times in multiple ways. Because of this excessive wordiness, “lawyerese” is notorious for its complete obscurity, a form of language famous for thoroughly obstructing communications.

Except that impenetrable lawyerese serves an important purpose. By requiring you to agree to multiple definitions of the same concept, lawyers reduce to the smallest possible factor your ability to weasel out of contracts. Ambiguous language permits free riders and bad-faith actors to interpret the contract in ways that advantage them, regardless of the intended spirit. You personally might not do that, but it only takes one actor.

Consider real-world examples of what happens when language is ambiguous. Religious discussions bog down over minute nuances of Greek and Hebrew terms, and what those words meant to Bronze- or Iron-Age prophets. Playful Christmas songs written around playful vaudeville in-jokes have been condemned as predatory because we no longer remember the context. What seemed clear when it was written down is vague and scary now.

Linguistic ambiguity allows people to find the worst possible interpretations in words that were written with the best intentions—and vice versa. Rhetoricians speak of the 100% Natural Fallacy, a logical failure that creeps in because people put the best possible spin on the word “natural,” a word that has no agreed-upon definition. When words have no binding definition, naïve optimism or bad-faith cynicism tend to creep in.

Lawmakers like David Clemens attempt to foreclose this ambiguity by giving binding definitions to words we formerly just assumed we knew. I might disagree with Clemens’ long-term goals of making it illegal to be transgendered, but I understand his desire to weed out vagueness. After all, in my teaching days, I formerly told students to structure their language to make it as free from vagueness as possible. Boldness and specificity always beat uncertainty.

Plato and Aristotle, as painted by Raphael

Except…

Sometimes vagueness is good. We argue about how to interpret Levitical law, not because we lack our ancestors’ moral confidence, but because we aren’t a poor hill-dwelling nation on the margins of Bronze-Age Asia anymore. Even True Believers who insist that Moses and the prophets understood the Truth, must acknowledge that how we live that truth has changed. Hebrew law is a foundation, not a prison.

Ambiguity lets bad-faith actors manipulate the system to their advantage. But ambiguity also lets good-faith actors grow and change when faced with new evidence, without having to abandon the foundations upon which they’ve built their lives. Why, it’s almost like there’s no one-size-fits-all moral code that means anything! Almost like we have to face situations as they are, and make decisions here and now, not relying on a dead-tree text.

Senator Clemens assumes that, if we permit English gender definitions to remain subjective, then everyone will misuse those definitions for selfish purposes. Which means that Senator Clemens assumes human nature is necessarily selfish, that humans are manipulative, that deep down, we’re all bad-faith actors. Therefore he wants to lawyer that ambiguity out of existence, reining in what he perceives as human weakness and venality.

I prefer to assume that subjective gender definitions give us, you and me, the freedom to investigate what it means to be a “man,” a “woman,” or neither. It liberates me, a cishet man, to decide how I will express being male in the healthiest and most productive terms, and how I can identify toxic or harmful maleness. Fundamentally, Senator Clemens distrusts us, but I believe humans are good.

Wednesday, January 18, 2023

Some Pessimistic Thoughts On “Plagiarism”

Last summer, when historian Kevin M. Kruse was accused of plagiarism—a serious accusation in academia, and one that submarines careers—I wrote a lengthy examination of the concept. My views on plagiarism have shifted substantially since my “zero tolerance” teaching days, when we were institutionally encouraged to distrust any writing that was too well-written. I’ve come to accept that the workload schools impose on students is simply too exorbitant to permit serious original writing.

So I congratulated myself for my Gandhi-like enlightenment and walked away from the discussion. Dr. Kruse was eventually cleared by a council of like-minded nabobs, and the issue retreated from public consciousness. Then yesterday, I received a Facebook message from a stranger identified as Emmett Cullinan, admitting he’d repurposed one of my reviews for a classroom assignment. He wasn’t even circumspect about it. He acknowledged he hadn’t read the book, simply repurposed my written review.

Suddenly I found myself back in “teacher” headspace, cueing up platitudes about the evils of slapping your name on somebody else’s words. Though I doubt an online scolding will discourage any undergraduate brazen enough to actually tell a writer after lifting another’s words, I at least went to Cullinan’s Facebook page to find his school. Turns out, Cullinan has set his Facebook profile to private, making him inaccessible to anybody he doesn’t already know. Crafty.

Cullinan’s message initially looked smug, boasting in performing an act of intellectual dishonesty. But the longer I live with it, the more I suspect there’s something more going on. Yes, this kid (I’m assuming youth and inexperience drive this cocky impudence) acknowledges slapping his name on my words. Considered in a vacuum, these words imply a galling level of self-importance, since he not only performed the plagiarism, but wanted me to know he’d performed it.

But he says this pilfering “really saved my life” and “legitimately saved my grade.” These aren’t the words of somebody rubbing my nose in his dishonesty; they’re the words of somebody scared of falling behind and losing out. Indeed, his use of the word “legitimately” suggests that although he knows his actions are illegitimate, he feels his necessity is more than legitimate, or at least legitimate enough to justify his actions, taken out of desperation.

In other words, the longer I live with Cullinan’s message, the more I realize they’re written from a place of inner fear. He writes with an exterior mask of audacity and entitlement, perhaps because he believes his position as a student is precarious enough that he can’t admit terror. Students frequently learn early that professors, whose own position is frequently shaky, can smell fear. Therefore, with professors and with me, Cullinan masks fear behind arrogance.

College degrees, once a signifier of aristocratic erudition, have become job credentials for most Americans. The time spent getting a degree has become a mandatory buy-in for a middle-class life. Many jobs that, in the past, simply required an apprenticeship or on-the-job training, like office managers, industrial technicians, and law enforcement, now actually do require higher education. Sometimes the requirement is unofficial and simply customary; but some employers or jurisdictions have made this requirement official.

This means that an upwardly mobile life increasingly requires not only college, but frequently a graduate degree, to stand out and move ahead. This means a commitment of years, effort, and debt. Undergraduates and grad students are nominally adults, but see the period of juvenile dependency dragged out for years, sometimes until they’re pushing thirty. It’s even worse for anybody hoping to change careers in adulthood, which can mean more years studying for more credentials.

Emmett Cullinan’s message sounds ballsy, at first. But it reflects childhood fears dragged out onto an adult, a grown-ass man who is considered culpable as a grown-up if he mishandles a car or a beer, but who isn’t allowed to take responsibility for his working life yet. Like kids throughout history, he acts brazen because, perhaps, it’s the only power he has. Far from the unashamed plagiarist I first assumed, I think he’s just scared.

Of course that doesn’t stop me from using his real name, or screenshotting his message. I’m no longer angry enough to pursue consequences, but I’m passive-aggressive enough that, if consequences find him, I won’t cry. Because that, too, is part of higher education. Students learn to have opinions about books they haven’t read, or experiences they haven’t experienced, but they also learn to get called out when they fib. Learn to live with it.

Monday, January 16, 2023

India’s History and the War for the Soul

1001 Films To Watch Before Your Nexflix Subscription Dies, Part 48
Santosh Sivan (director), Ashoka

Prince Ashoka has become the most successful general in the Mauryan Empire, a claim he makes despite, not because of, his royal standing. A younger son of a lesser queen in the Emperor’s harem, nobody expects Ashoka to inherit, least of all his favored brother Susima. When Susima deliberately refuses to support his brother in battle, Ashoka manages a massive strategic victory, then returns to the capitol, intent on vengeance.

World cinema should, ideally, offer ambitious audiences an opportunity to immerse themselves in somebody else’s culture for a few hours. Unfortunately, Hollywood’s carcinogenic influence has undercut that recently; filmmakers must appeal to English-speaking audiences to make bank. This Hindi-language movie therefore makes an interesting contradiction. It embraces the full vaudeville cheese inherent in Bollywood masterpieces, while striving to tell an important story of historical and cultural significance.

Despite his military proficiency, Ashoka proves less capable of palace intrigue. His initial plans for vengeance against Susima and his other brothers fails, and he narrowly avoids an attempted assassination. At his mother’s insistence, Ashoka flees the palace, posing as a commoner and sleeping rough. This experience teaches Ashoka important lessons in humility, but it also gives him a long-overdue opportunity for love, when he meets Kaurwaki, exiled princess of Kalinga.

Shahrukh Khan, India’s biggest matinee idol, plays Ashoka in a manner Western audiences might find jarring. One moment, he has smoldering, Brad Pitt-like charisma and an understated performance, stone-faced and impassive, the character happening entirely in his eyes.The next moment, he turns into a caricature, chewing scenery with the aplomb of Gary Oldman. No matter his tone, he always carries a sure and placid confidence in his star power.

These tonal shifts reflect the Bollywood culture that birthed this movie. Bollywood has certain requirements. For instance, every movie requires five tightly choreographed song-and-dance routines. Four routines directly advance or comment on the plot; the fifth is pure lowbrow spectacle. Americanized audiences unfamiliar with Bollywood convention may feel back-footed when the prince begins singing and dancing for the first time. But that confusion is half the fun.

Ashoka is an important figure in Indian history. He pushed the Mauryan Empire to its greatest geographical expanse, and he sponsored massive artistic and public-works projects. Many of his surviving artworks are among India’s national treasures, and have weathered 2,300 years remarkably intact. But at his empire’s peak, he converted to Buddhism, foreswore violence, and rededicated his empire to helping India’s most defenseless peoples. History doesn’t exactly record why.

Kareena Kapoor as Princess Kaurwaki and Shahrukh Khan as Prince Ashoka

This movie speculates on the forces leading to Ashoka’s conversion. The resulting mix is both personal and national, both contemporary and historical. Ashoka’s life among the poor and destitute reflects the Buddha’s own mythological journey outside the palace walls. But his personal romance with a foreign princess reflects important modern concerns, that while Ashoka was a product of his times, he also rejected those times for deeply personal reasons.

Director Santosh Sivan directs this picture in ways that reflect Ashoka’s dualism. He designs his shots with Peter Jackson-like simplicity that makes the Iron Age setting come alive. The Mauryan palace has timber frames and beaten metal ornaments that bespeak both poverty and ambition. Important character moments happen while Ashoka hides out in windswept caverns and candlelit temples. Shadows cut deep across his face as he chews up his enemies.

And chew them up he does. Sivan recreates military conquest in images that would make Cecil B. DeMille envious. The movie cuts from conversations inside stone-walled taverns to massive cavalry charges as quickly and effortlessly as Ashoka’s military lifestyle requires. Ashoka’s relationship with his bodyguard Virat begins with slapstick that would make American directors flinch, and concludes in truly heartbreaking tragedy.

The contrast of tones, not only within the movie but within principal characters from scene to scene, creates a jarring disjunction that English-speaking audiences might find uncomfortable. Sivan includes broad physical comedy in a tragic film, and religious rumination in a war epic. Western audiences aren’t accustomed to such juxtaposition. This film dropped in 2001, about the time American TV and movies shifted to whispered dialog and solemn, unsmiling faces.

However, that very juxtaposition bolsters this movie’s themes. Sure, Ashoka lived around the same time as Alexander the Great, and we’d consider him ancient. But the concerns that forced him to reject empire and embrace transcendence, aren’t only located in the past. Ashoka laughs, gets drunk, and adores his mother; he also gives reign to murderous rages and destroys entire nations. Because ultimately, so do we.

Friday, January 13, 2023

Pie in the Sky on TV's “Firefly”

Promotional cast photo of Firefly

“Burn the land and boil the sea, you can’t take the sky from me.” Bluesman Sonny Rhodes’ theme-tune performance nailed the tone of TV’s Firefly. The nine crewmembers of the bucket freighter Serenity have, for individual reasons, abandoned life on terra firma and elected to live in the vacuum between planets. Over fourteen episodes and one feature film, those individual reasons dribble out slowly, equating to various forms of injustice.

American theologian Howard Thurman insists we understand Jesus best by understanding historical context. Jesus’ message of humble resistance makes most sense amid Roman occupation and Jewish subjection. Before Jesus, Jews had three responses. “What must be the attitude toward the rulers,” Thurman asks, “the controllers of political, social, and economic life?” Some, like the Sadducees, adopted Greco-Roman styles, while others, like the nonviolent Pharisees and violent Zealots, doubled down on Jewish identity.

Thurman’s third option is simply absenting oneself from society. This was the option favored by the Essenes at Qumran. Rather than submit to Rome’s bootheel, the Essenes fled Judea altogether, tended flocks among the caves, and refused to participate in society. This allowed the Essenes to maintain Levitical purity by simply not having their practices challenged. The empire can’t threaten us if we simply walk away from the empire.

This third option seems appealing when the empire’s yoke is intolerable, but also inescapable. Religions in particular, but also certain philosophies, urge True Believers to leave the empire. From ancient Essenes and Benedictines to modern hippies and Maharishi followers, morally pure people often yearn to rebuild society. The simple fact that these communities seldom outlive their founders doesn’t dissuade True Believers from trying again.

Nathan Fillion as Captain Malcolm Reynolds

Captain Mal, commanding the Serenity, takes this impulse to an uncommonly literal level. Saint Benedict promised believers they could, by following simple rules, reject the empire and live in the Heavenly City now. Captain Mal literally lives in the sky. Because his empire, yclept the Alliance, controls every planet in the ’Verse, he simply rejects the empire, leaves the planets, and—despite being outspokenly atheist—tries to go to Heaven.

I’m reminded of Swedish-American labor activist Joe Hill, who wrote folksongs to unify the labor movement. Among his most famous, his song “The Preacher and the Slave” coined the phrase “Pie in the Sky When You Die” to mock conservative Christians’ belief that passive compliance with unjust laws guaranteed a beneficent afterlife. Going to Heaven, in Hill’s view, seemingly meant selling out to unjust authorities here on Earth.

Both Captain Mal and his creator, Joss Whedon, would seemingly agree there. Whedon, like Mal, has nothing generous to say about Christianity. Yet throughout his works, characters continue believing in capital-T Truth, which is found through first leaving human society, then returning to it. Many Joss Whedon narratives, like Buffy or The Avengers, have a contemporary setting that, while altered, reflect our world. Firefly uniquely leaves Earth altogether.

This lets the protagonists do something seen frequently in other Whedon properties: go to Heaven now, before they’re dead. Captain Mal makes explicit something only lurking tacitly behind Buffy or Captain America. Whether fighting the Alliance, vampires, or Chitauri, Whedon’s protagonists must first leave this plane, often multiple times, then return armed to fight this world’s battles.

Joss Whedon

(As an aside, Captain Mal provides a perturbing metaphor. Whedon modeled Mal on ex-confederates who refused to surrender, like Frank and Jesse James, and headed West as outlaws. This seems like Lost Cause apologia, and former fans have turned against Firefly for this. But the episode “Shindig” specifies that the victorious Alliance, not the defeated Independents, practice slavery. Historically, this makes sense. As James McPherson writes, the Confederacy lost the war, but arguably won the peace.)

Unfortunately, fleeing the world never works. The world eventually barges in. Throughout history, peoples have used the cloak of religion to seek solace in isolated locations like Qumran, Monte Cassino, or Canyon de Chelly. The empire eventually overrun and destroyed all these locations. Each one is now a national park or historical center, demonstrating the ultimate fate of those who flee: their lives are frozen in time, as artifacts.

Like the Essenes at Qumran, or the Ghost Dancers at Wounded Knee, the Serenity crew attempts to disavow participation in the empire. But the empire keeps dragging them back in. That’s because, as Joe Hill realized, Heaven isn’t a place to camp during this life. Capital-T Truth will only be found by facing the empire on the ground. Because eventually, they will take the sky from you.

Wednesday, January 11, 2023

Your Brain Is Part Of Your Body, Sort Of

Amy Cuddy, Presence: Bringing Your Boldest Self To Your Biggest Challenges

Why do some people seemingly walk into a room and immediately own it? Charismatic entrepreneurs, tech maestros, and artists wield this magic, while us mortals witness them with awe. Harvard psychologist Amy Cuddy admits multiple variables probably influence how we manifest what she calls “presence,” not all of which are portable. But Cuddy suggests we have one powerful variable under our control: how we stand.

Reading this book’s description, I expected something like a Dale Carnegie course. The dust-flap copy implied an introduction to “the power of presence,” the tweaks of attitude and comportment that we collectively identify as charisma. We who, for whatever reason, never quite learned society’s unwritten rules in adolescence, frequently look for streamlined guidance for how to handle fraught situations, like public speaking, job interviews, and first dates.

Instead, Cuddy advocates what she calls “power posing.” This means holding your body upright, back straight, chin out, feet spread. Take up space, Cuddy advocates, and your mind will become expansive enough for even the most threatening spaces. Though Cuddy spends time on preliminary concepts like the nature of Imposter Syndrome or the importance of honesty, her book’s heart is invested in the idea that if you pose big, your brain will follow.

I’m reminded of Jordan Peterson’s first Rule for Life: “Stand up straight with your shoulders back.” Like Peterson, Cuddy believes this releases beneficial neurochemicals, and justifies this belief by citing wild animals peacocking in their natural environment. At least Cuddy cites higher primates, not Peterson’s seriocomic lobsters. But both psychologists elide one important question: are humans in society essentially similar to wild animals?

Animal social structures are essentially flat. If one chimpanzee leaves the colony, it leaves every aspect of colony life. Human lives are more dynamic, and encompass work, school, family, volunteer groups, sports fandoms, hobby organizations, religious congregations, and neighborhood taverns. Having power in one environment doesn’t necessarily transfer elsewhere. Moreover, I may leave one workplace, yet still safely grab a beer with my former coworkers.

Dr. Amy Cuddy demonstrating
the “Wonder Woman pose”

Cuddy even admits her advice is deeply conditional. Power postures which she claims empower our preconscious mannerisms, come across as deeply phony if we actually do them in public. People seriously pulling what Cuddy calls the “Wonder Woman pose” mainly appear off-putting in group dynamics. That’s why Cuddy cites the behaviors great apes pull in colony environments, but recommends we mimic them in private spaces: because humans aren’t wild animals.

Further, Cuddy admits “power poses” are culturally conditioned. Western power poses, she acknowledges, don’t work in most Asian societies. Though Cuddy cites her own and others’ research to demonstrate posing’s efficacy, her subjects are mostly college students, a core sample critics call WEIRD: Western, Educated, Industrialized, Rich, and Democratic. I’d question whether these poses work in poor or rural environments, where the person who stands out most usually gets knocked down first.

Please don’t misunderstand. We all recognize the correlation between bad posture and bad attitude. Cuddy herself mentions the hunched posture of the overworked cube farmer, or the increasingly common neck problems faced by teenagers who spend hours bent over their phones. These postures put lateral stress on our spines, structural tension our bodies aren’t optimized to handle. Humans evolved in environments where standing straight meant better access to food.

However, that’s correlation; Cuddy imputes these postures with cause. It’s equally plausible to say “submissive” postures are survival mechanisms. The hunching seen in cube farmers cowed by the workplace, tragically resembles the “don’t hit me” posture demonstrated by survivors of childhood abuse. People make themselves small to avoid making themselves targets. Maybe that behavior is maladaptive in adulthood, but it perseveres because, at one time, it worked.

I don’t want to imply that Cuddy’s conclusions are wrong. I lack sufficient knowledge and standing to support such conclusions. However, reading her text, it’s impossible to avoid noting that Cuddy’s evidence, while staged in scientific terminology, relies heavily on self-reporting. I also balk at Cuddy’s dependence on testimony, as it’s easy to slant outcomes according to which users bother to write back. Disappointed end-users seldom write fan letters.

We’ve probably all stared into a mirror, holding ourselves erect and practicing our breathing and eye contact. Feels empowering, doesn’t it? But we quickly realize the benefits of peacocking are transitory in hostile environments. I don’t want to flippantly dismiss Cuddy’s precepts, which probably help the right people in the right conditions. But her presentation is too sweeping, and doesn’t leave room for people struggling to survive adverse or dangerous circumstances.

Monday, January 9, 2023

Jesus in the Living Empire

1001 Books to Read Before Your Kindle Battery Dies, Part 113
Howard Thurman, Jesus and the Disinherited

Dedicated Christians believe Jesus Christ’s message holds true in all times and all locations. But Jesus himself existed in a specific place and time: Galilee and Judea, in the narrow window between Roman conquest, and Rome’s expulsion of Jews from their homeland. Jesus spoke to a powerless and occupied nation, delivering a message emphasizing how to live when society and empire wouldn’t permit painless living. Jesus’ original audience understood this.

Dr. Howard Thurman began life in segregated America, raised by a grandmother born into slavery. Concepts of empire and occupation weren’t metaphorical to him. Therefore, he read Jesus’ teachings as approaches to living in a nation that wrote inequity into its laws, and maintaining one’s dignity and creativity in adverse conditions. Though perhaps less well-known than peers like Dr. King or Fannie Lou Hamer, his insights are equally relevant today.

Thurman, who pastored America’s first intentionally multiracial congregation and later became America’s first Black dean of a majority-White seminary, In the wake of World War II, he published an article comparing Jesus’ historical context with the the-current conditions of Black Americans, a comparison that seems obvious now, but was probably scandalous. This short (barely 100 pages) book emerged from that article and the discussions surrounding it.

In this book, Thurman breaks down the common, intuitive ways occupied peoples in conquering empires handle their occupation. Though the responses often take nuanced form in response to specific situations, Thurman organizes them into three categories: Fear, Deception, and Hate. These categories correspond to mainline Hebrew responses to Roman violence, though they’re not uniquely Hebrew, nor are they necessary to Jewish identity. They’re just how ordinary Jews handled the situation.

Against these three categories, Thurman describes Jesus’ prescription: Love. This seems counterintuitive. The opposite of Fear is Courage, isn’t it? Not so, according to Thurman. Courage is respected in powerful people, but in conquered populations, it makes one a target. Instead, Thurman proposes Jesus’ response that cannot be broken. Those who have humility cannot be humiliated. Those who love their enemies don’t carry hatred’s frequently toxic weight.

Howard Thurman

Throughout, Thurman alternates between Jesus’ historical context, and Thurman’s own times. His examination of Jesus’ life and times isn’t an abstracted sociological experiment. Rather, Thurman published in 1949, as American wars in Germany, Japan, and later Korea enflamed national sentiment. As Thurman notes, racism frequently increases in America whenever official outlets gin up “patriotic” sentiment.

This insight isn’t original to Thurman. Historians like Greg Grandin have noted that, whenever American soldiers return from overseas wars, the homeland almost immediately sees increases in racially motivated violence. America’s commitment to World War II and its bastard offspring, the Korean War, segued directly into the racialized violence that motivated Dr. King, Malcolm X, and Kwame Ture. This book precedes the formalized “Civil Rights Movement,” but is unitary with its social conditions.

Thurman, a pastor first and therefore a veteran author of sermons, reinforces his exegesis with sermonic illustrations. He describes a sojourn in India, for instance, where he shared coffee and insights with a certain unnamed Hindu leader. Thurman elides any identifying details, but this leader may be Mahatma Gandhi, who Thurman met, and with whom he maintained a lengthy correspondence. Gandhi’s activism contributed directly to American Civil Rights, and Thurman was one important point of contact.

I don’t make the sermonic analogy flippantly. According to Thurman’s preface, much content within this short book began life as a series of lectures he delivered on multiple occasions, refining and clarifying his insights with each telling. His prose is thematically dense, but not impenetrable, and he writes without scholarly reliance on frequent source citations. His tone, rather, resembles a beloved teacher expounding important points you’ll need sooner than later.

This title’s current Beacon Press edition includes a foreword from historian and activist Vincent Harding. Dr. Thurman, like Jesus, addressed his teachings to a specific audience, which isn’t us. Professor Harding situates Thurman’s writing in his historical context, with the personalities and situations that Thurman’s original audience would’ve simply understood. Sadly, though our world continues changing, the underlying problems plaguing it don’t always change.

Yes, the world has changed since 1949, and American Christianity with it. But many problems remain fundamentally similar. The concerns of Black Christians which Thurman describes, are now understood to extend likewise to Hispanic, Native American, and LGBTQIA+ Americans who see themselves as part of the Christian fold. They too live, as Thurman puts it, with their backs against the wall. And they too have much to teach Christians.

Friday, January 6, 2023

Who Needs the Speaker of the House?

Kevin McCarthy

By the time you read this, you’ll already know something I don’t as I write: whether the United States has a sitting Speaker of the House. At this writing, Kevin McCarthy has narrowly failed six ballots, making his the first Speaker’s vote in a century to require multiple ballots. Because the Republicans hold a razor-thin majority, the usual procedural workarounds used to prevent intraparty entanglements won’t resolve the problem.

However, that razor-thin majority precipitates an important question: does the United States need a Speaker of the House? Does the role, as it currently exists, serve the House’s legislative role, as defined by the Constitution? Or does it impede debate, preventing testing of important ideas and the process of coalition-building? Would America be better served by dismantling the Speaker’s authority into an executive committee holding dispersed authority?

The fact that I’m asking these questions probably telegraphs my preferred answer.

The Speaker position’s existence is specified in the Constitution (Article One, Section 2), but no responsibility specified. Like much in America’s Constitution, the actual execution remains vague. Nothing actually requires the Speakership’s current authority to not only start, but also prevent, legislative debate. The Speaker’s control of legislative agenda derives from written rules and unwritten traditions—any of which could be changed.

Previous Speakers, like Sam Rayburn or Joseph Cannon, exercised significant power. But the Speakership’s current authority derives from Newt Gingrich, who explicitly saw the purpose of politics as war. His goal, in exercising authority, was not to serve the common good, but to destroy his opponents. He consolidated decision-making authority specifically to ensure only bills favorable to partisan causes even made the floor; any other debate got spiked.

Nancy Pelosi

We’ve seen more power accumulated to the Speakership. Officially, the Speaker provides order and governs organizational procedure, but in practice, that lacks definition. The “Hastert Rule,” not really a rule but a practice introduced by Dennis Hastert, gives the Speaker the power to prevent voting, or even debate, on bills which the Speaker’s party won’t support. Equally important, the Speaker dispenses committee appointments, ensuring which bills even advance.

This matters because, under the Presidential Succession Act of 1947, currently in force, the Speaker is second in line for the Presidency. By concentrating powerful authority in the Speakership, the procedure essentially creates a second national executive—then gives that person specific motivation to undermine the existing Executive Branch. Under a divided government, that provides remarkable incentive for individual corruption, as Gingrich demonstrated.

Nothing about the Speakership requires this level of executive authority. Unlike the President, who must often act swiftly and unilaterally to preserve national interests in a volatile world climate, the Speaker seldom faces such urgency. The issues which the House handles are long-term and national in scope, deserving test through debate. Though “compromise” and “horse-trading” have become dirty words lately, they describe how we build legislation that works.

Whatever political party doesn’t hold the White House usually gains power in midterm elections. This means the Speaker develops massive power over national priorities. Which sounds great, if Speakers hold actual policy positions, but few do. Since Gingrich, the bipartisan procedure has been to use the Legislature to break everything, preventing solutions to growing problems. Global warming, economic inequality, and an unaccountable military go unaddressed, as Speakers jockey for partisan advantage.

Newt Gingrich

Living in the shadow of Newt Gingrich and Nancy Pelosi, the Speakership has become a barrier to governance. Bills don’t receive nuanced debate in full chambers; indeed, the full House seldom meets unless a vote is scheduled. The House serves at the Speaker’s discretion, not the people’s. An entire second presidency has grown, tumor-like, on the Legislative Branch’s rump, one which isn’t popularly elected and doesn’t serve all Americans’ interests equally.

Arguably, the House needs the Speakership to organize routine business. But as the office has accumulated extra-Constitutional power, it has served to resurrect the exact conditions that doomed the Articles of Confederation: in-group loyalties, one Congress passing legislation to overrule the last Congress, and institutional whimsy. The entire office has become antithetical to clean governance and a sense of responsibility to the people.

While left-wing pundits whip themselves into a tizzy over supposed injustice of the unfair Senate or the Electoral College, the Speaker of the House has become the enemy of small-d democracy. The bitter, and frankly stupid, fight currently unfolding behind the Speakership vote demonstrates this. The ballot remains unresolved because partisans are fighting over who deserves that level of power. Sadly, the natural answer seems to be: nobody.

Edit for historical context: Representative Kevin McCarthy (R-California) eventually won the election. It took four days and fifteen ballots, the longest lag in electing a Speaker since before the U.S. Civil War.