Tuesday, October 31, 2023

The New Dark Age of the Fake Self

Buffy Sainte-Marie

News broke last week that multiple award-winning folksinger Buffy Sainte-Marie isn’t Native American, as she’s claimed for sixty years. Since breaking into the Greenwich Village folk scene alongside Bob Dylan and Pete Seeger, Sainte-Marie’s public persona has included her putative claim of Canadian birth, to the Piapot band of the Cree nation; she was subsequently, she said, adopted by a Massachusetts couple. New documents claim she was born White, to an Anglo-Italian family outside Boston.

Sainte-Marie’s public career has focused heavily on breaking boundaries for generations. She was supposedly the first Native American to win an Oscar, and has won Grammy and Juno awards for Indigenous music categories. If the current accusations stand, that means she received accolades intended for legitimately marginalized persons. It means she’s participated in expropriating other groups’ stories, reselling them at a profit, and claiming the honors accruing thereunto. It makes her, morally, a common thief.

This is especially vexing in the music industry, where we expect professionals to speak from experience. “Authenticity” is the watchword among pop singers, poets, and dramatists. Nobody expects this of, say, actors; when Gillian Anderson played Margaret Thatcher on The Crown, nobody bellyached because she wasn’t really a Prime Minister. But whenever Kid Rock rehashes his “plain folks” redneck persona, somebody hastily reminds us that he grew up rich in Michigan, just cosplaying working-class roots.

Recently, the internet’s population of full-time professional offense-takers has become particularly militant around policing racial boundaries. Rachel Dolezal set the pattern eight years ago, but she’s gotten lost amidst the tide. More recently, netizens have pushed themselves into a tizzy about whether BLM firebrand Shaun King might be secretly White, or that MAGA glamour queen Kari Lake might be concealing Black parentage. Whatever the truth, we clearly expect politicians, like pop stars, to be authentic.

Yours truly, right, as an angel (there’s a stretch) withJeff Ensz, left, as George Bailey
in the 2017 Kearney Community theatre production of It's a Wonderful Life

The demand for authenticity sometimes produces strange effects. Actor Jussie Smollett’s case is almost comically extreme, but representative. Being both Black and gay, Smollett evidently believed he needed some categorical oppression in his backstory; but he comes from a middle-class background and a relatively low-friction life. So he paid two brothers to fabricate a hate crime. While most participants in the “oppression Olympics” don’t go nearly so far, lies and street theatre are depressingly common.

But simultaneously, I fear that “authenticity” creates siloes that restrict our ability to participate in culture creation. As a trained actor myself, I can’t entirely manufacture my stage identity. I can’t, for instance, use blackface. Robert Downey Jr. in Tropic Thunder will probably be the last mainstream actor in my lifetime to use blackface on camera. Even then, he played a satire, a character too lacking in self-awareness to realize the harm in his actions.

I have, however, played characters dissimilar to myself. Onstage, I’ve been Jewish, gay, a veteran, and dead—none of which I’ve ever been offstage. I’ve affected Southern, Bostonian, Yiddish, and British accents. Okay, I’ve never discolored my skin to affect another racial identity, but I’ve adopted other external signs of groups I don’t belong to. Exactly how far from my “authentic” identity can I stray before I stop being an actor, and become something harmful?

A friend suggested Buffy Sainte-Marie could take a 23andMe test to determine her genetic quotient. Laying aside that she’s already done so, and found nothing, this testifies to the prevalence of pseudoscience in racial debates. As Richard J. Perry notes, these tests promise results they can’t possibly deliver. They only identify genetic markers existing in statistically significant concentrations in certain geographic areas. Race isn’t genetic, it’s social; did anyone treat Sainte-Marie as Indigenous growing up?

Please don’t misunderstand. I recognize why it’s important for public figures to not misrepresent themselves when speaking for their (putative) heritage. To quote one egregious example, political operative and notorious bigot Asa Earl Carter fled Alabama, moved west, and rechristened himself as Cherokee Forrest Carter. His fictional memoir The Education of Little Tree has actively preserved harmful stereotypes about Native Americans. Jamake Highwater and Hyemeyohsts Storm have similarly falsified their heritage and kept stereotypes alive.

However, ours is a time of shifting standards; formerly acceptable modes of imitation have become verboten, and the new boundaries haven’t solidified. While Buffy Sainte-Marie has her defenders, most reasonable people will agree she crossed a line, pretending to be something she isn’t. But most artists pretend, at least occasionally, and some forms of pretend are more acceptable than others. Some of us will certainly choose wrong; others will choose correctly, but be judged later.

Friday, October 27, 2023

Barbie, Disability, and the Death of Formal Rhetoric

Ben Shapiro expresses his well-thought-out opinion in a totally reasonable manner.

Aristotle defined rhetoric as “the capacity to discover the possible means of persuasion concerning any subject.” Pointy-headed and abstruse, yes, but a reasonably concise description of what I tried to teach in Freshman Composition. When structuring our language around contentious issues or painful controversies, we must think in terms of what will persuade our intended audience. That standard is often subjective, and moves almost whimsically.

Two recent events have re-centered this difficult process for me. I recently witnessed an unpleasant online dispute quickly spiral out of control. A disabled person noticed that a friend’s anecdote about helping a disabled stranger contained certain ableist prejudices. The story was well-intentioned, but fit a genre of short narrative sometimes disparaged as “inspiration porn.” All such stories mean well, but misfire by showcasing able-bodied generosity over disabled autonomy.

As sometimes happens when disadvantaged persons ask for consideration, some observers saw this criticism as personal attack. Like The Former President, who saw kneeling football players as disloyal “sons of bitches,” the OP’s friends closed ranks defensively, lambasting the critic for “attacking” their friend and “ripping him to shreds.” The defensive posture became so energized that they persisted even after the OP cautioned them to back off and cool down.

When Ben Shapiro, the massively online full-time professional offense-taker, protested Greta Gerwig’s Barbie movie this summer by lighting two Barbie dolls on fire, he apparently thought he was making a serious point. The entire internet, however, responded with aggressive disdain. Shapiro evidently thought this fire was an appropriate synecdoche for his internet-friendly outrage. But even his staunchest allies had little support for a petulant boy destroying his sister’s toys.

Shapiro makes most of his living doing personal appearances on college campuses, engaging undergraduates in “debate.” He organizes his public persona around the motto “Facts Don’t Care About Your Feelings.” Superficially, Shapiro seems to advocate Aristotelean rhetoric as persuasion through evidence. Yet the Barbie incident demonstrates something Shapiro’s critics have long noted: he cares only about winning, usually by personally demolishing anyone who disagrees with him.

Proof of the standing stereotype
of what constitutes a disability

The disability debate pushed me into an awkward position. Both the OP and his critic are friends, whom I respect dearly. I struggled to triangulate a position where I supported both while clarifying that I considered the criticisms justified. This meant finding ways to say “you’re wrong” without making the statement personal, and managing the feelings of defenders whose emotions already ran high. Therefore my participation mainly consisted of overthinking and extended paralysis.

Rereading the debate afterward, I noticed something I missed in the moment: the critic and the defenders kept talking past one another. The critic offered copious evidence, including cited sources and hyperlinks. The defenders hand-waved all the evidence, focusing on the perceived personal slight in the original callout. Because the critic intended no personal slight, she never addressed it. Therefore, both sides’ core concerns never got addressed.

When Ben Shapiro mistakes destroying toys for pitching an argument, the core problem probably resides in who he thinks his audience is. Shapiro has garnered acclaim by performing stunts designed to embarrass progressives and dissidents. Such displays help unify his hard-right audience and create a base primed to listen (and, importantly, to buy his advertisers’ sketchy products). But it’s more likely to alienate anyone who doesn’t already agree.

In other words, Ben Shapiro, his Daily Wire media company, and similar massively online conservative outlets like Daily Caller and The Blaze, create loyalty to an ideological brand. As I’ve noted before, these outlets generate an almost religious sense of unity. Sure, the ideological sense of aggrieved White masculinity coaxes new converts through the door. But once inside, the politics generally matter less than the sense that we’re traveling together.

That, I realized (with some pain), happened with the disability debate. While the critic attempted to structure a formal argument supported with evidence, the OP’s defenders formed a perimeter around group loyalty. Rereading the previous sentence, I realize it sounds pejorative. Not so; when disadvantaged groups face systematic challenges, group membership enables them to organize and support one another.

Don’t misunderstand me: I’m not here to call out or condemn anyone individually. Rather, to return to classical rhetoric, I believe the two groups had “mixed stases,” that is, they were having two different arguments. But that’s become the problem with online ideology. Too often, we care more about defending the group than seeking the truth; that goes double for us White able-bodied cishet males. The group becomes paramount; the truth gets lost.

Monday, October 23, 2023

...and the Dogs Licked His Sores

The Rich Man and Lazarus (undated Orthodox icon)

Two friends, in two completely different circumstances, have quoted Jesus’ parable of the Rich Man and Lazarus (Luke 16:19-31) at me recently. This forces me to turn inward and contemplate one of Christianity’s more opaque passages. This narrative, which only appears in one Gospel and has no explanatory text, doesn’t easily admit one-to-one interpretation. It reads more like modern literature than religious text, meant more to disquiet readers than enlighten souls.

The rich man—nameless in the text, but christened Dives in the English theological tradition—lives comfortably inside a fortress-like enclosure, never having to see the squalor around him. Jesus contrasts Dives’ comfort with Lazarus’ suffering. Not only is Lazarus hungry, he is “covered in sores,” presumably leprous, which makes him ritually unclean under Jewish law. He’s so unclean, in fact, that “dogs came and licked his sores.” Dogs, under Abrahamic law, are filthy scavengers.

In other words, Jesus doesn’t contrast here between rich and poor. He contrasts between a man so rich he never needs to touch the ground, and a man so poor he can’t get off the ground. One lives so richly rewarded by this world’s standards that he never needs to participate in this world, and the other must plead for crumbs so he doesn’t leave this world prematurely. Despite this, they’re separated by one thin door, and when the time comes, they seemingly die simultaneously.

Read from a conventionally middle-class moral position, the story seemingly culminates in rewards for earthly suffering, and punishment for earthly indifference. That’s how my childhood preachers interpreted this story. Lazarus ascends to heaven, not because the poor are particularly righteous or deserving, or because pain creates holiness, but because God favors the poor, and casts away those who don’t use God’s earthly blessings to assist the needy. Seemingly straightforward.

Except, as I’ve recently become conscious of widespread systemic ableism in Western society, the story has unwelcome creases. Jesus doesn’t only make Lazarus poor, he also makes Lazarus sick, and so thoroughly sick that he’s vulnerable to scavenging bottom-feeders. But Jesus seemingly offers the ultimate counsel toward patience. Wait your turn, he seemingly says; in the afterlife, God will comfort the afflicted, and consign everyone else to Hades. Just shut up and wait.

The Rich Man and Lazarus
(Sir John Everett Millais, 1864)

In practice, that’s how Christian theologians have utilized this story. Shut up and remain patient, countless bourgeois ministers have asserted, and eventually, God will reward your patient suffering. Those same ministers counseled Dives’ luxurious descendants to gift charity from their surplus, an attitude that still survives whenever you hear anybody say, “charity belongs to the church, not the government.” But an economy based on resource hoarding never gets truly fixed.

Because of Lazarus’ leprous sores, lepers’ hospitals in Victorian England were termed “Lazar-houses.” Mostly run by religious charities, these hospitals let professional Christians scoop lepers off the street, so they wouldn’t beg at the gate. But they walled lepers inside massive, fortress-like sanitoriums. Where Dives lived inside his palace, and Lazarus begged on the street, wealthy Victorians walked the street unmolested, while lepers lived inside their Lazarettos. The upshot remained unchanged: Get them out of sight!

These aren’t historic oddities. Nancy Isenberg describes the degrading language America’s founders, including Thomas Jefferson, used to describe chronically impoverished and ill White people. America’s last “Ugly Laws,” which saw people jailed for being poor, disfigured, or unpleasant to look at, weren’t repealed until 1975. P.T. Barnum built his entertainment empire by displaying disfigured, congenitally disabled, and otherwise outcast persons in his “freak show.” Barnum’s “freaks” accepted this treatment because, otherwise, they had no means to earn a living.

Conventional bourgeois interpretations of the parable focus on Dives’ earthly prosperity, and everything he loses. Middle-class pastors warn parishioners to remember the poor, and to provide sustenance from their worldly excess. Perhaps this interpretation isn’t unfair, since Jesus himself focuses his greatest word count on Dives, and everything Dives lost. But Jesus himself isn’t neutral. Jesus does something disability rights advocates deplore: he reduces Lazarus to a mute prop, with no agency, in his own story.

Jesus undoubtedly intended this story for rich audiences. Luke’s Gospel is uniquely blunt in scolding the wealthy: where Matthew’s Beatitudes promise blessings to the poor, Luke’s parallel passage promises punishments to the rich. These warnings remain as relevant today as when Jesus spoke them. Yet in targeting the rich, Luke’s Jesus strips agency from the poor. Jesus makes Lazarus an object of rich people’s pity, and disability advocates know, pity is one step removed from contempt.

Tuesday, October 17, 2023

The House of Usher and the Fall of Capitalism

Promo photo from Netflix’s The Fall of the House of Usher. (the pic is deliberately off-center)

Near the climax of Netflix’s new adaptation of The Fall of the House of Usher, Madeline Usher (Mary McDonnell) delivers a monologue almost directly at the camera. Half-mad with grief, Madeline rants about the American condition, taking in almost everything synoptically: runaway consumerism, prescription drug abuse, wage stagnation, the Dodds decision. Then she asks her moon-eyed brother Roderick (Bruce Greenwood): are we culpable for this? Or do we just provide services which market forces demand?

The ”we” in this question is the rich. In Poe’s original short story, the Usher family’s wealth is poorly defined, the inheritance of a waning feudal aristocracy, and Madeline barely speaks. But in this adaptation, the Usher family’s wealth is very specific: they own Fortunato Pharmaceuticals, which aggressively markets an opioid drug to pain sufferers worldwide. This unsubtle dig at the Sackler family, owners of Purdue Pharma, which manufactures OxyContin, is just one anti-capitalist theme.

First, the overview. This is filmmaker Mike Flanagan’s fifth Netflix limited series, and the third adapting American horror properties, after The Haunting of Hill House and The Turn of the Screw (as The Haunting of Bly Manor). Like the previous two, this one samples liberally from the source author’s collected works, adapting the spirit of the text rather than the words. It’s beautifully paced, somber, and elegiac, but also gory enough for seasoned horror fans.

Flanagan’s previous adaptations have frequently commented on contemporary society. Midnight Mass has called out religious intolerance, for instance, and the ways that outward piety invite moral rot into communities. Usher, however, is distinct for not only having one-to-one criticisms of contemporary issues, but also for calling those issues out by name. In later episodes, several characters deliver monologues like Madeline’s, sounding suspiciously like courtroom closing arguments. Fitting, since the frame story is Roderick Usher’s “confession.”

The Usher twins have achieved earthly power by selling consumers what they think they want. Early on, in flashback, young Roderick (Zach Gilford) pitches the motivating opioid drug, Ligadone, by promising “a world without pain.” It’s unclear whether he knows, making that pitch, that his drug doesn’t cure pain, only defers it until later. In later episodes, he clearly does know this, eventually. But he tells regulators and corporate execs what they want to hear.

Carla Gugino as Verna, a mysterious presence hovering over the house of Usher

Roderick’s six children represent different forms of vice. None grew up with Roderick as primary caregiver, and he bought their affection by showering them with money, and the appurtenances money can buy. Therein lies an important difference between Poe’s world, and Flanagan’s. In Poe’s “The Masque of the Red Death,” fat aristocrats have their party to flee a plague-ravaged world. Flanagan’s Prospero Usher has the same party, but flees a world of excess, not lack.

Of Roderick’s six children, three explicitly abuse drugs. Two others indulge bizarre, distorted sexual appetites. Of the six, only Victorine, a surgeon, lacks conventional vices; but she’s a workaholic. Just because an addiction isn’t illegal, doesn’t mean it isn’t harmful. As addiction specialist Dr. Gabor Maté writes, addictions are almost always manifestations of the abuse and neglect we’ve suffered, either as children or adults. The Ushers got rich selling substance abuse, while indulging covertly themselves.

In other words, the Ushers endure the same capitalist “win or die” mentality they’ve exploited to harm others. They simply game the system sufficiently to defer their suffering onto the future. Money grants the family an illusion of control, and encourages Madeline’s frequent exhortations that the Usher family is destined to change the world. (Madeline’s unflattering parallels with Hilary Clinton’s post-hippie idealism grow increasingly pointed.) But the Ushers never fix the system, only exploit it.

The series’ conclusion implies a just universe, where the more pain the Ushers have deferred, the faster it rolls over them. Flanagan, an atheist, doesn’t believe the Ushers will face consequences in “the next life,” but he declares everyone will answer for their transgressions. The script compares the Ushers to Donald Trump, the Koch brothers, and Mark Zuckerberg, all facing greater or lesser consequences for their actions. Flanagan apparently believes our universe is ultimately just.

I must acknowledge Netflix, which has recently demonstrated a willingness to platform blunt anti-capitalist messages. Black Mirror’s recent sixth season is tightly self-referential, with episodes tied together by “Streamberry,” a media platform so obviously modeled on Netflix that one can only laugh. As the Disney model of conglomerate media has grown increasingly silly, Netflix has shown a willingness to laugh at itself. Evidently Reed Hastings can do something Roderick Usher can’t: change with the times.

Saturday, October 14, 2023

The State as Enemy of the People

The “Apartheid Wall” separating Israelis from Palestinians

One early scene in Emily Raboteau’s memoir Searching For Zion helped crystalize my understanding of race in America. Raboteau, who is biracial, recalls a discussion of race issues in her grade school classroom. Her best friend Tamar, who was Jewish, was horrified to hear how thoroughly bigotry is baked into American history. Raboteau recalls her best friend turning to her and exclaiming, “I’m not White.”

As an adult, Tamar performed Aliyah, the ritual immigration of diasporic Jews to the Israelite homeland. Raboteau visits Tamar while researching the elusive concept of “homeland” for Black Americans. She finds her childhood friend living in the Israeli-occupied West Bank, dwelling in a house which had been expropriated from a Palestinian family. She recalls listening in horror as Tamar spews talking points about why Palestinians deserve expelled from the land.

In Israel, Raboteau realized, Tamar had become White.

We’ve watched this week as the longstanding hostilities between the Palestinian Hamas government and the Israeli state have boiled over into war, again. Both sides are committing atrocities, including targeting civilians, and justifying their action by defending their own civilian populations, as states do. Ordinary Palestinians and Israelis, with little influence over their governments’ actions, watch helplessly as powerful states replay the grievances of ages past.

Outsiders struggle with how to respond. American spokespeople weighing in on Levantine affairs hasn’t gone well recently. Representative Ilhan Omar was famously called “antisemitic” for criticizing the Israeli state, an accusation that has subsequently spread to colleagues like Rashida Tlaib and AOC. That accusation was specious, however, spearheaded by Margorie Taylor Greene, the candidate endorsed by the Charlottesville fascists who chanted “Jews will not replace us.”

Well-meaning politicians, journalists, and others fear that, if they criticize Israel or Hamas, their opposition will tar them as smearing Jews or Palestinians. Because that does happen. Yet in neutering their critiques, they appear willing to accept war crimes. Hamas targeted air strikes on civilian populations; Israel responded by giving Palestinians 24 hours to flee northern Gaza, a logistical impossibility, since there are over a million people and no place to go.

American right-wing Christians are historically willing to excuse every Israeli excess, despite a longstanding antisemitic streak through their politics. The Israeli state looms large in their theology, because if Israel rebuilds the Temple in Jerusalem, Jesus will (their script says) come swiftly. The End Times aren’t scary for conservative Evangelicals; they believe Jesus will vindicate them, punish everyone else, and let them live justified in God’s light.

Sort of like Tamar, who needed justification for living in a stolen Palestinian house.

Hey, Israel? Those “terrorist targets” sure look like neighborhoods

Powerful states seldom really represent their citizens’ interests. The English Puritans who settled in Massachusetts were legitimately oppressed by the English ethno-nationalist state; but upon bivouacking in America, they began recreating that oppression on Native Americans, enslaved Africans, and Anabaptists. Similarly, the Israeli state and the Jewish people apparently learned different lessons from the Shoah.

Saying this doesn’t mean Jews, individually or collectively, bear responsibility for the current violence in Gaza. Much like individual Americans, living today, aren’t culpable for African slavery or Indian genocide. Indeed, Israelis or Americans who resist their authoritarian state and its bigoted history often get knocked down first. The state, ultimately, always preserves itself—not its people, and certainly not its people’s morals.

States always oppose free thinkers, dissidents, and reformists. Because the state wants only to preserve itself, and has no other moral goal, it always dispenses rewards to those who support the state, and punishments to those who challenge it. This describes states of all kinds: the history of the Cold War reveals that the Eastern and Western blocs found proprietary ways to punish dissidents, but punish them it did.

The powerful Israeli state provided Tamar the protection from antisemitic hatred which she’d lacked in America. It only demanded that Tamar relinquish her morals to the state, and agree with the state’s rationale. America saw something similar, when it originally rejected Italians and Irish as murky, Latinate foreigners. America eventually accepted these groups as White, if these groups would become wholehearted defenders of American state morals.

So it follows that one can challenge the Israeli or Hamas state without condemning the Jewish or Palestinian peoples. Indeed, we can only protect all peoples equally, without prejudice, by condemning and challenging the state. States, by their nature, reward conformists, and punish those who pursue justice. States always preserve an in-group and try to kill an out-group; the current war is between two privileged in-groups, not between peoples.

Friday, October 13, 2023

The Simple Joy of Being Wrong

Tom Baker as the Fourth Doctor

In elementary school, when people asked me what I wanted to be when I grew up—that wheezy childhood standard—I consistently answered: “A scientist.” I didn’t know what that involved, but it definitely looked cool in classic Doctor Who episodes. The Doctor collected evidence, tested hypotheses, and by Act III, he inevitably found a solution that saved humanity from itself. What could be cooler than that?

By middle school, I discovered that my giddy childhood enthusiasm didn’t match the process. Science class consisted significantly of memorizing lists, performing “skillz drillz” exercises, and satisfying state-mandated competency checklists. My brief dive into seventh grade Science Club also showed me one of science’s less-appealing aspects: fundraising. We spent most of the academic year trying to pay down debts the club ran up the previous year.

This left me profoundly discouraged. There was no messianic world-saving going on! We didn’t even stick with any program long enough to understand it. One week, we’d demonstrate the states of matter by applying heat to an ice cube until it melted, then evaporated; the next, we’d dissect a pickled frog. Our teacher, with deadlines imposed by the state Department of Education, couldn’t linger on anything enough to spark understanding.

Because of this, I lost interest in “science.” I understand, as an adult, why teachers needed to imbue students with a satisfactory corpus of knowledge, because to operate common technology and participate in modern society, I had to have a basic understanding of thermal dynamics, biology, and meteorology. But I never understood any subject better than necessary to parrot answers back on the test, and I promptly forgot everything afterward.

Science fiction usually depicts rococo science. Star Trek often implied that Spock and McCoy could pull an all-nighter to invent a vaccine and instantly stop a pandemic. Nevertheless, it conveyed that science wasn’t memorized lists and data tables, it was a systematized version of “let’s try something reckless.” But the “science” I learned in school had no reckless experimentation. Every “experiment” had a pre-ordained conclusion, and a scripted take-home lesson.

Instead, I found my long-sought experimentation and recklessness in writing and literature. Sure, every English class expected me to savvy part of the literary canon, so some prescriptive learning still happened. But in writing particularly, I could try something new, and succeed or fail on my own terms. This adolescent Shakespearean sonnet clunks badly? Heigh-ho, into the bin, and I’m already trying the next fracas!

Richard Feynman

Paul Lockhart complains that students studying math in public (state) schools never have an opportunity to be truly wrong. They never have an opportunity to face a problem, self-indulgently play with potential solutions, and ultimately find the answer themselves. Schoolbook math, in Lockhart’s view, has become desiccated and lifeless, a mere husk. “Math is not about following directions,” he writes, “it’s about making new directions.”

I often wonder how my life would’ve differed, had I discovered the unsolved, and possibly unsolvable, problems underlying scientific thought. I discovered physics at age twenty-five, in the person of Nobel Prize-winning physicist Richard Feynman. His writings, many of them surprisingly comprehensible to outsiders, emphasize how much physics relies on metaphor, analogy, and imprecision—the fun, dangerous qualities I found in poetry.

Parenthetically, I realize that Feynman, personally, was more fraught than his mythology implied. That’s a topic for another time.

Feynman’s approach to physics was characterized by irresolution, play, and risk. Sometimes literal risk: he tested his hypothesis that a vehicle’s windscreen was sufficient to deflect the glare of a nuclear explosion by watching the Trinity test from a pickup truck’s cab. Feynman exemplified the sensory immersion of just trying something that I wanted from science, but found in literature. What if I’d discovered physics sooner?

My science teachers, dedicated educators all, nevertheless taught me that science is known and precise, and nothing is worse in classroom science than being wrong. But being wrong—stepping beyond the limits of knowledge which state contractors can write into Scantron tests—is the heart of science. And that’s what school denied me: the opportunity to experience the unmitigated joy of writing my own hypothesis, testing it, and maybe being wrong.

How many others like me exist? How many would-be historians got discouraged by pop quizzes laden with names and dates, when they’d rather discover the contingencies that made history happen? How many poets who never found their voices because somebody compared them unfairly to Robert Frost? How many people never got to just try something, and maybe be wrong?

Thursday, October 12, 2023

The Wheel of Time After Time After Time

The main ensemble from Amazon's Wheel of Time, season one

The Amazon Studios adaptation of Robert Jordan’s Wheel of Time has obtained largely warm reviews, but mostly from critics who haven’t read the novels. A minority of critics have lambasted the adaptation for its lukewarm fealty to the source material; they’ve accused showrunner Rafe Judkins of not so much translating the source, as nesting inside it. I have a different problem. This series demonstrates the greatest problem of media in the streaming era: content bloat.

On one level, the series shows Peter Jackson’s influence on screen-based epic fantasy. Beautiful arboreal landscapes and rushing rivers, frequently viewed from high above with cameras in motion, create not only a sense of intricate world-building, but that real characters occupy this world, moving through space and time. Sprightly theme music underscores the lushly colored visual palette. The young, attractive ensemble cast wears vibrant, prismatic clothes, unsullied by dusty agricultural work or unpaved village roads.

These visuals call attention to themselves. Presumably the creative crew intended this, as they use many long, lingering shots of The Shire Two Rivers, the White Tower, elaborate medieval streetscapes, and primordial terrain. As shots linger, however, making the scenery itself the center of audience engagement, we start paying attention to the design. We start analyzing which shots were created on location, which virtually in 3D modelling, and which are composites, matted together in postproduction?

Like Jackson’s LotR movies, the series premiers with a wizard’s arrival in a bucolic village, followed by a party. This provides an opportunity to introduce characters through dialog and interaction. Lots and lots of dialog and interaction. Again, these scenes are beautifully designed, but they continue interminably. The introductions of character relationships, larded with soap-operatic detail, mount up relentlessly. When the Uruk-Hai Trollocks finally attack the village, we’ve grown bored waiting for something to happen.

Maybe Judkins and the creative team watched Jackson’s “long-expected party” scene and felt they needed one of those, except it occupies the entire first episode. That’s an hour of content. What Jackson completed in twenty minutes, Judkins extends for almost sixty. Each season of The Wheel of Time contains eight one-hour episodes, meaning that together, each season is longer than the theatrical release of the full LotR trilogy. (At this writing, there are two seasons.)

This pattern continues. Having fled the monsters, our ensemble of protagonists journeys with Gandalf and Aragorn Moiraine and Lan to meet destiny. The characters keep demanding explanations, but their self-appointed mentors keep answering: “Now is not the time.” Somehow, despite months of travelling together and long nights spent huddled around the campfire, now is never the time. The characters have identity-building encounters, and make occasional meaningful discoveries, but real exposition remains punted down the field.

Our ensemble remains mostly unified throughout the first season, though briefly separated for… purposes. They then spend the entire second season pursuing side quests, before unifying atop a wizard’s tower. Along the way, they encounter an ageless monarch, a lost nation, and an enemy thought destroyed three thousand years ago. Though the story arc doesn’t slavishly copy Tolkien, it’s nevertheless one giant spider away from matching the beat sheet in his first two LotR novels.

Some of this overgrowth comes from the source material. Full disclosure, I haven’t read Robert Jordan’s original novels; but I have it on authority from friends who have, that his writing often rambled, and later novels particularly suffered lack of firm direction. Yet the transition between media should’ve resolved some of this. Traditional broadcast or cable television would’ve forced the creative team to make hard decisions, because airtime is limited, and advertisers’ patience isn’t infinite.

With no requirement to fit into scarce, valuable airtime (or to empty the cinema for the next showing), Judkins and the creative team apparently feel authorized to keep creating more and longer scenes. Again, in fairness, these scenes are often beautifully designed, and individually, introduce interesting, thoughtful character moments. But as they accumulate, we increasingly see the story as constructed. We desperately wish we could tell the creative team to just make something happen, already!

Streaming distribution removes many limitations forced on traditional television. The one-hour format, advertisers who need appeased, and network Standards & Practices are out; graphic language, sex, and violence are in. But traditional limitations forced creative teams to make choices; what they omitted often mattered as much as what they included. Streaming doesn’t force them to cut anything. As content gets longer, with interminable exposition and vague resolutions, we start wishing someone made them make choices.

Saturday, October 7, 2023

The Rise and Fall of Anti-Democracy

Steven Levitsky and Daniel Ziblatt, Tyranny of the Minority: Why American Democracy Reached the Breaking Point

Did you know that the right to vote is not enshrined in the U.S. Constitution? Like me, you probably assumed it was protected in Article V, the Equal Suffrage Clause. Not so; like many components of democracy, it was never written down, a compromise necessary to push the Constitution through the feuding factions which wrote it in 1787. America has dribbled suffrage slowly to increasing numbers of citizens, but the franchise is nowhere formally guaranteed.

This is just one example of what Harvard University political scientists Steven Levitsky and Daniel Ziblatt call “anti-democratic institutions” baked into the American political system. The Constitutional Convention delegates in Philadelphia, versed in the historic abuses of ancient Greece, feared “the tyranny of the majority,” by which populations could turn against fellow citizens or the state. Ancient democracies were vulnerable to demagoguery, paranoia, and political whimsy. The Framers didn’t want that in their new government.

Levitsky and Ziblatt address several recent bugbears which have plagued the American republic recently. Gerrymandering, the Electoral College, and the historical malapportionment of the Senate are obvious choices. American and British readers may less readily recognize others: the winner-take-all electoral system, for instance, that concentrates power in candidates who carry razor-thin majorities and discourages third-party voting. Most other democracies have a proportional representation system, but not America, still governed by an 18th-century Constitution written hastily.

These anti-democratic practices have histories. Small Northern states like Delaware and New Jersey feared dominion by populous states further south in 1787, while slaveholding Southern states like Virginia and Georgia feared having their economic advantages stolen by energized minorities. Institutions like the Senate and the Electoral College, and practices like redistricting and the filibuster, were by generations now dead, to protect themselves from changes in economics and population. Then the population and economy changed anyway.

Most anti-democratic institutions aren’t uniquely American. Levitsky and Ziblatt find other early semi-democratic societies which checked the people’s will. Even demagogic attempts to overthrow the government aren’t uniquely American; France in 1934 looms large in their historical comparison. These scholars specialize in tracking diverse bodies politic, and they don’t have to look deeply to find evidence that “free” societies, in their founding, often distrust the people. Laws are written to restrain, not empower, the masses.

Steven Levitsky (left) and Daniel Ziblatt (via Harvard University)

Our authors identify the difference between small-d democratic parties, and anti-democratic parties. The differences they identify include, but aren’t limited to, a willingness to abjure violence, and a willingness to cut off the violent wing of their own party. It isn’t enough to speak the verbal traditions of democracy; for Levitsky and Ziblatt, the difference between democratic and anti-democratic parties lies in their actions. They closely parse the actions of American and international anti-democratic parties.

America’s Republican Party has, historically, been a pro-democracy party. Bipartisan majorities swept voting rights legislation into law, and even paleoconservatives like Strom Thurmond and Mitch McConnell worked to expand democracy. But anti-democratic factions infiltrated the party. Rather than ostracize them, Republicans welcome these insurgents, and write legislation catering to them. American politics compounds this trend by treating the Constitution as sacred, and the Framers as prophets. (See also Mary Anne Franks for America’s “state religion.”)

Levitsky and Ziblatt also cover what actions other democratic societies have undertaken to reverse the anti-democratic trend. Most other democratic societies have taken steps to broaden the franchise. Most societies with bicameral legislatures have disestablished the upper house, like France and Germany, or reduced its power severely, like Britain. Norway, which modeled its constitution on America’s, has amended its constitution over 200 times since 1814; America, in that time, has ratified fourteen amendments, and none since 1992.

Anti-democratic slide isn’t inevitable. Other societies have faced constitutional crises as catastrophic as America faces today. (These authors don’t mince words: the Trumpist faction is, at the very least, a threat to democracy.) Some societies salvage democracy only after collapsing into totalitarianism; the authors mention both Vichy France, and Thailand, which continues suffering periodic coups d’état. Other societies, like Britain, Argentina, and Australia, have become more democratic simply because it’s the right thing to do.

Preventing collapse means doing something, though, not just expecting state institutions to protect themselves. It means we who support democracy must become as energetic as those who oppose it, and must participate before catastrophes happen. Levitsky and Ziblatt agree that American society has become more democratic over time, and can continue doing so, but only if motivated citizens make it happen. Democracy isn’t a spectator sport. It will wither as we watch from the sidelines.

Thursday, October 5, 2023

What Kind of University do Americans Want?

My alma mater, the University of Nebraska at Kearney, has placed several academic programs on the chopping block. Cuts to the fine arts have attracted the most attention: university administration has proposed axing the entire theatre program, and one-third of the music department. But while these cuts attract the most attention (and the most students inclined to protest), less sexy cuts include the entire Geography, Philosophy, and Journalism departments.

Other departments would survive, but in truncated forms. Modern Languages would lose its French and German majors, functionally turning the entire program into a Department of Spanish. The English Department would allow vacant positions to remain unfilled indefinitely, furthering the program’s decline into a Department of Freshman Composition. Although the administration has proposed some Education and Business cuts, two-thirds of proposed cuts come from Arts and Humanities.

Smarter commentators than me have addressed the costs which the campus and community will suffer. UNK, once a top-tier regional university, has slid in rankings since I left, which probably isn’t coincidental with the steep cuts previously imposed on academics, and the numerous tenure-track seats going unfilled. I’d rather focus on another question raised by these draconian cuts: what role will universities, and education generally, serve in coming years?

The university is slashing theatre, an art wherein people attempt to genuinely, realistically depict people dissimilar from themselves. Actors, and the myriad technicians supporting them, try to accurately channel people from other times, nations, or backgrounds, and tell their story respectfully. In other words, theatre cultivates empathy—a trait shared by, say, learning to speak and read French or German. Future students will have fewer opportunities to learn empathy.

Likewise, the university is cutting journalism at a time when Americans are historically ill-informed about world events, and lack of media savvy has produced painful consequences. The geography program is in jeopardy, as our earth faces climate shifts which have the potential to wipe out human civilization. These cuts reflect value judgments from state officials who want education to produce desired outcomes—which, apparently, had better not threaten state values.

A certain subset of American power has always wanted to limit state universities to teaching job skills. That subset, usually but not exclusively conservative, sees “liberal” arts, those knowledge domains which create free thinkers and liberated minds, as luxuries for the minority of students whose families can afford private universities. These self-appointed education pundits don’t want students asking difficult questions, they only want them learning marketable job skills.

We’ve witnessed this, in more dramatic terms, in Florida and Texas. Ron DeSantis’s Florida has banned entire fields of study, while Florida, Texas, and Oklahoma have allowed PragerU Kids, an edutainment company founded by an AM radio host and funded by fracking billionaires, to displace teachers in schools. The removal of entire academic disciplines from UNK, a school which primarily attracts regional students from poor backgrounds, is no less consequential.

Plato and Aristotle, depicted by Raphael

Throughout history, self-appointed polymaths have debated the purpose of education. Plato thought education fitted scholar-kings to rule the benighted masses, while Aristotle thought education made citizens into good people. Thomas Aquinas thought education brought people closer to God, though later scholars have thought education broke the yoke of religious delusion. I suggest there’s no pat resolution to these differences, but education prepares wise people to differ more constructively.

“Liberal” arts, the arts which liberate people—disciplines like literature, history, math, and science—allow students to know themselves. But equally, they allow students to know themselves in context, in their society and economy and culture. Educated citizens have tools necessary to evaluate fair use of power, just distribution of resources, or the difference between truth and lies. It isn’t coincidental that American slaveholders didn’t want their chattel to read.

Higher education shouldn’t be merely cost-efficient. Indeed, for many students, post-secondary education will be their last opportunity in this lifetime to pursue truth, beauty, science, and knowledge for their own sake. This, of course, offends those who believe every item, thought, and hour should have an owner. Students able to ask penetrating questions, will inevitably ask questions that powerful people don’t want answered.

I acknowledge limits exist. Three-fifths of UNK’s budget comes from taxes and endowments, which deserve accountability. While American generational cohorts continue getting smaller, tenured professors remain in harness for decades, narrowing the academic pipeline. These concerns aren’t hay. But the proposed cuts clearly aren’t value-neutral, and serve to limit the kinds of questions students can ask. University administrators should prepare themselves when inquisitive students stop showing up.