Monday, February 27, 2023

Life Under the Un-Free Market

Rebecca Giblin and Cory Doctorow, Chokepoint Capitalism: How Big Tech and Big Content Captured Creative Labor Markets and How We’ll Win Them Back

Only five corporate conglomerates publish about eighty percent of America’s books, a concentration of power largely unchanged since the 1990s. But only one company, Amazon, sells about half of America’s books, and nearly all of America’s self-published and indie books. Australian law professor Rebecca Giblin and Canadian author Cory Doctorow consider this squeeze to be a harbinger of future first-world economics, if something doesn’t change soon.

Concentrations of market authority create what our authors call “chokepoints,” where one company has immense power over consumer access. They admit this chokepoint is a casual term, and describe the more technical description: monopsony, a little-known economic force where one or a few buyers control the market. Most monopsonies are also monopolies, since they resell their captive market to us, the general public, who can’t negotiate a better price.

Similar market concentrations obtain throughout the culture industry: three record labels publish most music you listen to, but most audiences only listen anymore through Spotify. And the TicketMaster/Live Nation merger gave one company a death grip on A-list live music. Five studios own most of Hollywood, but even they’re beholden to the Big Four talent agencies.And Google’s stranglehold on the online ad market gives them power over nearly everyone.

These monopsony markets didn’t just happen. Their controllers manipulated market conditions to their advantage, often by misusing antitrust regulation in ways that contradicted why those regulations were written. But since the 1970s, a new market philosophy has dominated. Invented by Robert Bork and popularized by Chicago School economists, the dominant bipartisan theory holds that monopolies and un-free markets are acceptable, provided consumer costs remain low.

Markets, our authors remind us repeatedly, don’t objectively exist. They consist of laws, traditions, ad hoc regulations, and personal agreements, and therefore we humans invent and reinvent markets constantly. This book’s first half dismantles the long processes through which powerful people have captured markets—and market regulations!—to serve the wealthy resource owners, usually at the expense of creative workers, and to the detriment of their audiences.

Rebecca Giblin (left) and Cory Doctorow

The second half switches gears, tackling ways creative professionals have challenged the monopsony market, and potential ways these approaches could expand. Our authors don’t like traditional liberal regulatory approaches. They extend limited praise to the Biden Administration, which has detailed committed professionals to enforce antitrust regulations which have lain mostly dormant since the 1970s. But those regulations were written, they note, for 19th Century markets.

For instance, though union drives are lawful and protected in America, that only applies to directly employed workers. Independent contractors and freelancers can’t organize, because under 19th Century antitrust law, that’s a “price-fixing cartel,” and totally illegal. Unfortunately, many creative professionals aren’t directly employed. And where they are directly employed (as under the Hollywood studio system), they’re frequently employed through agencies, making their contract status legally squishy.

Our authors prefer a hybrid approach. They definitely believe collective action holds the greatest promise for fixing market concentrations, because where the wealthy control markets through money, that money doesn’t mean much if it can’t buy human labor. Collective action requires quick thinking, though: for every solution creative professionals invent, Big Tech capitalists have demonstrated their ability to manipulate that to their advantage, eventually. Modern problems require modern solutions.

They also describe public shaming operations which have worked well. For instance, when Disney bought Lucasfilm, they claimed they bought the rights to everything Lucas did, but not necessarily any obligation to pay creative workers. A loose affiliation of writers and other creatives, let by novelist Alan Dean Foster, pushed a hashtag campaign that humiliated Disney into paying its workers and honoring its contracts. Shaming the rich evidently works.

I’d go further. This book shipped before the TicketMaster/Live Nation catastrophe saw Taylor Swift tickets going for $4000. Our authors describe how Swift previously motivated her mainly young, energetic audience to overturn Spotify’s pay structure, and distribute royalties fairly. Now, that same audience has pushed Congress to hold corporate America accountable, for the first time in two generations. It’s early to say, but this may be the beginning of a movement.

These authors describe the consequences of monopsony economics on creative workers, and how creative workers have resisted. This may seem abstruse to some audiences. But they establish in Chapter One, and reiterate throughout, that the creative industry is a bellwether for the economy overall. The same forces are already spilling into gig work, and may displace white-collar work soon. This isn’t purely academic; it’s about your economic future, and mine.

Saturday, February 25, 2023

The Vanishing Experience of Winter

The thaw happening as I write

There’s steam rolling off the parking lot as I write. Though the air temperature is only 14 degrees fahrenheit, well below freezing, a mostly sunny sky has warmed the pavement sufficiently to cause last night’s thin, fluffy snow to melt faster than it can run off. Because the snowmelt is warmer than the surrounding air, it’s evaporating rapidly, greeting the kind of steam sometimes photographed rolling off sultry Louisiana bayous.

The last few Nebraska winters have been largely mild. I’ve only needed to shovel my walk once this year and, this late in February, the chance of another big ass-kicking snowfall is pretty minimal. Though some American communities have gotten socked pretty hard, we haven’t witnessed the prolonged snows of my childhood, where air stayed so cold so long that accumulations hung around for months, only disappearing in spring.

Winters swing wildly on the Great Plains. It’s too early to calculate this year’s snowfall, but according to the University of Nebraska, last winter was the least snowy on record; the year before, the sixth snowiest, with mounds that took months to melt away. As someone who didn’t live here until the cusp of adulthood, I find winter’s unique fingerprint exciting. And I dread the possibility that winters might stop.

Plenty of Nebraskans hate heavy snows, and have appreciated the change to milder winters. I’m not one of them. I’ve never known most forms of transcendent euphoria which people report with physical exercise, such as the semi-mythical “runner’s high.” But my first winter in Nebraska, in 1992-93, I discovered I can totally access the “snow shoveler’s high,” and eagerly shoveled public sidewalks up and down my block, unwilling to stop.

Yet Earth continues warming. The last eight years have been the warmest in world history. Though record-setting cold events like the Texas deep-freeze do happen, they’re brief, acute, and too fleeting to define the season. Even the Arctic air masses that drive the now-notorious Polar Vortex aren’t as cold anymore. Where I live, Christmas Day 2022 was warm enough to rain, once an unthinkable event in Nebraska in December.

Certainly, some regions have been hit differently than others. Large parts of Wyoming were apparently immobilized yesterday, while my part of Nebraska wasn’t even seriously inconvenienced. But even this, we were warned, is part of global warming’s trend. The Arctic air masses that formerly drove winter weather in middle latitudes like mine, simply can’t muster the oomph necessary to move south like they once did. And they probably never will.

Experts and paid climate pundits have worried about the effects this dwindling will have on Earth’s macrobiome, and its ability to support humanity. I have another question, though: what happens to the culture we leave behind? The touchstones we formerly shared of distinct seasons, of the responsibilities and pleasures that each brings? What will step up to replace them when winter no longer exists?

Despite my age, I haven’t entirely abandoned the hope of marrying and having kids someday. Consider how many Western courtship rituals center around winter. Though we’ve lost traditions like “bundling bags” and winterizing the family log cabin, winter remains a prime opportunity to make memories. Consider how many Euro-American marriage rites center on Christmas—or its successor, the Hallmark Christmas Movie.

And my children: what experiences will we lose forever when the world becomes intolerably hot? Will I have the opportunity to teach them the simple sensory pleasure of the “snow shoveler’s high” that I didn’t discover until my own adulthood? What about sledding, hot cocoa, and lazy Sunday evenings around a hearth fire with people you love? Without winter, these memories will recede into the dark corners of mythology.

Sadly, I discovered partway through writing this essay that the steam off a thawing parking lot doesn’t photograph well. I can see the steam rising with my own eyes, but I can’t preserve it for you; I can only recount it using words and language. This picturesque steam becomes a story, a narrative that you experience through my depth of feeling, through the urgency of my words. It becomes, in short, a myth.

That’s what winter will become. And as happens with myth, the story will quickly overshadow the event. Just as Santa’s Lapplander assistants became poorly defined North Pole elves, the experience of “snow shoveler’s high” or downhill sledding will become something distorted. When we no longer share these experiences, when we no longer shelter for winter together, the reality will become romanticized, warped, and ultimately lost.

Thursday, February 23, 2023

State of the Disunion

Rep. Marjorie Taylor Greene (R-GA)

Marjorie Taylor Greene’s call for a “national divorce” this week, in context of her personal history, is spectacularly dumb. Though she’s tried to clarify that she doesn’t mean Fort Sumter-style secession, her fear of the federal government going “woke” is an extension of the dog-whistle ploy used since George Wallace, associating federal power with any attempt to make laws less bigoted. We can safely ignore her demand on her own terms.

That being said, however, I find something appealing about partially disaggregating federal power. Not in completely abandoning a central government, since I read enough history to know that weak federal authority provides shelter to bigots, kleptocrats, and grifters. Rather, watching how power is currently distributed in America, we’re trying to run a 21st-Century government with a 19th-Century organization. A Digital Web nation with Pony Express offices.

In particular, our Senate and Electoral College were written with no expectation that America’s population could become so unequally concentrated. In 1789, when most Americans worked in either agriculture or artisanal craftsmanship, and long-distance travel meant difficult overland slogs on foot or horseback, the idea that the population of Manhattan or Los Angeles might exceed entire states was unthinkable. In 1800, America’s largest city, New York, had barely 60,000 people.

Our state lines are drawn along 18th- and 19th-Century influences. Though four territories gained statehood in the 20th Century, all were circumscribed by domestic and international borders drawn in the 19th Century, or by geography. California remains our most populous state, but within lines drawn in 1853. That includes its ruler-straight borders with Oregon and Nevada, impervious to population density or geography, a hallmark of White settler colonialism.

America’s current internal borders don’t reflect their population or economy. Most of America’s growth economy comes from a few packed urban areas, resulting in massive wealth concentration. People move to places like New York, San Francisco, Seattle, and Chicago to find work, but need three jobs to afford rent, because nobody’s building sufficient housing. Meanwhile the things that make life liveable, like art, culture, and neighborhood pubs, keep getting priced out.

One proposed map for the State of Jefferson

Meanwhile, the places where workers extract minerals, grow food, and otherwise make actual stuff, continue dwindling. Nebraska, where I live, has whined constantly about “brain drain” since I arrived in 1992, and probably longer. Even within the state, net migration is leaving the rural west and moving to Omaha and Lincoln. Entire counties have responded by pulling their claws in and hoping the Eisenhower Era returns quickly.

Surely there’s a better way.

I first learned of the State of Jefferson movement over ten years ago, though it’s apparently existed since the Great Depression. A coalition of citizens in primarily rural counties in northern California and southern Oregon want to secede from their existing states and form a new one. The new proposed state would be the only consistently Republican state on the West Coast, and would significantly jostle distributions of Federal power in Washington.

Now, the State of Jefferson movement is far from perfect. First, it’s riddled with racists who resent how California has become relatively welcoming to immigrants. Second, its values are often contradictory. It wants to direct local tax revenues locally, despite being a primarily agrarian region dependent on federal subsidies. It wants law-and-order policing, even though cannabis is the region’s only lucrative crop. Jefferson is a satisfactory abstract idea, probably.

Despite its problems, Jefferson has virtues. Residents of rural northern California and southern Oregon have more in common with one another, culturally and economically, than with their respective state population bases. Requiring Siskiyou, California, to follow laws written in Los Angeles and San Francisco is frequently counterproductive. And indeed, original Jefferson complaints remain valid: major agricultural roads remain unpaved, devaluing their output.

Similar complaints accumulate. Laws written to protect or constrain Chicago, wind up applying to Illinois generally. (Change place names appropriately for, say, Wisconsin, Washington, Louisiana, etc.) Federal regulations written to rein in Manhattanite fiscal recklessness choke cash flows in Wyoming and the Dakotas, undercutting rural development. One-size-fits-all lawmaking generally hits poor, rural, agrarian areas hardest, in often invisible ways.

Therefore, though Marjorie Three-Names’ ridiculous Palestinian two-state system is a non-starter, I find something appealing in a system that rewards local knowledge and regional interest. Relaxing the federal stranglehold on disparate local areas, within agreed-upon limits, could unleash regional ingenuity.

No, I don’t yet know how to disaggregate federal power without empowering cranks. But it’s definitely time to start discussing the process, before local friction overtakes the federal system.

Friday, February 17, 2023

The “After” Part of Revival

A recent photo of the Asbury University revival

As I write, an event being termed a “revival” continues in the Asbury University campus chapel in Wilmore, Kentucky. Since Wednesday, February 8th, hundreds of faithful have permanently occupied the chapel building: singing, praying, preaching, and lifting hands unto God. The 24-7 religious outpouring is giving some Christians hope, in a time of seemingly unbridled selfish behavior and the continued numerical decline of American Christianity.

I’ve sat through two previous “revivals,” and therefore have definite opinions. Wrapping oneself in the moment of transcendent unity with fellow believers can definitely feel like communion with God. But, like Jesus tempted in the desert or Buddha planting himself resolutely beneath the Bodhi Tree, that moment only matters in light of what we bring back into the world. The historical track record of that “after” moment leaves me skeptical.

My first “revival” took place at a Christian rock concert in high school, the only Christian rock concert I attended. People were dancing on chairs, singing along, becoming one with the crowd, and then the lead vocalist finished with an altar call. Yes, I responded. But when the crowd dispersed, and we returned to our normal lives, the moment of exultation passed. Without the “worship high,” motivation to repent quickly dwindled.

Years later, a charming young associate pastor at the local United Methodist Church began holding Sunday evening services with a full band. Once again, the experience of crowds, music, and emotional exaltation created a perfect storm of transcendental giddiness. Unlike the rock concert, this service happened regularly, and also involved group Bible study, prayer circles, and other sustained community. This “revival” showed signs of lasting.

This pastor successfully packed a mainline Protestant sanctuary wall-to-wall every Sunday, something most conventional services only accomplish on Christmas and Easter. Donations rolled in, and money was channeled toward common good, like scholarships, community improvements, and overseas disaster relief efforts. Weekly altar calls were warmly received; even my dad, Christian but ordinarily allergic to displays of overt religiosity, walked up to “receive Jesus.”

But as the program continued, something happened: numbers began falling off. Members of the worship band, which peaked around forty-five members, began begging off. The congregation, which briefly held over 750 worshipers—remarkable for a small-ish town—began capping at a hundred, then seventy-five. While worship song instrumental breaks ran longer than the Allman Brothers at the Fillmore East, worship service electric bills started exceeding what the collection took in.

A recent photo of the Asbury University revival

Because the service happened weekly, the falling-off didn’t happen abruptly, as happened after the concert. Rather, things gradually tailed off. People experienced the transcendent worship high, but then returned glumly to regular lives of jobs, school, and cooking dinner. Without discipleship efforts to offer anyone a genuine new life, a genuine straight-and-narrow to walk, the worship high began feeling hollow. Interest waned, and soon, so did the service.

Don’t misunderstand me, what happened in those moments wasn’t hay. The dissolution of self that happens in concert environments distinctly resembles the ego death which Christian and Buddhist mystics describe at moments of salvation or enlightenment. However, concert transcendence depends on the crowd, and ends when everyone goes home. Likewise, when religious people leave the sanctum, if there’s no continuation of community, the emotional response dissipates.

Events like what we’re seeing happen at Asbury University make True Believers feel connected to God and one another. But eventually, everyone has to leave the sanctum and return to daily life. If revival offers nothing beyond that moment of emotional bliss, the pull of ordinary tedium quickly overwhelms grandiose feelings. Like cocaine, a worship high requires greater and greater quantities to overcome the flesh. Mere mortal pastors just can’t provide that.

However, churches can provide community. When “church” is a temporary respite from a world of exploitation, and we return to lives where others profit from our efforts, religion (or anyway religiosity) seems frivolous. Christians need forms of continuing discipleship, opportunities to participate in something larger than themselves. Living the Beatitudes is tiring when you do it alone except for an hour on Sunday. But it’s easy when Christians work together.

I don’t want to diminish the Asbury revival, or the feelings its participants share in that time and space. The defining question, though, is: will they carry those feelings, that experience, into the world? Historically, White Protestant churches are pretty bad at the “after” part of revival. I hope I’m wrong, though, because this world really needs weekday Christians to get busy living by the words we claim to believe.

Wednesday, February 15, 2023

The Trouble With the Chosen One

The last days of a dying Chosen One

Luke Skywalker, Harry Potter, and Neo from The Matrix were all destined for greatness before birth, and recognized as The Chosen One by others. Science fiction and fantasy love the Chosen One mythos and recycle it endlessly. The Chosen One’s coming is always foretold; indeed, the Chosen One may be the last to recognize his own (and it’s usually “his”) greatness. But once he does, he unleashes righteous fury on the nations.

While science fiction loves literal Chosen Ones whose presence purges humanity’s impurity, other genres love a more subdued Chosen One. Hero teachers like Jaime Escalante in Stand By Me and Erin Gruwell in The Freedom Writers, or hero cops like Popeye Doyle and Dirty Harry Callahan, serve the same role, though their reach is circumscribed by their form. Seems like storytellers love finding that one superlative individual who will instantly repair all our woes.

Sort of like Jesus or the Buddha.

Most people, I suspect, would agree that we don’t fall on hard times because one person did us dirty. We might point fingers at an unpopular president, or the public face of a designated “other” like Hitler or Saddam Hussein. But realistically, most people realize these highly visible individuals didn’t cause widespread social crisis, they simply exploited it for personal or ideological ends. Society feels like it’s been skidding for a while.

Yet despite knowing individuals didn’t cause our unhappiness, we nevertheless seek individuals to reverse that perceived skid. Whether we seek that salvation in fictional characters, like Luke Skywalker or Katniss Everdeen, or in (putatively) factual individuals like Jesus or Muhammed, we want one person to take responsibility for solving the crisis. Solving social problems collectively is hard; we’d rather throw everything on the person of a Chosen One.

That’s where things get sticky. Jesus Christ claimed a unique ability to guide his people out of darkness. So did Donald Trump. The former president’s 2016 mantra of “I alone can fix this” seemed, for many people dispossessed by a changing economy, to be a necessary tonic for the Obama Administration’s embrace of collective responsibility and systemic reform. Obama’s approach was difficult, slow, and unpleasant; Trump’s response was easy and instantaneous.

Not every Chosen One necessarily brings salvation through their person. (I use the word “salvation” loosely here; please bear with me.) Jesus Christ said “I am the Way, the Truth, and the Light,” making himself individually the bearer of transcendence. The Buddha, by contrast, placed his teachings above his person. Had Siddhartha Gautama not achieved Enlightenment, somebody else would have, and the Truth would’ve shone through anyway.

A Chosen One and his girlfriend walk into a bar...

Few modern Chosen Ones would be so modest. Whether religious messiahs like Jim Jones or Sun Myung Moon, or secular deliverers like Donald Trump or (dare I say) Adolf Hitler, they always proclaim themselves, personally, the source of redemption. Jesus, Buddha, Hltler, and Trump all arrived at times of deteriorating empire, when old ways of government and religion weren’t working anymore, and promised deliverance. Some succeeded better than others.

Don’t misunderstand me: I don’t mean Hitler and Trump are morally equal to Jesus and Buddha. Others might say that, but I won’t. Rather. Jesus and Buddha promised to free humanity by stepping outside this world’s social, political, and economic limitations. Jesus promised to make all things new, while Buddha promised knowledge from this world’s ignorance. Political messiahs, however, inevitably pledge relief by reinforcing this world’s doctrinaire tendencies.

For this discussion, the particular deliverance each messiah offers matters less than their promised methods, and how their congregation receives them. The Chosen One always promises to relieve the suffering congregation’s pains, whether that congregation is genuinely oppressed, or merely feels themselves oppressed. Peace and consolation await True Believers willing to invest their every hope into the Chosen One.

Secular Chosen Ones generally don’t end well. Donald Trump’s congregation progressed from nodding agreement at his campaign rallies, to the violence of January 6th, 2021, with almost Life of Brian-like haste. And while Jesus told his disciples to sheathe their swords and not answer evil with violence, two millennia of people speaking on Jesus’ behalf have fomented Christian crusades and sectarian holy wars. That’s pretty bad PR for the Prince of Peace.

Fundamentally, a Chosen One lets True Believers relinquish their agency. When 909 people commit suicide in Jonestown, no individual congregant is truly responsible. Indeed, it’s likely that hundreds looked for one other person to say “no” first. Because when a Chosen One goes bad, the only hope is another Chosen One.

Monday, February 13, 2023

Are We Living In an X-Files World?

David Duchovny and Gillian Anderson in the 2016 X-Files relaunch

Yesterday, the United States military shot down a third unidentified object in American airspace, this time over Lake Huron. (Technically it was in Canadian airspace, just barely, but stick with me.) These unknown objects invading American skies have become so common, so ubiquitous, that even I have joined in jokes comparing them to UFO paranoia and kaiju movies. Yet I find myself worried about our increasingly militant response.

After the Biden Administration waited several days and several thousand miles to shoot down the first Chinese spy balloon, members of Congress, mostly the Republican Party’s hard-right flank, went berserk. The Administration has met each subsequent one with swift finality, and they're still not happy. Unidentified objects (I’m reluctant to call them UFOs) traversing American airspace have become another rallying point of a fear-based, xenophobic American worldview, again.

When The X-Files first aired in the 1990s, I wasn’t savvy enough to notice the overlap between TV stories and current events. The narrative of extraterrestrials conducting a slow, covert invasion of Earth corresponded with fears of invasion at home. Pete Wilson ran two successful campaigns for Governor of California based on overt, undisguised appeals to racism. Demagogues stoked similar fears of refugees entering America from Haiti, Somalia, and elsewhere.

However, it’s worth contrasting the literal “invasion” of refugees and undocumented immigrants, against Chris Carter’s science fiction invasion. The X-Files depicted a categorical invasion based on strategy and fear. Chris Carter’s invading aliens cultivated allies among the powerful, controlled the media message, and used psy-ops to silence anyone calling them out. Refugees simply showed up, hopeless, scared, and desperate for food and a shower.

Watching events unfold around us, I can’t help noticing who’s using psy-ops to actually divide us. While Marjorie Three-Names and Lolo Bobo scream bloody murder about invasion, about undocumented immigrants bringing disease as biological warfare, and about the enemies at our gates, America’s rich and powerful continue finding ways to narrow citizens’ access to information. We’re busy looking for external enemies, and failing to see them at home.

Elon Musk

Elon Musk’s invasion is probably the most glaring, if only because it’s so ham-handed and clumsy. While he continues running one of America’s best peer-to-peer information sources into the ground, he’s had ten children by three mothers. Overwhelming human genetic codes and planting alien offspring was a literal X-Files strategy for invasion. Despite still having numerous admirers, Musk has devolved into a 1990s sci-fi villain.

Other internal enemies are less visible, but more numerous. Corporate media conglomerates like Sinclair Media and iHeartRadio (formerly Clear Channel) have a stranglehold on American media, and share a notoriously right-wing bent. Though it’s been a while since they did anything as obvious as submarining a platinum-selling country group, their ability to silence dissent is unmatched since the heyday of William Randolph Hearst.

It congeals into a massive goulash of influences where we can, as Fox Mulder famously warned, “trust no one.” While our political leaders hold power by lavishly identifying enemies inside and beyond our borders, the moneyed interests that bankroll their reelection campaigns continue finding ways to foment disunity and paranoia. Law and society persist in undermining the tools of community organization, and we’re left atomized, unable to defend ourselves.

Don’t misunderstand me. The X-Files showed the alien invaders’ human allies as a tight-knit cabal, so friendly that they could gather in one room and make binding decisions quickly. There’s no material evidence of such literal collusion outside a TV writers’ room. In reality, it’s more likely a sloppy agglomeration of mutual back-slappers whose needs happen to coincide. But the patterns are, if not instructive, at least illustrative.

I don’t believe, based on evidence, that it’s going too far to say that America’s rich are currently using the playbook from 1990s science fiction to overthrow the existing society and engineer one that suits their needs. Though the January 6th insurrection failed to coalesce into a coup, it didn’t need to. Subsequent years have seen us descend into exactly the bureaucratic intransigence and partisan backbiting that Mulder and Scully fought against for eight seasons.

Spotting UFOs in American airspace is, arguably, the endgame of this paranoia. It has Americans literally looking into the clouds or across the ocean to spot the enemies that have already seized the levers of American power. Our country’s wealthy and powerful have taught us to live in constant fear and distrust, which undermines our most rudimentary tools to protect ourselves against them. Flying saucers have become just another decoy for power.

Friday, February 10, 2023

The Quest for the Elusive “Fact”

A Montana state Legislature bill floated this week would forbid science teachers in public schools from teaching anything other than “facts.” The bill is doomed to fail for multiple reasons: like North Dakota’s recent attempt to define “gender,” the bill is unenforceable, and has no visible support besides its sponsor. Further, the bill’s language apparently provides no meaningful definition of “facts,” something philosophers of science have debated for generations.

The bill itself is beneath contempt and wouldn’t deserve commentary on its own. But situated in context, alongside the North Dakota gender bill and Florida’s ongoing efforts to purge entire disciplines of “ambiguity,” a pattern becomes visible. One political party wants to engineer an educational system completely devoid of debate, complexity, or doubt. All subjects become lists of facts, answers are either right or wrong, and entire disciplines become foregone conclusions.

Smarter commentators have spilled copious ink over the current right-wing push to ban books and censor classroom discussions. Yet as new fronts in the battle become visible, a pattern emerges. These accusations aren’t directed at any individual fact or discipline, despite the attention paid to science or African American studies. Rather, taken together, they show a desire to produce a generation too uninformed to formulate questions or demonstrate rudimentary curiosity.

This desire makes itself most visible in studying topics where clearly right answers exist. The mass removal of books from Florida classrooms has reportedly swept up anything dealing, even tangentially, with sexuality, religion, and race; baseball icons Roberto Clemente and Jackie Robinson are too controversial for Florida today. But as I’ve noted elsewhere, absolute facts are seldom interesting. Only the controversial dynamics of controversy make facts worthwhile.

Science, including both the physical and social sciences, reveal the importance of controversy. If I drop a baseball, it will fall. This claim isn’t controversial. But why does it fall? Not until Isaac Newton did anybody formulate an answer that withstood scrutiny, without relying on God or a circular claim of “Because it does, duh.” Newton’s theory of gravitation created models which other scientists could replicate, and therefore could verify or disprove. That’s what a theory is.

When legislators reduce science to memorizable facts, they strip science of meaning. We don’t care about science as a description of obviously true events; we care about science because it permits informed and testable predictions. When Newton’s theories accurately forecast the location of the planet Neptune, his theory was considered credible. When his theories inaccurately described the orbit of Mercury, Albert Einstein formulated a new theory.

Anybody who remembers the tedium of grade-school arithmetic “skillz drillz” can easily imagine how stultifying an education based around only facts would be. Especially when those facts come pre-filtered to protect White parents’ tender sensibilities. Without enough knowledge of controversy to test opposing sides, and without enough awareness at times to even realize a controversy exists, students are reduced to memorizing lists, a task kids famously hate.

In other words, an entirely fact-based education seems designed to make kids mentally check out early. Without questions to solve, without patterns to identify, without friction to resolve, students predictably become bored and stop caring. Anything that students can correctly identify on a Scantron sheet, still the hallmark of standardized tests, probably is also boring. As this pervasive boredom slowly overtakes academia, one wonders if this isn’t deliberate.

Although the Montana bill makes science central here, the same principles apply to all topics. Paul Lockhart writes that many freshers enter college thinking they’re good at mathematics, when they’re actually good at following directions. (Replace “good at” with “bad at” where appropriate.) Ron DeSantis’ attempts to whitewash history only make explicit what James Loewen claims has always been implicit in schoolbook history.

Taken together, the pattern emerges: students don’t learn to ask questions. They learn topics only shallowly, reduce all subjects to memorized lists, and mostly forget everything after the standardized test. The system creates deeply incurious graduates who comply with authority simply to make the moment go away. Students emerge perfectly suited for essentially robotic industrial jobs that, ironically, don’t much exist in America anymore.

As a sometime teacher myself, I don’t like these conclusions, but I can’t avoid them either. I remember spending months trying to reignite students’ innate childhood curiosity, but by the time they reached me, the system had already squelched it. Bills and laws like these serve neither teachers nor students. But they do serve to create a permanent underclass of supposedly schooled graduates forever dependent on arbitrary authority.

Wednesday, February 8, 2023

Death of the American Expert

Dr. Steven Hayne giving testimony in an undated photo (source)
This essay is a follow-up to Southern Comfort and Down-Home Injustice

For over a quarter century, Dr. Steven Hayne and Michael West, DDS, made healthy paychecks as “expert witnesses” for Mississippi prosecutors. As Balko and Carrington describe, the two doctors maintained a robust schedule of medical examinations, written and oral testimony, and full-time medical practices. Radley Balko himself has been key in bringing national attention to Hayne and West’s quackery, and the injustice they precipitated.

However, reading Balko and Carrington, I felt distinct discomfort. The authors wrote before the COVID-19 pandemic, which brought dueling definitions of expertise, and high-profile media debates about who to trust. Anti-mask and anti-vaccine advocates used almost exactly the same language to disparage Anthony Fauci as Balko and Carrington use to disparage Steven Hayne. This backs me into a difficult corner regarding how we identify and use experts.

Our authors describe the Daubert standard, the current rule of evidence in American courtrooms, which authorizes judges to determine expertise. Essentially, judges make rulings on whether expert witnesses have sufficient standing in an accredited field to voice opinions that mean anything. This requires opposing counsel to be sufficiently well-informed (and sufficiently lucrative) to actually challenge expertise, and judges to know enough to make such rulings.

However, as we’ve seen since COVID, public officials frequently aren’t sufficiently well-informed to make such decisions. The perception of “scientific consensus” may swing on the Availability Heuristic: that is, if a field like Michael West’s bite-mark analysis gets media attention, judges will know of it, and therefore deem it plausible. Bite-mark analysis helped bring Ted Bundy to justice, and still holds water in courts, but is widely regarded as pseudoscience today.

In Balko and Carrington’s telling, Hayne’s biggest sin was probably maintaining an autopsy and testimony schedule that makes no sense. One autopsy can take four hours, plus weeks of preparation for testimony; but Hayne, at his peak, claimed to complete five or six autopsies per day. The numbers simply strain plausibility.

West, by contrast, apparently invented scientific disciplines from whole cloth. He pioneered an ultraviolet bite-mark identification system that supposedly provided ironclad evidence. However, nobody but West ever successfully replicated his process. Then, his process involved damaging the supposedly already-damaged tissue, ensuring no third party could either verify or dispute West’s findings. Because only West could do it, and nobody could possibly contradict him, his claims were implausibly self-sealing.

Michael West, DDS (Netflix photo)

Without meaningful peer review of Hayne and West’s techniques, they could claim anything, no matter how extravagant. Which, of course, is exactly what Balko and Carrington assert. Hayne and West’s lucrative “expertise” mainly involved courtroom acclamation, not studied give-and-take. In practice, these doctors’ scientific expertise involved oodles of laboratory jargon that mainly served to confuse jurors.

No wonder, when COVID hit America, that so many heel-draggers thought they could answer the Centers for Disease Control with (and I paraphrase) “Nuh-uh!” In this environment, science isn’t a process of inquiry; science is what scientists do. When Anthony Fauci presented steps for preventing disease spread—and importantly, when those prescribed steps changed with new information—his opponents, including the President, challenged his person, not his evidence.

A recurrent social media meme asserts that “if you can’t question it, it isn’t science.” Which is true as far as it goes, but this often reduces to the most ridiculous possible minimum. Popular anti-mask spokespeople, like Judy Mikovits or Stella Immanuel, engaged in fault-finding missions, conflated dissimilar facts to bolster a flimsy narrative, or simply made stuff up. These “questions” were treated seriously by credulous politicians and media giants.

One knock against Fauci was a March 2020 60 Minutes clip where he specifically said mask-wearing wasn’t necessary, at a time when there weren’t enough quality masks to go around. Months later, new information, and more widespread mask availability, led Fauci to endorse mask-wearing. But Facebook and Twitter exhumed that original claim, asserting that Fauci was inconsistent. If a scientist’s claims are inconsistent, his science is disproved, Q.E.D.

Hayne and West emerged from a system wherein expertise was accorded by authority figures, not refined through testing and verification. Courts deemed their statements scientific because they themselves were deemed scientists—and, not coincidentally, because they told state prosecutors whatever they wanted to hear. This wasn’t just bad for individual cases; it redefined “expertise” in ways that privilege individuals over the scientific method.

In that environment, I can’t fault Americans for failing to recognize legitimate science, even when ignorance caused human death. COVID, global warming, and industrial pollution are serious concerns supported by serious evidence. But Americans need responsible individuals to translate that evidence for us, and our credentialed experts act in bad faith. We’re left paying the price.

Monday, February 6, 2023

Southern Comfort and Down-Home Injustice

Radley Balko and Tucker Carrington, The Cadaver King and the Country Dentist: A True Story of Injustice in the American South

If your loved one was murdered in Mississippi sometime between 1980 and 2010, chances are their postmortem was conducted by Dr. Steven Hayne. That alone should give you cause for concern. But if Dr. Hayne spotted any unusual marks during the autopsy, he would’ve called in Dr. Michael West, the state’s foremost bite-mark expert. Between Hayne and West, there’s an appalling likelihood that the wrong suspect got railroaded.

Radley Balko and Tucker Carrington have strong professional investments in Hayne and West, from their different perspectives. Balko, a journalist with ties to the libertarian Cato Institute and Reason magazine, has written extensively about an American law enforcement apparatus that’s long since parted company with justice. Carrington, an Ole Miss law professor, moonlights as head of Mississippi’s Innocence Project, and has faced down Hayne and West in many a contentious courtroom.

This story begins with two gruesome murders, a sure hook for today’s audience of true-crime podcast listeners. Two very young girls were sexually assaulted and murdered in rural Noxubee Country, Mississippi. Despite the murders’ similarities, investigators rushed to identify men inside the victims’ homes. Levon Brooks and Kennedy Brewer, who apparently didn’t know one another, were quickly singled out, and police, aided by Hayne, began assembling evidence.

Even though both girls’ bodies were submerged in water for hours or days, Hayne found suspicious marks on them, and called in West for bite-mark analysis. Here’s where things become significant: neither Hayne nor West identified new suspects, or excluded existing ones. Both “experts” found evidence inculpating the men police already wanted. Both Brooks and Brewer were convicted, Brewer sentenced to die, with no physical evidence except those bite marks.

Our authors aren’t satisfied with naming and shaming two “expert” witnesses who, by the time this book debuted, were already considered discredited. Their target is larger: they expose a political system that made this injustice possible. That means not only identifying the people who railroaded two innocent men, and almost certainly did likewise to others. They must also unpack how we got here, and where the problem still exists.

Most Americans probably don’t know the difference between a coroner and a medical examiner. The words are used interchangeably in the media where most Americans learn about law enforcement: TV crime dramas and detective novels. But they’re very different positions, with very different responsibilities. Start with the fact that coroners are elected, and therefore beholden to voter sentiment, not necessarily the facts. That should scare you.

Radley Balko (left) and Tucker Carrington

County coroners regularly hired Dr. Hayne, a private pathologist, to perform autopsies, especially in rural counties with limited forensic resources. At his peak, Hayne performed eighty percent of Mississippi’s autopsies, at a breakneck pace of six per day—while maintaining two full-time hospital jobs and giving testimony. Not surprisingly, obvious errors crept into many postmortem reports. Yet Hayne, often with West, remained indispensable to prosecutors for decades.

Exactly why Hayne and West remained popular isn’t difficult: they delivered convictions. Amiable and charismatic on the witness stand, the men charmed juries while baffling them with barrages of medical jargon. Faced with defendants who mostly couldn’t afford competitive attorneys, their bullshit went substantially unchallenged. Hayne and West gave lip service to justice and evidence, but built their careers carrying the prosecution’s water, and got paid well for it.

Balko and Carrington describe a death investigation system driven by convictions, not justice. Most coroners, prosecutors, and (in Mississippi) judges are elected, not appointed. Therefore being perceived as “soft on crime” can kill otherwise promising careers. Prosecutors choose expert witnesses based on their ability to deliver convictions, not their dedication to truth. And if those convicted are disproportionately Black, well whoopsie-daisy, I guess.

Worse, these problems aren’t limited to Mississippi. The lingering legacy of tough-on-crime politics has provided defendants with limited recourse at the federal level. Even when persuasive evidence exists that exculpates those found guilty, judges often feign helplessness. This means the innocent remain imprisoned, but equally awful, the guilty go unindicted, and often, as in Brooks and Brewer’s case, are free to kill again, because the justice system just stops looking.

These authors tell a gripping story of two innocent men, two charismatic experts, and the fatal dance tying them together. But it also describes a justice system that, in many ways, has advanced little. Expert witnesses serve the same role in modern trials that theologians served in colonial witch trials. The outcomes are heartbreakingly similar for those who believe that justice exists, and we have a duty to find it.

Friday, February 3, 2023

What Marie Kondo Means To Me

Marie Kondo (Netflix photo)

The internet collectively lost its mind last week at the revelation that home management guru Marie Kondo has largely abandoned tidying up. Now a mother of three, Kondo simply has other priorities. The online response has been an embarrassing pile-on, in which I bashfully admit I participated. Millions worldwide felt validated by the revelation that Kondo’s household management strategy doesn’t work if you have kids, a job, or a life.

Now the heat of excitement has largely dissipated, I’d like to consider what Kondo’s admission really means. Because I don’t think it means we have permission to live sloppy, chaotic lives. Though I can’t confirm it, I’d imagine that the messiest parts of Kondo’s apartment probably look more orderly than the cleanest parts of my bachelor pad. She practiced habits of cleanliness long before real life derailed that ethic.

We humans enjoy seeing the mighty brought low. The public schadenfreude demonstrated whenever Elon Musk finds another way to mismanage Twitter is just one current example. We feel gratified knowing “the great” are fragile like us. Unfortunately, I know something about falling short of goals. As a Christian, an American, and a progressive, I’m a member of several groups that are historically bad at living up to our own standards.

Rhetoricians speak of the “Fundamental Attribution Error,” a fallacy of informal logic. Briefly put, we assume our own misbehavior is a consequence of circumstances, but we assume other people’s misbehavior stems from their personality. When I personally fail to uphold my voiced values—my religious morals, intellectual standards, or just proper etiquette—I’ll tend to blame something outside myself. But when someone else fails, they’re personally culpable.

I see something similar happening with Marie Kondo currently. People who previously rushed to defend themselves against criticism for haphazard housekeeping because they have demanding jobs, small children, or whatever, now see Kondo having to schedule her life around her kids, and construing it as a failure of her philosophy. People who would never blame their politics or religion for their own failings, cast the same aspersions on Kondo.

But I’d contend that no philosophy ever encompasses every possible circumstance. The entire point of Marie Kondo’s tidying philosophy wasn’t to live permanently inside a Better Homes & Gardens display; it was to practice in moments of limited pressure, training yourself how to think. Because when the pressure hits, when you have kids demanding your time and attention, it’ll be too late to learn how to maintain your house.

Previous moral scandals have erupted because people thought their religion or philosophy protected them from failure. Famed televangelist Jimmy Swaggart preached against sexual indiscretion, then got caught with a hooker. Swaggart wept on television, pleaded for believers’ forgiveness, and then… got caught with another hooker. Because he believed his faith didn’t just direct him from sin, he believed it made him invulnerable to sin.

Other examples accumulate. We know teenagers who sign “virginity pledges” are no less likely than the general population to actually have premarital sex. However, because they believe they’re morally immune to premarital sex, they’re less likely to have birth control available when it actually happens. Because these teenagers didn’t practice their moral responses in low-pressure conditions, they had no response when the pressure was on.

Paging Bristol Palin, stat.

Moral philosophies, whether Christianity or pragmatism or Marie Kondo’s systematic cleanliness, aren’t intended to last forever. One doesn’t rest on moral accomplishments like a fat puppy. Rather, we need to exercise our moral positions in safe environments of controlled pressure, just as athletes practice weight training in the gym between bouts on the field. Practice doesn’t make perfect; rather, I’d say that practice makes prepared.

Permit me to overextend the athletic metaphor. Anybody who’s worked out knows the person able to lift the heaviest gym weights isn’t necessarily the strongest person. Rather, somebody with practiced form and moderate strength can outlift somebody with sloppy form and great strength. That’s what Marie Kondo, and other philosophers, should offer us: not strength, but form. Because we still have form even when our strength inevitably fails us.

We haven’t seen Marie Kondo’s post-kids house. But I’d bet money that we’d enter and, while she blushed and apologized for the mess, we’d gawk at nearly everything in its place and free of dust. Because through years of practice, she’s learned the habits of making good decisions about what to keep or discard. She knows what she values. And unlike us, she knows the right way to get it.

Wednesday, February 1, 2023

The Only Path Through the Midnight Woods

Sarah Hollowell, A Dark and Starless Forest

Nine children and their teacher live a cloistered life in a secluded house in northern Indiana, practicing their magic and avoiding the outside world. Though these children, mostly girls (and one enby), will eventually grow into awesome powers, they’re currently too young to defend against a hostile world. Until they learn, they live by a few simple rules. Don’t use magic recklessly; don’t go into the forest; and above all, don’t cross Frank.

This novel’s Shirley Jackson-esque premise drew me in without hesitation. Like Jackson, Hollowell places her characters in a setting that physically isolates them from the larger world, forcing them to rely upon one another. Then she makes them, in different ways, unreliable. Throughout the novel, some oppressive presence looms constantly over the characters. But what is that presence and what does it want?

Derry, our first-person narrator, has a deeply conflicted relationship with her surrogate father, Frank. He’s taught her to control her superpowers since her parents unceremoniously dumped her at his doorstep. Frank is gentle, paternal, and patient, as long as Derry and the other students comply. But he punishes the slightest sign of willfulness strictly. And Derry is a restless teenager, learning to chafe at his authority.

Meanwhile, Derry has a secret: she and a few other students have discovered a secret tunnel leading to the forbidden forest. A tunnel only magically gifted students can access. This liberates them from Frank’s often arbitrary dominion, but at a cost: Derry and her sister Jane saw something in the wood. Derry won’t tell us what, but she believes it’s connected when, one summer midnight, Jane apparently disappears into darkness.

Hollowell creates a world defined by narrowness and constriction. Frank collects his students from parents frightened of their children’s nascent powers. Yet as the story develops, and Frank seems no closer to discovering where Jane went, Derry slowly realizes they only know what Frank tells them. What motivates Frank to teach his students, while protecting him from a putatively hostile world? He won’t tell, and Derry can’t guess.

Sarah Hollowell

To find answers, Derry must improve her supernatural abilities, and use them in ways Frank has never taught her. Though Derry describes her fellow students’ powers as magical, and Frank calls his students “alchemists,” the students don’t learn diverse spells from grimoires. Rather, they have unique powers which Frank helps them cultivate. The school thus looks less like Hogwart’s, more like Xavier’s School for Gifted Children.

Outside Frank’s door, the forbidden forest starts calling Derry. Soon enough, she breaks one of Frank’s inviolable rules and ventures in alone. There she finds a mysterious figure who promises to reveal the truth which Frank has spent years concealing. But how can Derry decide who’s telling the truth? And what must she do when the forest begins demanding more from her than she feels prepared to give?

It’s tempting to seek parallels in Hollowell’s narrative. Derry, our narrator, resembles Hollowell herself: bespectacled, fat (her word, not mine), and gender-nonconforming. The symbolism of the isolated childhood, the authoritative father figure, and the desire to see the outside world, are pretty glaring. On a tenth-grade book report level, Hollowell’s themes of young adulthood and the need to break parental chains loom large.

But such one-to-one interpretations place limits on Hollowell’s gripping, fast-paced story. Derry wants answers, while Frank continues treating her and her foster family as permanent children. One after another, Frank’s students vanish, first under darkness, later in broad daylight. They leave no trace, and Frank’s efforts to maintain order become increasingly Spartan. Derry ultimately wants what everyone wants: to see the world with her own eyes.

Hollowell handles the necessary tone shifts to tell an engaging story. Derry describes the house, the forest, and her foster family with the lush detail you’d expect from Frank’s low-tech curriculum. But when she needs more muscular storytelling force, she has it, and key scenes explode with vigor. The book runs over 350 pages without ever feeling long. And she keeps twelve mononymic characters in constant play without cluttering the story landscape.

This book is marketed as a Young Adult novel, which perhaps isn’t surprising, considering its mainly young cast and coming-of-age themes. But like the best children’s literature, it offers plenty for adults: themes of authority versus independence, for instance, and exactly how terrifying the outside world really is. Hollowell writes with nail-biting tension that keeps readers up past their bedtimes, and she tells a story as timely and pertinent as it is scary and fun.