Sunday, March 31, 2019

Robert Mueller, Failed Messiah

Robert Mueller
I felt a certain harkening familiarity in the recent revelation that the long-awaited Mueller Report wouldn’t unseat the President. For around two years, Blue Facebook and Blue Twitter impatiently awaited the magic text that would depose the usurper, reverse injustice, embolden the keepers of truth, and make everything good again. I kept my yap shut, trepidatious about upsetting people already on edge, but we’ve seen this before: True Believers disappointed in yet another secular Messiah.

As religion has drifted from modernity, the impulse for moral clarity and deliverance remains ruggedly persistent. Émile Durkheim described this in 1912, that though faith in transcendence was dwindling, human-scale forces had stepped into the gap. Father, Son, and Holy Spirit were displaced by Liberté, Egalité, et Fraternité. Gilded saints’ icons gave way to heroic portraits of Washington, Lafayette, or bare-breasted Liberty On The March. The justification was worldly, but the language was clearly religious.

But just as the Second Coming keeps not happening, so does salvation through philosophy or politics. Neither Utilitarian happiness nor Neitzschean power have made anyone one whit better human beings when they get hungry or angry enough. During the Cold War, the language of Capitalism versus Communism took on undeniably eschatological trappings, as both promised some measure of salvation. But both proved disappointing, terminally inept Messiahs, forcing True Believers worldwide to continue the frustrating search.

Robert Mueller becomes the latest in whom Americans invested a Life of Brian-like zeal which he didn’t court or accept. He was predicted to save, or damn, the nation, depending on one’s political viewpoints. Progressives likened him to Dr. King; conservatives, to Julius and Ethel Rosenberg. Nobody seems to have considered him a professional, simply following the evidence where it leads, and permitting actual jurists to indict or dismiss, which seems the role he preferred.

Progressives are responding to the disappointment surrounding Mueller in two ways. Serious lawyers and legal smile benignly and say they knew this all along, that they never really promised one simple investigative report would overturn an election and impeach a President. Anyone who’s watched the news during the last two years knows this is buncombe, of course. Like Cleopas on the road to Emmaus, they’ve dejectedly forgotten their former messianic fervor and returned to drudgery.

More numerous, especially among late-night comedians and the punditocracy, are those doubling down on what they previously believed. Release the actual report, they demand, insisting that if they see the original text, gnostic wisdom will descend and everyone’s eyes will be opened. Like the religious devotees Leon Festinger investigated in his classic When Prophecy Fails, the mere absence of the foretold incarnation doesn’t dim their belief. If anything, it makes True Believers believe even harder.

“Liberty Leading the People,” Eugène Delacroix, 1830
Political revolution presented as undisguised secular religion

The religious qualities of both responses are telling. The disappointed return to awaiting the next Annunciation, while the fervorous resume evangelizing. Audiences familiar with religious history, however, can already predict what happens next. Occasional religious innovators like Jeremiah, Jesus, Mohammed, or Joseph Smith create movements that live after them. Most, however, burn out quickly, leaving adherents looking for another prophet to alleviate their despondency. Dwindling numbers of faithful dedicate their lives to evangelizing the converted.

Certainly I understand this impulse. Progressives imbue Robert Mueller with their moral aspirations, because if he can reverse the perceived injustice of a racist, misogynist also-ran in the Oval Office, then obviously we live in a righteous universe. Like the apostles, progressives wanted Mueller to overthrow tyranny and lead the Chosen People into a Golden Age, in this life. They don’t want to live forever with God, or whatever God substitute we’ve chosen this week.

Theologians debate why Jesus needed to die ignominiously—scorned, humiliated, and naked. Whatever reason, though, it surely outstrips the fate awaiting Robert Mueller as disgraced Messiah, returning to anonymity, a government careerist plagued by a government that no longer wants him. Meanwhile, his would-be apostles are already scrambling for the next Messiah who they hope will deliver them from this evil and restore moral certainty. It would be hilarious, if America weren’t on the line.

Unfortunately, this cycle will repeat itself infinitely, as long as humans perceive a gap between their conceptions of justice, and the visible world around us. In former generations, we believed a literal God would restore harmony someday, or perhaps that centrifugal force would pull everything apart in a catastrophic Ragnarok. Now we seek recompense between political salvation and Armageddon via global warming. The impulse remains the same. Only the name we give salvation ever changes.

Wednesday, March 27, 2019

The Passion, the Pride, and the Copyright Law

Promo still from Disney's The Lion, the Witch, and the Wardrobe

I never heard of Francis Spufford, the British literary critic turned novelist, until this week, when he garnered stray headlines over a book which he’ll almost certainly never publish. Spufford wrote a book intended to close a gap I didn’t realize existed in CS Lewis’s Chronicles of Narnia, a piece which the few who have ever read it describe as unacknowledged genius. But he wrote this book without first getting clearance from the Lewis estate.

My opinions on today’s overly long copyright protections are already well-documented. In signatory countries to the Berne Convention (which is most of the world), most artistic works are protected for a minimum of fifty years after the creators’ death. But that’s a minimum; America and Britain offer additional protections, keeping works secured for seventy years. So Narnia won’t enter public domain, or support unauthorized derivative works, in the English-speaking world’s two largest markets until 2034.

Nearly seventy years after the first volume dropped, Narnia remains among those rare few books that actually continues making money for its author’s heirs. It remains a steady seller at Christian and mainstream bookstores, and as been adapted for mass media twice, by the BBC and Disney. (A Netflix adaptation has been reported.) Most authors make all royalties they’ll ever make on their books in the first year, but Narnia remains lucrative three generations later.

Meanwhile Professor Spufford, like most authors, needs a day job to subsidize his writing, in his case as a writing instructor. His works are well-regarded, by people whose job it is to regard such things, but not widely read. Not only will his work probably not remain in circulation in seventy years, I’ve had some difficulty tracking it down today; if I wanted his books, I’d need to pay and have them imported from Britain.

Which is where this becomes interesting.

Arguably, Spufford has committed a PR coup. I’d never heard of Professor Spufford before this story erupted this week, but investigating him to write this essay, I discovered that his criticism dovetails with a nonfiction book I’m writing; I have a title on order. Congrats, Professor, you made a sale. Even without publishing the controversial novel, Spufford has ridden Lewis’ coattails to relevance, selling some copy along the way.

Francis Spufford
Spufford claims to have written a novel that reconciles the gap between The Magician’s Nephew, Narnia’s creation myth, and The Lion, the Witch, and the Wardrobe, the first-published Narnia book. I hadn’t realized there were any particular inconsistencies; like Brian Herbert and Kevin J. Anderson’s Dune continuation novels (based on Frank Herbert’s notes), nobody knew there was a missing middle until somebody filled it. And the few critics who’ve seen it, apparently praise Spufford’s writing.

This isn’t without precedent. Novelist Gregory Maguire made his name rewriting popular children’s fantasies from the antagonists’ viewpoint. His breakout novel, Wicked, forced audiences to reëvaluate their preconceptions about The Wizard of Oz. But Maguire waited until the source material went out of copyright. Admittedly this was easier, before the Copyright Extension Act of 1996 extended protections until almost Doomsday.

And, just a brief reminder, that copyright extension was subsidized by lobbyists paid by Disney.

This bears mentioning because Disney made its reputation adapting works from the public domain. Their classic The Jungle Book (1967) hit theatres only one year after Rudyard Kipling’s copyright extension ended, which, given the long lead time in movie production, means the work began under technically unlawful terms. Nevertheless, both Disney and Maguire established their names repurposing society’s common cultural pool, and not by taking property that technically belongs to somebody else. Or their estate.

I’m not sure Spufford is wrong, though. Narnia has become so widespread in popular culture that it’s unmoored itself from its author. Considering just one example, American author Lev Grossman’s The Magicians clearly takes Narnia as its inspiration, changing proper nouns just enough to create plausible deniability. The result masterfully addresses Lewis’ deep moral omissions. Grossman, like Spufford, engages with Narnia’s widespread cultural influence. He just cooperates with established intellectual property law to do so.

In short, I find myself torn. Narnia has become widespread cultural mythology, a shared experience many readers recall fondly from childhood. But it also remains somebody’s livelihood, and Spufford’s decision to not collaborate with the Lewis estate bespeaks a certain intellectual arrogance. Which, as I struggle to establish my writing career, I almost admire, since modesty doesn’t launch an arts career. Maybe more writers should kick the beehive. Maybe Spufford is the hero we need.

Monday, March 25, 2019

Paying the Monster For His Monstrosity

Jon Finch (left) and Francesca Annis in Roman Polanski's Macbeth

Roman Polanski’s adaptation of Shakespeare’s Macbeth is among my favorite movies. I owned it on VHS, and now I own the DVD. The way Polanski preserves Shakespeare’s poetic language while abolishing academic theatre’s false decorum makes it a classic, and his seamless integration of bloody tableaux reminiscent of the Manson Murders that took his wife, Sharon Tate, make it intensely personal. I’ve shown it to friends with the same pride as Casablanca and Butch Cassidy.

In the 1990s and early 2000s, I could overlook Polanski’s history as a confessed, convicted child rapist, basically because everyone else did. Polanski continued winning Oscars even after admitting kiddie diddling in court, because his movies remain among the best cinema ever made. But our standards shifted in the 2010s. Critics and film scholars have become less willing to separate art from artist. Older audiences, like me, become, to an extent, accessories after the fact.

I recalled this while reading Caitlin Flanagan’s The Art of a Monster, examining whether audiences can still enjoy Michael Jackson’s music, knowing he was a sexual predator. Flanagan says yes. Even with the increased scrutiny following HBO’s Leaving Neverland documentary, Flanagan writes: “Art isn’t something mere; it doesn’t exist as the moral bona fides of the person who made it. That person is a supernumerary.” To Flanagan, art, including Jackson’s, provides its own moral justification.

Yet I cannot overcome nuance, for one reason: as long as Jackson’s work remains under copyright, his estate continues drawing royalties from its distribution. When he was still alive, this payment subsidized his bizarre kid-centric love nest; now, it probably goes toward the out-of-court settlements he doled out like autographs. Which means, if I download his albums, I’m probably paying the longstanding cost of his known predilections. Doesn’t that make me part of the problem?

Nor, anecdotes suggest, are Polanski and Jackson alone in this. Charlie Chaplin, Errol Flynn, and David Bowie all had known taste in underage women. Bill Cosby and R. Kelly have been charged and/or convicted of rape; Woody Allen, Kevin Spacey, and Ben and Casey Affleck remain uncharged but publicly accused. Until recently, the “casting couch horror story” was one of Hollywood’s acknowledged career speed bumps. We simply used to take that shit for granted.

Left to right: Roman Polanski, Bill Cosby, and David Bowie

I dare not create an itemized list of these legitimated accusations; not only is it too long, it’s morally degrading. It’s natural for America’s culture industry to take stock as we approach the ten-year anniversary of Michael Jackson’s death this summer. Yet the longer the self-scrutiny continues, it becomes increasingly difficult to avoid the conclusion: creative men (and I do mean men) are damaged people who, as damaged people do, inflict their damage on others.

And, as long as their work remains under copyright, any transaction on some level subsidizes their damage. It’s too late to avoid giving Roman Polanski money; I already bought my DVD, and he already received his royalty (which, for my purchase, probably totalled about a dime). But what if I want to watch The Pianist? Chinatown? Rosemary’s Baby? Caitlin Flanagan can reassure herself that Michael Jackson is dead, and therefore not profiting. Polanski soldiers on.

That doesn’t even account for artists who, themselves, did nothing wrong, yet became collateral damage. As I wrote nearly a year ago, Harvey Weinstein’s public implosion took several striving young artists with him; I singled out actress Paige McKenzie simply because I know her work. (McKenzie later tweeted me that her experience was “more of development hell, less #metoo, Thankfully.” That’s reassuring, but still means she had dreams within her grasp, then ripped away again.)

Michael Jackson, Bill Cosby, David Bowie, and Roman Polanski created art that changed the world, arguably making life somewhat better for their having been in it. Caitlin Flanagan assumes art justifies itself, separate from the artist. After some initial resistance, I must concede she’s right. These awful human beings created transcendent art distinct from themselves. But even if the art doesn’t justify the artist, it does subsidize the artist, which carries its own moral weight.

For economic reasons, I cannot reconcile great art with the flawed people who create it. Or, more accurately, with the people who perpetrate their flaws upon others. The issue, fundamentally, isn’t moral, it’s economic: every dime given to Polanski is another dime keeping him unaccountable for his crimes. Because art, ultimately, is also a commodity, and our appreciation is their paycheck. We can enjoy a monster’s art, but can we, rightly, subsidize a monster’s monstrosity?

Thursday, March 21, 2019

The Frontier, the Fortress, and American Mythology

The frontier as depicted by Currier and Ives (click to enlarge)

It’s impossible to escape America’s history with Manifest Destiny. If we believe, as Frederick Jackson Turner did, that American democracy begins with our frontier—and I think we should at least consider the concept, given the ubiquity of cowboy hats in American politics—then the American desire to expand outward, taking our beliefs and culture and generalized Whiteness with us, is the core American idea. At least, it used to be.

Working, as I do, a blue-collar occupation surrounded by White people in the most segregated workplace I've ever seen, I’ve become immersed in the language of White American conservatism. I’ve seen who and what the political Right considers important, not only in my co-workers’ love for loudly blasted talk radio, but also in which talking points they repeat to one another. And I’ve noticed a change, from a frontier myth to a fortress myth.

This begins with the literal. The fetishization of The Wall has become a point so obsessively repeated, I think it wouldn’t go too far to call it liturgy. Chanting “Build That Wall” has the same group identity value as “I believe in God the Father almighty.” Where once American identity involved carrying our values (and race) into the mysterious distance, American-ness has become something we must physically defend against others coming to us.

This marks an important departure from American history. Where our White European ancestors had to live inside walled cities, fearing bandits and barbarians and invading armies, Americans took pride in living on the land they worked. American expansionism often needed temporary stockades when the Army took the leading edge, but once the land was securely in White hands, Americans boasted we didn’t need to live behind walls.

Before anybody interrupts, let me quickly say I’m aware that the “frontier” didn’t objectively exist. First, White armies needed to actively chase Indians (and, frequently, the runaway slaves they harbored) off ancestral land. Then, less obviously, the first poor White settlers needed chased off the land they’d reclaimed, so rich whites could profit off it (see Nancy Isenberg). Americans realistically revere the abstract frontier, not the actual lived experience.

Nevertheless, the concept of “going forth into the wilderness and building a civilization” looms large in American mythology. Nation-building wasn’t just a moral good, but a religious calling executed with religious zeal, sometimes literally, as spreading Christianity was synonymous with spreading America. From the first time John Winthrop called Massachusetts a “shining city on a hill,” America defined itself first by action.

The frontier as depicted in current American politics.

But even beyond the literal fortress mentality, walling America off at the southern frontier, Americans have increasingly accepted a more metaphorical fortress mentality. Our current administration has withdrawn us from the Paris Climate Accords, the Trans-Pacific Partnership, and the North American Free Trade Agreement. It’s instituted protectionist economic policies unseen since the Gilded Age. It even threatens to withdraw from NATO, an unprecedented move.

Like a medieval village under attack from marauders or the Mongol Horde, America has pulled its resources inside the wall, pitched its tent, and decided to endure the outside world privately. While our candidates continue to wear cowboy hats and sing country music, our actions reflect the idea of the lonely skald strumming his lute to distract the villagers while we outlast the faceless attackers beyond the wall. We have surrendered our doer heritage.

Let me repeat, I’m aware how loaded and explosive America’s frontier mythology is. I know it’s been abused by power-hungry demagogues to mislead crowds about where we’ve come from, and where we’re going. In order to repurpose American frontier myths for the Twenty-First Century, America’s visionaries will need new ways of seeing the outside world, ways that don’t involve conquest, expropriation, and ever-expanding Whiteness.

But surely, if seeding Whiteness throughout the world is poor policy, so is ensconcing Whiteness behind a literal wall and disengaging from the larger world. American policy has apparently fallen into a binary equation: we must conquer, or be conquered. Since our most recent conquests, in places like Iraq and Afghanistan, quickly turned into massive national embarrassments, our elected officials have shifted into a defensive posture and hunkered down.

I dare not advocate some return to a rosy-eyed view of America’s past. We already know America’s past had some pretty awful components, and misplaced nostalgia is creating problems in our explosive present. But if we want America to have a future, and I believe some of our values are important enough to preserve, then we need a better mythology than the fortress. Because we can’t live long under constant attack.

Sunday, March 17, 2019

Our Lady Mary Magdalene of Ireland

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 30
Peter Mullan (writer/director), The Magdalene Sisters

In 1964, three women from different parts of Ireland find themselves ripped from their lives and forced into the Magdalene laundry. Overseen by battalions of hard-faced nuns, the girls, none older than twenty, are forced to toil as penance for sexual sins most haven’t even committed yet. They struggle under the convent’s harsh rule, which literally desires to control their souls. But they keep one eye on the outside, and plan for their eventual escapes.

Officially called “Magdalene Asylums,” the Magdalene laundries started out as places prostitutes and other “fallen women” could rebuild their lives and achieve redemption. Many were established throughout the world, including Britain and America; the Irish laundries, however, became an unmitigated horror show. When the nuns operating the laundries discovered they were making a profit, their original Christian mission went by the wayside.

Writer-director Peter Mullan focuses on three women among the dozens held captive at the laundry. Rose (Dorothy Duffy) is an unmarried mother, anathema in a Catholic country. Margaret (Anne-Marie Duff) was raped. And Bernadette (Nora-Jane Noone), an orphan raised in a church home, is simply too pretty and warm to boys’ attention; the nuns at her orphanage believe she’ll inevitably produce more orphaned children for the church to raise.

All three get shipped to the laundry by relatives or caretakers—it’s somewhat murky where the story takes place, though it was inspired by UN reports of abuses at a laundry based in Cork. Mother Superior of the facility, Sister Bridget (Geraldine McEwan), comes across as soft-spoken and amiable to her wards. However, it quickly becomes clear she relishes power, demonstrated by moments of casual sadism, and cares mostly about money.

The girls remain trapped in the laundry for four years, working ten-hour days and six-day weeks. As the only reliable service able to process the laundry produced by a large swath of Ireland, their services are in constant demand. And we see it pays well: the nuns eat buttered toast and bacon for breakfast. The girls who do the actual work, however, eat oatmeal and water.

Worse, the girls are subject to constant abuse. Not only are they overworked by the nuns, and physically punished for insignificant infractions, but the pries, Father Fitzroy, who wants to reform the laundry, becomes corrupted by the culture and starts sexually abusing a developmentally disabled girl. The men who drive the delivery lorries, meanwhile, who are the girls’ only contact with the outside world, often trade sex for favors.

Mother Superior (Geraldine McEwan) leads a line of trapped workers (left to
right Anne-Marie Duff, Nora-Jane Noone, and Dorothy Duffy) in The Magdalene Sisters

Mullan focuses on character drama on character, letting larger history speak for itself. Unlike Neil Jordan, writer-director of Michael Collins, Mullan doesn’t lecture about history, or make Irish facts digestible for international audiences. He instead forces characters into an intolerable situation, and lets their actions speak for themselves. His heroines have two choices: either conform to a corrupt system, or break out by force.

The Magdalene nuns repeatedly promise the girls, when they’ve achieved salvation, they’ll be permitted back into the world. As years drag on, however, and the girls find themselves unconsciously mimicking the power hierarchies that control them, we start to realize: not everyone will escape. They’ve internalized the nuns’ system of abuse. The convent has lost interest in salvation; as Bernadette observes once, they only care whether the work gets done.

The cozy relationship between Church and government during the early Irish Republic often corrupted both institutions. Police helped dirty priests cover their sins, while idealistic young clergy often tried to change the system from within, but found the system changing them. Ireland consistently proves a point I’ve long believed: individual Christians often create powerful good, but the Church, like any other institution, serves mainly to protect itself.

In 1993, long after the events depicted here, property developers working land formerly owned by Dublin’s Magdalene laundry uncovered a mass grave containing 155 skeletons of unidentifiable girls. By this time, the Dublin convent was Ireland’s last Magdalene laundry, and the outcry generated by this discovery forced its closure. Only after the laundries ended did anyone officially discuss their existence, or the church-state relationship that made their abuses possible.

Peter Mullan made this movie partly to raise awareness of the Magdalene abuses, which weren’t officially redressed until 2013. Within Ireland, this movie helped make these crimes visible, but international audiences should watch too. Only by staring directly at the history of religious intolerance and state corruption, can mass populations, Christian or secular, ensure these crimes aren’t repeated. Because bigotry like this still exists in our world.

Monday, March 11, 2019

Two Modern Dogmas

Back when I worked at the factory, I remember a co-worker approaching me at breaktime, holding a bottle of 7-Up he’d just bought from a vending machine. My co-workers often treated me as smart and authoritative because I have a good head for memorizing facts, and I’d grown accustomed to answering whatever questions might arise during the day. This guy pointed to some words printed on the label. “It says here ‘All Natural,’” he said, smiling smugly. “That means it’s good for you, right?”

Part of me hated to let the poor guy down, but not enough to shut me up. “You gotta be careful,” I said, “that’s one of the ways they fool you. They depend on you to think that way. But there’s no legally or scientifically binding definition of the word ‘Natural.’ You have to do more research than that to keep healthy.”

I recalled that exchange, and my co-worker’s crestfallen face, during a more recent disagreement. My friend, whom I love and respect, nevertheless said something I completely disagree with: that America can never abandon high-tech farming, despite its lousy environmental impact and its grotesque overproduction of food that often gets landfilled, because if we do, we’ll run out of affordable food, and the poor will suffer.

My friend is a big believer in GMOs and their potential to produce healthier, more abundant food. This despite the fact that they haven’t done so, and most GMOs have proven to be more expensive, more bland-tasting, and generally more disappointing versions of existing food crops. In the unlikely event my friend reads this essay, I’m confident he’ll feel motivated to defend his existing positions by asserting I’m just an enemy of “science.”

So we have two conflicting attitudes, which I’ve encapsulated in two people I know personally, though I’ve seen both repeated by other people and in mass media. On the one hand, we have the belief that “natural” means accord with human needs, and a general tinge of moral goodness. On the other, the belief that “science” is a humane progress through layers of understanding to the light of secular salvation.

Both attitudes are wrong.

The belief in nature’s goodness, as a sort of countercultural push against the heedless embrace of technology, is so completely mistaken that it has its own name: the Appeal to Nature Fallacy. Sometimes called the 100% Natural Fallacy, a name taken from a popular brand of breakfast cereal sold in the 1980s. From this fallacy, we get quack medicine, homeopathy, herbal “medicines,” and gullible people drinking their own pee.

The opposing belief, that what’s created in a laboratory is superior to dirty old nature, isn’t widespread enough to have its own name. Yet it demonstrates remarkably similar willingness to trust a dogmatic interpretation of evidence. Human ingenuity gets presented as innately morally good, and scientific advance becomes an end in its own right. But it requires an equal willingness to trust an abstract conviction without question.

Charles Darwin
When my friend argues, and he has, that we shouldn’t worry about mechanized farming damaging soil fertility, because we can replace lost fertility with synthetic chemicals, he makes the exact same appeal as homeopathy: that whatever made us sick will also make us well. This maybe made sense sixty years ago, when we had less empirical evidence. We formerly had to venture into new territory without a map.

But we don’t anymore. We have abundant evidence that peach pits don’t cure cancer, trace amounts of arsenic don’t reverse poisoning, and petroleum-derived fertilizers burn the soil, making future harvests less abundant. Blind trust in either nature or science has produced serious consequences, even cost lives. Experience tells us that calling something “natural” doesn’t make it healthful. And calling something “scientific” doesn’t make it good.

I don’t write this to accuse either my co-worker or my friend personally. Both men simply want a concise, intellectually coherent explanation for today’s difficult and often inconsistent circumstances. I frequently catch myself doing likewise. Unfortunately, modern complexity doesn’t permit such prefab consistency. Failing to frequently test our dogmas against evidence serves the same impulse formerly served by religion.

Tragically, both “nature” and “science” make pretty poor gods. From tapeworms to opioids, both have a track record of turning viciously on their worshipers. It isn’t comforting to say we have to review the evidence constantly, especially in today’s environment, saturated as it is with information. Yet we have to. The era of comforting dogmas, which both men described are seeking, is long over.

Thursday, March 7, 2019

The Will of the People, and the Won't

Do we think these people represent the majority of Americans?

I remember, back in 1992, when Colorado passed Amendment 2, a state constitutional provision ensuring neither the state, nor any local government chartered by the state, could provide any specified legal protections based on sexual orientation. It was the first state law anywhere specifically targeting sexual orientation as a non-protected status, and passed among Colorado voters by a straight majority, with a six-point majority.

However, it never took effect, because court challenges paused implementation. The amendment, which again passed a statewide referendum with a simple majority, languished in legal limbo for nearly four years before the U.S. Supreme Court negated it in 1996. At the time, I lived in Ogallala, Nebraska, fifteen minutes from the Colorado line. And I began hearing the common line: “Why can’t politicians just give people what they want?”

My friend began answering that question with a response I’d never previously considered: “The people of Alabama under George Wallace wanted segregation by force. Should America have given it to them?” I doubt that argument changed anybody’s minds, because we know facts seldom persuade anybody whose positions are ironclad. However, it certainly ended the conversation, because people never had any counterclaim that didn’t make them sound ringingly bigoted.

I thought about these two clashing arguments recently when I read about a recent Kaiser Family Foundation survey, which found a straight majority of Americans approve of Medicare For All—at least hypothetically. That support goes up or down, conditionally, when surveyors begin explaining different ways politicians might implement the policy in real life. But at least in theory, Americans support extending Medicare provisions to everybody, regardless of ability to pay.

We could go on: a majority of Americans think abortion should be legal in most or all cases. Most Americans think we should protect the unemployed, the elderly, and the environment. Popular support for the Green New Deal runs above eighty percent, commanding a clear majority of even Republicans. Straight majorities support America’s longstanding weak-tea social platform, and depending on the survey, a majority even favor expanding it.

But majorities have historically approved some pretty awful stuff. Recent complaints about “gentrification” make the mass removal of Black and Brown communities from cities seem sudden and urgent, yet highly popular urban revitalization projects in postwar America, many supported by a majority of returning White veterans, so disproportionately hurt urban POC communities that James Baldwin famously called urban renewal “Negro removal.” That’s just one ready example.

Do we think these people represent the majority of Americans?

Reasonable people, studying history, would probably conclude that just because a majority wants something, doesn’t mean they should receive it. But categorically denying majority appeals defies the principles upon which liberty-minded republics, like America, were founded. If a sufficient proportion of the populace demands something, we should consider why they consider this goal desirable, even if we don’t necessarily acquiesce to their demands.

The largest number of Americans want clean water, clean air, clean soil. They want citizens to have access to top-quality medical care, unconstrained by their private economic circumstances. They want work, and not just meaningless work as bean-counters or burger-flippers, but work which actively contributes to improving humanity’s condition on this fleeting globe. To my Distributist-minded philosophy, these sound like perfectly reasonable demands.

But what if people’s demands change? History demonstrates that very rapid changes in widespread political philosophy can reverse themselves equally rapidly. Only in the last fifteen years have majorities of American voters approved gay-rights initiatives by ballot; remember that California’s Amendment 8, now one of the most widely reviled pieces of legislation, received a straight majority in one of America’s most progressive-minded states. Much like Colorado’s Amendment 2.

If the people are free, it follows that the people are free to be wrong.

At this writing, an anti-LGBTQ discrimination bill has stalled in the Nebraska legislature. It’s difficult to gauge, in a state like Nebraska, how widespread any support might be for anti-discrimination measures; but multiple state agencies, private companies, the mayors of the state’s two largest cities, and several philanthropic groups, have endorsed the measure. I find little evidence of organized opposition, yet the bill remains stalled.

The people wanting something, anything, doesn’t necessarily justify that thing. America’s constitutional checks specifically aim to thwart an unjust majority. But if enough people believe something, shouldn’t we at least seriously consider that thing? Well, no, because majorities can shift overnight. I wish I had the answers, or anyway more convincing evidence. But the longer I contemplate this question, the more adamant and insoluble the questions I uncover have become.