Thursday, March 21, 2019

The Frontier, the Fortress, and American Mythology

The frontier as depicted by Currier and Ives (click to enlarge)

It’s impossible to escape America’s history with Manifest Destiny. If we believe, as Frederick Jackson Turner did, that American democracy begins with our frontier—and I think we should at least consider the concept, given the ubiquity of cowboy hats in American politics—then the American desire to expand outward, taking our beliefs and culture and generalized Whiteness with us, is the core American idea. At least, it used to be.

Working, as I do, a blue-collar occupation surrounded by White people in the most segregated workplace I've ever seen, I’ve become immersed in the language of White American conservatism. I’ve seen who and what the political Right considers important, not only in my co-workers’ love for loudly blasted talk radio, but also in which talking points they repeat to one another. And I’ve noticed a change, from a frontier myth to a fortress myth.

This begins with the literal. The fetishization of The Wall has become a point so obsessively repeated, I think it wouldn’t go too far to call it liturgy. Chanting “Build That Wall” has the same group identity value as “I believe in God the Father almighty.” Where once American identity involved carrying our values (and race) into the mysterious distance, American-ness has become something we must physically defend against others coming to us.

This marks an important departure from American history. Where our White European ancestors had to live inside walled cities, fearing bandits and barbarians and invading armies, Americans took pride in living on the land they worked. American expansionism often needed temporary stockades when the Army took the leading edge, but once the land was securely in White hands, Americans boasted we didn’t need to live behind walls.

Before anybody interrupts, let me quickly say I’m aware that the “frontier” didn’t objectively exist. First, White armies needed to actively chase Indians (and, frequently, the runaway slaves they harbored) off ancestral land. Then, less obviously, the first poor White settlers needed chased off the land they’d reclaimed, so rich whites could profit off it (see Nancy Isenberg). Americans realistically revere the abstract frontier, not the actual lived experience.

Nevertheless, the concept of “going forth into the wilderness and building a civilization” looms large in American mythology. Nation-building wasn’t just a moral good, but a religious calling executed with religious zeal, sometimes literally, as spreading Christianity was synonymous with spreading America. From the first time John Winthrop called Massachusetts a “shining city on a hill,” America defined itself first by action.

The frontier as depicted in current American politics.

But even beyond the literal fortress mentality, walling America off at the southern frontier, Americans have increasingly accepted a more metaphorical fortress mentality. Our current administration has withdrawn us from the Paris Climate Accords, the Trans-Pacific Partnership, and the North American Free Trade Agreement. It’s instituted protectionist economic policies unseen since the Gilded Age. It even threatens to withdraw from NATO, an unprecedented move.

Like a medieval village under attack from marauders or the Mongol Horde, America has pulled its resources inside the wall, pitched its tent, and decided to endure the outside world privately. While our candidates continue to wear cowboy hats and sing country music, our actions reflect the idea of the lonely skald strumming his lute to distract the villagers while we outlast the faceless attackers beyond the wall. We have surrendered our doer heritage.

Let me repeat, I’m aware how loaded and explosive America’s frontier mythology is. I know it’s been abused by power-hungry demagogues to mislead crowds about where we’ve come from, and where we’re going. In order to repurpose American frontier myths for the Twenty-First Century, America’s visionaries will need new ways of seeing the outside world, ways that don’t involve conquest, expropriation, and ever-expanding Whiteness.

But surely, if seeding Whiteness throughout the world is poor policy, so is ensconcing Whiteness behind a literal wall and disengaging from the larger world. American policy has apparently fallen into a binary equation: we must conquer, or be conquered. Since our most recent conquests, in places like Iraq and Afghanistan, quickly turned into massive national embarrassments, our elected officials have shifted into a defensive posture and hunkered down.

I dare not advocate some return to a rosy-eyed view of America’s past. We already know America’s past had some pretty awful components, and misplaced nostalgia is creating problems in our explosive present. But if we want America to have a future, and I believe some of our values are important enough to preserve, then we need a better mythology than the fortress. Because we can’t live long under constant attack.

Sunday, March 17, 2019

Our Lady Mary Magdalene of Ireland

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 30
Peter Mullan (writer/director), The Magdalene Sisters

In 1964, three women from different parts of Ireland find themselves ripped from their lives and forced into the Magdalene laundry. Overseen by battalions of hard-faced nuns, the girls, none older than twenty, are forced to toil as penance for sexual sins most haven’t even committed yet. They struggle under the convent’s harsh rule, which literally desires to control their souls. But they keep one eye on the outside, and plan for their eventual escapes.

Officially called “Magdalene Asylums,” the Magdalene laundries started out as places prostitutes and other “fallen women” could rebuild their lives and achieve redemption. Many were established throughout the world, including Britain and America; the Irish laundries, however, became an unmitigated horror show. When the nuns operating the laundries discovered they were making a profit, their original Christian mission went by the wayside.

Writer-director Peter Mullan focuses on three women among the dozens held captive at the laundry. Rose (Dorothy Duffy) is an unmarried mother, anathema in a Catholic country. Margaret (Anne-Marie Duff) was raped. And Bernadette (Nora-Jane Noone), an orphan raised in a church home, is simply too pretty and warm to boys’ attention; the nuns at her orphanage believe she’ll inevitably produce more orphaned children for the church to raise.

All three get shipped to the laundry by relatives or caretakers—it’s somewhat murky where the story takes place, though it was inspired by UN reports of abuses at a laundry based in Cork. Mother Superior of the facility, Sister Bridget (Geraldine McEwan), comes across as soft-spoken and amiable to her wards. However, it quickly becomes clear she relishes power, demonstrated by moments of casual sadism, and cares mostly about money.

The girls remain trapped in the laundry for four years, working ten-hour days and six-day weeks. As the only reliable service able to process the laundry produced by a large swath of Ireland, their services are in constant demand. And we see it pays well: the nuns eat buttered toast and bacon for breakfast. The girls who do the actual work, however, eat oatmeal and water.

Worse, the girls are subject to constant abuse. Not only are they overworked by the nuns, and physically punished for insignificant infractions, but the pries, Father Fitzroy, who wants to reform the laundry, becomes corrupted by the culture and starts sexually abusing a developmentally disabled girl. The men who drive the delivery lorries, meanwhile, who are the girls’ only contact with the outside world, often trade sex for favors.

Mother Superior (Geraldine McEwan) leads a line of trapped workers (left to
right Anne-Marie Duff, Nora-Jane Noone, and Dorothy Duffy) in The Magdalene Sisters

Mullan focuses on character drama on character, letting larger history speak for itself. Unlike Neil Jordan, writer-director of Michael Collins, Mullan doesn’t lecture about history, or make Irish facts digestible for international audiences. He instead forces characters into an intolerable situation, and lets their actions speak for themselves. His heroines have two choices: either conform to a corrupt system, or break out by force.

The Magdalene nuns repeatedly promise the girls, when they’ve achieved salvation, they’ll be permitted back into the world. As years drag on, however, and the girls find themselves unconsciously mimicking the power hierarchies that control them, we start to realize: not everyone will escape. They’ve internalized the nuns’ system of abuse. The convent has lost interest in salvation; as Bernadette observes once, they only care whether the work gets done.

The cozy relationship between Church and government during the early Irish Republic often corrupted both institutions. Police helped dirty priests cover their sins, while idealistic young clergy often tried to change the system from within, but found the system changing them. Ireland consistently proves a point I’ve long believed: individual Christians often create powerful good, but the Church, like any other institution, serves mainly to protect itself.

In 1993, long after the events depicted here, property developers working land formerly owned by Dublin’s Magdalene laundry uncovered a mass grave containing 155 skeletons of unidentifiable girls. By this time, the Dublin convent was Ireland’s last Magdalene laundry, and the outcry generated by this discovery forced its closure. Only after the laundries ended did anyone officially discuss their existence, or the church-state relationship that made their abuses possible.

Peter Mullan made this movie partly to raise awareness of the Magdalene abuses, which weren’t officially redressed until 2013. Within Ireland, this movie helped make these crimes visible, but international audiences should watch too. Only by staring directly at the history of religious intolerance and state corruption, can mass populations, Christian or secular, ensure these crimes aren’t repeated. Because bigotry like this still exists in our world.

Monday, March 11, 2019

Two Modern Dogmas

Back when I worked at the factory, I remember a co-worker approaching me at breaktime, holding a bottle of 7-Up he’d just bought from a vending machine. My co-workers often treated me as smart and authoritative because I have a good head for memorizing facts, and I’d grown accustomed to answering whatever questions might arise during the day. This guy pointed to some words printed on the label. “It says here ‘All Natural,’” he said, smiling smugly. “That means it’s good for you, right?”

Part of me hated to let the poor guy down, but not enough to shut me up. “You gotta be careful,” I said, “that’s one of the ways they fool you. They depend on you to think that way. But there’s no legally or scientifically binding definition of the word ‘Natural.’ You have to do more research than that to keep healthy.”

I recalled that exchange, and my co-worker’s crestfallen face, during a more recent disagreement. My friend, whom I love and respect, nevertheless said something I completely disagree with: that America can never abandon high-tech farming, despite its lousy environmental impact and its grotesque overproduction of food that often gets landfilled, because if we do, we’ll run out of affordable food, and the poor will suffer.

My friend is a big believer in GMOs and their potential to produce healthier, more abundant food. This despite the fact that they haven’t done so, and most GMOs have proven to be more expensive, more bland-tasting, and generally more disappointing versions of existing food crops. In the unlikely event my friend reads this essay, I’m confident he’ll feel motivated to defend his existing positions by asserting I’m just an enemy of “science.”

So we have two conflicting attitudes, which I’ve encapsulated in two people I know personally, though I’ve seen both repeated by other people and in mass media. On the one hand, we have the belief that “natural” means accord with human needs, and a general tinge of moral goodness. On the other, the belief that “science” is a humane progress through layers of understanding to the light of secular salvation.

Both attitudes are wrong.

The belief in nature’s goodness, as a sort of countercultural push against the heedless embrace of technology, is so completely mistaken that it has its own name: the Appeal to Nature Fallacy. Sometimes called the 100% Natural Fallacy, a name taken from a popular brand of breakfast cereal sold in the 1980s. From this fallacy, we get quack medicine, homeopathy, herbal “medicines,” and gullible people drinking their own pee.

The opposing belief, that what’s created in a laboratory is superior to dirty old nature, isn’t widespread enough to have its own name. Yet it demonstrates remarkably similar willingness to trust a dogmatic interpretation of evidence. Human ingenuity gets presented as innately morally good, and scientific advance becomes an end in its own right. But it requires an equal willingness to trust an abstract conviction without question.

Charles Darwin
When my friend argues, and he has, that we shouldn’t worry about mechanized farming damaging soil fertility, because we can replace lost fertility with synthetic chemicals, he makes the exact same appeal as homeopathy: that whatever made us sick will also make us well. This maybe made sense sixty years ago, when we had less empirical evidence. We formerly had to venture into new territory without a map.

But we don’t anymore. We have abundant evidence that peach pits don’t cure cancer, trace amounts of arsenic don’t reverse poisoning, and petroleum-derived fertilizers burn the soil, making future harvests less abundant. Blind trust in either nature or science has produced serious consequences, even cost lives. Experience tells us that calling something “natural” doesn’t make it healthful. And calling something “scientific” doesn’t make it good.

I don’t write this to accuse either my co-worker or my friend personally. Both men simply want a concise, intellectually coherent explanation for today’s difficult and often inconsistent circumstances. I frequently catch myself doing likewise. Unfortunately, modern complexity doesn’t permit such prefab consistency. Failing to frequently test our dogmas against evidence serves the same impulse formerly served by religion.

Tragically, both “nature” and “science” make pretty poor gods. From tapeworms to opioids, both have a track record of turning viciously on their worshipers. It isn’t comforting to say we have to review the evidence constantly, especially in today’s environment, saturated as it is with information. Yet we have to. The era of comforting dogmas, which both men described are seeking, is long over.

Thursday, March 7, 2019

The Will of the People, and the Won't

Do we think these people represent the majority of Americans?

I remember, back in 1992, when Colorado passed Amendment 2, a state constitutional provision ensuring neither the state, nor any local government chartered by the state, could provide any specified legal protections based on sexual orientation. It was the first state law anywhere specifically targeting sexual orientation as a non-protected status, and passed among Colorado voters by a straight majority, with a six-point majority.

However, it never took effect, because court challenges paused implementation. The amendment, which again passed a statewide referendum with a simple majority, languished in legal limbo for nearly four years before the U.S. Supreme Court negated it in 1996. At the time, I lived in Ogallala, Nebraska, fifteen minutes from the Colorado line. And I began hearing the common line: “Why can’t politicians just give people what they want?”

My friend began answering that question with a response I’d never previously considered: “The people of Alabama under George Wallace wanted segregation by force. Should America have given it to them?” I doubt that argument changed anybody’s minds, because we know facts seldom persuade anybody whose positions are ironclad. However, it certainly ended the conversation, because people never had any counterclaim that didn’t make them sound ringingly bigoted.

I thought about these two clashing arguments recently when I read about a recent Kaiser Family Foundation survey, which found a straight majority of Americans approve of Medicare For All—at least hypothetically. That support goes up or down, conditionally, when surveyors begin explaining different ways politicians might implement the policy in real life. But at least in theory, Americans support extending Medicare provisions to everybody, regardless of ability to pay.

We could go on: a majority of Americans think abortion should be legal in most or all cases. Most Americans think we should protect the unemployed, the elderly, and the environment. Popular support for the Green New Deal runs above eighty percent, commanding a clear majority of even Republicans. Straight majorities support America’s longstanding weak-tea social platform, and depending on the survey, a majority even favor expanding it.

But majorities have historically approved some pretty awful stuff. Recent complaints about “gentrification” make the mass removal of Black and Brown communities from cities seem sudden and urgent, yet highly popular urban revitalization projects in postwar America, many supported by a majority of returning White veterans, so disproportionately hurt urban POC communities that James Baldwin famously called urban renewal “Negro removal.” That’s just one ready example.

Do we think these people represent the majority of Americans?

Reasonable people, studying history, would probably conclude that just because a majority wants something, doesn’t mean they should receive it. But categorically denying majority appeals defies the principles upon which liberty-minded republics, like America, were founded. If a sufficient proportion of the populace demands something, we should consider why they consider this goal desirable, even if we don’t necessarily acquiesce to their demands.

The largest number of Americans want clean water, clean air, clean soil. They want citizens to have access to top-quality medical care, unconstrained by their private economic circumstances. They want work, and not just meaningless work as bean-counters or burger-flippers, but work which actively contributes to improving humanity’s condition on this fleeting globe. To my Distributist-minded philosophy, these sound like perfectly reasonable demands.

But what if people’s demands change? History demonstrates that very rapid changes in widespread political philosophy can reverse themselves equally rapidly. Only in the last fifteen years have majorities of American voters approved gay-rights initiatives by ballot; remember that California’s Amendment 8, now one of the most widely reviled pieces of legislation, received a straight majority in one of America’s most progressive-minded states. Much like Colorado’s Amendment 2.

If the people are free, it follows that the people are free to be wrong.

At this writing, an anti-LGBTQ discrimination bill has stalled in the Nebraska legislature. It’s difficult to gauge, in a state like Nebraska, how widespread any support might be for anti-discrimination measures; but multiple state agencies, private companies, the mayors of the state’s two largest cities, and several philanthropic groups, have endorsed the measure. I find little evidence of organized opposition, yet the bill remains stalled.

The people wanting something, anything, doesn’t necessarily justify that thing. America’s constitutional checks specifically aim to thwart an unjust majority. But if enough people believe something, shouldn’t we at least seriously consider that thing? Well, no, because majorities can shift overnight. I wish I had the answers, or anyway more convincing evidence. But the longer I contemplate this question, the more adamant and insoluble the questions I uncover have become.

Thursday, February 28, 2019

Tears For the Methodist Church

I’ve struggled in recent days to comprehend recent decisions in the United Methodist Church. I grew up a Methodist, and though I’ve attended mostly Lutheran for the last twenty years (with brief diversions into Episcopal and AME), my entire family remains Methodist, and the denomination retains a place in my heart. Yet I cannot help recalling something I only learned in 2017: the entire denomination’s history stems from division.

Despite the history of Methodism in America back to pre-Revolutionary times, the United Methodist Church has only existed since 1968. It formed from a merger of the Methodist Church (USA) and the Evangelical United Brethren. The Methodist Church was primarily English in heritage, while the much smaller EUB was mainly German, but both were Wesleyan traditions. The reason they merged, however, was pretty horrific.

The Methodist Church came into being in 1939, from a merger of the Methodist Episcopal Church and its lost child, the Methodist Episcopal Church, South. These denominations had split in 1844 because the Southerners refused to renounce slavery, while the denomination overall had abolitionist teachings. John Wesley himself had published tracts on abolitionism, besides vocally advocating against oppressing Indians and other colonized peoples.

The two Methodist Episcopal traditions were almost identical in doctrine and teaching, and the South denomination was suffering massive losses of people and resources, so there was little to lose from the merger, while the Southerners in particular had plenty to gain. However, while slavery had ended in the interim, racial bigotry had not. The Southerners made a stipulation to the merger: the reunified Church absolutely needed to maintain segregated congregations.

Among many tragedies in American history, one looms large: progressive Whites advocated the abolition of slavery, but once slavery ended, they lost interest in racial issues. As Dr. Ibram X. Kendi writes, the Republican Party was founded on abolitionist principles, but demonstrated no interest in civil rights issues after the passage of the Enforcement Act of 1870. The pattern was widespread. To their widespread disgrace, the Northern Methodists acquiesced.

As a digression, whenever I visit my second home of Lawrence, Kansas, I’ve lately attended the local African Methodist Episcopal church. This group, informally called the AME, split from the original Methodist Episcopal Church in 1816 because the original MEC refused to ordain Black ministers. This despite John Wesley’s stated opposition to slavery and racism; Wesley was a personal friend of leading British abolitionist MP William Wilberforce.


So to recap: in 1939, Southern Methodists wanted a Christianity stratified by race, and Northern Methodists didn’t care enough to fight. The old Methodist Church therefore had racist inclinations written into its foundation. But by 1968, Methodist leadership found this embarrassing, and wanted to revoke this stipulation. That’s where the Evangelical United Brethren come in: their tradition was doctrinally anti-racist. Many EUB leaders also served as Civil Rights volunteers.

It’s sad to think, but Methodist leaders understood it would be easier to create an entirely new tradition, and institute an entirely new charter, than amend the one they already had. The Methodists were numerous, and relatively secure; they didn’t need the EUB merger to reinforce themselves. The Methodists wanted the merger for only one reason: they knew that, without it, heel-dragging traditionalists would keep segregation written into their charter forever.

Which returns us to what happened in Chicago this week. Given the opportunity to widen their organizational reach, Methodists instead opted to cross their arms and deepen their opposition to LGBT+ inclusion. Just as their previous leaders once predicted, changing existing doctrine is exceedingly difficult, even when the status quo apparently puts select populations outside the welcome of God’s love. How, I wonder, does this serve any Christian interest?

Throughout history, many individual Christians have done the world great good. Christians have taken lead on abolitionism, Civil Rights, anti-colonialism, and other flashpoint issues. But whenever sufficiently large numbers of Christians come together that they need to created a hierarchy and doctrinal statement, that good becomes diminished. Christians are often leaders in overdue social changes. Churches, however, tend to impede change as often as enable it.

The UMC has struggled with this issue for decades. I still attended Methodist when the Church defrocked Reverend Jimmy Creech for officiating covenant services (not, please note, “marriages”) between two same-sex couples. I was more conservative back then, admittedly. But even then, I struggled to explain why declaring certain expressions of love was any better than saying mixed groups couldn’t sit in the pews. Clearly, twenty years later, nothing has gotten better.