Wednesday, September 29, 2021

The Problem With Wizards

“Yer a wizard, Harry.” These words, snarled in Rubeus Hagrid’s West Country drawl, have achieved life beyond their franchised origins. They confirm for one stir-crazy London pre-teen that his beige suburban life isn’t inevitable, that he could, simply by embracing it, become destiny. Like the X-Men or Ender Wiggin, Harry’s revelation echoes the desires of countless drab childhoods, acknowledging that their greatness merely hasn’t been recognized yet.

But it’s always made me uncomfortable. Hagrid doesn’t promise Harry, “Ye’ve got magic in ye, Harry;” wizard isn’t something he becomes. Rather, it’s a pre-emergent state, something Harry already is, without his consent or awareness. Like discovering you’ve always been White, or middle-class, the condition doesn’t require his awareness; it’s part of his essential being. “You’re already a wizard, Harry,” Hagrid basically says, “so decide how to use that fact.”

Author JK Rowling has latterly achieved internet notoriety for her inability to shut up about gender issues. Rowling believes one’s gender corresponds, one-to-one, with one’s genital anatomy, existing from birth, and any deviation is aberrant. Scholars call this position “gender essentialism”; I consider that title a misnomer, but that’s another discussion. What matters, for our purposes, is that Rowling believes gender precedes conscious awareness.

Many fantasy novelists use some concept of “the gift” or other euphemism for pre-emergent magical identity. Maybe this reflects Sir Thomas Malory, who depicted Merlin as a human-demon hybrid, or Tolkein, who presented wizards as ageless emissaries of God. Either way, wizards aren’t like us plebians; they have a nature that makes magic available. No amount of practicing or education will close this gulf; wizardry isn’t like golf, or playing the guitar.

This perhaps makes sense with Rowling. As a Scottish Presbyterian, Rowling has some familiarity with Calvinist doctrines of predestination: God already knows the roster of the saved. To the Calvinist, we indeed have pre-emergent natures. As this conundrum fermented with me, however, I realized I’d spotted this elsewhere: Rowling’s wizards, like George Lucas’ Jedi, can never emerge from pedestrian bloodlines like ours.

Lucas’ original Star Wars, latterly retitled A New Hope, presented Luke Skywalker as mastering the Force through self-discipline and mindfulness, but also through genetics. From the beginning, Obi-Wan Kenobi clearly considered Luke an apprentice, and wanted him to reconcile ancestral transgressions. Luke is one of us, but not really. This sense of inherited responsibility became only more glaring in Lucas’ prequel trilogies, with the introduction of midi-chlorians.

Calvinist predestination becomes especially pointed because Luke Skywalker, like Harry Potter, is a child of prophecy. Like Katniss Everdeen, Aragorn, or Neo, these promised Messiahs come at pre-ordained intervals to deliver humanity from thinly coded Nazis. Individually, these stories seem hopeful, leading benighted humans into liberty. Together, a pattern emerges: the Nazis were necessary to purge senescent society, and now the Messiah will restore the balance.

Who, then, are these predestined Messiahs saving humanity from? The tyrannical Empire, Ministry of Magic, Panem, or machines? Or are they saving us from the doddering human dominion that let the tyrants in? Are we fortunate enough to receive these Messiahs’ salvation exclusively because we were born after the necessary cleansing? And therefore, does the Messiah’s existence loom as a threat, that we, too, might get cleansed next?

Predestination might make sense in an environment with a literal God. As a Christian myself, Calvinist predestination has always made me uncomfortable, since it implies God has already cut some people loose; it’s a short leap to assuming that God approves discrimination. But in science fiction and fantasy environments, where transcendence is, at best, murky, the potential for misuse becomes glaring. As Rowling has tragically discovered.

Yet such abuse seems implicit if humans have a pre-emergent nature. If being a wizard, or mystic space samurai, or techno-Jesus, is something somebody just innately is, then it follows that it’s also something everyone else innately isn’t. Even if the muggles and non-Jedi aren’t explicitly inferior, that potential is implicit. Observe when the unwashed masses submit themselves to Jedi judgement, or wizards alter muggle memories to protect their secrets.

Like millions of audience members, Harry Potter and Luke Skywalker propelled me along their narratives when they were new. Like in real life, we don’t have time to ruminate upon the moral implications during the action. Only later, in quiet recollection, do the story’s moral contradictions suddenly intrude upon our appreciation. But the fact that we needed years to recognize the contradictions, doesn’t make them less real.

A world with real wizards is just too depressing to contemplate.

Friday, September 24, 2021

Christianity and Cusswords


Is it ever acceptable for Christians to use cuss words? The question recently arose on Christian Twitter, springing up like wildfire, as Twitter debates often do. By the time you read this, it will probably have already dissipated. But I've spent time thinking about this previously, so I thought I could take the opportunity to air my opinions. Because what else is the internet for?

Like you, I grew up hearing that "good people," implicitly coded as Christians, don't use certain words. To this day, George Carlin's seven famous words get bleeped on broadcast media, because their very sounds are considered so offensive, that hearing them will cause sensitive hearers distress. Schools routinely punish children for saying these words, and they're a firing offense in many workplaces.

Like many children of Christian households, my adolescent rebellion included learning to use vulgarity. Working the words "shit" and "fuck" into my vocabulary became a goal in its own right. When I began, I sounded artificial, like a pretentious undergraduate forcing references to Plato and Derrida into lunchtime conversation. Eventually, though, the words sounded natural coming from my mouth. You decide whether that's much of an accomplishment.

Then, in 10th Grade, I discovered a fact that would change my views. The words we consider innately offensive are not decided according to their content. We don't object to people saying "shit" because feces are innately terrible. And we object to "fuck" not because sex is terrible, but because certain people describe sex that way.

The two English words used most commonly to describe coarse language, "vulgar" (from Latin) and "lewd" (from Anglo-Saxon) literally describe the common people. Even calling them "coarse" refers to the texture of their homespun clothing. The language and behavior classed as inappropriate by schoolmarms and church wardens across history, are the language and behavior of us numpties.

In other words, these concepts are offensive, not in themselves, but because they're the actions of the common masses. One could continue in a similar vein: the term "pagan," in its roots, describes country people, who worship loudly and fervently, not like us quiet, dignified city aristocrats. The term "rustic" doesn't mean something is quaint and bucolic, it means it's literally rotten. And so on.

Standards of polite discourse are designed to exclude the ordinary people. Even to describe discourse as "polite" associates it with the polis, the city where the king or magistrate holds court. Our standards of refined dress, cleanliness, and grammar, are designed to ensure that nobody mistakes us for bumpkins from the provinces. They're designed to identify us as gentry.

Recent trends in Christian culture, especially Protestantism, have relaxed standards on swearing. As a subset of American Christians have become more accepting of marginalized groups, particularly LGBTQIA+, they have likewise become willing to stop policing language, both their own and others'. I think this is a positive development, and foretells needed changes in how Christians relate to the world.

I first noticed it with public Christians like Nadia Bolz-Weber. A recovering substance abuser herself, Bolz-Weber comes from a "vulgar" background, where cusswords are simply the vernacular. To best serve her congregation, she felt that she should speak their language, which is often four letters long. When she began public life, though, this opinion was still an outlier.

As social media, Twitter in particular, has become part of Christianity's mission field, plain-English discourse has become mandatory. I follow multiple clergy, seminarians, and public theologians who feel increasingly disinhibited about using the language of the "great unwashed" in public. Where many pastors I know once felt obligated to apologize if the occasional "fuck" slipped through, it's now not even that controversial.

Like anything, moderation and self-discipline remain important. Just as public Christians can enjoy the occasional beer, but should refrain from drunkenness, there's a difference between cussing sometimes and talking like a sailor on shore leave. But prissy censoriousness, once the signature trait of the socially disconnected clergy, is a dying form. And good riddance, too, as it only separates Christians from "the least of these" that we're called to serve.

Because of the economically polarized, deeply divided world around us, strict rules basically serve to keep the poor excluded. Demanding that Christians never cuss is about equal to saying you can't attend church unless you own a fitted suit: it creates a buy-in so high that the poor can't afford to join. We're called to be a church, not a country club, and shouldn't have a bouncer on the door.

Wednesday, September 22, 2021

Two Modern American Myths

Mounted border patrol agents break up a refugee camp. Photo by Paul Ratje for AFP

Two stories achieved critical levels of awareness in America this week. In Texas, Border Patrol agents on horseback, wearing Stetsons and chaps, broke up an encampment of Haitian refugees. Many Haitians had hiked from as far as Brazil, where they’d fled after the devastating 2010 earthquake. The agent in the most widely shown photograph waves something at a refugee; commentators argue whether it’s a whip, a strap, or a lariat.

Almost simultaneously, a massive dragnet in Wyoming’s Grand Teton National Park found human remains consistent with missing “van life” vlogger Gabby Petito. Her fiancĂ©, with whom she’d been traveling when she disappeared in August, has gone missing, apparently with his family’s assistance; circumstantial evidence and bad PR make it appear increasingly likely he’s responsible for her death. Millions of Americans follow the story with tightly held breath.

The jagged contrast between these stories speaks volumes about our identity as Americans—volumes that, sadly, don’t reflect well on us. The copious attention paid to a young, good-looking White woman’s disappearance contrasts poorly with the many disappearances of Native American women in the same area. As tempting as I find it to make flippantly cynical comments about looks and influence, I fear a darker narrative is in play here.

Narratives about defending women, especially White women, permeate American propaganda myths. “Wives and daughters back home” get used to justify atrocities by troops abroad, and by militarized police at home. Historian Kathleen Belew writes how White Power cells use language of marriage and motherhood to justify race-based violence against the state. Womanhood, implicitly White in the mythology, continues justifying America’s worst impulses in the world.

(I’ll briefly acknowledge that many people following Gabby Petito’s story are women, including women of color. As cultural critic Sady Doyle notes, women’s aggregate attraction to true-crime narratives may reflect their awareness of their precarious position in a violent, patriarchal society. This true-crime myth, and the “wives and daughters” myth, likely have wide overlaps. But I only have space to address one.)

Historically, the myth that White womanhood needs defended against an encroaching Black criminal class, has justified egregious vigilante violence. Carolyn Bryant’s claim that teenager Emmett Till wolf-whistled at her justified her extended (male) family torturing and murdering Till—even though we now know Bryant lied. That’s just one highly visible instance where protecting White women weighed larger than either Black men’s lives, or American ideals of justice.

A still of “van life” vlogger Gabby Petito, taken from hew video history

In fairness, I doubt strongly that anyone popularizing Gabby Petito’s story has consciously racist motivations. Nevertheless, the absence of matching coverage for the area’s many missing Native American women, and the massive outlay of resources and manpower to investigate Petito’s tragedy, follow the time-honored script. And events in Texas, while perhaps not directly motivated by Petito, still reflect how myths of White womanhood steer myths of White manhood.

The Border Patrol agents’ cowboy garb, and slave-catcher tactics, reflect a belief that some uniquely American essence is under attack. Maybe it is. The increasing popularity of “van life” culture, propelled by Internet charisma, suggests that important parts of the American Dream, including home ownership, stable employment, and parenthood, are dwindling. Whether the Dream is dying because of kids living frugally, or adults paying starvation wages, goes unexamined.

It’s always been easier to blame dark-skinned outsiders for unwanted change, than to examine the economic and power dynamics that make people’s choices for them. Fear of refugees at the southern border, joins past narratives of “uppity” Black men, “Indian outrages,” or the “Yellow Peril,” to provide a ready-made narrative. As Yale historian Greg Grandin demonstrates, racist violence on America’s margins always increases after overseas wars, especially unsuccessful ones.

American youth increasingly don’t believe home ownership and participation in capitalism are desirable, or even possible. American military might can’t silence restive populations internationally. Human greed and environmental decay mean youth increasingly don’t believe they have a future. The generation that still holds power in America’s government and economic institutions believes their systems of control are failing, because in fairness, they are.

Gabby Petito’s death provides a neat, linear narrative that White womanhood is beleaguered, and not, the powerful rejoice, from above. Rather than address patriarchal attitudes that make violence less emasculating than a breakup, America has retreated into that other myth we foolishly believe, that shows of strength can bring the hammer of justice, in the best John Wayne style. The result is ugly all around.

America is showing the world its worst face. Worse, we’re showing that our mythology doesn’t work anymore.

Monday, September 20, 2021

Black Afterlives Matter

Cadwell Turnbull, No Gods, No Monsters: a Novel

It begins, as so many stories do today, with a leaked police video. A police body cam catches a Boston PD officer putting two bullets in a suspect. The twist is, the suspect this time is a lycanthrope. Though City Hall tries to bury it, evidence soon gets out, and the whole world knows. We have proof that the monsters and demons of folklore exist in the world of science and technology.

Cadwell Turnbull's second novel exists on multiple levels simultaneously. We can browse his text for parallels with our modern world, and how that speaks directly to us. It's easy, and tempting, to point out resemblances to BLM, QAnon, late capitalism, and other looming contemporary issues. Turnbull's monsters are metaphors for violent, segregated postmodern America.

Such rigid analysis, though, loses the nuance of Turnbull's prose poetry. Turnbull tells multiple overlapping stories of ordinary, civic-minded people trying to pay their bills and improve their communities. These individual stories, however, unfold against the background of an American society where the monsters of myth and fable, the phantoms of campfire tales and bedtime stories, have been forced back into daylight.

In and around Boston, a group of "smash the system" anarchists find their loyalties torn. Their commitment to peaceful resistance has always included BIPOC, LGBTQIA+, and other marginalized groups. But do werewolves and techno-mages count as marginalized? Ideals that seemed unambiguous in theory, turn vague when tested, because seriously, do vampires and shapeshifters need civil rights?

In the US Virgin Islands, a territorial senator with high ideals has concealed that she's a weredog, and her sister walks through walls. But a secret society brings her information that could close her childhood trauma. Should she risk her career, and the trust of her constituents, to solve the biggest mystery of her life? And where does she turn when her personal trauma proves to be part of a secret civil war threatening humanity?

Meanwhile, the story's "Third Person Omniscient" narrative voice struggles to reconcile the seemingly divergent secrets and lies driving the characters. How, the voice wonders, can I speed these poor, blind people toward their resolutions? Soon the voice peels off and begins cross-examining the characters in their moments of vulnerability. We realize that the story has become self-aware.

Cadwell Turnbull

Does this sound like a lot of threads? Turnbull doesn't deny it. With his shifting perspective, his cast of thousands, and his multiple short stories converging on one destination, he has created the Winesburg, Ohio of contemporary dark fantasy. He uses horror tropes and political hot buttons to tell a story of literary depth and subtlety. The product dwells in the liminal space between genre fiction and literature.

Don't let that intimidate you, though. Turnbull writes with a casual voice and brisk pace that never lags (except maybe once, for two pages around page 185). Even when discussing revisionist economics or the many-worlds theory of quantum physics, he doesn't forget that writing is about the relationship with his audience, and he keeps us engaged with characters and story.

This hybrid form works in service of a plot with a message. Turnbull's characters, human and monster, draw from America's marginalized population: Black and Latinx, queer, traumatized, colonized. People who the majority culture have kept at arm's length throughout living memory. People the majority culture has dismissed as criminals, ingrates, and monsters.

What, Turnbull wants us to ask, happens when America's monsters become visible? When we can no longer pretend we don't see them? Like Slender Man, these monsters have always been there, and once they force their way into our perception, we see the footprint they've left on history. Once we've seen them, we can no longer forget them… until, Turnbull suggests, we start to forget them anyway.

In the final pages, Turnbull demonstrates the lengths some will take to ensure our collective forgetting. This opens new and darker doors, forcing a conflict that didn't have to happen. But, like in so many issues today, the powerful minority's refusal to face facts means someone has to take action. We ordinary people are forced along.

It isn't necessarily obvious at first that this is the beginning of a series. Though it says "Book One of the Convergence Saga" on the back of the dust jacket, it's written small enough to miss it. I'm eager to see how Turnbull handles the story threads he's introduced in this volume. Maybe that's the highest praise I can offer, to say I'm ready to see where things go from here.

Wednesday, September 15, 2021

Some Thoughts On “Do Your Research”

Cast your memory back to the “research” papers you wrote in high school and college. The hours spent in libraries, the days spent reading sources. If your education was anything like mine, you spent precious little time with primary sources and laboratory-grade investigation until you were in graduate school, assuming you went that far. “Research,” for most of us, involved compiling citations from acknowledged experts to create a consensus.

Watching the debate unwind over vaccine “research,” I’m noticing two opinions arise. Those reluctant to trust new and innovative mRNA vaccines have scolded: “Do your research!” By this, they mean searching publicly available sources for evidence which confirms or contradicts the widely accepted narrative. The “research” they exhort resembles high school research papers, which often consisted of compiling lists of expert testimony to support a chosen thesis.

This version of research, of course, suffers because it’s vulnerable to confirmation bias, the tendency to only accept evidence which supports what one already believes. The wide availability of testimony, and low barriers to “expert” status, mean the Internet and cable news are awash in putative evidence for every position. One needs only modest patience and a little “Google-fu” to support any position, no matter how pseudoscientific or outlandish.

Contra this attitude, I’ve witnessed vaccine True Believers insisting that, unless you’ve established a laboratory, conducted double-blind studies, run statistical regressions, and published in a peer-reviewed journal, you haven’t really done “research.” To these people, only original research, conducted in rigorous laboratory conditions, counts for debate. Everyone else should defer to the experts in their respective fields, who have actually studied the topic.

This may alienate fellow Leftists, but I find this attitude dangerous. Beyond the elitist attitude that only those who have research grants and official standing should “do science,” this mirrors other important American trends. Consider legislation written by lawyers, so arcane that only other lawyers can comprehend it. Chances are, your car is currently in violation of some road laws, and you’ll never know unless the police stop you.

I appreciate my fellow Leftists’ desire to assert that one evening browsing Google isn’t serious research, that too many “skeptics” merely seek confirmation for whatever fish-eyed opinion they already have. But the opposite of lousy research isn’t relinquishing everything to credentialed experts and demanding general obedience. We don’t rescue science from the numpties and full-time professional ignoramuses, by decreeing only the chosen few voices really count.

Consider the parallels. We’ve witnessed the damage caused by legislators making laws without consulting the electorate, artists creating art without thought for the audience, theologians making religious pronouncements that alienate hurting souls, and engineers building appliances we mostly don’t want and can’t use. This has given us a blighted atmosphere, bureaucratic intransigence, intractable spiritual malaise, and long, joyless trips to unpleasant theatres and art museums.

Don’t misunderstand me. I appreciate expertise, in its place: if I have a medical emergency, I definitely want a doctor, not some rando off the street. But I’ve also experienced painful and disheartening interactions with doctors who make snap decisions, disregard their patients, and prescribe treatments for illnesses this individual doesn’t actually have. I’ve watched conditions go untreated for years, because experts considered themselves immune to consultation.

Even serious researchers don’t think only new research counts. Universities and laboratories conduct metastudies frequently, because only taken collectively do scientific studies yield insights. As Thomas Kuhn has written, science is the process of discovering new information, but it’s also the process of putting that information in a narrative context. Science is about fact-checking, certainly, but it’s also about story-telling.

“Skeptics” Googling information are only seeking the story that unifies the evidence around them. I agree that they generally accept thin evidence and don’t scrutinize the source. Maybe, when our middle-school teachers assign “research papers,” they need to dedicate a unit to evaluating sources, and not deferring that until graduate school, as happened to me. But barring that, Google researchers just want the narrative that ties everything together.

Certainly, we need better discussions about what constitutes “evidence,” who really is an “expert,” and why “science” sometimes changes its mind. Skeptics make bank harping on minor inconsistencies, or on people reversing track following new evidence, which shows a lack of process understanding. But the solution isn’t telling unbelievers to shut up and obey state-sanctioned authority. This shouldn’t be a choice between ignorance and subservience.

People do their “research” because they want to understand. That deserves respect, not mockery, and they deserve a better class of experts.

Saturday, September 11, 2021

Bright Midnight in the Fake Japan

Mary Elizabeth Winstead and Mika Martineau in Netflix’s Kate

Kate (no last name) stalks Tokyo’s midnight streets, enforcing terminal contracts on behalf of… someone, it’s never made particularly clear. Despite her gaijin status, she’s become one of Japan’s top contract killers, available on a moment’s notice. Until, that is, someone slips her a lethal dose of radioactive poison during her latest caper. With mere hours to live, Kate has to find her killer and exact her revenge.

What exactly about Japan makes filmmakers believe round-eyes develop superpowers? This isn’t the first movie I’ve watched where the creative team thinks a White character wanders into a world of paper houses and Armani-clad assassins, and begins moving fast enough to dodge bullets. The White Euro-American fetish for Japan as a land of comic-book exaggeration worked when it only happened occasionally, but now, it’s become clichĂ©, bordering on racism.

Netflix’s Kate is merely the latest Western movie I’ve watched that depicts Japan generally, and Tokyo specifically, as a manifestation of anime excess. Like Kill Bill and The Wolverine before it, Kate’s Tokyo teems with bright colors suffused against a background of steel-framed technocratic excess; in several scenes, anime scenes are literally projected onto the surfaces of gleaming skyscrapers. This, of course, when Kate isn’t barging into tatami-mat paper houses.

We’ve seen movies like this before. The tall, unflappable protagonist strides, god-like, through a world of highly choreographed violence, and somehow never gets hurt badly enough to stop her. In substance, Kate is neither revolutionary nor controversial. It’s just another Western attempt to recreate the magic of John Woo “gun-fu” thrillers like Hard Boiled and A Better Tomorrow. It’s silly and grotesque, but not particularly dangerous.

The parts that frustrate me appear in the background. Kate’s Tokyo is always night, and frequently rainy; the movie’s only visible daylight occurs in the prologue scene, in Osaka. (Even then it’s overcast, and the wet pavement suggests recent drizzle.) The blackened midnight gloom is anything but dark, however, as oversaturated neon colors occur everywhere, from the backlit advertisements littering every street, to the kids’ brightly painted hair and clothes.

Hugh Jackman in James Mangold’s The Wolverine

I’m reminded of James Mangold’s The Wolverine. Though Mangold permits Japan more daylight, multiple important scenes occur in color-soaked midnight. Mangold repeatedly frames scenes so traditional rice-paper houses and cherry-blossom landscapes exist in the foreground, against a skyline of glass-and-steel skyscrapers. Even Hiroshima, for Mangold, becomes a moment of transcendent glory. Mangold’s Japan, like Kate's, is a deliberate mix of Orientalist exoticism and excessive modernity, the hungry Japanophile’s dream landscape.

And, like Kate, Mangold’s Logan engages in battles that only make sense if we pretend we aren't’ aware of flight rigs and fight choreographers. The characters feel compelled to engage in hand-to-hand combat, or even ritualized swordfights, despite everyone carrying fully automatic guns in armpit holsters. Throughout these battles, the White hero never gets hurt, not badly enough to stop fighting anyway, while sharp-suited Yakuza extras die like flies.

At least Mangold’s White hero has literal superpowers: a healing factor, metal skeleton, and retractable claws. Logan’s inability to suffer real pain makes sense, within the character’s X-Men context. Kate somehow suffers advanced radiation poisoning, multiple bullet wounds, fall injuries, and plain old exhaustion, yet nevertheless keeps killing anyone who challenges her. Because of course she does, she’s a White gunslinger in Japan.

Uma Thurman in Quentin Tarantino’s Kill Bill

I’m reminded of Quentin Tarantino’s Kill Bill, in which the nameless antihero apparently gains killing power by purchasing a katana. Throughout the movies, Uma Thurman’s “The Bride” character slaughters every challenger, apparently because her sword gives her superpowers. Historically, katanas were made of pig iron and, despite Western myths, were actually cheap swords for soldiers. But give a katana to a White woman, and she apparently becomes Death incarnate.

These movies share the mythological backstory of ancient Bushido traditions kept alive amid technocratic modernism, an oversaturation of colors, and a warrior ethos. Japan, for action filmmakers, isn’t a place; it’s an ethical situation into which they ship White characters. Like Neverland or Narnia, Japan becomes a place where laws of physics are suspended and death is paused, so White people can test their mettle and emerge renewed.

Ultimately, these stories, with their White protagonists and dreamlike settings, aren’t really Japanese. For too many White filmmakers, Japan isn’t a place where people live and work and aspire and die; it’s a color-soaked fairyland. It becomes a recipient of Western ideals of magic and transcendence, stripped of anything authentically Japanese. It becomes a cartoon, in the worst sense. Maybe it’s time for Westerners to give Japan back.

Thursday, September 9, 2021

What, and Who, Is Art For?

Banksy, Snow, 2018 (source)

Someone recently hit me with a shopworn Banksy quote: “Art should comfort the disturbed and disturb the comfortable.” The anonymous British graffiti artist, whose success is as much a triumph of public relations as artistry, rewrites an axiom beloved by creative professionals, satirists, clergy, and politicians worldwide, to comfort the afflicted and afflict the comfortable. This chiasmus has precedents in the Bible, where Mary says:

He has brought down rulers from their thrones
    but has lifted up the humble.
He has filled the hungry with good things
    but has sent the rich away empty.

Sounds great, certainly. But one wonders what exactly this means, coming from Banksy. Earlier this year, one of Banksy’s canvases sold for $20 million, a new personal best, and a price range beyond anything us pedestrians could afford. Though Banksy became famous for semi-illicit guerilla work in outdoor spaces, the artist’s ability to continue making public art is subsidized by producing portable canvases which only the insanely wealthy can afford.

I began actor training in the early 2000s, when one couldn’t go thirty feet in any American theatre department without hearing somebody loudly singing excerpts from Jonathan Larson’s Rent. That play’s exhortations against post-Reagan malaise and the racism and casual homophobia of the late Twentieth Century took on an explicitly rebellious edge. The play implied all of Manhattan’s bohemian Alphabet City would rise against stultifying conformity and change the world.

Twenty years later, my acting career sputtered following some poor choices; Alphabet City has gentrified; and though casual homophobia isn’t instantiated in law anymore, the revolution never actually came. As YouTube critic Lindsay Ellis has explained, theatre often embraces the rhetoric of insurrection. But it absolutely requires the financial backing of corporate donors and rich patrons, because the soaring overhead means theatre bleeds money most of the time.

Artists, including me, consider ourselves incipient revolutionaries. We have messianic delusions that, like Mary’s Magnificat, we’ll overthrow rulers and raise the proletariat. (Mary sounds almost Marxist.) Yet art regularly loses money, and requires someone else’s generosity to cover the bills. Maybe it’s slightly better in Britain, where public subsidies mean the working class can afford theatre, orchestra, and opera tickets, but even that makes art beholden to the state.

Since my acting career has translated strictly into community theatre, I’ve discovered how risk-averse management frequently is. Not only will boards avoid anything with raunchy themes or controversy, which is perhaps understandable, but they’ll also avoid anything too new. I’ve witnessed decision-makers moot the idea of producing material written locally, but it always dies quietly, as only something road-tested on Broadway will likely pull audience numbers sufficient to entice sponsors.

Jackson Pollock, Blue Poles, 1952 (source)

This means our company will never do anything likely to challenge our community’s religious, political, and economic suppositions. I don’t even mean Augusto Boal’s sometimes self-righteous precepts that theatre shows society to itself, exposing our worst sins and social rot. Sam Wasson writes that Paul Sills and Mike Nichols invented Chicago improv to wrest theatre from elites, yet the companies soon became dependent on high-demand ticket prices.

It’s tempting to say these points describe in-person art, which always suffers from high overhead. What, one wonders, about recorded music and movies, which can amortize costs across much larger audiences? Even that doesn’t bear scrutiny, as most recording artists lose money and depend on concerts and personal appearances to get paid. And with Disney’s acquisition of Fox, Lucasfilm, ABC, and Marvel, one company has a stranglehold on Hollywood.

Fundamentally, art needs the system it rebels against to remain solvent. Not even profitable, but just above water. Artists individually may believe we’re doing God’s work, but in the aggregate, like politicians, we kowtow to whoever carries the checkbook. Great art, like a pearl, originates in friction and suffering, yet we wind up defending the system we abhor, because sooner or later, we get hungry. We can’t help becoming complicit.

These complaints aren’t unique to art. I’ve heard parish pastors voice the same struggle: they ultimately can’t drive the moneychangers from their temple, because somebody has to keep the lights on. Politicians campaign against the very fundraising practices that subsidize their eternal reelection bids. I only focus on art here because, as I struggle to find buyers for the manuscripts I’ve written, I still need rent and groceries.

But I’ve surrounded myself with art, and its baggage, too long to read quotes like Banksy’s without flinching. That statement seems directed at me. But, like Banksy, I still need someone to buy my art.

Sunday, September 5, 2021

Why Men In Skirts Are Dangerous, Part 2

This essay follows Why Men In Skirts Are Dangerous
Me in a kilt in my 30s

After another lively weekend spent badly faking a Highland Fling at the Kansas City Irish Fest, I find myself crawling reluctantly back to daily life. Having missed the 2020 festival owing to the pestilence, I loved the return to live music, cultural exploration, and being around people who, like me, enjoyed being entirely in the moment. And I took the opportunity to do something I seldom do anymore: I wore my kilt out in public.

Two years ago, last time I had this opportunity, I wrote about my entirely-male co-workers, many of whom expressed astonishment that I actually wear kilts. “I could never do that,” one said, sending me on a spiral of contemplation about what might make a man express such aversion. I decided the opinion must express sublimated class-based anxiety about the consequences of rocking the boat. These conclusions seemed obvious and inescapable, from my personal vantage point.

That’s the problem with anecdotal analysis, however: it reflects my vantage. My friend Jerry, who works a white-collar job, expressed a counter-opinion: “I think, for me, it's more of a case of it would make me feel physically vulnerable, not having my loins girded?...I like having those dangly bits more protected.” I love such reciprocal give-and-take, because it forces me to refine my thinking. Especially when my comments have larger implications, they require testing.

Jerry makes a valid point: men, in most English-speaking cultures, grow accustomed early to having our genitals covered by fabric. This practice has practical implications, especially in manual trades: the thought of operating a circular saw or pneumatic hammer without my junk swaddled in cloth makes me reflexively cringe. We probably had similar reactions in humanity’s primordial environment: imaging fleeing a sabre-toothed tiger across the savannah with your gonads flapping.

Okay, now unclench your thighs.

Counterpoint, though: anybody who’s raised children, even in a helping capacity, knows the difficulty of convincing toddlers to wear trousers. Small kids often need coaxed, cajoled, even threatened to not strip their britches off. Clothing isn’t naturally comfortable; humans need conditioned, at early ages to assent to having our bodies packed in cloth. Anybody who has worked outdoors in high summer knows the sensual joy of stripping your jeans off and letting your legs breathe.

Not everybody outgrows this feeling. In recent years, we’ve witnessed the phenomenon of teenagers, mostly boys, refusing to wear long-legged trousers, even in winter. As rigid social standards surrounding clothes have relaxed, some youth have rejected the imperative to cover their legs. The most commonly cited reason is because having their knees and calves encased feels unnatural and intrusive. Simply, they refuse the conditioning necessary to see pants as mandatory. For good or for ill.

When I purchased my first kilt, I assumed it’d be my last. I intended to wear it in certain limited circumstances, mostly in oppressive Great Plains summer conditions. I started wearing it to university, basically because it got a reaction from certain squares who needed a good jolt. But I kept wearing it, and buying more until I had four kilts, because it was exceedingly comfy. As noted before, I became known as “Kilt Guy.”

Me in a kilt in my 40s

Then, I began teaching. I wore a kilt once, and realized it didn’t work: I needed my students paying attention to my lessons, not my clothes. I stopped wearing kilts on teaching days. Then I left teaching for the manual trades, where having my entire legs covered was important, and loose-fitting fabric that could get caught in the equipment was a critical safety hazard. First, I got accustomed to wearing kilts. Then I got unaccustomed.

At Irish Fest this weekend, I wore my kilts for the first time in two years. As the opportunities to wear kilts and feel comfortable in them has diminished, I’ve grown uncomfortable in the clothing choice I once embraced. Even before the plague forced everyone into isolation, I became uncomfortable wearing kilts to the park or the pub, places I once most enjoyed wearing them in my thirties. I’ve become a conformist, pants-only stiff. Again.

Therefore I suggest, both my previous semi-Marxist reading of the jobsite, and Jerry’s explanation of protecting his goolies, are canards. My co-workers’ defense of britches, arises because britches are workplace safety gear. They see pants as manful because they see themselves as manful, and they wear pants. My definition of manhood doesn’t require pants; but my definition of comfort has evolved to demand them. Then, the Irish Fest made me comfortable again in a kilt.

Every explanation I’ve encountered is a post-hoc construction. Humans need conditioned from toddlerhood to wear clothing. We wear certain kinds of clothing because we’re conditioned according to culture, economics, and climate. And our conditioning changes according to how society rewards and punishes us. Our explanations aren’t wrong, necessarily; they’re just retrospective. All explanations arise from needing to defend an answer we’ve already reached.

Therefore I stand by what I wrote before. Except when I don’t.