Tuesday, December 31, 2019

The Pleasure and Pain of 1969

1001 Albums To Hear Before Your iPod's Battery Dies, Part 15
Fairport Convention, Unhalfbricking and Liege & Lief

In the middle 1960s, American musicians like Bob Dylan and Roger McGuinn had a pioneering idea: why not play traditional folk singer-songwriter tunes with a pronounced rock backbeat? This made American folk-rockers into international superstars, but for some reason, British musicians hesitated to follow suit. Fairport Convention, founded in 1967 and named for the house where they first rehearsed, jumped into the resulting void. But in 1969, their sound suddenly changed.

At the beginning of the year, they basically played Americana. Their Greenwich Village sound attracted a small but loyal audience to London folk clubs, but their albums and 45s didn’t chart. New female vocalist Sandy Denny, already semi-legendary on the British folk circuit, sang in Judy Collins’ vocal range. However, she wanted to record music more distinctly British than American. And the band, somewhat adrift, was (mostly) willing to accommodate her.

Fairport Convention cranked out a remarkable three full LPs in 1969. Their first with Denny, What We Did On Our Holidays, is mostly of historical interest. But with their second, Unhalfbricking, they took British music by storm. It’s somewhat silly to call their sound “mature,” as most band members were only nineteen and twenty years old; bassist Ashley Hutchings was downright elderly, at twenty-four. Yet they sported a weathered, old-soul sound.

This album includes three Bob Dylan compositions, taken from his Basement Tapes. Dylan overtly influenced Fairport, who included multiple Dylan covers in their live sets. However, Dylan’s reciprocal openness means that he unlocked his demo vault to them, allowing them to record three songs that hadn’t had mainstream releases yet: “Percy’s Song,” “Million Dollar Bash,” and “If You Gotta Go, Go Now” (recorded in French, as an in-joke).

Unhalfbricking opens with guitarist Richard Thompson’s first songwriting credit, “Genesis Hall.” Now a living legend among folkies, Thompson, twenty years old, was just coming into his own as both a songwriter and multi-instrumentalist; on this track, he plays dulcimer, then as now an unusual instrument in mainstream music. Swinging straight into a Dylan song, then Sandy Denny’s “Autopsy,” then the traditional “A Sailor’s Life,” Side A was deeply melancholy.

Side B includes Denny’s “Who Knows Where the Time Goes,” which she’d already recorded as a solo singer-songwriter single, and would later re-record as a bluegrass with the Strawbs. It’s now a recognized classic, and this is probably her most famous version. Another Thompson composition, and two Dylans, finishes the set. This album straddles the divide between American folk-rock, which was already established, and British folk-rock, which they were just creating.

Before the album dropped, however, Fairport’s touring bus suffered a catastrophic road accident, killing drummer Martin Lamble, aged only nineteen, and injuring every other member. The band went into a months-long hiatus; before its release, Unhalfbricking already memorialized a band which no longer existed. The late-1960s milieu, however, demanded immediate action. Rather than breaking up, Fairport reconvened with Liege & Lief.

Fairport Convention in late 1969

Partway through Unhalfbricking, Fairport’s most staunchly American member, London-born Ian Matthews, quit to record Americanized folk-rock. (His covers of Steve White and Jackson Browne are legendary.) Liege & Lief thus reflects Sandy Denny’s aggressively British inspirations. Unlike Unhalfbricking, which includes only one “Trad. Arr.” song, this album has five, with only three new compositions. Traditional stemwinders like “The Deserter” and “Tam Lin” reflect this album’s backbone.

However, the track which most thoroughly reflects Liege & Lief is the traditional “Matty Groves,” a Northern English murder ballad from the 1600s. Anybody who says metal is the darkest music, has clearly never heard a British murder ballad. This song’s grim, fatalistic lyric, in which pleasure always contains karmic payback, suggests a band blaming itself for Martin Lamble’s death. Played in a traditional style, this song includes a long, self-flagellating hard-rock tail.

Liege & Lief is much darker and more pessimistic than Unhalfbricking. Songs like “The Deserter,” in which the title character receives a pardon on condition that he returns to service, suggest that every respite carries its resulting doom. But it’s also a musically ambitious album. Fiddler Dave Swarbrick, a guest contributor on Unhalfbricking, becomes a full member, and his love of minor keys gives every track an almost epoch-making orchestral depth.

After recording Liege & Lief, Sandy Denny quit, and the band never reclaimed this level of success. Their best songwriters eventually drifted away, and a vestigial group now tours the nostalgia circuit. But these two albums, which bookend the ways 1969 began with unprecedented optimism, and ended in bleak resignation, are a memorial to a classic band, and a year like no other.

Saturday, December 28, 2019

Tic-Tac-WHOA!

Blaise Müller (game designer), Quarto

Have you ever wanted to recapture the speed, simplicity, and ease of Tic-Tac-Toe, without the annoyance of knowing from the get-go who’s going to win? Swiss mathematician Blaise Müller apparently shared this desire, because he invented Quarto, a similarly themed grid game which shares the goal of creating a straight line of symbols. Then he added one complication: your opponent picks your symbol.

Müller gives you sixteen pieces, and a playing board with sixteen round “squares.” The pieces divide into four pairs: pale or dark finish, short or tall, round or square, dimpled or not-dimpled on top. Your goal is simply to create a line of four pieces across the board, where all four pieces have some quality in common. But you don’t get to pick which piece you set from move to move, oh no. Your opponent hands you your pieces.

I’ve recently become a fan of games like Onitama and Pylos, with simple boards and few rules, which nevertheless admit multiple forms of strategy. Quarto, manufactured by the German company Gigamic, which also manufactures Pylos, has rules filling less than one page of the included instruction booklet. Yet once you start playing, you realize the game admits hundreds of possible interpretations. Minutes to learn, they say; years to master.


You need to think strategically, just like in Chess or Connect Four, not only in placing your pieces, but in how you block your opponent in placing theirs. But you get the added complication that you relinquish control, allowing your opponent to make some level of decision about the choices available to you. Think of it like a metaphor for the capitalist economy: you both are, and are not, in control of your decisions.

Most games run under fifteen minutes. Like in Chess, players often make early moves hastily, almost arbitrarily, then slow down as the game’s middle forces deliberative strategy. Then at the end, you find yourself moving lickety-split again, as the conclusion becomes inevitable. And when it’s over, you’re ready to play again, because you’re awash in heady excitement of new discovery.

Or is that just me?

Friday, December 27, 2019

The Existentialist Jesus

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 34
Martin Scorsese (director), The Last Temptation of Christ


In northern Galilee lives a carpenter who makes crosses for the occupying Roman forces. The man Jesus has dark dreams of making all things new, but these dreams all end in his violent death. Jewish rebels assign Jesus’ childhood friend, Judas, to kill the man they consider a race traitor, but as Judas rediscovers his friend’s torments, he begins to suspect something nobody expected: Jesus might be the long-expected Messiah.

Martin Scorsese was no stranger to controversy; classics like Taxi Driver and Raging Bull demonstrated his willingness to push limits and defy critics. But this, his eleventh directorial outing, really angered people willing to voice their feelings. His deviation, not only from the Gospels, but also Nikos Kazantzakis’ existentialist novel, enraged believers and doubters alike. Yet audiences willing to overlook the controversy, will find something deeper waiting herein.

For Jesus (Willem Dafoe), the call of God isn’t sweetness and light. He knows representing God will place him between Romans and Jews, unable forever to live peacefully. He makes his living as a Roman collaborationist, but isn’t particularly sympathetic to the occupiers; though he also has no commerce with the rebels. Life in occupied Galilee is permanent suffering, especially for a tradesman who simply wants to live quietly.

Judas Iscariot (Harvey Keitel) pulls Jesus from one extreme. His demands for violent rebellion against the occupiers have a certain appeal. The Jewish people are crushed, silenced by their status as lesser people, a status Judas doesn’t accept. He isn’t particularly eloquent, but his actions speak volumes. When he interprets Jesus’ dreams to mean Jesus might be the Anointed One of prophecy, he thinks maybe his rebellion isn’t vain.

From the other extreme, Jesus feels drawn to Mary Magdalene (Barbara Hershey). Like Judas, Jesus has known Mary all his life. Now working as a prostitute, Mary fucks Jew and Roman indiscriminately; her assimilation buys wealth unparalleled in rural Galilee, and a level of personal solitude Jesus envies. Mary attempts to seduce Jesus into the peaceful life of an acculturated sell-out, but Jesus finds this as dissatisfying as Judas’ rebellion.

Inspired by the half-naked visionary John the Baptist, Jesus flees into the wilderness to confront God directly. Instead, he meets Satan, who—but let’s pause there. Here’s where conservative Christians’ objections to this movie seriously begin. Scorsese, like Kazantzakis before him, presents a list of temptations differing from the Gospels. No more does Satan offer Jesus mere nourishment, glory, and power. Because Scorsese acknowledges, this isn’t really Jesus.

Willem Dafoe as Jesus of Nazareth, and Barbara Hershey as Mary Magdalene

Nikos Kazantzakis had a massively complicated relationship with Christianity. He denied Jesus’ consubstantiality with God, and believed humans could have the same Messianic relationship Jesus did. But he also resisted the Greek Orthodox patriarchate’s attempts to excommunicate him, wrote books on prayer and pilgrimage, and ordered a religious funeral. He simply didn’t believe the Christological message could be preserved in amber by embodying it in only one person.

Scorsese carries this hybrid religious existentialism into cinema. This Jesus isn’t Jesus Christ, the historical figure who lived, died, and was putatively resurrected in First-Century Galilee. This Jesus is the Christological impulse in all human spirits, the desire to become closer to God’s will, coupled with the dread of what happens when we achieve this desire. Because Scorsese’s Jesus knows, like all theologians, that God’s call leads to this-worldly death.

Returning from the wilderness, Jesus assumes his missionary role, carrying a message unto the people of God’s unconditional acceptance. But he successfully alienates both Roman occupiers and Jewish nationalists, because in Jesus’ telling, this world has no comfort for you. You must abandon the illusion that you’ll find peace and stability in the flesh. This message sways, but never quite converts, both Judas and Mary.

But as Jesus confronts the violent ends which this world reserves for all non-conformists, he discovers he never left Satan in the wilderness. Sure, he resisted the three temptations put before him, and began his mission. But Satan, king of this world, has saved one temptation in reserve: the invitation to relinquish one’s moral purity and become normal. That’s the Last Temptation of the title, which we all face.

The conservative controversy which enshrouded this movie’s 1988 release exists out of proportion to its actual message. Even if we don’t accept Kazantzakis’ (and Scorsese’s) moral that becoming Christ is an option available to humankind, we all nevertheless face the existential struggles this Jesus overcomes. We only survive by coming prepared. That’s the Gospel of this Jesus, that we have the tools to overcome.

Wednesday, December 25, 2019

Help Thou My Unbelief

1001 Books To Read Before Your Kindle Battery Dies, Part 103
Daniel L. Pals, Nine Theories of Religion, and
Stephen Prothero, God Is Not One: the Eight Rival Religions That Run the World


Everyone knows the religion they believe, or anyway the religion they rebel against. But what is religion as an overall phenomenon, an apparently common aspect of the human experience? What is religion when you strip away one denomination’s specific beliefs? There you hit a much thornier aspect of humanity. Attempting to understand “religion” without parsing individual beliefs leads to an interdisciplinary academic field which American scholars call “religious studies,” and the British call “comparative religion.”

In his survey textbook, Daniel Pals provides a brief overview of the history of religious studies, and the nine most important theories yet devised to explain human religion. (That’s his nine, so the selection obviously isn’t without controversy.) In a short introduction, Pals explains that religious studies began in the 19th Century as an effort to justify Christianity. However, the field quickly transitioned into something more scientific and secular. The scholarly drama is almost biblical.

Early sociologists and psychologists, like James Frazer and Sigmund Freud, assumed religion was an ancestral hanger-on from humanity’s pre-rational days. Karl Marx believed the wealthy used religion to pacify and control the masses. Even Ėmile Durkheim, less actively hostile to religion than his peers, assumed religion served an essentially socializing purpose, channeling humans into an organized society. Funny enough, these early sociologists, psychologists, and economists all saw religion as a sociological, psychological, or economic phenomenon.

You get well into the Twentieth Century before anybody considers religion as a religious phenomenon. But once you do, the transition is remarkable. When serious scientists like Mircea Eliade or Clifford Geertz attempted to understand religious behavior through believers’ own eyes, they reached remarkably diverse conclusions. These more sympathetic general theories of religion are harder to apply, because they’re more ephemeral and require larger bases of evidence. But they reflect religion as it materially exists.

Most importantly, these general theories solve nothing; even the most recent theories make assumptions that serious critics have credibly attempted to refute. No single theory, from any scientific discipline, has ever successfully explained religion, not without excluding swaths of evidence. But by surveying multiple theories, we readers can selectively apply their various insights to real-world circumstances and better understand particular religious choices. Just don’t let any single theory become some form of new secular dogma.

The single most insidious prior assumption is probably the conjecture that all religions are identical, that all lead equally to salvation. This presupposition has confounded secularists, who don’t bother separating conflicting claims, and also religious people, especially Christians who want to resist their religion’s universalist assertions. But as Boston University religious historian Stephen Prothero writes, all religions can’t possibly lead everyone equally to salvation, since salvation is a uniquely Christian claim. All religions aren’t interchangeable.

Contra this feel-good globalism, Prothero posits a need for familiarity with humanity’s many religions. Toward that end, he conducts overall looks at the eight religions he considers most influential in today’s international milieu. (Again, it’s Prothero’s eight.) The three Abrahamic religions, the four most widespread “Eastern” religions, and the Yoruba tradition, which became global owing to the trans-Atlantic slave trade. These eight religions, in distinct ways, exert diverse pressures on politics, economics, and lifestyles daily.


Many outsiders often assume certain transcendent claims describe all religions. Not so, says Prothero. Many precepts absolutely necessary to Christianity, like God or an afterlife, don’t exist in other religions. Some religions, like Confucianism, are entirely this-worldly, while others require some separation from this world, like Buddhism and Daoism. And even seemingly unitary religions like Judaism disagree wildly on important points: not all Jews agree on an afterlife, or even God’s literal or symbolic existence.

Prothero isn’t a theorist. But he postulates a simple heuristic for understanding how religions work: they identify a problem, offer a solution, construct a path to achieve that solution, and offer human or superhuman exemplars of how to follow that path. This theory isn’t airtight— it could apply to the Marine Corps, for instance— but he makes a persuasive case that, to understand various religions’ incompatible claims, we must see them on their own terms.

These two books aren’t simple bedtime reading. Between them, they cite many sources, offer occasionally incompatible evidence, and drop so many names, I recommend taking notes. However, if you hope to understand religion as a humane phenomenon, they provide plain-English introductions to complex topics which often aren’t explained in ordinary language. And both include enough source notes continue self-guided beyond the introductory level. They raise more questions than they answer, but they’re good, important questions.

Monday, December 23, 2019

What J.K. Rowling Says About You and Me

Celebrated Scottish author J.K. Rowling’s rapid collapse into Internet pariah status last week demonstrates a whole range of issues. The way rich people with massive platforms apparently can’t pipe down while they’re ahead, on the one hand; and on the other, the way Leftist puritans will excommunicate one of their own for not toeing the entire party line. Or the question of whether we can appreciate art while criticizing the artist.
 
I’d like to focus on another question, though: how much of Rowling’s downfall happened because we readers projected ourselves onto her? When she tweeted in favor of economic justice for poor and working citizens, like she herself used to be, we flocked around her. Then she tweeted comments that showed her lack of awareness regarding sensitive racial issues, especially in America, so we turned against her. And she won us back by tweeting against Donald Trump.

This whiplash movement between extremes makes me wonder exactly how much we’re really responding to her. Yes, some comments implied Rowling has a problem with implicit, unexamined racism, to give just one example. However, as Ibram X. Kendi writes, we all do, sometimes. Even Kendi a university professor specializing in the history of race and racism, admits finding poorly sublimated racism in himself, and constantly struggling to identify and expunge it.

Rowling’s comments defending a transphobic troll, laced with dog-whistle language pinched from selectively right-wing web discussions, hit many readers pretty hard. My observations suggest, purely anecdotally, that many audience members took Rowling’s comments so hard because they expected her to demonstrate their treasured values. Though remarkably few would admit it, I suspect many people fear they might harbor unsavory exclusionary opinions too.

This possibility struck me reading somebody’s comment on a friend sharing a Harry Potter-themed Tumblr comment on Facebook. “No Harry Potter memes!” this friend-of-a-friend demanded. “She’s nothing but a TERF hoe.” This comment jolted me severely, not just because this man wanted to defend trans-women using the misogynistic term “hoe,” but also because he said “nothing but,” crushing Rowling’s accomplishments into the most repellent moment in her public life.

In other words, in wanting to condemn an outspoken TERF, this person, this man, used the exact kind of ugly, misogynistic bigotry against which outspoken womanist leaders have fought for generations. This man demonstrated internalized anti-woman attitudes which he probably doesn’t even realize he possesses. Like President Trump insisting he’s “not a racist,” this man would probably insist he isn’t sexist. But he did an ugly sexist thing.

Again, I recognize my observations are purely anecdotal. It’s difficult to get wide cross-sections of public data on sexist attitudes, especially when people have open divisions between their actions and their self-image. But purely from my individual experience on Left-leaning social media, many people most aggressively condemning Rowling’s statements have revealed deep, unexamined bigotry in themselves. And you know the risks in pointing such bigotry out.

Moreover, I suspect this conflict, with someone as public as Rowling in the center, may well make things worse overall. Rowling is unlikely to change her mind after airing her opinion in public, because as any psychologist can tell you, being wrong hurts. Not just in an abstract, spiritual manner either; being wrong, and especially being called out for your mistakes, can fire your brain’s pain centers. It hurts like being punched.

Meanwhile, those condemning Rowling’s statements are equally resistant to dialog, because they invest their public presence in being anti-whatever we’re angry about today. Anger can feel good, releasing adrenaline and endorphins, giving a person the high we associate with being righteous. So groups encourage one another to anger, everyone involved gets high, and everyone feels vindicated. Thus we get today’s ugliest swarming behavior, Internet shaming.

Please don’t misunderstand. I think Rowling picked the losing side, used ugly language that encourages nasty behavior, and history will judge this moment in her life poorly. Still, this moment will probably drag longer than anyone involved expects, and looking back, even those who currently feel righteous will experience deep shame over their own statements. Many people are revealing more about themselves than they probably realize.

Public attitudes about sexuality and gender have changed very rapidly in recent years. People formerly perceived as progressive now often look like backsliders, and even those who keep abreast of developments frequently must walk back statements from their pasts. Rowling set herself up for this criticism. But those condemning her must take care they don’t prep their own future condemnations at the same time.

Friday, December 20, 2019

The Second-to-Last Jedi



With today’s long-anticipated release of Star Wars: the Rise of Skywalker, there’s an apparent popular reëvaluation of The Last Jedi, the previous franchise entry. When it debuted two years ago, the apparent consensus said the movie had many good ideas, but was too long and talky, with characters explaining the story’s thesis to one another. Now, at least on Blue Twitter, the new consensus is, it’s the best SW film since The Empire Strikes Back.

So, dedicated to both scrupulous honesty and pop-culture deconstruction as I am, I made the only reasonable choice and re-watched the film. In fairness, without the cloud of expectations lingering from the original trilogy, it’s better than I remembered. It still takes too long to make its point, then keeps going after it’s made. But it has charming characters, an interesting take on class consciousness, and more well-rounded female characters than most mainstream genre films.

Yet I remained dissatisfied, and couldn’t quite place why for over two hours. It mirrors Empire, still the best SW movie. Then suddenly, as the last vestige of the beleaguered Rebellion settled into the realization that help wasn’t coming, General Organa eased herself into a chair, groaning: “Hope has died. The spark has gone out of the galaxy.” And I realized why the story continued to bother me.

This movie ties together two incompatible moralities.

Empire also features the Rebellion in retreat, moving into desperation as the student Jedi, lost to their larger cohort, learns that ancient morality. Yet in watching this retreat, I felt something very different. Han and Leia fleeing Vader have highs and lows: when they find refuge with Lando Calrissian, sure, their surcease is false hope. But they don’t know that. We never get, as with the Resistance, the sense that they’re trying to convince themselves.

The Last Jedi features the Resistance going from defeat to defeat, digging deeper to find resolves of courage to keep believing. Luke Skywalker re-enters the fight and turns the tide only after the Resistance has literally given up hope, and prepares to die. Empire never involves surrender to fatalism, even at the nadir. This one-way trek through defeats into the lowest possible ebb reflects the standardized screenwriting beat sheet from Blake Snyder’s Save the Cat!



If you’ve noticed that movies, especially genre blockbusters, look remarkably similar recently, blame Blake Snyder. His screenwriting guide gives a breakdown of story progressing which, he contends, creates instant audience recognition and emotional bonding. He gives instructions for writers to make stories instantly relatable, with clearly defined heroes who sacrifice everything, including themselves, to prevent monolithic evil. But they find victory, in Snyder’s “beat sheet,” only by progressing from defeat to humiliating defeat.

Sound familiar?

Snyder’s beat sheet, though, isn’t just a storytelling device. It includes a specific moral mindset, one reminiscent of early Christian martyrdom. Snyder teaches that characters’ suffering is ennobling, and that only through suffering do we discover their true golden nature. Characters should constantly do right, even suffering great personal violence, and remain undeterred as the universe repeatedly demolishes their every accomplishment, confident that the universe will ultimately vindicate them. It’s like a medieval saint’s hagiography.

Except Star Wars isn’t based on this morality. George Lucas famously wrote the first movie with two books on his desk: a Webster’s dictionary and Joseph Campbell’s The Hero With a Thousand Faces. Campbell, who was raised Catholic but abandoned faith when he discovered Jungian psychology, contends in this book that all religions, philosophies, and myths descend from the same internal struggle all humans face when passing into adulthood. Morality, to Campbell, is basically psychological.

Most important, to Campbell, suffering is common, but it’s not necessary. It’s simply a by-product of becoming yourself in a world that demands you conform rather than mature. In Empire, the Rebellion struggles because it believes in capital-T Truth, in a universe that is deeply impersonal and frequently unjust. In Last Jedi, the Resistance struggles because a morally invested universe needs to purge human illusions, before it can permit the Conversion Experience, and ultimate redemption.

There you have it. That’s why I still can’t get behind Last Jedi: because it tries to impose a humanist demi-Christianity on a narrative founded on moral psychology. The two moral forms don’t fit together, and the result feels awkward. Y’know what, though? I’ll still probably see The Rise of Skywalker this weekend. Because through it all, Star Wars is founded on faith in hope. And hope, I guess, will always keep me coming back.

Thursday, December 19, 2019

Doctors vs. Accountants©: Part III

Alexander Fleming, discoverer of penicillin
I began writing this series with the best intentions. Affected by a single, emotionally raw tweet, I simply wanted to express why the most common proposal to remedy insurance bureaucracy, nationalizing American healthcare, will only put new bureaucrats in charge of the same system. I’m no fan of private insurance, which privileges corporate interests over the patients it’s supposedly organized to serve. But even less do I like bureaucracy.

As I’ve attempted to clarify my positions, however, I’ve apparently made things even murkier. A dear friend, recently diagnosed with a rare and painful genetic disability, praised my thoughtful responses, but replied, “it feels like you’re saying that people like me are going to be screwed no matter what.” This criticism pierces my heart, because I struggled with this very thought while writing, fearing I was condemning my friend, and people like her, to a lifetime of suffering without remedy.

The more I thought about things, however, the more I’ve realized this isn’t necessarily true. Yes, advances in medical technology have meant people once doomed to early, painful deaths can now live longer, do more, and thrive abundantly in ways once unthinkable. And as I wrote earlier this week, that technology is cripplingly expensive, meaning some bean-counter somewhere needs to make life-or-death decisions about who gets priority access.

But, conversely, the most important advances in medical history haven’t been either high-tech nor particularly expensive. Scottish microbiologist Alexander Fleming discovered penicillin accidentally, after leaving a petri dish unattended overnight. Louis Pasteur, famous for discovering a rabies treatment, and Edward Jenner, who invented vaccinations, worked in similarly low-tech environments, pioneering practices that, today, aren’t very expensive or difficult to acquire.

Louis Pasteur, discoverer of multiple
medical procedures
Even my friend’s diagnosis of Ehlers-Danlos Syndrome, a rare connective tissue disorder, didn’t come through expensive, specialized innovations. Her doctor simply listened to her symptoms, compared them to known research, and referred her to a geneticist. It took years for her to reach the point where a physician would trust her enough to simply listen, but I suggest that’s a whole other problem.

So let me suggest a remedy.

Let’s return to a source I cited at the beginning of this series, Princeton economic historian Jerry Z. Muller. He writes that doctors and accountants have always sparred over treatments: doctors often want to pursue heroic lifesaving measures, even when these measures are expensive, and divert resources from a larger pool of deserving patients. Accountants, by contrast, strive to keep costs down, sometimes despite a patient’s treatments being both affordable and cost-efficient.

This ordinary conflict becomes exacerbated by both private insurance, and nationalized health-care, because the accountants responsible for the economic side no longer work within the medical treatment environment. They occupy external positions in corporations and governments, making decisions based on Xeroxed checklists, without consulting the doctors, and without the doctors having any avenue of appeal. In other words, the problem isn’t accountants, it’s their bureaucratic framework.

Professor Muller suggests a straightforward response to this problem: trust those intimately familiar with the problem, to also have intimate familiarity with the solution. Muller doesn’t make this suggestion only for health care. Teachers know better than corporate standardardized-test writers whether their students are prepared to graduate. Farmers know better than ConAgra or Monsanto how to husband the land. Doctors and on-site accountants know how to treat patients.

Edward Jenner, discoverer of vaccines
My friend’s Ehlers-Danlos diagnosis took twenty years, but I propose the problem isn’t the doctors. I witnessed several, though not all, of her interactions with medical professionals, and I feel confident in saying, they rushed through the diagnosis process, asking checklist questions and assigning her a pharmaceutical diagnosis, because that’s all that insurance would pay for. In order to cover operating expenses, doctors need to rush their patients, lest they go broke.

Return, please, to the image I first conjured: Doctor House struggling for days, weeks even, over one patient. Staying up all night consulting medical journals and concordances. Working to parse the diagnosis. That’s not realistic for most patients, certainly, or their bills would run into the millions of dollars. But imagine if doctors, and their on-site accountants, had time to converse with their patients before settling on whatever diagnosis the insurance bureaucrats will cover.

I suggest my friend might’ve gotten her diagnosis years ago, if the bureaucracy hadn’t held the process hostage for profit. Government bureaucrats, cognizant of the next election and pot-luck outrage over “taxpayer dollars,” will do no better. We need to trust skilled professionals, intimately familiar with their field, to make these decisions. The solution to bureaucracy isn’t reshuffling the bureaucrats; it’s removing them entirely.

Monday, December 16, 2019

Doctors vs. Accountants©: Part II


Last week, I wrote that current attempts to expunge for-profit influences from American medicine are doomed because that simply involves reshuffling the medical bureaucracy. I came in for some mild criticism, suggesting I missed the point. “If we funneled a lot of the money that we put into welfare for corporations into helping people get the medical treatment they need,” a good friend wrote, “we could reduce a LOT of the bureaucratic harmful determinations.”

I understand the appeal of this argument, but I don’t believe it’s true. Yes, I’d love to see some of America’s massive federal subsidies to, say, hydrocarbon mining and tech giants, redirected to serving ordinary citizens’ interests. Certainly we need to stop using public funds to protect the wealthy while ordinary people lack access to common-pool resources. But I’m not sure that’s what’s really happening in medicine. It’s something more subtle and insidious.

All economies are structurally organized around some resource that forms a bottleneck. Something that’s scarce and desirable forms the foundation of every economy. Whether it’s gold, or land, or human labor, every economy needs something that exists in finite amounts; the medium we use to value and exchange that resource, we call “money.” The key to unlocking and reforming an economy lies in identifying that bottleneck resource.

What, in medicine, is the bottleneck? Two or three years ago, I might’ve said “human labor.” Because medicine is highly specialized, and requires years of technical training to begin a career, doctors and nurses will always be a finite supply. And while for-profit hospitals can outbid for the services of experienced, highly-skilled professionals, access to quality, affordable medical care for numpties like me will always founder on money.

But recent news makes me question this assumption. To give just one example that’s gained recent news-cycle traction, we’re also facing a potential shortage of Earth’s helium supply. When I say “helium,” you possibly think of party balloons and silly voices. But much highly specialized technology, including, say, MRI machines, relies upon helium, which is so scarce on Earth that we risk running out in about 35 years.

Therefore we must make valuable decisions about allocating our finite helium, and other nonrenewable resources. Simply turning free markets loose sounds desirable to strict libertarians: if doctors need helium more than party planners, let them outbid. But forcing doctors to compete in resource auctions will increase already-staggering prices, ensuring that ordinary people can afford medical care even less than we do now. Some authority needs to intervene.


One cannot overvalue the importance of scarcity. Those of us old enough to remember the Clinton Administration’s attempts to reform health care, derisively dubbed “Hillarycare,” will recall conservative anger at the attempt to nationalize one-seventh of the American economy. But by the time Obamacare rolled around, that number had risen to one-sixth. The cost of medical care is growing faster than the overall economy, apparently.

Somebody has to make life-or-death decisions about allocating limited resources. Conservatives want private corporations to make those decisions, while progressives trust the altruism of the state. In both cases, some poorly-paid underling with looming deadlines will have to make decisions, about which they might be under-informed or even ignorant, based on one-size-fits-all metrics handed down from authority. That’s the living definition of bureaucracy.

Not only is medical technology limited and expensive, so is skilled labor. So is physical space for providing patient treatment. So is decision-making to determine who receives treatment: a Dutch study found doctors widely agree that older patients are under-treated compared to younger patients, while the American Medical Association prizes “duration of benefit,” which privileges the young. So, despite our best efforts, we’re still asking bureaucrats to make important decisions.

Though I recognize the limits of a layperson’s thought experiments, I cannot separate bureaucracy from medical treatment in a capitalist structure. Hospitals were invented by Christian monks during the Crusades, and hospital treatment was initially a form of religious devotion. Perhaps we could separate money from medicine if we recaptured this sense of spirituality, but how? Money changes people’s values, and the change usually can’t be reversed.

Please don’t misunderstand: I dislike the status quo greatly, and believe it abuses poor and minority people in its lopsided distributions. But, unless we remove all resource bottlenecks, which cannot happen if we want high-tech treatments available, I can imagine no remedy that isn’t just another reorganization of the bureaucracy. Whether you prefer state or corporate bureaucrats more, they’re still bureaucrats. And we’re still running out of helium.

Friday, December 13, 2019

Boris Johnson, Donald Trump, and the Road We're On


Well, Boris Johnson has popped his head from the freezer and apparently seen his shadow, guaranteeing Britain another five years of bullshit. Faced with a tanking economy, scathing international ridicule, and no support from his party, the British public nevertheless gave him the electoral mandate which, until now, he has lacked. As a longtime Anglophile, I care about this sort of thing. But I suggest other Americans should care, too.

Because American and British politics have a history of moving in tandem. Margaret Thatcher’s election preceded Ronald Reagan’s, and their two hand-picked successors, John Major and George H.W. Bush, collapsed almost simultaneously. Bill Clinton coined the term “special relationship” to describe his liaison with Tony Blair, a relationship that continued with George W., cementing American perceptions that the two major parties are more similar than different.

Most important, the Brexit vote came just months before Donald Trump won the Presidency on a technicality. Neither Trump nor Britain’s Conservative Party won a straight majority (which is pretty common in both countries), but both nations have a first-past-the-post election system that means a weak candidate can strategically half-ass their way into an overwhelming legislative majority, which Johnson’s Conservatives did yesterday.

(For Americans, and others unfamiliar with the British system, the Prime Minister isn’t exactly like the President. Queen Elizabeth remains Britain’s nominal head of state, though no monarch has publicly contradicted Parliament since Queen Victoria. The PM is an elected Member of Parliament, who gets further elevated by Parliament itself. This means the PM is usually the head of the majority party, and therefore rarely faces meaningful opposition from the legislature.)

So, follow me here. Britain looked at a government with unpopular policies, a history of racism, and demonstrations of both criminal intent and widespread procedural incompetence, and said: sure, we’ll have some more of that. This should worry Americans already concerned with the Trump administration’s visible lawlessness. The English-speaking world is apparently ensnared in some multinational cultural moment where anything goes, as long as aging White people permit it.

I don’t say “aging White people” flippantly. In Britain and the U.S., the longstanding demographics are changing quickly, as former majorities are increasingly displaced by immigrants. Bruce Cannon Gibney writes that, at present, White Baby Boomers outnumber all minority voting blocs together in America, but their youngest members currently qualify for AARP discounts at most restaurants and grocery stores. And when they die, Whites will slip into the plurality.

Immigration from Latin America, in the U.S., and from the former Empire, in Britain, have changed both countries’ voting and cultural profile. Not that the heritage has disappeared; William Shakespeare and Edgar Allen Poe won’t lose their status in the near future. But we cannot trust that the “normal” aggregate citizen will, in coming years, look like me. I think that’s great. Others consider that an assault on everything they treasure.

Many voters respond by aggressively rejecting the outside world. Donald Trump ran on a platform of undisguised racism, economic policies that kick the weak, and promising to quit international alliances that have secured America’s political and moral leadership since World War II. And he won. So Boris Johnson ran that same platform on steroids. Britain’s economy has been dwindling since the Brexit vote, and British voters apparently don’t mind.

America’s economy is apparently more fraught than Britain’s; our current growth rate mostly depends on who you ask. Trump’s greatest accomplishment is, apparently, to not submarine the boom economy he inherited from the previous administration. But his racism, his profiteering from office, and his undisguised disdain for democracy remain on public display. I truly fear, if something doesn’t change soon, we’ll see the electorate next year shrug and continue unperturbed.

I don’t say this flippantly. Just yesterday, the same day the British electorate returned Boris Johnson to power despite his historically unpopular policies, former Arkansas governor Mike Huckabee announced his intent to shepherd Donald Trump to a third term, in defiance of the U.S. Constitution. This President’s apostles have taken their hero, who is, remember, an admitted sexual harasser and probable rapist, and elevated him to the status of secular Messiah.


Throughout my lifetime, the pattern has been clear: America and Britain aren’t the same, but they move on substantially parallel tracks. And the current track calls for the electorate to return a leader whose policies are massively unpopular, but who is personally somehow well-liked, to power, despite demonstrated incompetence in office. We who believe America has a moral mission in this world should take this as open permission to panic.

Tuesday, December 10, 2019

Doctors vs. Accountants©: the Role-Playing Game


I just encountered another of those audience-grabbing stories about insurance companies failing to provide medical coverage. The bean counters apparently denied somebody’s necessary life-saving medical care. I understand the outrage this story causes, because we all imagine ourselves, with years ahead of us, suddenly facing mortality because an actuary somewhere said “no.” Putting ourselves in those shoes, the prospect seems horrific.

Is it, though? Princeton economic historian Jerry Z. Muller writes that, while medical metrics are frequently overused in ways that undermine doctors’ autonomy, that isn’t always bad. It certainly can be, when insurance executives who don’t understand medicine overrule a doctor’s opinions based on shoddy math. But throughout medical history, Muller says, hospitals have hosted tension between doctors who want to take heroic life-saving actions, and accountants who tally the costs.

Generations of TV medical dramas have convinced laypeople that medicine consists of earnest, energetic professionals making split-second decisions while lives hang in the balance. This might make sense in Emergency Room conditions, where people come in broken and bleeding, staving off burst appendixes and suppurating aneurysms. But most medical care is slow, deliberative, and costly. Asking whether continued costly treatment will have meaningful outcomes isn’t always unreasonable.

The model we’ve all seen in stories like House M.D. looks exciting, dangerous, and fun. Somebody enters the hospital with a twitching thigh muscle, and the glamorous doctors, who have only one case, piece together clues proving how an aortic dissection threw a clot to the brain, resulting in testicular obstructions: medicine as logic puzzle. I might’ve paid better attention in middle-grade biology had I thought I’d get jobs like that.

But an NPR human-interest story broadcast at the peak of that show’s popularity traced the costs of just one episode, landing on a $300,000 price tag—and that’s a conservative number, because I added up their annotated costs and got something far higher. And that’s nearly ten years ago; advancing technology and added administrative bureaucracy probably mean it’s far higher now. Even under the best circumstances, medicine is expensive.

The early-seasons core cast of House M.D.

Let me interrupt myself here and note: I don’t mean accountants should be more active in scaling back costs and denying medical care. A good friend recently received a medical diagnosis that mercifully ties all her disparate symptoms together. She should’ve received this diagnosis twenty years ago, but overworked, underfunded doctors made hasty short-term determinations, almost certainly rushed along by bean-counters. A little more time and money could’ve saved years.

So yes, I acknowledge that there’s no simple, arithmetic formula to strike a balance between the accountants and the doctors, whose desires often conflict. And the arena of that conflict is a patient’s body. The outcome, as seen in the tweet quoted above, can be tragic for individuals—but downright mandatory for the medical economy overall. Anybody hoping to solve this problem concisely should also wish for a pony, because it’ll do as much good.

Because, let’s be honest, when people complain about private insurance’s interference in medical decisions, their solution is often to nationalize medical care, to a greater or lesser degree. As America’s political Left wants to institute Medicare For All, Britain’s political Right is actively considering privatizing the NHS, which American progressives often brandish as a model of more efficient medical template. Which they want because the NHS isn’t much better.

British medical journalist Dennis Campbell, writing in the Guardian, a newspaper with undisguised Leftist allegiance, notes that NHS bureaucrats regularly overrule doctors’ opinions. Campbell cites a panoply of reasons bureaucrats to this, but buried in this list, he names “resources”—a weasel word meaning “money.” So the NHS, like America’s private insurers, overrules doctors’ wishes to keep costs down and funnel money where it’ll do the most good.

If we expect a constant stream of on-demand, high-tech medical treatment, we’ll inevitably run into the impediment that all resources, including money, are finite. Somebody needs to make decisions about who gets costly, invasive treatment. That “somebody” will inescapably be a bureaucratic goat whose official functions, whether funded by the state or private capital, will conflict with life-saving desires. Somebody’s live will take priority over another.

The system is heartless to individuals. I’m sure that woman whose life-saving cancer treatment got denied is suffering greatly. But trading corporate bureaucrats for state bureaucrats won’t solve anything. Unless we abjure high-tech medicine, which will never happen, we’ll always have to make finite resources cover infinite needs. That’s what bureaucracy does. Sometimes, that means making painful decisions and letting human lives go.

See also: Doctors vs. Accountants©: Part II

Wednesday, December 4, 2019

OK Go Boomer Computer

Billie Eilish
Pop singer Billie Eilish’s recent admission that she doesn’t know who Van Halen is, has reignited the generational debate that apparently flares every two weeks anymore. Van Halen’s fan base of older Gen-Xers have to admit they’re officially as old as they once thought their parents were. Since Van Halen never much spoke to me, I wouldn’t much care about this faux controversy, if it didn’t come so soon after the now-obsolete “OK Boomer” joke.

When I was 16, I  desperately wanted to be a Baby Boomer. Looking around, I saw the cultural and political power Baby Boomers possessed, the authority they held over my choices, and the simple ubiquity of their presence. When you're a kid, you have two options when faced with authority: you rebel against it, with the moral incoherence for which teenagers are justly famous, or you emulate it. I chose column B.

While still in elementary school, I remember watching PBS documentaries about the cultural milieu in which Boomers grew up, from the images of wealth they witnessed on shows like Leave It To Beaver and Bewitched, to the socially acceptable rebellion they helped create at Woodstock. Based on the exposure I received in childhood, I believed Boomers won the Civil Rights conflict—although Malcolm X and Dr. King were both born in the 1920s.

We didn’t have that. As a member of Generation X, I struggled with communal identity. We didn’t even have a generational handle until about 1993, nearly thirty years after our generation began; we bounced among “Latchkey Kids,” “the Baby Bust,” “the MTV Generation,” and “Generation 13” before Douglas Coupland’s novel gave us a name. If television documentaries gave me a false impression of Boomer unity, media treatment convinced me that my generation barely even existed.

However, we did exist. We test-marketed the claims many Millennials now use to make bank, including the claim that wages were flat while work was up, meaning we faced the likelihood of a chronically impoverished future; the claim that cultural institutions treated us exclusively as consumers, and demonstrated little patience with us as makers; and the claim that the economy was saddling us with debts we’d never be able to repay. History has vindicated us.

So you understand my frustration when, several weeks ago, New York radio host Bob Lonsberry tweeted that “Boomer is the n-word of ageism.” After forty years of constant saturation with the idea that Boomers were cultural innovators, political liberators, and better musicians than every subsequent generation, somebody purporting to be a Boomer spokesperson claimed Boomers are oppressed equally to African Americans. This takes oppression chic to a new level.

Eddie and Alex Van Halen
(don't ask me which is which)
The “OK Boomer” tag, like every other fad in our media-drenched environment, had a remarkably short lifespan. In November, it pushed people born between 1945 and 1965 to heights of spittle-flinging rage; but it’s already getting used ironically by comedians and basic-cable news commentators. Just like “MeToo” and “Black Lives Matter,” overexposure defanged the monster in its cradle. Yet the reaction it generated speaks volumes about its intended audience.

I have one problem about the Billie Eilish controversy, though: approaching fifty years since the Beatles released their final album, their music still appears timeless and relevant. Van Halen doesn’t. Van Halen’s peak album, 1984, is larded with okay songs like “Jump,” and shitty pieces of self-indulgence like “Hot For Teacher.” As the movie Yesterday recently demonstrated, playing the Beatles moves many generations equally. Van Halen sounds very early-1980s.

People slightly older than me feigning distress that a 17-year-old doesn’t know Van Halen, is like demanding a 17-year-old when 1984 was released, 25 years ago, remember Frankie Laine and Perry Como. I sure didn’t when I was 17. In both cases, the Billie Eilish faux-controversy and the “OK Boomer” outrage, we see people old enough to be parents or grandparents demanding they be treated as permanently, eternally young.

Essentially, we have two generations publicly demanding to act like Peter Pan, and it’s embarrassing. Boomers and their first-generation offspring expect history to freeze, their childhoods to be treated as sacrosanct, and post-Millennials to treat them like peers. They want congratulations for the innovations they nurtured, and then, they want all innovation to stop.

And making a teenage pop singer the embodiment of this desire for nothing new to happen, is the height of irony.

When “OK Boomer” began, I thought we were seeing old people refusing to get old. But clearly my generation is refusing to get middle-aged, too. Here’s hoping we can all stop being narcissistic, and appreciate that the world keeps growing and blooming, even when we’re not in the center of it.

Monday, December 2, 2019

Why Is Making a Superman Film So Hard?

Henry Cavill in Man of Steel
DC’s continued struggles to create an economically viable cinematic enterprise have reached a new low. Writing in Forbes, pop-culture critic Dani Di Placido describes DC's struggles to find something for Superman to do. You’d think an alien living in America, believing in justice and honor, would have important resonance in today’s divided cultural landscape. Perhaps movie studios, notoriously jittery about public opinion, find this too pointed.

The rush of armchair critics on FaceTube and InstaTwit have readily condemned this timidity, for obvious reasons. I’m tempted to echo these positions, because that’s low-hanging fruit. Superman’s backstory would seem ripe for utilization in today’s America. His historic connection of “the American way” with “truth [and] justice” should address our trend toward conflating Americanism with untruth, subjectivity, and avarice. This shouldn’t be a hard sell.

Yet considering the economic landscape responsible for controlling this discussion lies hidden beneath the cultural issues. Superman, Batman, and the MCU don’t just objectively exist; they are properties controlled by media conglomerates, which make remarkable money off their holdings. The rentier economy allows DC and Marvel to get wealthy by owning and licensing their intellectual property, but that wealth makes them risk-averse.

DC Comics is owned by WarnerMedia, so naturally Warner makes DC movies. WarnerMedia is America’s second-largest media conglomerate, controlling about sixteen percent of America’s media revenue. That’s slightly under half the revenue controlled by America’s first-largest media conglomerate, Disney, which owns Marvel Comics and Star Wars. Following its Fox buyout, which reduced America’s major media conglomerates from six to five, Disney controls approximately one-third of America’s media revenue.

Superman on the cover of Action Comics #1
So the continued runaway success of the MCU, and DC’s inability to compete, isn’t about the competition between these two comics companies; it’s a proxy feud between America’s two largest media corporations. As I wrote recently, talking about Martin Scorsese’s condemnation of the MCU’s box-office domination, this is somewhat misguided. Yes, major-studio franchises control America’s box office. But only two franchises, Star Wars and Marvel, are currently very successful.

And both are owned by Disney.

I’m loathe to offer suggestions to major media conglomerates for controlling their more lucrative properties, since history suggests they’ll misuse that power to further limit the government. These corporations have a history of kissing tyrannical ass to ensure their continued revenue flow: Disney itself has become particularly risk-averse since it temporarily lost China’s import market following their 1997 dud Red Corner. Conglomerates suck just fine without my help.

Nevertheless, Superman has loomed large enough in American culture for so long, that abandoning his principles to corporate timidity resembles a form of surrender. So I’ll weigh in anyway. Let’s start by remembering what made children embrace comic-book superheroes over eighty years ago: they embodied American moral convictions. They believed in the same things we believed in, and then followed through.

The runaway success of Wonder Woman should instruct DC what audiences want. The titular protagonist, who believes moral right exists as an objective force, and sets out to kill war, in the midst of history’s most pointless war ever, demonstrates what traits audiences reward with money. We want moral confidence, and a will to act upon this confidence. We want characters who remind us to do what’s right.

Christopher Reeve in Superman
Following the immense, brooding darkness of Man of Steel, many critics, including me, condemned the movie’s bleak, ends-justify-the-means attitude. Charles Moss, writing in the Atlantic, pointed out that Superman’s origins were remarkably grim and violent. But I’d suggest that misses the point. Early Superman had lite-beer socialist leanings, and his willingness (in a White, urban way) to confront slumlords and corrupt bankers, reflected America’s unexpressed urge toward revolution.

Christopher Reeve’s Superman expressed the morals of another time. Flush with cash in the post-WWII years, before media conglomerates began hoovering everything upward, this Superman reflected an era whose proletarian values were less communist, more communitarian. Reeve’s Superman busted villains whose machinations interfered with everybody’s desire to live well together, rather than punishing criminals by forcing them to live under the conditions they created.

Superheroes today must express contemporary moral sentiments. What do Americans fear today? The political insurgencies of Donald Trump and Bernie Sanders bespeak distrust of political and economic establishments, including media monopolies, that serve themselves, rather than the people. Both Trump and Sanders campaigned against a fossilized social order stealing from the people. What if Superman, and the Justice League generally, fought the same fight?

This would require WarnerMedia to turn against its business model. But whose hand feeds them, ultimately? The shareholders? Or ours?

Friday, November 29, 2019

New Millennial Pop-Folk Blues

Sharon Van Etten, Remind Me Tomorrow

“Sitting at the bar, I told you everything,” Sharon Van Etten sings mournfully to open this album. “You said ‘Holy shit. You almost died.’” Van Etten doesn’t much explain what “everything” means in this song, “I Told You Everything.” But it clearly involves a youthful sexual experience that leaves her shaken and scarred, yet, she implies, compelled to eternally repeat. That sets this album’s mingled themes of dread and disappointment.

If, like me, you encountered this album through its advance singles, particularly “Comeback Kid” and “Seventeen,” you probably anticipated these themes. Heavy with melancholy and a sense of mortality, these songs reflect an artist who, thirty-seven years old when she recorded them, recognized an unmet need for pop music with a grown-up audience. But they don’t really reflect the album’s larger soundscape, which is unremittingly grim, verging on bleak.

Previously noted for a substantially acoustic singer-songwriter sound, Van Etten’s fifth album shifts to an atmospheric electronica sound notable for its minimal guitars. And by “atmospheric,” I mean a minor-key bass chord on Farfisa organ runs through nearly this entire album, making your teeth vibrate like a 1980s horror movie soundtrack. This chord is so understated, though, that you may only notice its persistence on the fourth or fifth listen.

Yet this isn’t a horrific album. Sad and pensive, perhaps, often preoccupied with the past despite the pressing imminence of the present, but the only scary thing is how we interpret it. I noticed this on “Jupiter 4,” which has lyrics of unalloyed love—”Our love’s for real, how’d it take a long, long time to let us feel?”—played ironically against chords that sound like a breakup song.

Put another way, Van Etten does the opposite of Hank Williams, who often played gloomy lyrics against bouncy tunes. Van Etten, like Williams, puts her words and music in direct opposition. Listening to this album, you may feel a growing sense of dread. Not only that somber chord, but Van Etten’s contralto voice, which contrasts with the music but doesn’t oppose it. Her voice seems separate from the instrumentation.

Sharon Van Etten
This has good and bad qualities. This album’s first three tracks are so uniform in sound and tempo that, if you’re listening with half an ear while driving or studying, you could be forgiven for thinking it’s one fourteen-minute song. Only with “Comeback Kid” does the sound become differentiated enough to feel a change. This is also where Van Etten’s vocals become distinct enough to follow without a lyrics sheet.

After that point, however, everything opens up. Her dynamic changes, and her voice becomes a more prominent instrument. Though still atmospheric and dense, she becomes more willing to step up or fade back, appropriate to the message she conveys. Yet she never loses that reverse-Hank Williams trick, because her songs remain sonically stark, regardless of how optimistic or despondent her words.

Please understand, this isn’t a timeless sound. My previous reference to 1980s soundtrack music isn’t flippant. Her foregrounding of Farfisa or synthesizer on every track harkens back to the music that dominated the soundscape of Van Etten’s childhood. As the oldest Millennials approach forty, but frequently still can’t afford a down-payment on a house, this lingering backwards gaze will touch their situation concisely.

Hearing this album as a unit, one wonders whether Van Etten intended it to announce her planned retirement. The single “Seventeen,” with its themes of generational angst, became her first to hit any Billboard chart. Like most Millennials, she both is and isn’t an adult, with all the responsibilities of a career and new motherhood, but a paucity of trust from her economy. One suspects she’s touched a nerve.

Motherhood in particular lingers throughout this album, though usually not overtly. On the final track, “Stay,” she muses, “Imagining when you’re inside, when you make those kicks inside. Don’t want to hurt you. Don’t want to run away from myself.” The prospect of caring for a helpless life, when her generation frequently can’t care for itself, scares her. But, she continues, “You won’t let me go astray.” Adulthood persists, regardless.

Pop music often requires artists to remain eternally teenaged and rebellious, because kids have more disposable income. But Van Etten’s compositions reflect a generation that never had economic stability enough to rebel, and now faces impending middle age. I find a kindred spirit in her: grown-up, yet still carrying the unfulfilled impulses of youth. This album starts slowly, sure. But by the end, it’s pop for a newly older generation.

Wednesday, November 27, 2019

The Happy-Dance of Fake Gender Roles

Rigaud's portrait of Louis XIV
(click to enlarge)
Whenever I get entangled in discussions of gender, my mind inevitably drifts to Hyacinthe Rigaud’s legendary portrait of King Louis XIV. Louis, the “Sun King” responsible for the Rococo atrocity that is Versailles, commissioned several paintings of himself, though Rigaud’s is probably most famous. Rigaud depicts Louis, like nearly every portrait does, wearing high heels, leggings, a fur-trimmed wrap, heavy cosmetics, and a wig.

This painting depicts one of the most powerful men—and, for this discussion, I do mean “men”—who ever lived. Louis commanded a global empire with an iron hand, untrammeled by the oversight of any parliament or circumscribing aristocracy. He controlled France’s political, economic, and even religious culture with absolute authority. And he did so while wearing clothing that, to modern eyes, looks like a woman’s, if not a drag queen’s.

I recalled this painting again recently when a tweet went viral. Ashley StClair, a woman who calls herself a “freedom fighter” and “patriot,” shared an Instagram video of a father and son, prancing happily in off-the-rack Halloween versions of the protagonists’ dresses from the movie Frozen. StClair captioned the video with “The testosterone is being sucked from our men right before our eyes.” Because obviously one Instagram video is a universal data point.



The reaction has been both swift and predictable. Defenders of the status quo have condemned the father for not forcing his son to adhere to masculine stereotypes, by requiring him to play catch or work on a farm. Progressives have called for people to let families have fun, or insisted that gender inclusiveness is sexy. I can’t help thinking they both miss the point: this post is built on entirely wrong premises.

StClair’s whole “testosterone” comment implies that gender, or anyway sex, is biologically determined. If that’s the case, why must we police gender roles, and punish anybody who transgresses? Is biology so brittle that it could be shattered by children being happy while trying on possible identities for size? Of course not. StClair feels compelled to enforce gender standards because she knows she’s protecting something artificial, for artificial reasons.

Louis XIV demonstrates this artificiality. High-heeled shoes, a fashion accessory connected today with women, were invented at Versailles, specifically for Louis, who, at five-foot-five, was shorter than most of his courtiers. He feared his height made him look weak when holding court, so he ordered shoes designed to protect his image. Only after he abandoned wearing them, late in life, did women inherit this formerly masculine fashion accoutrement.

Medieval illustration of peasant dress
(click to enlarge)
We invent gender standards, just like standards of class, race, and nationality, to defend power arrangements. Medieval illustrations show that, though the sexes didn’t dress exactly alike, the differences were much slighter than we expect today. Billowy skirts allowed workers to move freely in an economy where both sexes did manual labor, while lace-up leggings protected workers from nettles and insects. Clothes weren’t gender markers, they simply existed.

So gender standards, as we understand them, are fake. Yet the opposite also isn’t true. I have the same attitude regarding gender that Ibram Kendi has regarding race: the distinctions might be artificial, but their consequences are real and lasting. Working construction, a heavily gender-segregated occupation, I see men rigorously defending their gender identities, because it’s frequently the only credential they have for wage-earning in a stagnant labor economy.

Therefore, gender roles both are, and are not, real. And don’t bring me examples of traditional societies which had more than two established gender niches. Unless they had the same gender definitions, and those definitions never changed—which isn’t so—that doesn’t tell us anything useful. It only reinforces my position that gender roles are socially conditioned. Those who police categories, and those who contravene them, operate from the same expectations.

King Louis adopted his look for specific political reasons. He wanted kings of other nations (Rigaud’s painting was originally commissioned for Philip of Spain) to understand his wealth, power, and eternal youthful vigor. His laborers, wearing skirts to work the fields and vineyards, needed no such cultural reinforcement. They dressed in ways that freed them to work, and made them happy.

Happy. Like a little boy dancing in a dress.

I’m sure the anonymous father didn’t intend a political protest against intrusive gender standards. But he gave us one anyway. He showed us that one needn’t obey somebody else’s socially conditioned categories to be happy. The rules don’t objectively exist; they need self-righteous censors like Ashley StClair to police them. And if we don’t need those rules, we don’t need the rule-keepers, either. Maybe we’d all be happier dancing in dresses.

Monday, November 25, 2019

Classical Folk Fusion for a Discerning Ear

The Arcadian Wild, Finch in the Pantry

Listening to the title track, an instrumental, from The Arcadian Wild’s second album, I couldn’t help noticing something unusual: the fiddle and guitar were playing different underlying rhythms. The fiddle, largely the lead instrument, followed a looping six-beat rhythm suggestive of an Irish seisiún, while the guitarist played a four-square supporting chord progression sounding almost like a chorus of snare drums. They weren’t in different time signatures, in the Stravinsky style, but came mighty close.

This musical erudition characterizes the entire album. The acoustic instrumentation comes straight from the bluegrass tradition, with guitar and mandolin trading roles as percussion and driving force, with a layer of fiddle atop them. But they use staggered arrangements, syncopated beats, and cathedral choir vocals. Clearly these musicians learned their craft from the honored elders of musical lore, but aren’t beholden to it. They have their own story, and their own way of telling it.

From the opening song, “Hey Runner,” the band establishes their impatience with convention for its own sake. Sung from the viewpoint of a volunteer concert organizer dealing with an arrogant showcase performer, guitarist Isaac Horn expresses the disappointment inherent in becoming a musician today. Mandolinist Lincoln Mick does something similar on the more upbeat, possibly radio-friendly “Food Truck Blues.” They occupy a Nashville where dues-paying dependency has become a full-time career stretching out for years.

Clearly this album comes from superior musicians more interested in creating music than kissing record executives’ rings. Complex arrangements that require more planning than today’s common studio jams, and dense, allusive lyrics, reflect artists who spend time thinking about their music. It wouldn’t be accurate to say these artists aren’t listenable, because they emphatically are; but they don’t permit listening with half an ear while driving or studying or cleaning. They write for active listeners.

Tracks like Mick’s “Silence, a Stranger” have intricate expressions reflecting this ethos. If you’re like me, you hear songs several times before you really begin processing the lyrics; initial attention stays on the music, which in this case is ethereal and dreamlike, but never wispy. Only on the fifth or sixth hearing do I catch lines like: “Stillness is a woman I’m too cowardly to kiss / A hallowed thing too holy for my unclean lips.”

The Arcadian Wild, l-r: Paige Park, Lincoln Mick, Isaac Horn

Seriously, I heard this album several times before catching how laden the lyrics are with references to literature, the Bible, and other sources. In today’s Nashville, songwriters tear off tracks hastily inside the studio, building them around radio-friendly hooks, because that’s what makes money. The Arcadian Wild are more contemplative and intricate, reflected in lyrics like Horn’s, from “Oh, Sleeper”: “I wonder who I’ll need to be today / They don’t need to change, I’ll relate.”

It bears noting, though this isn’t a Christian band, they’re clearly influenced by Christianity. Besides Mick’s quoted lyrics above, which reference Isaiah and Proverbs, it’s possible to spot other Biblical references throughout. Mick does this more directly, while Horn’s influence comes from liturgy (his compositions use cathedral vocals more than Mick’s). This culminates in the long final track, “A Benediction,” which includes Irish-style blessings, lyrics in Latin, and the line “Because death has lost already.”

This level of musical sophistication has fittingly won The Arcadian Wild a small but loyal following throughout the independent folk circuit; they maintain a busy schedule of house tours and intimate club venues. That’s how I encountered them, playing a small storefront church. Speaking to mandolinist Lincoln Mick following a concert, I mentioned I’d struggled to place why their sound feels familiar, before successfully placing it: they resemble Nickel Creek. “We’ll take it,” Mick said.

Horn and Mick, as composers and vocalists, have a thread of, let’s say, optimistic disappointment. They reflect the belief that things will get better soon, but they don’t know when, or why not yet. (Fiddler Paige Park is billed as full band member, but doesn’t compose or sing lead on any track; she replaces a banjo picker from the prior album. This band isn’t a boys’ club, necessarily, but the men obviously have creative control.)

Without a clear genre niche, this band may struggle to find an audience. Are they bluegrass? Classical fusion? Americana? My comparison to Nickel Creek isn’t flippant: both bands will enjoy a small but dedicated audience, which will probably spread by word-of-mouth rather than slick Nashville promotion. Maybe a review like this will bolster their public awareness. They certainly deserve that attention, because nobody else is creating music like theirs right now, and that’s a shame.