Tuesday, July 30, 2019

A Prayer for Joshua Harris

Michelangelo's The Creation of Adam, from the Sistene Chapel ceiling

Dr. Andrew Newberg, research psychologist at Thomas Jefferson University Hospital, devised an experiment: have people draw a picture of God. It’s disarmingly simple once somebody else already thought of it: don’t tell me your denomination, show me your God. Adults offered various pictures, including an eye, hand, or both in the sky; planet Earth; the Milky Way galaxy; a mirror; blooming trees and flowers; a blank page, because you can’t draw what doesn’t exist; and more.

Children, by contrast, consistently presented the same picture: an old White man with a long white beard, the image we recognize from Michelangelo’s Creation of Adam. According to Newberg’s book How God Changes Your Brain, this applied to children of both biological sexes, all races and national backgrounds, all religions, and no religion. Children have a consistent view of God.

I couldn’t help recalling Newberg’s experiment upon learning the news that former pastor Joshua Harris had announced his marriage ending recently. Harris wrote the 1997 Christian bestseller I Kissed Dating Goodbye, the manifesto of Purity Culture that permeated White Christianity in the late 1990s and 2000s. Like many religious people, Harris devised a system that worked for him, then imposed it on everybody, believing his experience was portable and universal.

Except, obviously, it didn’t really work for him.

After writing his name-making book at 21, and marrying his wife at 24, Harris became an East Coast megachurch pastor, heavily supported by his book’s durable reputation—few pop Christian books receive a second printing, but his remained available for over twenty years. But this created tension, because he couldn’t change his religious views from those of early adulthood. He couldn’t leave the old White man for the mirror, trees, galaxy, or whatever.

Stuck in the religious views of a fervorous 21-year-old, who preached a black-and-white theology of “purity,” his own moral straight-jacket apparently began strangling him. In 2015, he left his pulpit after it became common knowledge that his congregation buried sexual abuse allegations, an increasingly common accusation anymore. In 2018 he asked his publisher, Multnomah, to remove I Kissed Dating Goodbye from publication (they agreed), and apologized for its very existence.

Purity Culture taught young Christians they were either pure or impure, either holy or fallen. Like the Temperance Movement of generations gone, which separated Americans into Wet or Dry, Purity Culture had a Manichaean edge, and told adherents that once you turned apostate, forgiveness was impossible, or at least incomplete and conditional. Not only couldn’t Purity Culture youth have premarital sex, they were deemed “impure” if they even kissed or held hands before becoming engaged.

I somewhat understand this impulse. Harris’ book dropped less than one year after a Friends episode where Chandler and Joey disparaged Ross as “a freak” after it transpired that Ross had only ever had sex with his wife. Former pastor Harris and I are only months apart in age, so I understand, from direct experience, the strong Christian revulsion to what we probably both perceived as profligate sexual mores in 1990s culture. So he reacted against it.

Former pastor Joshua Harris

But by taking mores to an opposite extreme, Harris created a public ethos based on shame and accusation. He made young Christians intensely self-conscious, telling them their sins were so heinous, they couldn’t be forgiven. Many Christians of Harris’ (and my) generation continue to struggle with internalized guilt, believing minor sexual transgressions are so complete that God will hold them forever accountable, even as adult faith tells them otherwise.

Essentially Harris tied an entire generation to Michelangelo’s old White man. Twenty years of Christians, mostly White, mostly conservative, weren’t free to start seeing God in transcendent, galactic terms, because doing so would’ve made them formally Impure. To change, in Purity Culture, is to fall. I’m reminded of Oscar Wilde’s jibe that one mustn’t trust priests, because they’re required to believe at eighty the same things they believed at eighteen.

That includes Harris himself, who has announced he no longer considers himself Christian.

Harris is probably being too hard on himself. Christianity is still open to him. He just needs to see past his homeschooled straight-jacket. The harm he’s caused probably means teaching and preaching are forever gone from him, but Christianity is a relationship, not an absolute. He just needs to stop believing the lies he’s told himself for over twenty years.

God isn’t an old White man. Nor is God a galaxy, plant, or mirror. But importantly, God also isn’t a pastor, something Harris and others still need to learn.

Thursday, July 18, 2019

“Stranger Things” and the Illusion of Capitalism

The Starcourt Mall, as seen in Stranger Things 3

The producers of Stranger Things presumably thought placing a hidden Soviet research base underneath an American shopping mall in season 3 would make an ironic juxtaposition. After all, the brightly lit, party-like atmosphere couldn’t be further from the grim, featureless government installation which several characters find themselves trapped in through most of the season. What gap could be further, than between the Evil Empire and a capitalist acquisition extravaganza?

I propose, though, the producers knew the gap was notably small. Sure, the brightly lit Starcourt mall, where shoppers walk surrounded by endless choices and seemingly infinite social interaction, looks like an individualistic paradise. It certainly suckered countless suburban youths, like me, who grew up during the Reagan years, believing that these compressed venues of commerce represented the necessary end point of free enterprise. Having uncountable options in one place feels exciting.

However, when your shopping options are limited to which multinational corporation sells you shoes and fast food, the choice is illusory. The Starcourt Mall brims with chain stores, many of which peaked around the time this story is set (1985), but don’t, or scarcely, exist anymore. Names like Sam Goody, Waldenbooks, Orange Julius, and Radio Shack once tempted GenX into air-conditioned malls with promise of endless choice, overseen by supposedly benign corporate overlords.

Steve Harrington, who in previous seasons strutted around Hawking High School with swagger pinched from a Duran Duran video and dated the prettiest girl in school despite treating her like shit, now finds himself thrust into the adult economy. Rather than running things, he, like everyone who ever existed, must abase himself before Mammon. In his case, this means scooping ice cream while wearing a sailor suit so ridiculous, it makes Buster Brown look dignified.

Within moments, though, Steve and his coworker Robin uncover that the Starcourt Mall is a Soviet front; suited and booted Communists operate in subterranean caverns, conducting illicit research. Through means that matter to the plot, but not this commentary, Steve and Robin find themselves getting interrogated by Soviets in full Stalinist regalia, while themselves still wearing their degrading work uniforms. The parallelism between forms of enforced conformity is glaring.

The Starcourt Mall succeeds, as mall capitalism does, by convincing consumers their every desire is met inside climate-controlled, aesthetically pleasing indoor space. The sloppiness and chaos of downtown capitalism, with its sidewalks and rain, gets subordinated to the mall’s constant sensory overload. And superficial choice reigns supreme; if you dislike Sam Goody, try Tower Records instead. This illusory freedom finds its apotheosis in that acropolis of excess, the Food Court.

Yum yum! I'll have a double scoop of capitalism, please! (Netflix promotional image)

This putatively beats Soviet Communism because you have freedom to choose your employment, commerce, and loyalties. You don’t work the commune, shop at GUM, and salute Party logos. But if Steve and Robin’s employment is circumscribed by choosing between Scoops Ahoy or Taco Bell, how much liberty do they really have? They can’t afford the buy-in to start a funky take-out shawarma joint across the Food Court.

That’s even before the limitations of privately-held public space. Take a picket sign into any American mall, or even a guitar. I dare you. Malls, owned by corporations which usually aren’t headquartered locally, have no legal or moral obligation to respect free speech or regionally generated culture. And if you nominally have the right to say anything, but are precluded from venues where anybody can hear you, are you really free? Really?

Online conspiracy theorists postulate that Stranger Things 3, appearing almost simultaneously with HBO’s Chernobyl miniseries, reflects possible growing fear of anti-capitalist rebellion. Our media sovereigns must squelch this cultural trend through intellectual mockery immediately! But actually watching Stranger Things 3, it doesn’t make Reagan-era mall capitalism look any better. If Soviet Communists can successfully disguise themselves as corporate maestros, is there any real difference?

(As an aside, malls and other aggregate capitalism have definitely narrowed the economy, but that doesn’t mean pre-mall capitalism was some Eden of entrepreneurial vigor. Anyone who visits communities where city fathers resisted the mall impulse, knows that locally owned businesses can be just as imperious and anti-democratic as large corporations. Don’t fall prey to appeals for false nostalgia; unrestricted capitalism has always been deeply anti-individuality.)

Writing between World Wars, GK Chesterton observed that, for ordinary citizens, the difference between capitalism and communism is vanishingly small. The only choice is, do you prefer your life dominated by state bureaucrats or corporate bureaucrats? The Starcourt Mall demonstrates what that means. Steve and Robin as workers, and the people of Hawkins as consumers, are supposedly free under mall capitalism. But they’re free in ways that just don’t matter.

Monday, July 15, 2019

Wonder Woman and the War We Don't Talk About


When Patty Jenkins’ record-breaking movie Wonder Woman debuted in 2017, a friend and I had a disagreement. My friend thought Wonder Woman was a classic piece of national security state propaganda, a movie made with Pentagon interference to sway public opinion to pro-military stances. I disagreed; its depiction of war’s horrors, its bumbling generals leading from the rear, and a protagonist who strives to kill War, would convince very few people of war’s innate heroism.

Neither of us persuaded one another. Two years later, we probably never will. But when Jamie Woodcock’s book Marx at the Arcade appeared this summer, I found something buried midway through the volume that arguably supports my position. Author Jamie Woodcock describes Battlefield 1942, a first-person shooter videogame released in 2002. It’s a single-player game set during World War II, which retells common Allied myths about WWII as a Manichaean battle of good versus evil.

These myths are twaddle, but they persevere. Responsible historians have acknowledged that Allies did some pretty horrible things, including targeting civilians, pillaging, and rape. But WWII permits citizens of Allied countries, particularly Americans (who didn’t face actual fighting on the home front), that we fought a clearly moral fight against a clearly villainous enemy, and therefore war is heroic, soldiers are virtuous, and TAKE A BATH AND GET A JOB YOU SLOPPY HIPPY JESUS CHRIST!

Battlefield 1942 spawned a series of similarly themed games featuring you, the first-person protagonist, fighting various military conflicts throughout the Twentieth Century. But Woodcock expresses surprise at one series entry, Battlefield 1, set during World War I. Before this game, designers, like filmmakers, largely avoided WWI. We just don’t talk about the First World War, except in tones of regret and high-minded anomie. Unlike World War II, World War I is too awful for mythology.

Which is why Patty Jenkins’ Wonder Woman isn’t national security state propaganda. Because when William Moulton Marston created Wonder Woman, he deliberately set her into World War II. Dressed in stars-and-stripes pinup girl underwear, she trounced Nazis, deflected bullets, and generally participated in the war’s pro-American narrative. She was a GI’s dream. This Wonder Woman actually was wartime propaganda, and didn’t even pretend otherwise.

Jenkins subverts this by taking the story back one world war.


Movie makers have largely avoided World War I because, in our collective memories, it completely lacks any heroic through-line. When remembering personalities from that war, most people remember Alvin York and the Red Baron, and nobody else. Campaigns like Gallipoli, which was complete slaughter, or Verdun, which dragged for eleven months before anything happened, are the complete opposite of heroism. This war isn’t a triumph of national security state marketing, it’s a shared international disaster.

Like the Battlefield game designers, Patty Jenkins establishes the war’s persistent, nihilistic futility. Diana wanders the trenches of France, horror-stricken by the walking wounded and carts filled with corpses. Moved by a widow’s plight, she breaches the line and liberates a Belgian village; the people celebrate with wine and dancing. Then in the next scene, the germans destroy that village with mustard gas. Diana stands, powerless and numb, among the dead who celebrated her yesterday.

This isn’t a function of individual action. The movie takes pains to emphasize that no one person caused this war, and no one person will solve it. Diana enters the conflict with dreams of heroism at Troy or Thermopylae; she exits it acknowledging that she can only face the enemy before her. She’s a superhero, admittedly, but she cannot heroically alter the course of history. She can only know her own values and keep fighting.

Wonder Woman stands as the highest-grossing WWI movie ever made. But that doesn’t surprise me, since not many WWI movies have been made. Hollywood, and its satellites in places like London and Mumbai, love narratives of perfect moral symmetry, from Rambo to Casablanca, and World War I persistently refuses to reduce that way. Instead, before this one, the most successful screen WWI I’ve seen was Black Adder Goes Forth, drenched in claustrophobia and complete paralysis.

From Wilfred Owen’s poems to Pat Barker’s novels, World War I has provided some touching, substantive art. It hasn’t, however, jibed well with screen adaptations, which demand a morally neat universe. Wonder Woman succeeds because it forces its heroine to adjust her definition of victory, and us with her. It resists the military’s exhortations that whatever it does is, perforce, right. For that reason, I’m pretty sure the national security apparatus didn’t bankroll this production.

Friday, July 12, 2019

The Risks of Believing in Magic

1001 Movies To See Before Your Netflix Subscription Dies, Part 32
Christopher Nolan (writer-director), The Prestige


Early in Christopher Nolan’s 2006 thriller The Prestige, a scene features a supplicant named Angier (Hugh Jackman) begging assistance from Nicola Tesla (David Bowie). Tesla approaches from a raised platform, surrounded by electrodes discharging plasma lightning. It’s tough to watch this scene without recalling The Wizard of Oz, and Dorothy’s first encounter with the Wizard, all lightning and booming echoes. This is probably deliberate.

Like Dorothy’s Wizard, the magicians Angier and Borden (Christian Bale) create wonder and spectacle which keeps their audiences captivated; in return, audiences provide the magicians a living. But also like Dorothy’s Wizard, these magicians increasingly believe their own hype. They demand worship from the hoi polloi, which, in their minds, includes one another. When they don’t get it, their demands become increasingly destructive.

Angier and Borden apprenticed together in Victorian London, paying dues as audience plants and stage machinists. (Their mentor, Milton the Magician, is played by Ricky Jay, an actual illusionist and veteran David Mamet actor, which lends his performance both verisimilitude and a Mamet-like air of bored cynicism.) But a routine stage accident increases quickly, costing Angier his wife’s life. This begins a cycle of distrust and revenge, onstage and off.

Going solo, Angier and Borden face two requirements: establish their personal brands, while sabotaging one another. Both illusionists attempt exceedingly dangerous tricks; both sabotage those tricks to cause their rivals to hurt themselves or others. Both performers get maimed when the other sees through his tricks. Yet each answers their respective setbacks by doubling down, creating even bigger and more spectacular illusions.

Two themes run through this story. First, how much of the spectacle depends not on the event, but on how the performer sells the event? Angier, under his stage name “The Great Danton,” is clearly the superior showman, spicing his performances with unique flair contingent upon his personality. But Borden, calling himself “The Professor,” is the superior engineer, inventing new ideas first. Imagine what these two could accomplish if they collaborated.

Second, how important is belief to stage magic? Naturally, the audience knows the performer has enacted some sleight of hand, but we tacitly pretend to believe the performer’s story, because we cannot see the stage mechanics. But how thoroughly must the performer believe his own banter? Both these performers invest themselves so thoroughly in their stage performances that, if they doubt themselves, they’ll probably shatter.

Christopher Bale (left) and Hugh Jackman in The Prestige

Borden has devised a trick which only makes sense if he can exist in two places at once. Angier wants to crack this trick. Angier’s stage engineer, Cutter (Michael Caine), proposes a simple, elegant, self-contained solution; Angier rejects it, believing Borden must have perfected something bordering on supernatural. Because Angier believes Borden has a high-tech solution, he resolves to recreate it, and does; his belief in Borden’s banter causes him to change the world.

One recalls Arthur C. Clarke’s statement about sufficiently advanced technology.

Convinced that a technological solution exists, Angier consults Nicola Tesla. Like a stage magician himself, Tesla is famous for having some important ideas, but also for selling himself as his most important product. Largely shunned during his lifetime in favor of his rival, Edison, Tesla has come a downright mythological figure in subsequent decades. His very presence in this story provides a Cliff’s Notes on the dominant themes.

Angier, Borden, and Tesla tell elaborate, cantilevered lies to one another, edifices of untruth destined eventually to crumble when somebody pokes the weak brick. That’s what magicians do: they lie to audiences desperate to believe something exists. But these men don’t just lie to each other, they lie to themselves, so often and so elaborately that they forget the narrative they’ve constructed is false. They deceive themselves so often, they believe their own deceit.

Not surprisingly, for a story about stage magicians, this movie builds to a twist reveal. I got accused of “thinking like a screenwriter” for noticing the twist ahead, not by anything onscreen, but by what the movie omits. For me, the story’s culmination came not with the reveal, but with the impediments these characters overcame to reach that moment. These characters face setbacks normal humans would consider impenetrable, and persevere.

Nolan’s surface-level thriller certainly has enough twists and complexity to keep popcorn viewers entertained. But beneath that veneer, Nolan crafts a parable about how belief both empowers and entraps the believer. Both magicians believe they’re morally right, for separate but compelling reasons. But ultimately, everything they believe proves a lie. Sadly, a liar can’t see through his own lies.

Tuesday, July 2, 2019

The Pixel Revolution

Jamie Woodcock, Marx at the Arcade: Consoles, Controllers, and Class Struggle

Why should Marxists care about videogames? Perhaps even better, why should videogamers care about Marxists? The digital play world has gotten along spectacularly without interference from a cadre of po-faced academics who interpret the world through an industrialized economic lens. Yet Dr. Jamie Woodcock, Oxford University researcher and lifelong videogamer himself, decided to approach his two passions together. The result is surprisingly uplifting.

Woodcock approaches videogames from two angles. (And he spells “videogame” thus too, as one word. It looks weird to me, but let’s stipulate his spelling for now.) The first and longer section considers how games are made. This sounds straightforward, especially given recent revelations about the near-sweatshop conditions games studios famously maintain. In the latter section, Woodcock looks at how we play games—which proves more difficult and loaded.

Discussions of videogames often begin with Atari and “Pong,” introduced in 1972. But Woodcock cites an early mainframe computer designed especially to play simple games as early as 1940. Is that, he asks, an example of a videogame? Which leads us into questions of definition, like what exactly is a videogame? What even is a game? These questions prove more difficult than you’d expect, and also more rewarding.

Games of various kinds grew alongside computers, which means they inevitably grew alongside the American defense network. This relationship between videogames and the military-industrial complex recurs throughout Woodcock’s book. But Woodcock notes early programmers devised their games as ways of resisting the regimented hierarchy of military contractor life. So early hacker culture was deeply controlled, yet simultaneously anarchic by nature.

This contradiction spills into game-making culture. Hackers and coders, being innate individualists, refuse to unionize. But without collective bargaining abilities, games workers are routinely exploited by management, who demand high-handed control, marathon hours, and few contract benefits. Woodcock isn’t the first to notice the contradiction between “individualism” and individuality; without a union, hackers get reduced to cogs in the machine.

Not that hackers take management demands quietly, no. Woodcock describes various ways programmers have written anti-capitalist messages into their videogames. Much of Woodcock’s book expounds upon a side mission written into Assassins’ Creed: Syndicate, where the player has an opportunity to join Marx himself in organizing textile workers and sabotaging management. The metaphor is too spot-on to be coincidence.

Dr. Jamie Woodcock
In a later chapter, Woodcock explores a recent development: Game Workers Unite (GWU), a hackers’ grassroots activist… um… event, where several game-makers banded together to expound, for lack of a better term, class consciousness. It isn’t a union really, not yet, but Woodcock admits events are progressing so rapidly, he can’t keep abreast of them; by now, hackers could, after two generations, finally be organized to fight the system.

Woodcock’s second section, dealing with game play rather than game creation, runs shorter. It also has a different tone from his first section. Where Woodcock interviews several game-makers, including forward thinkers engaged in organizing the industry, he doesn’t much speak to fellow game-players. Though he occasionally quotes reviewers, he mostly just analyzes the game experience, similar to Michel Foucault.

What he analyzes, though, is remarkable. In my youth, moralistic elders criticized videogames as isolating and lonely, stealing youth from communal outdoor play. But since around 1997, videogames have become increasingly team-oriented, a shared experience. This has only become more so as improving internet technologies make streaming games worldwide realistic. Games have elaborate, cinematic storylines, detailed graphics, and a team experience more nuanced than football.

This doesn’t make games socially neutral, though. Many games, particularly first-person shooters, tend to glorify war, and travel hand-in-glove with American military imperialism. Woodcock even describes how, with only minor modifications, the military has repurposed popular videogames as recruitment tools. Games cannot shed hacker culture’s roots among military contractors, apparently.

But that doesn’t make games relentless war machine propaganda. Since around 2010, an increasing number of first-person shooters have emphasized war’s futility. Early games stressed World War II, where American GIs fought an explicitly evil enemy, or the Cold War, where evil was implicit. Recent games have shifted focus to World War I, Vietnam, or the War on Terror, foregrounding war as interminable, degrading, and bleak.

In games, as in game-making culture, Woodcock sees potential for social engagement and expanding consciousness. Marxists, he admits, have often avoided addressing issues of play, and what the proletariat does after hours. But videogames are increasingly the way people organize themselves into communities and groups. Coupling his sophisticated research with plain-English writing, Woodcock opens games as a legitimate forum for social analysis.

Monday, July 1, 2019

Free University? Meh.

I’m tired of hearing about free higher education. Which presidential candidates, we wonder fretfully, will make universities free? Let’s lionize this venture capitalist who wrote off an entire graduating class's debt! In various ways, the cultural trend runs toward making postsecondary education free, or anyway cheaper than dirt. More people, ideally everyone everywhere, should have at least the opportunity for postsecondary education. Or so the argument goes.

I taught university-level English for four years, and would like to return to teaching that level someday. I especially liked teaching students from working-class and disadvantaged backgrounds, students who’d often spent years in high school getting told they’d just pass into the same workaday grind their parents occupied. Seeing their eyes illuminate as their liberal studies core lighted new paths, was a constant pleasure and education for me.

So I believe in education. Given the choice between education and unawareness, I’ll always foster education. But I have two important objections to thrusting every high school graduate into university. First, not all education is synonymous with schooling. Second, we already have students and their parents mistaking a four-year degree for job credentials; how much worse will that become if every job assumes every applicant has a degree?

In eight semesters teaching Freshman Comp, I noticed a recurrent trend: my students regarded anything that wasn’t career-oriented as extraneous. This included my class. The idea that communicating well in writing isn’t a professional skill continues to astound me; but that also meant that most considered philosophy, Shakespeare, higher math, economics, and history as merely ancillary to whatever career goals they nourished. Only their future employment mattered.

Imagine that: an entire generation of students who want their education completely blocked into professional silos. Business executives who only know how to execute business. Computer technicians who only know computers… and so on. Meanwhile, we generate a crop of voters with no understanding of history; entertainment customers with no savvy for literature; lower-rung employees without the adaptability to learn their industries and ascend the corporate ladder.

What a bleak, dystopian world this postulates—and yet, it’s the world we increasingly have. Throughout America, liberal arts programs are dwindling, and liberal arts colleges seek new justifications for their programs. We consider graduates educated, not if they’re able to face life’s massively complicated facets with depth and equanimity, but if they’re able to slot neatly into somebody else’s capitalist machine. Education’s outcomes are measured entirely in dollar signs.


Students themselves don’t want this, mostly. I had multiple students who wanted to study art, but got channeled by parents into “graphic design,” for career purposes. An aspiring English major instead acquiesced to a journalism degree, which anymore amounts to a career in advertising anyway. And American lawmakers, over educators’ objections, have actively attempted to change what liberal studies even mean. For the students’ best interests, of course.

However, as I’ve written before, one group consistently appreciated their liberal studies core, not only for its own value, but for the way it influenced everything else about their education: non-traditional students. I had multiple students who took time after high school to work, travel, join the military, or raise children, and returned to school in their twenties or thirties. These students consistently enjoyed, even wanted, their liberal studies classes.

They wanted them for aesthetic reasons, certainly, for the appreciation of Shakespeare’s tragedies or Heidegger’s phenomenology, or the elegance of analytic geometry. But they also understood the practical relevance of these classes. Understanding history was a complete prerequisite for understanding the present. Reading literature let us step outside ourselves and see through other people’s eyes. Science and math taught us to approach the world systematically, rather than flailing helplessly.

Therefore I propose we must redefine what we value in education. If we want students to acquire job skills, we should make trade schools free, and encourage students to attend those first. Sure, let them acquire professional skills, if that’s what we value. But send them, at our shared mutual expense, to schools which specialize in professional skills, where they’ll get through faster with fewer prerequisites anyway.

Then, remove the stigma from students who return to four-year university later in life. Not everyone is prepared for university education just because they’ve finished high school—and that especially includes students who attended impoverished schools, rural students, and to a degree, men. If they aren’t ready at eighteen, but develop the skills and self-discipline to “do school” at twenty-five (as I did) or later, more power to them.