Friday, March 31, 2017

Poor-Shaming in the Self-Help Industry


Earlier this week, New York-based MSW Melody Wilding published a column declaring people prone to poor decision-making could improve their choices, and their lives, using the HALT method. This approach requires people making any particular choice to first take care that they aren’t choosing while Hungry, Angry, Lonely, or Tired. I reached the end and wondered: does this author realize she just described the experience of being poor?

I don’t mean the experience of being destitute. Of course people living in illegal squats or sleeping on sidewalks make poor choices. Living day by day, hour by hour, they lack freedom and security enough to even make long-term decisions; they’re more focused on ensuring the few possessions they still have don’t get pilfered while they’re chasing the next meal. We expect the truly indigent to make bad decisions.

Rather, I mean ordinary working poor. People like me, living paycheck to paycheck, spend much of our time Hungry, Angry, Lonely, or Tired. I’ve sometimes punted meals onto the next pay cycle, banking on the reality well-known to survivalists, that the sensation of hunger dissipates after two to three days. I’ve postponed sleep to dedicate time and energy looking for other jobs, hoping to improve my material situation to rise above a constant HALT state.

And the job conditions absolutely require a tolerance for both loneliness and fatigue. I know, writing this, that by the time you read it, I’ll probably be at my job, working under solitary or near-solitary conditions. The equipment I use is too loud to permit me plugging earbuds into my phone, much less conversing with whatever co-workers might wander past. Even in a crowd, work conditions are intensely isolating.

Just last week, my boss assigned me a side task, “when you need a break.” So I assumed the task was low-priority, and continued my main assignment. The next morning, I found the parts I’d been assigned to locate and collect, when I needed a break, jammed atop my regular tools so precariously that, if I moved one piece, the others would collapse, possibly breaking my tools. My boss admitted pulling this passive-aggressive move out of pure petty spite.

So yeah, I’m Hungry, Angry, Lonely, or Tired most of the time. Nor is this unique to any particular job. My blue-collar career has consisted of both construction and assembly line work, jobs that pay poorly because they’re considered low-skill and unimportant, and leave workers constantly surrounded by people but unable to talk. Before that, I was a university adjunct, a job that left me socially stimulated, but chronically broke and overextended.


These conditions are complicated by issues that seem incidental, but prove very important. For instance, the poorer you are in America, the further you live from work, statistically speaking. And probably, the further you live from access to public transportation, too. So you’re economically dependent upon your car, in which you probably spend a significant portion of your non-work hours. That’s time not spent with family, friends, or doing anything constructive or ennobling.

Also, a growing body of evidence indicates that most addictive behaviors, including substance abuse and certain other addictions, like gambling or risky sex, originate from some form of pain. This may include physical pain, or the psychological pain induced by loneliness. As the 1990s song indicates, Common People “dance and drink and screw, because there’s nothing else to do.” Put concisely, despite the common wisdom addictions don’t cause poverty; poverty causes addictions.

America’s elected officials prefer to invest resources in expanding what’s already Earth’s biggest military, offsetting costs by cutting half-invisible programs like Meals on Wheels. Why, these officials ask, can’t the elderly get their children to provide food and other protections against economic freefall? But these same officials do nothing to prevent the concentration of wealth in a few major cities, forcing young adults to move to New York, Chicago, and San Francisco in pursuit of work.

So yeah, America’s poor are Hungry, Angry, Lonely, and Tired most of the time. This forces us into cycles of lousy decision-making that prevent us improving our own lot. We recognize our own situation; we don’t need Gothamite experts reminding us. Yet self-help writers, from Jack Canfield to Suze Orton, constantly invent new ways to imply the poor are poor only because they lack Wall Street’s well-worn methods of saving, investing, and other good decisions.

I believe Melody Wilding means well. I certainly attempt to improve myself and my decision-making constantly. But when my job sometimes requires me to work thirteen days on, one day off, telling me that I’m making bad decisions because I’m tired isn’t just insulting. It’s a sign of deep-rooted ignorance in today’s economic atmosphere.

Monday, March 27, 2017

Who Is This Jon Ronson Person Anyway?

Jon Ronson, The Elephant in the Room: A Journey into the Trump Campaign and the “Alt-Right” and The Amazing Adventures of Phoenix Jones: And the Less Amazing Adventures of Some Other Real-Life Superheroes

Welsh journalist Jon Ronson probably came closest to hitting the big time when Ewan McGregor played him in the 2009 film The Men Who Stare At Goats. He lacks Hunter S. Thompson’s name recognition, despite being visibly influenced by Thompson’s gonzo style, but continues producing prolifically. Having recently discovered Ronson’s work, I decided to investigate him further by reading some of his standalone digital essays. Sadly, I emerged more confused than when I went in.

I cannot dispassionately consider Ronson’s essay about the Alt-right, in largest part because he published it before the 2016 general election. Like me, he considered Clinton's victory a foregone conclusion, and treats Donald Trump as a nine-days wonder, bound to blow over quickly. Interactions with actual conservatives present as low comedy. Thus he doesn't so much unpack Trump, as mock Trump's various hangers-on, especially Alex Jones, to whom Ronson has a personal and professional connection.

More important, for an essay whose title promises to go "Into the Trump Campaign and the Alt-Right," it offers precious few inside views. Ronson wanders into Alex Jones' studio a few times, where he's greeted with polite distrust. But he never meets Trump, Steve Bannon, Paul Manafort, or Kellyanne Conway. His Trump campaign reports originate in the press gallery way above the Republican National Convention. He’s one among the crowd; there's no insidership on display.

Completing this essay, I feel no better informed about the Alt-Right than going in. Who are they? What is their unifying position? When did they uncouple from the mainstream right? Are they more than PR jargon? For a term that's gotten ballyhooed widely in the last six months, I realize I know little about them. I undertook this essay, partly, hoping it would clarify my confusion. Instead I feel as flummoxed and adrift as ever.

This essay doesn't know what it wants to be about, probably because its author doesn't know what he wants to accomplish. We're I grading this whirlwind in my classroom, I'd say it lacks a thesis statement. Without that, the author lacks focus, and the reader lacks grounding. Ronson apparently expected a message to emerge from this stream of consciousness. But it's just a mélànge of liberal journalist buzzwords opportunistically grabbing a waiting, but confused, audience.

Jon Ronson
Ronson’s essay about the Real-Life Superhero Movement has the opposite problem: where he analyzes the Alt-Right he never meets, he meets the Superheroes he barely analyzes. Fluke Internet publicity made Phoenix Jones a viral sensation in the late 2000s, despite being hardly the only unarmed, costumed crimefighter in America, or even Seattle. Equipped with a press agent and a theatrical flair, Jones made civilian street patrols, formerly a fringe area, suddenly visible, if still unpopular.

Late-night patrols with Jones’s Rain City Superhero Movement reveal a strikingly banal truth: they don’t do much. Though Jones claims he’s been shot on patrol, his principal activity apparently involves making people stare, turning public gaze on unsavory people doing unsavory things, until they back down. They’re less crime-fighters, more public spectacle. In many ways, mostly unspoken but once briefly acknowledged, these “superheroes” channel the existential malaise inherent in Alan Moore’s classic graphic novel Watchmen.

Though the Rain City Superheroes have garnered media attention, they’re hardly America’s only self-proclaimed superheroes. Ronson visits other cities, only to find himself deeply disappointed. New York’s organized “superheroes” are effective at dispersing criminals, but have little flair; Ronson calls them bullies. San Diego’s superheroes make good PR, but patrol safe, gentrified downtown streets. Only the Rain City Superheroes combine theatre with (sadly paltry) results.  Despite many fine-sounding promises, this ain’t the Justice League, folks.

Unfortunately, Ronson only makes these “heroes” into punch lines. He observes them only one or two nights, which, combined with cursory interviews, denies them any reliable long view. He makes no attempt to get viewpoints from police, policy experts, psychologists, or basically anyone outside the movement. He simply contrasts the superheroes’ high-minded rhetoric with their banal results and rolls his eyes. Fans of pop psychology will see avenues for deeper investigation, which Ronson basically ignores.

These essays don’t instill confidence in Ronson’s journalism. The one trait they share is Ronson himself. Like P.J. O’Rourke and other Rolling Stone alumni, his “journalism” involves reporting what happened to him, and assuming it’s representative. If a student presented such content to me, I’d explain the Fallacy of Composition and suggest avenues for rewrite. But maybe that’s why I’m still slaving in the salt mines, while Ronson gets movie resales and TED Talk invites.

Friday, March 24, 2017

The “Gig Economy” and the American Work Ethic

Wages, insurance, guaranteed work? Screw you, who needs 'em?

Here’s a riddle for you: how can we call somebody “manager” of a facility if we never see him? I wondered that when the titular factory floor manager visited our shift one night. And when I say “one night,” I mean for forty-five minutes at the beginning of the shift. In over three years working third shift, I never saw our supposed factory manager before or after that all-crew meeting. Third shift was organizationally disconnected from the rest of the company.

Halfway through that meeting, a virtual monologue of bromides reheated from Stephen Covey and Kenneth Blanchard, the “manager” asked the assembled workers: “How many of you have second jobs outside this company?” About a quarter of workers raised a hand. “See there? That’s what I call initiative. When my son asks for a bigger allowance, I tell him to be more like the workers at [Company Name] and get a job. Take responsibility for yourself.”

Let’s be clear. In a room entirely comprised of adults working full time, with frequent mandatory overtime shifts, this manager considered it an indicator of high ethical character that roughly one in four workers cannot make ends meet. These workers, the majority of whom have spouses, partners, and/or minor children at home, spend half their waking hours inside this man’s facility, then decamp to another entire job to cover their bills.

Earlier this week, New Yorker staff writer Jia Tolentino published a short article, “The Gig Economy Celebrates Working Yourself to Death.” Tolentino quotes PR from inside “sharing economy” companies like Lyft, which praises a pregnant contract driver who picked up another fare after labor began, and Fiverr, which encourages freelancers to forego sleep. The images are pretty terrifying, especially if you amortize actual pay across work done.

But frankly, it isn’t surprising to anybody who’s worked jobs with little growth potential recently. Readers may recall the notorious McDonald's Employee Budget, which assumed workers had two jobs, while including no money for groceries or childcare. The jobs available to people lacking skills or connections, or who simply cannot reconcile their skillset with local economic demands, literally assume you can’t make ends meet on full-time employment.

Companies like Fiverr, which charges five bucks for skilled actions, are merely the culmination of economic trends, since the 1970s, that have progressively separated work from reward. They literally invite workers to charge five dollars for services like graphic design, a field that often charges in the hundreds, if not tens of thousands, of dollars for services rendered. These costs aren’t unreasonable, because buyers often reap rewards disproportionate to the value of actual performed labor.

Political cartoonist Ted Rall. Click to see source.

If you pay five dollars for, say, the cover design on your next self-published book, you’re either getting a substandard product done hastily in Microsoft Paint, or you’re profiting off somebody else’s desperation. The latter option, to the current economy, isn’t so bad. Thing is, when Tolentino associates this trend exclusively with the “gig economy,” she’s arguably missing the larger point. This isn’t a niche flaw; it has become characteristic of our entire working economy.

White House Budget Director Mick Mulvaney last week defended President Trump’s proposed budget, which controversially cuts block grants to school lunch programs and Meals on Wheels to near-zero, by calling it “compassionate.” This is common reasoning in conservative circles: refusing to protect the poor gives them incentives to work. If there isn’t a floor through which people are prevented from falling, the logic goes, people will work even harder to prevent their own total economic collapse.

Yet at that same factory where I worked, where the absentee manager praised employees who needed second jobs to feed their children, I watched one young woman forced to quit a job she actually loved. She received a merit-based pay raise, which gave her just enough money to no longer qualify for protected childcare subsidies. Forced to choose between paying more out-of-pocket or leave her kids home alone unsupervised, she had only one choice: find another job that paid either more or less.

The gig economy makes visible something that’s been real but concealed for two generations now: Americans no longer value work. We give lip service to the American work ethic, to myths of self-reliance and bootstrapping. But our structural refusal to pay for the things we buy reveals our actual core values. My manager, and the PR flack who praised a Lyft driver working through birth pains, couldn’t have been clearer: If you’re poor, go fuck yourself. That’s all you deserve.

Wednesday, March 22, 2017

In Praise of Fanfiction

Margery Kempe, history's first
fanfiction writer
As an undergraduate, I once had a creative writing class with a guy who wrote science fiction. I have no complaints about genre writing; of the three manuscripts I workshopped that semester, two of mine were also science fiction. But this guy’s stories were based on video games. This was easy to tell with one story, where the entire plot turned on monsters jumping out from behind furniture and yelling “Boo.” He admitted the second manuscript was the backstory for his online role-playing game character.

Ordinarily, I’d pay such people no mind, besides warning then to recognize the difference between differing media while writing. First-person shooter games reward simple, repetitive action, while RPGs favor exploration and exposition over action and dialog. Pick your medium, and stick with it. But I spoke with our teacher later, who informed me that, because this one student had serious problems with what she called “fanfiction,” she forbid any genre writing in future semesters.

That’s where I have a problem. Because fundamentally, I understand what this guy wanted to do. He had joined what USC professor Henry Jenkins calls “participatory culture,” a niche that includes not only fan fiction, but fan conventions, cosplaying, RPGs based on popular franchises, and other reciprocal creation. Audiences have never been entirely satisfied simply, passively receiving their favorite stories from the dream factories that create them. They’ve always wanted to join the creation.

In my pre-internet youth, I usually had no idea such participatory culture existed. Sure, I played with action figures, and schoolyard games involved adding to the canon of TV shows. We kids loved recreating Voltron episodes, and sometimes came near blows over who played the Black Lion. Looking back, it’s funny how the kids wearing football jerseys and recreating Monday night’s scrimmage lines were considered school heroes, but students trading hand-drawn Star Wars comics were “losers.”

But dedicated fan culture went much further. Mimeographed fan magazines, including fiction based around popular franchises like Star Trek, were frequently hand-distributed at conventions and circulated among friends. Burgeoning digital technologies made distribution of fan-made works more practical, and formerly narrow fan networks began trading stories globally. Continuing stories like James Cawley’s Star Trek: New Voyages, or Nicholas Briggs’ Doctor Who spinoff Auton received international distribution.

It’s easy for cultural snobs and university professors to dismiss fan creations as “mere” juvenilia. Better writers would naturally create original works, duh. But this hasn’t always been so. Well-respected writers have long attached their creations to existing works. Some have called English Christian mystic Margery Kempe a writer of fanfiction for inserting herself into New Testament narratives. Surely the tradition is older, as many scholars consider certain New Testament epistles later imitations of Saint Paul.

Promo poster from James Cawley's
Star Trek: New Voyages
Only with the rise of affordable print technology and widespread literacy did originality become something desirable in literature. When only limited resources existed for distribution of written material, originality was regarded as theft from the common store. Why bother creating something new, when you could better spend your time hand-copying the important works of bygone masters? Until the Industrial Revolution, works like “innovation” and “newfangledness” were deployed as insults.

Fanfiction writers don’t merely derive from existing works. They attempt to join an ongoing discussion, adding to the experience. And certainly, James Cawley’s episodes will never have the arching influence the original Star Trek had, particularly as his distribution license with Paramount forbids him showing a profit. But for an intimate circle of fans, such new content deepens the experience, particularly because in creating, they more wholly immerse themselves in the act of sharing.

In graduate school, I read research indicating that teachers could broaden students’ subject understanding by having them write new material within the subject. In sciences, this could mean creative writing about a discipline: deepening students’ grasp of sociology, for instance, by having them write from the perspective of someone from another race, sex, and nationality. In literature, writing “continuations” brings students into the process. I didn’t understand A Raisin in the Sun until writing a scene where Beneatha packs her belongings.

I remember telling a Freshman Comp student that art becomes art, not because we appreciate it, but because we have a relationship with it. And when we have a relationship with something, we want to return our feelings. We want that give-and-take with friends, spouses, children. Art is no different. If we passively receive it, and create new work only at right angles, we have missed the opportunity for true reciprocation with what we love.

Monday, March 20, 2017

One Day In an IKEA Showroom

Notice the arrow on the floor so that, like a theme park, you
experience IKEA in the sequence its designers intended

Entering an IKEA showroom distinctly resembles entering an airport. After finding parking in the subterranean garage, so huge that you have to remember alphanumeric codes if you ever hope to find your car again, you have to ascend two different escalators just to reach the front door. These escalators pass multiple displays of featured product, lovingly arranged just like airports arrange displays from local history and tourist attractions.

This demonstrates just one way IKEA attempts to structure its store like a full-immersion experience. The faux-Scandinavian place names (the children’s playground is called Småland) and the notorious free meatballs remind customers we’re entering an embassy from another nation. And quite an embassy it is, too: if you don’t claim your free map at the door, you’ll certainly get lost. And also forget your intended purchases.

Because the IKEA showroom isn’t, fundamentally, a store. Within the first few displays, it becomes clear you’ve entered a museum. The massive, intricately curated displays of furniture, arranged to recreate examples of how you could arrange your room, don’t involve any stock you’d actually buy. IKEA doesn’t invite you into their store to shop, they invite you in to witness their multiple lush displays of simple-colored middle-class aspiration.

One display near the door, in the Living Room area, features a fake-leather couch facing a wall-mounted media center. The wood veneer on the media center is color-coordinated with the couch, with doors and sliding drawer pulls in contrasting colors. Upon the media center, a 70-inch television is running cupcake competitions from the Food Network, that orgy of bourgeois pretension. IKEA hopes to sell you a lifestyle.

A life completely free of clutter. Or, y'know, windows.
So you don't have to see the messy world outside.


In today’s economy, this makes their approach particularly insidious. Where Target or Walmart assumes their customers already have intended purchases in mind, and permit customers to create their own itinerary for picking their product, IKEA requires you to pass through their showroom in a particular sequence, Living Room to Workspaces to Dining Room to Bedroom, before making any purchases. You have to immerse yourself in the IKEA experience.

The furniture is all arranged to mimic how you’d arrange your house, in a perfect world free from children or pets or workaday fatigue. Beds are carefully made, with fluffy comforters and hospital corners. Dining chairs are pushed in. TVs are turned on, but not loud. Perfect people who don’t get tired or frustrated or take their surroundings for granted could live here. It jibes perfectly with progressive Americans’ idealized Scandinavia.

Despite what I said previously, the showroom has occasional items for purchase. Displays of small items—paper napkins, macramé accessory hangers, collapsible storage boxes—serve to occasionally remind guests they’re here to purchase lifestyle aspirations, not merely drool over them. I never saw anybody carrying anything from these displays, merely picking them up and examining them. But they aren’t really there for purchase, just as reminders.

At points, the showroom includes homages to the Tiny Home movement. Areas walled off from the showroom floor, but with open doors inviting exploration, show customers how to organize a bachelor pad in 270 square feet, or a young couple’s first home in 370 feet, using design elements available in-store. They advertize a beautiful world, completely free from laundry on the floor or dishes in the sink, a world we’d live in if we didn’t have, y’know, stuff.

Designers try to make IKEA shopping as much about life
as possible, down to the family photos on the headboard


After the showcases, you pass into a restaurant area. Like an airport, or Disneyland, IKEA expects customers to dedicate most of their day to experiencing the showcase, and recognizes that you’ll need sustenance. Unfortunately, my brain was already in shutdown mode, overwhelmed by crowds and colors and constant sensory input. I’d already found the floating shelves I wanted; I longed only to buy and escape.

So down the escalator, to the actual shopping space. Only after the halfway mark does the company even provide shopping carts. The shopping happens below the showroom, as though filthy lucre is closer to earth. And unlike the lavishly appointed showroom, the shopping space is remarkably sparse. The weekend I shopped, heavy weather had isolated distribution centers in the Northeast from retail spaces nationwide.

We emerge, blinking, into the sunlight, clutching most of our intended purchases, full of meatballs and lingonberry soda. IKEA has eaten an entire day. We’re bleary-eyed and confused, as the clutter-free showroom surrenders to the dirty, carbon monoxide-choked parking garage. We must leave the aspirations behind and return to life. But we have the map, and can return any time. IKEA, like Disney, is more real than mere reality.

Monday, March 13, 2017

Living the Latin American Nightmare

Mariana Enriquez, Things We Lost in the Fire: Stories

A driftless young woman finds an abandoned skull in a Buenos Aires park, and becomes obsessed with reassembling the body. An apparently abandoned house turns out to be full of arcane artifacts and ethereal light, and an unhappy young girl wanders within, never to return. An angry urban husband mocks his wife, scorns the hickish truck driver who rescued them, and apparently packs his bags and wanders into an urban legend.

I can find precious little prior information on Mariana Enriquez. Though she’s apparently a well-respected journalist and novelist in her native Argentina, this is apparently her first book-length publication in English. She comes to Anglophonic readers a virtually blank slate provided we can avoid the temptation to make her resemble Jorge Luis Borges. Her short stories more resemble Edgar Allan Poe or Thomas Ligotti anyway.

Like Poe or Ligotti, Enriquez’s fiction uses foundations in the real world, incidents of the massively commonplace, as entry points into moments of overarching dread. When a woman, a sort of Argentinian do-gooder hipster, reaches out to a starving street child, we recognize a social justice warrior in action. When that child mentions a gripping fear of the monsters living across the railroad tracks, we wonder what monstrosities this child has experienced. And when that child disappears, we start seeking the real monsters.

This sense of creeping dread dominates Enriquez’s storytelling. As we read, we adjust our mental rhythms to Enriquez’s slow, sometimes soporific pace, and enter a sort of dreamland. As in our own dreams, this guided tour of somebody else’s phantasmagoria dwells more on mood than content. We start conjuring images of what could be, and our anticipations drip with creeping dread. We wonder: am I worse than the pending monster?

Mariana Enriquez
Some stories include actual monsters. “An Invocation of the Big-Eared Runt” features a local tour guide having visions of an historic Buenos Aires murderer (an actual person, though English speakers will need to Google this). In “The Neighbor’s Courtyard,” a disgraced social worker looking to redeem herself breaks into a house where she suspects abuse is happening, only to find a cave of horrors worse than her frequently vivid imagination.

But many stories involve no literal monster, or something glimpsed only in passing. Three young girls on self-destructive benders watch an anonymous woman get off a bus in the wilderness, in “The Intoxicated Years,” only to see her years later, untouched by time, luring them into the forest. Another girl, in “End of Term,” mutilates her own body to appease an invisible man behind the mirror. Is she merely schizophrenic, or is her illness somehow contagious?

Two themes emerge as the stories mount up. In some stories, young women on the cusp of adulthood do something vindictive and ruinous, to themselves or others, and suffer consequences they never anticipated. Or an unhappy wife’s inability to express her gloom leads herself or her husband into a death spiral. Either way, a woman’s inner turmoil manifests itself upon the outside world, often at great cost to human life.

At her best, Enriquez couples this inner violence with Argentina’s history of literal violence. In my favorite story, “The Inn,” two teenage girls, one a closeted lesbian, attempt to gaslight a local hotelier. But the hotel they target was a police academy—read, “torture chamber”—during the Peronist years. When the ghosts of Argentina’s bloody past chase the girls through the present hallways, it’s impossible to not wonder who’s passing judgment upon whom?

Parapsychologists like Joe Nickell and William G. Roll have long noted the apparent correlation between deep emotional turmoil and seemingly supernatural occurrences. This seems especially prevalent with poltergeists; seems the movies weren’t wrong associating this phenomenon with an emotionally high-strung adolescent girl. Enriquez simply assumes these correlations are real, and asks herself: how would they manifest in my homeland today?

As in the best horror fiction, Enriquez conjures the most powerful scares, the most lasting nightmare fuel, by withholding information. She creates rich mindscapes, certainly; her storytelling is resplendent with small but telling details that immerse us in her world. But she conceals the Big Evil. Stephen King this ain’t, and anybody expecting the big reveal moment American horror writers savor waits in vain.

But audiences willing to suspend their Anglophonic expectations will find Enriquez rife with crawling disquiet, the kind that gets under your skin. Like Borges, Enriquez creates an interstitial world on the borderline between reality and dreams. Unlike Borges, she reminds us that our dreams are something to fear.

Monday, March 6, 2017

This is the Future Americans Have

William F. Buckley, Jr.
William F. Buckley, Jr., founded the magazine National Review in 1955, a time when conservatism seemed in disarray following World War II. Though a Republican, Dwight Eisenhower, was in the White House, he was a committed centrist who brought conservative legislators to heel and openly disdained Joseph McCarthy. Vestigial conservatism still survived, but the side lacked what later Republicans considered a “movement.”

In his magazine’s inaugural op-ed, entitled “Our Mission Statement,” Buckley wrote that his brand of conservatism “stands athwart history, yelling Stop.” The image of an ideological purist, attempting, by sheer force of personality, to stop history happening, conjures images of somebody trying to shout down a river, or perhaps an oncoming train. Whatever your prefered metaphor, the fact remains: that person must either get with the movement, or get killed.

I couldn’t help remembering Buckley’s strange, retrogressive philosophy last week. If you follow Blue Facebook, you already know the seriocomic outrage explosion that happened when a hard-right-wing shill posted the following tweet:


For conservatives, the outrage here seems clear: a world where minority religions and alternative sexual identities feel no compunction in public is a marker of social decline. Progressives, by contrast, noted two people with very different cultural expectations sharing public transit peaceably, which they considered a triumph. Exactly what anybody takes from this encounter depends on what they brought into it.

My oldest friend asked, over the weekend: “Please explain to me what's so bad about this picture?” He intended the question for the self-professed conservatives among his friends, but as a former conservative myself, I think I’m qualified to answer: they find this picture objectionable because it represents change. The America they grew up in, an America defined by homogeneity and unified values, is sundering into different ethical codes.

But as a former conservative myself, I’m also qualified to say this explanation is moonshine. It relies upon something rhetoricians call the Fallacy of Composition, the mistaken assumption that the whole resembles its visible parts. Presumably the pseudonymous tweeter offended by Muslims and transvestites sharing public transit grew up, like me, in an economically and racially uniform suburb. And, unlike me, that tweeter never questioned that reality.

America’s historical lack of homogeneity has been extensively documented. For mass audiences, authors like Howard Zinn and James Loewen have combed the primary sources to create readable accounts of America’s racially, religiously, socially diverse Mulligan stew. Escaped African slaves settled among Native Americans in the territory that’s now the United States a generation before whites did, according to Loewen. Diversity is history’s rule, not an innovation.

For me, becoming aware that the standardized blandness of my childhood didn’t really represent America, forced me to evaluate, and finally change, my political views. But I suspect this tweeter didn’t perceive diversity as something he (she?) became aware of, but as something new, forcing itself onto society’s existing frame. If, like me, this tweeter knew only other white, or slightly off-white, faces on the school bus, diversity on public transit may seem new.

The photo as it originally appeared, posted on Instagram

To anyone who perceives all change as decline, all history as something we must stand athwart, yelling Stop, then evolution from the white uniformity of childhood to today’s vast, multi-hued complexity becomes something to resist. Childhood’s whiteness was fixed and comprehensible. The world outside childhood, which requires constant reconsideration as its color wheel constantly evolves, forbids citizens to rest on their haunches. Some people find this threatening.

Diversity’s inherent threats have been common philosophy at least as long as there has been a United States. George Washington, in his farewell address, urged Americans to not be divided by politics or geography. But this same Washington, as President, prosecuted Indian wars in the Ohio River valley and mustered the militia against frontier tax protesters, to create a unity he knew didn’t really exist. Even then, diversity needed quashed.

The superficial irony of this controversy is that the political forces who would force Muslims, transsexuals, and other non-conformists into isolation, simultaneously believe Libertarian economics would free society’s innovators to create beneficent change. So they want to select society’s outcomes based on their preconceptions. When President Obama funneled money to clean-energy research, Rex Tillerson condemned “picking winners.” But who’s really picking winners here?

This isn’t the future liberals want. It’s the present Americans have. How we respond defines who we as Americans are. Will we stop history, roll back development to some utopian childhood that only existed under controlled circumstances, and pretend the last fifty years didn’t happen? I certainly hope not.

Friday, March 3, 2017

Boston Murder Supernova

Peter Swanson, Her Every Fear: A Novel

Young Londoner Kate Priddy worries about everything; she has major anxiety disorder. Too bad some of her anxieties are real. Kate accepts an apartment swap with an American cousin she's never met, to escape the memory of the recent violence she's endured. But the very day Kate arrives in Boston, her new next-door neighbor turns up murdered. Kate tumbles into an investigation so intense and baroque, it threatens to undo all the healing she's accomplished.

Throughout reading this massively intricate thriller, I kept looking forward to writing this review. Peter Swanson crafts a complex plot, populates it with interesting characters, and kicks it into motion so that, the more momentum it has, the more it picks up. I really enjoyed reading this book. Then we get to the resolution, and... um... squeak? It doesn't really resolve, just end, and Swanson kicks the victory to the wrong characters. What a letdown.

The story alternates between four viewpoint characters. Kate is younger than her age, having lost prime years to an abusive lover's violent jealousy. (Swanson withholds exactly what happened for nearly 100 pages, but the suspense is undercut because the secret is explained in the dust flap synopsis.) Her disorder has her seeing menacing boogeymen in dark corners, a tendency she restrains with prescriptions and self-talk. This, sadly, means she winds up missing the real threat.

Alan Cherney lives in the same complex as Kate. His strange obsession with Audrey Marshall, the murdered woman, gives him insight into the investigation, but also makes him creepy. He and Kate have immediate chemistry; perhaps their contrasting neuroses make them soulmates. But the investigation's surprise turns put these two damaged people at odds, and everyone quickly starts doubting everyone else. As if murder wasn't intense enough, who could've guessed romance would make things worse?

Peter Swanson
Corbin Dell, Kate's American cousin, has secrets too. Last time he visited London, he left a trail of destruction nobody's yet cleaned up. He's weirdly cagey about his relationship with the deceased Audrey, and his motivations are contradictory at best. The more the police seek his statement, the more evasive he becomes. Soon, Kate worries that a mere ocean isn't wide enough to protect her from a stranger who might kill to protect his secrets.

Then there's the fourth character. Experienced crime fiction readers know enough to start a suspect list, and test it against mounting evidence, so we quickly determine who really killed Audrey Marshall. The motive remains less clear, and we wonder whether the characters will twig who their real enemy is before the violence has time to escalate. The killer dribbles clues slowly, and not always inadvertently, daring the others to act before becoming the next victims.

Swanson plays out the theme of different ways people see the same event. His story unfolds mostly from Kate's perspective, as she attempts, mostly ham-handedly, to assist the police investigation. Then suddenly, he'll shift to another viewpoint, Alan's or Corbin's or the killer's, and replay the same events with new knowledge, forcing us to re-evaluate what we thought we understood. By replaying single events through multiple frames, Swanson demonstrates the difficulty of truly understanding anything.

We readers progress thus, seeing the same events several times, becoming aware of the real story only by increments. It's a dark story, too: Swanson creates a Stieg Larsson-ish world of subtle, invisible brutality, a world deeply divided between savage criminals and desperate victims. Though I disagree with that Manichean worldview, Swanson nevertheless spins it into a taut, gripping yarn, populated by tragic heroes and ambiguous villains. I found sticking with his story very easy.

Then, in the final forty pages, it whirls apart under its own weight. Swanson creates an overpopulated climax, tosses viewpoint scenes onto previously minor characters, and let's someone else vanquish the monster. I won't reveal the conclusion, since somebody may want to read this book. Its first 285 pages are quite awesome; as I say, only in the final forty pages does it unravel. I just wish the ending had the setup's tense, exciting texture.

I can't entirely blame Swanson. Many of my favorite authors have difficulty writing endings. And perhaps he's set the standard so high, with his rising action, that he couldn't possibly craft a conclusion to measure up. It just hurts, after enjoying the book so much, to see Swanson's story splatter like an egg dropped on a sidewalk. These characters deserve better. They've paid their dues; their conflict deserves a proper resolution, not a sudden stop.

Wednesday, March 1, 2017

Race and the Frontier in HBO's Westworld

Teddy Flood (James Marsden) and Delores Abernathy (Evan Rachel Wood)
This essay contains spoilers
HBO’s viral sensation Westworld is profoundly dependent on Frederick Jackson Turner’s “Frontier Thesis.” First published in 1893, Turner’s essay “The Significance of the Frontier in American History” pushed the idea that the frontier, the opportunity to move outward and settle wilderness, defined not just American culture, but individual American character. Westworld character Maeve Millay channels Turner directly in her scripted speech: “This is the New World, and we can be whatever the fuck we want.”

The frontier belief looms large in American thinking. Even after the Census Bureau declared the “frontier line” closed in 1890, Americans pursued new, less literal frontiers. Alaska declared itself “the last frontier” long before President Kennedy called space “the final frontier,” a line pilfered by Captain Kirk. We praise the “frontiers of science,” likewise of medicine, of technology, of the Internet, and whatever. And Westworld implies, if we can’t find a frontier, we’ll make one.

This all, unfortunately, overlooks how profoundly racist Jackson’s Frontier Thesis actually is. Jackson stratified cultures into degrees of development, from white Euro-American settlement at the top, to nomadic Indians at the bottom. The frontier, Jackson said, gave whites opportunities to shed settled, emasculated habits and “live like Indians,” tempting death, pushing their limits, surviving by their own wits, discovering their inner, half-savage nature. With the implication they could, at will, return to settled white civilization.

Except, historically, Indians didn’t “live like Indians” unless they had to. The traditional image of nomadic Native Americans, sleeping in tipis and following the bison herds, arose only after white settlers chased Indians off lands they’d cultivated for centuries. The frontier of “unsettled wilderness” was an entirely white invention, based on the belief that any land untouched by anything they considered civilization was theirs for the taking. Though outright taking has ceased, the theft remains.

Dr. Robert Ford (Anthony Hopkins) and some of his creations

Audiences will notice Westworld has few Indians. It’s racially progressive in some ways, including having an integrated Union Army even though America didn’t desegregate its armed forces until 1948, and the technical personnel making the park possible are thoroughly integrated. But Indians don’t much enter except as background characters. Though I didn’t keep technical count, I noticed few or no speaking lines from Native characters. They simply exist as enemies for white characters to fight.

In one episode, William and Delores get chased by the Ghost Nation, a band of scalp hunters. In another, a corporate story agent proudly displays an array of “hosts,” artificial people who occupy Westworld, explaining the characters for an exciting new storyline, displaying the brothel whores players can seduce, outlaws they’ll defeat, and Indians they'll flee. He says this standing before a tall, buckskin-clad native with feathers and facepaint, stock villain from a dime novel.

Without stating anything explicitly, Westworld accepts that frontier-style self-discovery absolutely requires prior humans. The true frontier is never unoccupied; it’s been cleared of brush and boulders by someone before, who they accedes to the permanent settlers. This extends to all frontiers: Star Trek postulates a universe teeming with life, which Starfleet then sublimates into a larger, homogenized Federation. Battlestar Galactica, possibly the first mass-media franchise showing a lifeless universe, also treats “settling” as giving in.

The Man in Black (Ed Harris)

Please note, the actual American frontier didn’t last very long. Though the Oregon Trail opened in the 1830s, only the Transcontinental Railroad, completed in 1869, made mass white settlement in the American interior possible. Thus, the Wild West happened between 1869 and 1890, when the frontier formally closed. James Arness played Marshall Dillon on Gunsmoke longer than that. And in Westworld, we learn the Man in Black has been attending the park for thirty years.

So not only has the frontier myth lasted longer than the practical frontier, but Westworld keeps replaying the same single generation for decades, as paying customers keep indulging the same vices. White people keep chasing Indians off their land, Mexicans from their towns, and squatters off their farms, then returning to “real life.” In essence, they keep paying money to replay America’s racist past, a past many Americans would, in present times, like to forget.

The big, season-ending reveal, that our gentle hero William grows into the callously destructive Man in Black, makes plain the implications Turner’s original Frontier Thesis never addresses. When conquest stops being a means, and becomes an end, it strangles the human part inside us. We invent moral justifications for continued conquest. Without a purpose, we become the thing we once hated. If America really is Frontier Nation, we need to settle down soon, or die.