Wednesday, June 29, 2016

Will the Doctor Ever Be In Again?

Peter Capaldi as the Twelfth (current) Doctor
Readers encountering Joseph Campbell’s “Hero’s Journey” structure, from his monograph The Hero With a Thousand Faces, could be forgiven for seeing the structure in the BBC’s internationally popular series Doctor Who. The Doctor’s long journey, fraught with constant peril and usually resulting in a world improved by his presence, would make perfect hero fodder. The entire show premises on passing into the unknown, encountering otherworldly dangers, and emerging transformed.

But this would be a false conclusion. Campbell’s Journey, which he terms the Monomyth for its cross-cultural resonances, has a selection of persistent elements that cross genres, from myth and religion, to folktale and campfire story, to popular novels and movies. Nevertheless, Doctor Who misses one important Monomythic component: the Doctor leaves home, faces trials, and achieves apotheosis. Yet somehow, he never returns home.

For Campbell, who intended his classic as a work of comparative religion, the return to Earth with bestowed divine revelation carried prime importance. At the top of the book, he describes how the Buddha, newly enlightened from his struggles with himself, faced temptations to remain in place, revelling personally in his newfound apotheosis. Yet circumstances (Campbell recounts the Gods, but acknowledges the circumstance was probably more primordial) hastened his return

We see this throughout religious history. Elijah achieved prophethood dwelling in some distant cave, but carried his mission to King Ahab in Jerusalem. Jesus began his ministry by retreating to the desert, withstanding temptations, but ultimately returned to civilization. Gabriel dictated the Koran to Mohammed in a mountain cave, but Mohammed, who couldn’t write, re-dictated his revelations to humans in Mecca and Medina, establishing the first pan-Arab nation, and religion.

Paul McGann as the Eighth Doctor
Even fairy tales continue this tradition. Odysseus faces trials, and is tempted to remain with Calypso, but ultimately returns to Ithaca and restores his people’s order. Red Riding Hood and her grandmother pass through the darkness of the wolf’s body, but return aware, Red promising never to deviate from the path again. And imagine how disappointing Lord of the Rings would’ve been if Frodo never returned to the Shire.

The Doctor, however, never returns home. Oh, sure, he occasionally pops ’round to Gallifrey, chastising the plush-bottomed Time Lords for inaction and corruption. The most recent season, at this writing, concludes with the Doctor rescuing his homeworld from its time loop, chasing its reprobate president off, and humbling his people. But like always, he vanishes again, always heading for another, better adventure. His mission never returns him to his people.

Partly, we can blame this wandering on the demands of episodic television. As possibly the BBC’s most lucrative export, which undoubtedly subsidizes other innovative and risk-taking programming, Doctor Who is too valuable to ever quite end. The Doctor ceaselessly roams because ceasing would stall the story—unless the story transitions to the Doctor’s preaching to his assembled Time Lords, like a prophet, which would make sedentary, talky TV.

But even this doesn’t explain so much as describe. The Doctor doesn’t wander to achieve enlightenment, or to save anybody else. If he intended such, he could’ve stopped wandering, in the episodes “Last of the Time Lords,” which ends with all humanity praying for salvation from the Doctor, or “Journey’s End,” wherein the Doctor, in public view, saves Earth and twenty-six other worlds, personally putting Earth back in orbit.

Tom Baker as the Fourth Doctor
It’s tempting even to attribute the heroic journey to the Doctor’s companions. After all, unlike the Doctor, his companions inevitably return home eventually. Except, usually, they don’t. Rose and Amy return to different worlds than they left. Clara never really returns, spending an unspecified eternity journeying aimlessly. And Donna, well, she returns, but blissfully unenlightened. In the revived series, only Martha has really brought enlightenment home with her.

Thus, somehow, the Doctor never returns, nor helps others return. He’s neither messiah nor prophet, neither buddha nor bodhisattva. For him, the journey becomes all-encompassing, always headed somewhere, never arriving. Even his longest sojourn, over a millennium, in “A Town Called Christmas,” proves temporary. After vanquishing the massed hordes of monsters, the Doctor ultimately resumes his wandering, forever chasing an enlightenment that never quite arrives.

Where this leaves the Doctor’s journey, who can say. Everybody’s journey is unfinished, until it’s finally finished. But one wonders whether the Doctor’s journey ever could end. In Campbell’s Monomyth, the Hero faces a temptation to avoid returning. The Buddha could’ve sat under the Bodhi Tree forever; Christ could’ve accepted sovereignty over reality. But true heroes return. Except the Doctor: somehow, his heroism never involves rejoining his people.

Monday, June 27, 2016

The Robots of the Rio Grande

Cat Rambo, Altered America: Steampunk Stories

I have mixed feelings about steampunk fiction. I’ve read some really good steampunk, but it’s mostly been anti-modernist nostalgia, boilerplate fantasy with magic replaced with sufficiently un-advanced technology. But I really like short story master Cat Rambo, arguably the truest living successor to genre doyen Damon Knight. So when a beloved author undertook a subgenre I distrust, I had my doubts. The resulting hybrid product truly could go either way.

On balance, Rambo does pretty well. Her work isn’t immune from cliché, sometimes falling into the trap George Orwell called “phrases tacked together like the sections of a prefabricated henhouse.” But she owns those clichés boldly, coöpting shopworn phrases to tell superior stories. Sure, she relies on period nostalgia, as paperback readers frequently expect. But she retunes and subverts nostalgia formulae to tell the story she needs, regardless of expectation.

Rambo’s ten stories pinch a medley of influences, and spread across time, from the late 18th to the early 20th Centuries. A mysterious nobleman courts an unconventional inventor from under her judgemental fiancé’s nose. A hospital for Civil War veterans harbors dark secrets, as human soldiers are recycled for parts. A werewolf motorcar enthusiast races a vampire’s train for the ultimate prize: a human woman’s heart. The stories mix recklessly.

Together, Rambo’s stories create a history familiar enough to evoke wistful sentimentality, but distorted enough to challenge preconceptions. By inverting readers’ historical reminiscences, she questions our received narrative. American forces fight an unnamed enemy Out West for control of a powerful fuel, phlogistion. Which isn’t really petroleum, stop saying that. Aristocrats vie to control a deeply class-ridden Europe, but those aristocrats are werewolves and vampires, literally feeding on their people.

The Civil War looms large in Rambo’s history. Her Abraham Lincoln won a Pyrrhic victory by allying with Haitian warlords to create a zombie army. (I’ll forgive her occupation-era stereotypes, provisionally.) But in peacetime, what Lincoln created cannot be silenced, and the racially diverse living find themselves unified against the remaindered zombie hordes ravaging the countryside. Thus Rambo asks readers, is winning worth the price? She revisits this situation repeatedly.

Cat Rambo
By her own admission, Rambo began this collection without realizing she’d commenced a unified alternate history. Thus, some early stories disagree about their timeline: “Memphis BBQ,” about self-propelled mechanical men chasing a zeppelin, offers a different, less dark post-Civil War America than later stories, like “Snakes on a Train” and “Rappaccini’s Crow.” Later stories show greater continuity. Throughout the second half, Rambo’s appallingly grim history develops its own internal equilibrium.

Not that it becomes uniformly bleak. “Snakes on a Train,” which Rambo admits began with its title pun, retains a playful, sexy humor even when emphasis shifts onto monsters. “Rappaccini’s Crow,” by contrast, takes its cues, as you’d expect, from Nathaniel Hawthorne, with distinct hints of Poe. But its “horrors of war” theme arguably shows equal influence from Ernest Hemingway and Dalton Trumbo. This is science fiction consciously as literature.

Rambo doesn’t limit herself to just one form. Despite the Altered America title, her alternate history begins and ends in Europe. She opens with a Regency romance, then spills straight into a Dickensian protest tale. She has two Elspeth and Artemus mysteries, Western thrillers featuring twin outcasts, a clockwork man and a Jewish psychic. From there, she caroms through straight-up Western adventure, train heists, fairy tales, and more sprawling genres.

One theme permeates the entire book. Of ten stories, only two, “Web of Blood and Iron” and “Seven Clockwork Angels, All Dancing On a Pin,” don’t feature a female protagonist rejecting the romantic ingenue role. Not that they’re opposed to love: Pinkerton agent Elspeth Sorehs, female lead in two stories, openly embraces it, and other heroines fight for love they’re doomed never to receive. But they don’t define themselves romantically.

On one level, like much steampunk, Rambo exhibits flip sentiment and corny nostalgia. Admittedly, readers like that. But Rambo doesn’t make a nest in old history textbooks, she uses history to question readers. Frequently, especially in war-related stories, she threatens our understanding of the present. Thus she rises above a highly stereotyped subgenre to present tales ranging from the merely mawkish, to the downright dangerous, often in the same story.

At this writing, this book is available only in digital format. It has some visible scars from its manuscript formatting, including some editorial notations, that should’ve been removed earlier. These mistakes are few, and pretty widely spaced, so patient readers can simply read around them. But it does take patience.



See Also:
Lord of the Pings
Jane Austen Presents a Sherlock Holmes Extravaganza

Friday, June 24, 2016

Who Are EU?

One of my friends has compared the results of this week’s Brexit vote, where the UK took the unprecedented step of electing to leave the European Union, to voting for Donald Trump. This makes sense, considering the almost entirely white, rural, post-industrial breakdown of the results. Another friend compared it to Anschluss. I dread Nazi comparisons, which reduce discussions to name-calling competitions, but sadly, based on Europe’s larger context right now, the comparison is justified.

As an American, I try to remain aloof from international politics. We Yankee Doodles have an unfortunate history of weighing into discussions we don’t truly understand. While I’m no isolationist, I appreciate FDR’s preference to keep America outside World War II until circumstances were clearly defined, and America’s interests were inarguable. But events of recent years have me significantly concerned about European, and world, politics, because I can read, and we’ve seen these situations before.

The popularity of nativist groups, like German PEGIDA, the French Front National, and the UK Independence Party. American cluelessness, embodied in a Presidential candidate lacking policies, but brimming with charisma and paranoia. Governments pushing austerity on citizens already facing stagnant wages and soaring commodity prices. And citizens desperate for someone to blame, even if the creepy foreigners and vaguely brown-skinned people they choose had little influence on their situation. Tragically, we’ve seen all this before.

Let’s be clear. I’m no EU fan, or specifically the Eurozone. The concentration of European monetary policy under the Maastricht Treaty has damaged member states’ ability to control domestic economies. Recent upheavals in Greece, Spain, and Ireland have reflected the international export of monetary policy: pre-2008 paper millionaires flooded spongy economies with cheap money, buoyed by Eurozone guarantees, then fled when things got rough. Greg Palast and Matt Taibbi have explained this volatile situation perfectly.

But as a non-Maastricht state, Britain didn’t really face such concerns. One could levy charges of economic blackmail. What Britain did last week is literally no better than screaming “It’s my ball, I’ll make the rules.” As Earth’s fifth-largest economy, and Europe’s second-largest (after Germany), besides being home to Earth’s largest single Stock Exchange, Britain has significant power in Europe. Taking that power outside the EU throws the European economy, already drifting, into uncharted waters.

The Brexit arguably could’ve happened anyway. The name gained traction because we already faced the “Grexit,” Greece threatening the same maneuver, and other countries throughout the Eurozone have threatened to pull a runner. None so far have done it, because they need Germany’s influence, as the dominant Maastricht power, to stabilize their economies. Having delegated their economies to the EU Central Bank, based in Frankfurt, would-be European leavers have too little domestic standing to leave.

But just because nobody’s done it yet, doesn’t mean nobody will. Britain, one of Europe’s largest trading destinations, is leaving the EU, with no alternative policies prepared. This means countries which trade with Britain, from French goods to Spanish tourist resorts to Slovakian white slavers, literally have no idea where their money’s headed; Weimar-era trade tariffs may appear next Thursday. Sending anything to Britain right now would be foolish. Money’s about to become very scarce.

Sadly, this makes international paranoia worse, not better. Britain already has a government pushing exactly the same austerity measures the Rothschilds pushed on France between the World Wars. France didn’t surrender to Germany because it was cowardly; it surrendered because it was broke from trying to shrink the economy enough to go back on the Gold Standard. A standard which, don’t forget, is being heavily pushed by economic reactionaries in Europe and America right now.

We’re witnessing the reconstruction of conditions which existed in Europe before weakened central nations began electing Fascist governments. Fortunately right now, even states with reactionary governments, like Germany and Austria, are committed to not repeating these mistakes. It’s states that fought against fascism that currently support its modern successors. Farage in Britain and Le Pen in France lack Hitler’s snarling charisma, but seem committed to pushing a photogenic successor to his policies on naïve Europeans.

Fortunately, we have the perspective of history now. If pre-WWII conditions exist again, we can recall the past, and warn against them. The EU was created to shackle nationalist tendencies and privilege broader commitments to human rights, and if British voters regret their hasty exit—which they will—we have perspective enough to prevent regression. But it will require international actors, including America, to resist playing tit-for-tat. With Trump’s candidacy running, I dread the possibilities.




See Also: Those Who Don't Learn From History

Monday, June 20, 2016

The Last Days of College Sports

Joe Nocera and Ben Strauss, Indentured: the Inside Story of the Rebellion Against the NCAA

It’s impossible to read Joe Nocera’s economic history of the NCAA without seeing an implicit metaphor for abusive industrial capitalism. We have a proletariat, valorized for their unceasing hard work. We have a managerial class, who perform intellectual and technological gymnastics to keep the proletariat’s wages artificially low, while acclaiming the market forces that commercialize the game. And there’s the bourgeoisie of Division I coaches and athletic directors, made rich from the other classes’ labors.

Nocera, a veteran business journalist and pro-business moderate who has recently examined sports issues for the New York Times, is joined here by junior Times contributor Ben Strauss. The extra hands probably help unpack a complex issue which affects so many people who possibly don’t realize it. Though dissatisfaction with the NCAA’s business model has become increasingly commonplace, few fans probably understand how we reached this impasse. This historical illiteracy makes fixing the problem difficult.

Anecdotes about NCAA abuses of unpaid “student athletes” (a term coined specifically to avoid calling players “employees”) abound, though it’s difficult to derive meaningful statistics, since the intensely private NCAA keeps its books closed. We know the Association forbids its members to unionize, a position inconsistently supported by the Department of Labor, since the players’ one compensation, academic scholarships, aren’t considered taxable income by the IRS. College players aren’t paid, given workers’ comp, or contracted.

Even without statistics, however, it’s possible to determine patterns. This book runs rather long, and lush in narratives, partly because Nocera needs to establish a pattern of circumstantial evidence. This includes players’ personal experiences, public and private quotes—some quite long— from NCAA officials, and numbers where they exist. The Association’s secretive policies make smoking-gun proof elusive. But recurring narratives, accumulated across decades, makes a persuasive case for the NCAA as profiting off “amateur” athletes.

The NCAA was founded in 1906, basically to standardize rules and safety procedures in college football. Not until 1951 did the Association gain any governing authority over college sports. From this time, the NCAA made the players’ amateur status league dogma, asserting that players’ unpaid positions enhance the game. This dogma has influenced everyone from the league to the Supreme Court. And the Association will undertake massive investigations to ensure its star players remain unpaid.

Joe Nocera
Nocera’s accumulation of anecdotal evidence is both huge, and frequently bizarre. Players get targeted for accepting groceries, car rides, and discounted pants. While denying players enough pocket money to buy toothpaste (Nocera estimates eighty percent of student-athletes live in poverty), the Association basically ends college careers for students accepting dinners from booster organizations. And the Association’s investigators, including former federal agents, are disproportionately likely to target poor black students, whom it expects to remain poor.

Meanwhile, the NCAA’s academic requirements are both onerous and inconsistent. The Association maintains strict academic accomplishment policies, to justify that student-athletes are scholars first. Its standards reach back to 9th Grade for many players. Like amateurism demands, academic requirements are enforced with racial differentiations, though in the opposite direction. Well-off white students are more stringently policed, while poor blacks from struggling, underfunded schools often get waivers. You’d almost think the Association supported racial divisions. Hmmm...

Nocera is quick to assert that, in writing all this, he doesn’t describe every sport. Collegiate water polo and tennis, though prestigious in their circles, aren’t money makers, and thus not particularly abusive. His criticisms specifically describe what he terms “the revenue sports,” football and men’s basketball. These two sports employ over 15,000 uncompensated players, mostly poor, netting revenues measured in the billions. Though again, monetary numbers remain vague with the NCAA’s notoriously closed books.

Nevertheless, the accumulation of evidence is overwhelming. The NCAA makes literally billions off laborers whose work remains perennially unpaid, and takes remarkable steps to ensure market forces never influence the bottom. It demands off-the-books overtime from workers too poor and disorganized to oppose management. While some revenue subsidizes less-prestigious sports, top Division I coaches make more than some NFL coaches. And almost no revenue reaches academics. The consequences are almost Marxian in their pervasive devastation.

Nocera is an excellent storyteller. He weaves players’ personal anecdotes, some of which are almost Stephen King-ish in their bleak tone, with journalistic passages of statistics and quotes. His investigative prowess doesn’t overwhelm the human costs of the Association’s practices. As he notes, recent history has the patterns getting worse rather than better. This mix of history and current events will make a brutal wake-up call, for sports fans and believers in economic justice alike.

Friday, June 17, 2016

The Poet of Hollywood Boulevard

Martin Ott, Interrogations: Stories

A young man, who cannot reconcile himself to married life, must return to Mosquito Island to repair his relationship with the first woman he loved: his mother. An aging mother, whose daughter has been praying to a mysterious Virgin Mary sculpture, realizes she must make her own miracles. A husband and wife, drifting apart, discover their very literal bond when their daughter physically glues them together.Twenty short, powerful snippets, given brief but luminous life.

As a poet, Martin Ott has a distinctive voice. Blending his military experience, society’s suffusion with media, and the intricacies of making a life in entertainment, his verse has a concise punch often missing in poetry written by tenure-track professors. Ott’s fiction somewhat lacks that confidence, appropriating elements from other authors he respects and emulates. Not that his fiction isn’t good; he’s a skilled mimic. Rather, as a fictioneer, he’s clearly early in his career.

And what influences he mimics. Reading his stories, veteran audiences will recognize Ott emulating Annie Proulx, TC Boyle, and Deborah Eisenberg, among others. His eclectic borrowing gives this book an encyclopedic feeling, like a Best Contemporary American Short Fiction anthology filtered through an ambitious student’s viewpoint. Sometimes one suspects he’s imitating established authors because he lacks confidence in his own tale to tell. Other times, it’s like uncovering a lost work by some favorite writer.

Ranging from under two to nearly thirty pages, Ott’s stories span a gamut of styles, voices, and influences. Some stories have overtones of magic realism, especially as characters create their own realities, then drag others with them, willfully or otherwise. Sometimes Ott limits himself to strict realism, hitting readers directly with a jarring overload of detail. Stories occasionally hint at mysteries and thrillers, though he avoids recourse to detectives and other professionals. Ott’s voices swell.

At his best, Ott’s language resembles the poet he usually is. Momentary glimpses of powerful, incisive language strip away characters’ pretensions, especially in his shortest stories, where a single moment becomes an entire life. A little girl promises her faux boyfriend: “We’ll do dangerous things, then we’ll fight about it.” A former military interrogator (not the author himself, surely) “yearned to break men like bread sticks.” This doesn’t just best breaking twigs; it invests family, hearth, and religion into violence.

Martin Ott
Ott’s geography is somewhat uneven. His best stories emphasize two regions: his adopted home of coastal California (some highlight San Francisco, but experienced coast-dwellers will recognize it’s transparently a cipher for Los Angeles), and small-town Michigan, a region he revisits often enough, one suspects it’s his home domain. California, for Ott, represents dreams made manifest, the admixture of sun-kissed opportunity and bitter disappointment, the two experiences most Californians recognize from working overtime in the sun.

Michigan, however, is something Ott’s characters mainly reconcile themselves with. His Michigan stories mostly involve somebody, not always the viewpoint character, returning after fleeing, confronting some long-buried truth. “Home,” to these characters, represents something they escape, even while living there (underage drinking and drug abuse, which numb users to the present, are ubiquitous). But a bad home is still home, and Ott’s characters return because they need stability. Even if they must build it themselves.

Besides these two locations, Ott liberally uses images from Wyoming, Alaska, Seattle, and elsewhere. These sites, unfortunately, are more general and vague than California or Michigan, giving the suspicion Ott has simply elected to imitate other authors (Proulx in Wyoming, or Boyle in Alaska) he finds influential. The locations become more like generalized non-places than actual locations. If we can accept the dreamlike conditions, the places are okay. But they lack Michigan’s detailed, meaty realism.

Thus accepting Ott’s stories requires accepting Ott. Though a master poet, he remains a journeyman fiction writer, and demands an audience that can accept his learning curve. I mostly can; only very late in this volume do Ott’s inconsistencies become prominent enough to bother me. Even when he presents Wyoming, a state I know pretty well, as more archetype than location, I feel only minor twinges. Ott’s still learning fiction, and that’s okay.

At his best, Martin Ott’s fiction peels away the layers of pretense to uncover the underlying facts, like the interrogator he once was. Narrative, for Ott, exposes characters’ inner journey, as most literary fiction does, but it also exposes the factual core beneath subjective experience. And often, Ott exposes the jarring friction between reality and experience. Like an interrogator, Ott pierces pretense, laying reality bare to criticism and to brisk, informed response.

Monday, June 13, 2016

The Creation of the Modern World

1001 Books To Read Before Your Kindle Battery Dies, Part 70
Stephen Greenblatt, The Swerve: How the World Became Modern

One fateful day in 1417, Poggio Bracciolini, former Apostolic Secretary to Antipope John XXIII, now unemployed, roamed into a monastery in southern Germany. History doesn’t record which one. And the journey wasn’t accidental; Poggio went deliberately, with one goal in mind, to uncover ancient Latin books forgotten in dusty scriptoria. This monastery proved a treasure trove, including one book so important, its worldview arguably changed the world.

Harvard professor Stephen Greenblatt is primarily famous as a Shakespeare scholar. But like most public intellectuals, Greenblatt’s interests run pretty polyglot. This multiple-award-winning nonfiction narrative combines history, philology, philosophy, and other topics into an impressively exuberant stew linking several stages of European history together. The shared focus: the last surviving copy of Titus Lucretius Carus’ epic philosophical poem, De Rerum Natura.

Poggio was one among several Italian thinkers known as the Humanists. Despite that term’s present association with atheism, early Humanists were fairly religious. Their founder, Petrarch, was a cleric, while many layman Humanists, like Poggio, were nevertheless employed by the church. One early Humanist was even elected Pope. In an era of Inquisitions and Ecumenical Councils, church status provided pioneering Humanists with significant protection and economic stability.

The Humanists shared a conviction that ancient Roman culture, and later the Hellenic culture that preceded Rome, represented an apex of human accomplishment. Notwithstanding their religious alliances, the Humanists sought to recover Roman history from a millennium of abandonment and willful suppression. Some, like Petrarch, doubled as creative artists, while others were content as scholars. Poggio was the latter, using his church income to subsidize a career in book hunting.

That winter’s day in 1417, Poggio used his papal connections, immense historical knowledge, and impeccable handwriting to gain access to a neglected monastic library. Greenblatt postulates it may have been Fulda, a one-time center of knowledge and education, since fallen on hard times, though he concedes that’s speculation. Wherever it was, the library yielded several irreplaceable texts, including an astronomy guide and a verse history of the Second Punic War.

Poggio Bracciolini: a posthumous engraving
made from a sketch done during his life
It also yielded a single copy of Lucretius. This poet was known from throwaway references in Cicero and St. Jerome, but his work was considered lost. Poggio couldn’t have known the full significance of his discovery, because he had time enough to skim a few pages before entrusting it to some underpaid copyist. But he’d rediscovered the most thorough introduction to Epicurean philosophy, a complete worldview written without reliance on gods, spirits, or afterlife.

Lucretius recorded, in unrhymed Latin verse, a Greek philosophy declaring that reality, not Platonic ideals, are the starting point of conjecture. Epicureanism postulates that matter is not infinitely divisible, but comprised of atoms. These atoms combine and separate, creating the movements of reality. Everything, from gods and stars to humans and dust, shares this atomic nature. Atomism, to Lucretius, makes humans part of reality, free from fear of divine retribution.

Greenblatt isn’t so naïve as to believe Poggio’s discovery transformed Europe overnight. Indeed, Lucretius threatened worldly authorities so much, his verses circulated surreptitiously in learned circles for a generation. But Greenblatt situates this discovery amid a European culture where recovered Latin learning was already beginning to transform arts and sciences—a sort of Renaissance, if you will. Poggio was part of the metamorphosis already sweeping European Christendom.

Epicureanism’s deist (not really atheist) structures led scholars to reëxamine their approaches to thorny issues, both religious and secular. It led devout Catholics like Thomas More to give increased weight to scientific reasoning. It prompted Giordano Bruno to question received dogma, eventually exposing the Inquisition’s moral rot. European scientists began testing hypotheses, not just receiving them, while moralists began examining human consequences, not just divine mandates.

Sometimes Greenblatt speculates beyond the limits of evidence. His descriptions of medieval Christian asceticism, for instance, assert that Christians flaggelated themselves to purge the influences of Epicureanism. That seems like a reach, given how even fellow pre-Christians frequently distrusted and caricatured Epicureanism. Christians probably tortured themselves, rather, in reaction against sybaritic behaviors common among Roman persecutors. In this and some other circumstances, Greenblatt arguably oversimplifies complex historical trends.

Nevertheless, Greenblatt’s massive historical narrative, stretching from the Third Century BCE to the early American Republic, demonstrates history’s arc. While we moderns hope that progress is a universal imperative, Greenblatt shows it’s more contingent than that, with movements toward both knowledge and ignorance. History isn’t vague or impersonal, it’s driven by human activities and momentary choices, which may initiate consequences we cannot fully comprehend until years, sometimes even centuries, later.

Friday, June 10, 2016

Why Must You Talk To Me While I'm Reading?

Short story writer Lindsay Merbaum, writing last week on women-centered website Bustle, asked a very valid question: Why Won't Men Leave Me Alone When I'm Trying To Read In Public? She describes men accosting her, practically cross-examining her behavior, challenging the validity of actually reading in public. She quotes friends who have faced everything from rude rebuffs to sexual propositions while reading, alone, in bars and restaurants.

Merbaum makes many excellent points, which anybody who’s tried to read in public places will find painfully familiar. But she makes one fundamental error: she assumes the challenges she faces reflect her gender. She assumes these men wouldn’t treat her thusly without her gender. She even equates book-based boorishness with leering and sexual harassment. I disagree. In my experience, the operant challenge here isn’t Merbaum’s gender; it’s her book.

Speaking as a dude who reads significantly, I’ve observed people—mostly indeed men—see my book as public invitation for conversation. Some are dedicated readers too, and want to share that experience. Others see my book as threatening them for reasons buried inside subtextual strata. Some just talk. But somehow, my book’s physical presence invites certain people, a small but vocal minority, to start talking, while I’m trying to read.

This week at work, I’ve been reading Stephen Greenblatt’s The Swerve, about the rediscovery of Lucretius’s poem De Rerum Natura, a pivotal moment in the Italian Renaissance. On Tuesday, a goateed electrician asked the title, then began holding forth, at length, about something he’d read recently about Tipler cylinders. In fairness, the topic sounds fascinating; but I had thirty minutes to eat my sandwich and read, and didn’t solicit this conversation.


On Wednesday, a plumber noticed me reading, and began asking about the book. Who was Lucretius? Why is one poem so important? Why would Lucretius write about physics in verse? The plumber’s engagement was gratifying, I confess, but again, lunch break is brief, and I read partly to separate myself from the constant chatter around the workplace. Then on Thursday, the same electrician wanted to know more about the book.

Holy schlamoly.

Partly, I suspect, people notice the contrast between an essentially private, internal activity, reading, and my public venue. Like napping or changing clothes, reading publically isn’t necessarily offensive, but does make people feel awkward. Should I keep silent? Why would someone read in this noisy, cluttered environment? Is this person judging me because I’m not reading? My book creates cognitive dissonance they reconcile by trying to talk.

The fact that men mainly engage me at work proves little, since my workplace is overwhelmingly male. Of over a hundred workers, only four are women, and one speaks little English. But reading in other environments merits comments too, mostly from men. Reading in bars, as Merbaum notes, elicits male comments. Women leave me alone—even when I’d rather they spoke. Men feel compelled to talk about my book.

Restaurants, with seating turned inward, are somewhat better. Strangers don’t intrude. Except waiters. When women take my order, they respect my privacy, offering only whatever conversation is necessary to keep me fed. Male waiters talk. And not just waiters. Anneli Rufus recommends introverts eating alone dine at the bar, as it elicits fewer comments and judgmental stares. I say, don’t bring a book. Your bartender will try to converse.

This realization, that others plague me with unwanted conversation, makes things awkward when I see women reading publically. As an aging bachelor, I still hope someday to marry and start a family. And since I got unceremoniously ejected from my university job, I have vanishingly few opportunities to meet erudite women. Women reading in public are advertising their cerebral tendencies, their intellectual curiosity, their willingness to encounter new ideas.

They’re also advertising that they’d like to read their book. Watching the collision between reading, a private activity, and the public milieu, gives me pause. Are they open to approach because we’re amid people, or should I respect their solitude while reading? I usually give them their seclusion, because there’s no simple answer. But my trade-off, for respecting women’s privacy, is that I’m unmarried in my forties.

Unfortunately, there’s no etiquette for handling public readers. There’s no out-and-out censure like for clearly undesirable behavior, like fighting or groping. Neither is there a clearly acceptable test of willingness. Sighting a beautiful woman, I can sound her out by buying her next drink… unless she’s reading. So I understand Merbaum’s complaint, from both sides. I lack any suitable solution.

Wednesday, June 8, 2016

The Last Gasp Of 40-Hour Weeks


As we approach the final stretch in building the city’s new high school, many subcontractors have realized they’re behind schedule. The site must close by July 31st, and most subs must complete their responsibilities before that. As deadline-induced panic sets in, bosses have begun redoubling demands on workers. This involves common annoying activities, like micromanaging and exhaustive checklists. It also includes slave-driving practices guaranteed to create new problems.

Chief among these counterproductive procedures: the twelve-hour workday. Though I haven’t asked every subcontractor, the electricians, plumbers, and window hangers I’ve spoken to complain about mandatory ultra-long workdays. These aren’t just some days, either. Workers are skipping scheduled activities, like bowling leagues, their kids’ softball games, and religious observances, to work twelve hours a day, six and seven days per week. The hours show in their drawn, colorless faces.

Nearly everyone affects cheerful demeanors, when discussing their twelve-hour days. “I’m rolling in the overtime,” they say. “That’s my kids’ college fund,” one drywaller boasted. But they speak with manic, Joker-like grins, desperate to convince themselves. Most roll home around sundown, exhausted, to catch a reheated dinner and collapse into deep, comatose sleep, only to arrive at sunrise tomorrow and try again. Incipient problem drinking is epidemic.

I ran the math. Assuming the workers pull the Bernie Sanders wage of $15 hourly—which few do—then a forty-hour workweek comes to $600 before taxes. But working twelve hours a day, six days per week, makes workers entitled to 32 hours’ overtime pay, or $22.50. That’s $720 for the overtime, or $1,320 combined, more than doubling workers’ wages without doubling time worked. Does this return justify the investment?

Probably not. Besides the increased wages, longer hours involve other added costs. Tired, bored workers are more likely to commit function errors in their jobs, errors that require them to redo tasks until they’re done correctly. Having watched electricians rip out previously installed wiring, plumbers re-hang cockeyed roof drains, and concrete workers jackhammering out mislaid sidewalks, I can’t help wondering: how much did that run up their overall costs?


Nerves are fraying around the jobsite. For many subcontractors, finishing this project will mean not just ending the present task, but relocating to pursue the next job. Many of my co-workers are functionally absentee husbands and fathers, regardless whether their families live nearby. Well-paid but tired, with nothing in their lives but more work, these employees shuffle around, looking uncannily like reanimated corpses. There have been fistfights.

Some problems are unavoidable when you jam a hundred men together in a confined space with little autonomy and low wages. Let’s not kid ourselves, men drawn to careers in construction probably aren’t big-spirited yoga practitioners in their off hours. But the long hours under high-pressure conditions, redoing tasks well-rested men could’ve done better the first time, makes problems more likely. These explosions, real or embryonic, are completely avoidable.

Imagine if subcontractors offered forty hours’ wages for thirty hours’ work. A thirty-hour workweek averages to five six-hour days. Given the length of daylight in Nebraska in summer, subcontractors could easily manage two six-hour shifts daily, getting workers in, getting their most productive hours, then sending them home before fatigue and boredom begin causing significant problems. Less stress, fewer do-overs, and less overtime pay. Sounds like a winner to me.

Under these conditions, even if subcontractors needed to pull six-day workweeks (which they might not if they don’t need to repair botched jobs), workers still have plenty of time for family, community activities, and rest. Employers save on overtime, OSHA-reportable injuries, and rework. Budgets go down, schedules get streamlined, and tempers don’t fray. This idea isn’t new to me. Senator Henry Black pushed a thirty-hour workweek over eighty years ago.

But then as now, management would rather pay more for tired, bored workers. We’ve accepted the idea of work as a moral good in its own right, separate from the worker’s outside life, and think there’s nothing unusual about requiring America’s poorest-paid workers to pull hours so long, the ASPCA would burst if we inflicted them on a horse. Then we judge them morally for widespread class-specific alcohol abuse problems.

This contradicts the entire conservative attitude regarding wages and regulation. If we’d rather pay more for tired workers to redo mistakes, than hire enough workers initially, the entire supply-side argument goes down. But perhaps that’s the point. Perhaps, below the surface, capitalists realize they need workers more than workers need them. And they’d rather fatigue their workers than face the consequences.

Monday, June 6, 2016

When Bread Could Kill You

Paul Graham, In Memory of Bread: a Memoir

Paul Graham, an upstate New York English professor and gastronome, established elaborate rules for himself: Cook your own food. Use local ingredients. Keep fat, sugar, and glycemic index low. Cooking for his wife, eating the bread she baked, and home-brewing beer with his best friend were staples of building a sustainable, locovore lifestyle. Everything food hipsters say will keep us, and the land, healthy. So he couldn’t understand the sudden, shooting bursts of abdominal pain.

Diagnosed with celiac disease at age 36, Graham found himself in an increasingly common situation. Diagnosis rates worldwide have skyrocketed. But are celiac, and other gluten intolerance disorders, really more common today? Or have people previously misdiagnosed now being recognized? (This isn’t academic. I have two gluten-intolerant friends, one who was tested for everything from cancer to lupus for over a decade.) Graham resolved to do what scholars everywhere do: research the situation, and report.

This volume starts with Graham’s own situation. It’s a memoir primarily, of Graham’s own struggle as he goes wholly gluten-free. Fortunately, his wife joins him on the journey. I wish I’d been that brave; when my then-girlfriend was diagnosed gluten-intolerant, I selfishly hoarded coffee cakes and cinnamon rolls. But Graham and his wife, Bec, find they’re not just giving up one ingredient. They’re walking away from buffet spreads, pub nights, and food’s historic social implications.

Wheat agriculture, it appears, helped form Western civilization. As Graham’s investigation expands into the history and science of gluten, he finds wheat so basic to Western history that to abjure eating bread (Graham loves the phrase “wheaten loaf”) means to not participate in our culture. Food-sharing rituals, from pot-luck brunches to Catholic communion, underpin Euro-American culture, and eating bread looms large. Maybe that’s why humorists and hipsters treat gluten-free dieters as mere figures of ridicule.

Since Graham, an award-winning food writer besides his professorship, cooked for himself, and his wife baked, food wasn’t just bodily sustenance; it bolstered the intimacy of his marriage. Thus, for him, the macro-scale and the micro intertwined. Many recipes, and many prepared ingredients, involve wheat where you’d never look for it, especially as a stabilizer. As he abandoned the cultural history of eating wheat, he also lost the personal history of preparing his own dinner.


Our isolated, private society today often loses the community aspect of food. But the simple act of sharing conversation around the table has historically underpinned our society. When he had to walk away from that history—not just the cultural history of shared food, but the personal history of knowing how to prepare his own dinner—Graham had to relearn everything he knew. Not just about food, but about himself, and his place in society.

For one, he has to rediscover how to be Paul Graham, in a world where hobbies like baking and brewing were now off-limits. He needed to relearn cooking. Many store-bought gluten-free (GF) foods simply substitute rice, tapioca, or sorghum flours for wheat, assuming the process remains unchanged. Not so, as Graham discovers in actually preparing edible GF bread. His mentors, though meaning well, taught him concepts that no longer apply. Cooking is an adventure again.

Is bread even really necessary? Graham suggests many deeply ingrained expectations regarding food are learned, not innate, though impossible to discard, the centrality of bread among them. With time, he internalizes the systems necessary for understanding the new world he found himself thrust into. Though by the end, he returns to home-baking his own GF bread, he acknowledges that even then, it means unlearning habits he’s previously mastered. Embrace everything teachers told you to avoid.

By his own admission, Graham set himself many food-related rules well before onset of celiac disease. His “locovore” proselytizing sometimes gets intrusive, and his quest for celiac-friendly foods at farmers’ markets seems quixotic. But everything he says sounds familiar to anyone forced, by health or circumstance, to abandon wheat. The discomfort at public food gatherings (can I eat off this buffet? How do I know what’s safe for me?). The mockery one faces for eating.

If it’s true that only the intimately personal is truly universal, Graham achieves that here. No two gluten-sensitivity sufferers have identical symptoms; that’s what makes diagnosis so difficult. However, everyone who abandons gluten endures the same isolation: the same withdrawal from easy carbohydrates, the same alienation from bread-eating friends, the same journey through dietary blandness. His memoir of struggle can inform all readers, and offer hope that leaving gluten doesn’t mean leaving good food forever.

Friday, June 3, 2016

Are Vitamin Supplements Bad For You?


Let me begin by answering my title question bluntly: yes. The overwhelming scientific preponderance concurs that consuming high-dose vitamin supplements has negative long-term health effects for most people. If, like me, you find perusing scientific literature sleep-inducing, the findings have been condensed by everyone from The Atlantic to The Daily Mail to comedian Adam Conover (see above). Only the vitamin industry’s well-funded trade association disagrees anymore.

That said, I only recently discontinued my daily vitamin regimen. Despite having read the relevant reports; despite understanding the health risks associated with throwing my natural bodily harmonies out of balance; despite the fact that they’re so damn expensive, I continued taking daily multivitamins for years. If even an educated individual like me, someone who takes pride in staying abreast of facts, continues doing something harmful, we should ask: why?

Please don’t misunderstand me. I don’t believe I’m representative of humanity in general, even considering the well-documented tendency all people have to consider themselves normative. Rather, I only wonder, when the preponderance of evidence holds so heavily with one position, and the known history of the debate is, at best, weird, why anybody who reads would continue consuming something known to cause harm. I suggest the alternative simply feels worse.

In my case, having been pressured by people I considered trustworthy to consider a daily multivitamin regimen, I purchased an inexpensive supplement, gave it a try—and immediately felt better. Not in some abstract psychological sense, either. (Warning: grossness follows.) The second day of the regimen, I passed a massive bowel movement. Also the third day, fourth day, fifth day… massive, soul-shakingly cleansing bowel movements, every day for over two weeks.


Whatever revolting toxins my body had been stockpiling, the vitamin apparently helped purge. The improvement was immediate. I had energy to begin a moderate exercise program, spent more time engaged in creative activities and less watching television, and returned better productive outputs at work. With that waste gone, I had more energy, better moods, and greater mental acuity. Simply put,  took vitamins and felt well.

Reasonably speaking, I know that consequence accrued from the supplement’s probiotic content. Having eaten a diet rich in starches, sugar, and fat, and light on fiber, green vegetables, and roughage, my intestinal fauna was unbalanced and weak. The probiotics, which I probably needed in small doses for short times, pushed long-held waste from my body. From there, I should’ve adjusted my diet, embraced healthier living, and moved on.

But we’re all human, subject to the same anchoring biases and post hoc reasoning as anybody. I took multivitamins and felt well; therefore, I reasoned, the multivitamins caused my discernible improvement. Intellectually speaking, I know they didn’t. Having restored my health by simple, brief interventions, I had an opportunity to adjust my lifestyle for improved health. Instead, I latched onto the one visible change that preceded my palpable bodily improvement.

In today’s fast-paced world, most Americans eat badly. Nine out of every ten Americans don’t get their complete nutritional needs from food alone, a popular advertising campaign warns, before urging us to purchase supplements. They never question the common American diet of restaurants and processed foods, overloaded in red meat, sugar, and heavy starch. If Americans can’t get their nutrition from food, maybe the problem isn’t us, it’s our food.



Admittedly, we face complex restrictions. Our work lives are increasingly performed indoors, seated, without needed exercise, sun, and human companionship. Sweet, fatty foods fill the resulting psychological hole… deeply, but briefly. Busy two- and three-income families have little time for home cooking. And the US Recommended Daily Allowance (USRDA), little changed in two generations, measures nutrients enough to prevent starvation, not to thrive abundantly.

Nevertheless, multivitamins, like drugs, alcohol, and television, provide the promise that we’ll feel better, healthier, restored. People embrace fad diets, like gluten-free or macrobiotic, without considering their individual health needs, because they hope to feel better. Some do: many people going gluten-free experience rapid weight loss. This happens mainly because they stop eating processed foods, but like with my multivitamins, they embrace what immediately precedes their improvement.

Maybe, faced with massive dietary shortfalls and unsupportable lifestyles, Americans should consider the forces making them feel bleak. Reaching for superficial solutions feels good, but does nothing. And multiple magazine articles scolding vitamin buyers has produced little effect. The problems underlying lopsided lifestyles, like basic poverty and little autonomy, loom so large, they’ve become invisible. If vitamin buyers hope to feel good, let’s investigate why they feel bad anyway.

Wednesday, June 1, 2016

Wild Weeds and the End of the City

David Seiter with Future Green Studio, Spontaneous Urban Plants: Weeds In NYC

The very name “weed” implies a plant doesn’t belong. We use it to describe dandelions in a manicured lawn, or crabgrass in a tomato patch, or sunflowers in somebody’s cornfield. We rip them up by their roots, spray them with noxious chemicals, and otherwise seek to destroy them whenever they appear. But what if our conception of weeds is all wrong? What if weeds represent the future of nature, and hope for our changing cities?

“Spontaneous Urban Plants” began as an Instagram hashtag specifically spotlighting New York’s thriving untended gardens. Red Hook-based landscape designer David Seiter became interested in how plants flourish without human attention, even despite human opposition, in the midst of humankind’s vast built environments. This book mixes the best elements of policy manifesto, wildlife identification guide, and and coffee table art book, for a product that could have implications, and consequences, far beyond Seiter’s Brooklyn home base.

Despite being one of Earth’s most densely populated places, Seiter writes in his introduction, New York dedicates approximately one-fifth its land area to parks, greenbelt, and other dedicated nature. (Compare two percent of Shanghai.) But nature doesn’t remain politely confined to cultivated spaces. Sidewalk cracks, untended lawns, abandoned factories, and other disturbed soils provide sustenance for massive arrays of plant life. New York’s wild and untended plants, under Seiter’s camera, have a lush, edenic abundance.

Aided by his Future Green Studio collective, Seiter has made a dedicated study of the way plants refuse to obey human limitations. He makes a persuasive argument that, for many city children, untended weeds are the first, sometimes the only, nature they’ll ever see. Goose grass and pokeweed may annoy urbanites who believe sidewalks should remain flat, grey and lifeless. But many city kids discover nature through weed growth—and discover how nature remains uncontrollable.

But this isn’t some romantic paean to nature’s abhorrence of vacuums. Weeds aren’t only good for their own sake; Seiter argues that unintended weeds contribute materially to city life. Some control stormwater runoff and prevent soil erosion. Others make good food for wildlife, and even for humans. Some weeds have medicinal properties. Some weeds even restore soil seemingly irredeemably damaged by human callousness. A thriving urban landscape has measurable human benefits beyond their superficial annoyance.

Ailanthus Altissima (Tree of Heaven) in its common urban environment
But to receive these benefits, we must reëvaluate what makes plants weeds. Many landscape designers have a romantic attachment to “native” species, and expunge plants imported globally, like red clover or Tree of Heaven. But Seiter contends, with substantial evidence, that many “native” plants are ill-suited for urban environments. Frequently, plants categorized as “invasive” and treated with massive sprayings of Roundup (a chemical so dangerous, it requires hazardous waste disposal) are better-suited for city soils.

Following his reasonably brief, but informationally dense, introduction, Seiter transitions into two-page spreads on individual plant species. This includes three photographs of common weeds in their urban habitats, with detailed descriptions of their foremost benefits and liabilities, identifying characteristics, and most common growing conditions. This includes the ecological benefits individual plants provide, and their human benefits, like how edible or medicinal they are. Seiter makes an engaging introduction for urban foragers, amateur botanists, and others.

He also makes a charming art book. He takes pains to show plants in their most beautiful conditions, though those conditions aren’t always pretty. When he shows us Virginia pepperweed growing through a sidewalk pothole, flowering silk trees reaching long branches over a concrete wall topped with razor wire, or Queen Anne’s Lace in the shadow of a Coney Island roller coaster, we understand nature and man-made space exist in tension. Hint: nature always wins.

Seiter isn’t naïve regarding nature. He realizes many plants aren’t always beneficial. He acknowledges when weeds provide what he calls “ecological disservice,” like spreading allergens or choking out other plants. For most weeds, this means acknowledging a mixed nature: milk thistle can quickly overtake urban meadows, but it also provides the only sustaining food for monarch butterfly larvae. Some plants aren’t mixed. Seiter has an entire chapter on plants like ragweed, which just need uprooted.

This book focuses on plants of New York specifically, and the American Northeast more generally. Don’t use this book to forage edible weeds in Minneapolis or Fresno. But it isn’t just about one narrow area. Seiter exhorts readers, regardless of their place, to reconsider their relationship with weeds, and with untended urban nature. Because humankind, and our spaces, don’t exist alone. We’re part of nature, and someday, if we’re lucky, we’ll return whence we came.