Monday, October 30, 2017

How Often Should You (Yes You) Shower?


The tease on The Atlantic’s Facebook page was simultaneously tempting and disgusting: “What Happens When You Quit Showering?” It’s just one more that’s trickled out through the years, including Chip Bergh, CEO of Levi Strauss & Company, telling people never to wash their jeans, or Buzzfeed citing that beloved go-to, “science,” to tell Americans we shower too often. A real cottage industry has developed recently selling the idea that we just don’t need to bathe frequently.

I understand the mindset behind this thinking. Where Americans once bathed about weekly, unless they carried an absolute reek, major corporate products like Ivory soap and Listerine mouthwash have convinced us our bodies are petri dishes of disgusting microorganisms, and we need to confront BO and “halitosis” daily, if not more. Nor is it just Americans. Per The Atlantic again, the global average includes a shower or bath daily, and a shampoo every other day.

Besides the health concerns of ordinary people washing necessary, symbiotic organisms off their bodies, this anti-bathing trend also reflects a pushback against advertisers who profit from our insecurities. I certainly don’t mind seeing corporate profiteers getting a firm public comeuppance for peddling self-loathing to the public. But I spot an unexamined class-based assumption in these articles. They implicitly believe typical Americans don’t get dirty and sweaty enough to even require that much bathing and laundry.

Since I reluctantly accepted my move down society’s economic ladder, and got a blue-collar job, I’ve come to understand that sweat and dirt delineate American social class. When I worked in the factory, indoor temperature often approached eighty degrees Fahrenheit even in the dead of winter; in summer, temperatures nearing a hundred degrees weren’t uncommon. And that’s nothing beside conditions working construction, where we’re expected to work in blazing summer sun and Arctic winter chill.


If I don’t shower daily, I smell like somebody left bologna on the kitchen counter for about a week. Sweat pours off me in quantities you’d not believe without seeing it. Many joggers, cyclists, or  other high-impact aerobic workout fanatics think they sweat; I certainly did when I worked white-collar and biked regularly. But I didn’t discover what sweat meant until it bucketed off me for eight hours or longer like wringing a kitchen sponge.

Showering isn’t optional for people who work like that. Since humans expel nearly a third of our bodily waste through our skin, the exceptionally moist environment of a sweaty construction worker creates a thriving breeding ground for microorganisms that, yes, produce an odor. Many of these microbes are necessary for human health… in limited quantities. But my sweat-soaked body creates conditions where my otherwise healthful, necessary microbiome makes me stinky, and could unbalance my health.

And that’s saying nothing about other class markers. I remember watching Star Trek years ago; Dr. Crusher complained to some hirsute crewmates that, since the invention of the razor, having facial hair was an affectation. I wondered then why the hair growing naturally from men’s faces was affected, while barbering it daily was not. Now I realize something deeper: facial hair keeps hostile conditions, like wind and precipitation, off my skin. Shaving jeopardizes my safety.

Considering that Levi Strauss invented his heavyweight dungarees specifically to withstand the punishment gold miners put their britches through, Chip Bergh’s whine that washing your jeans seems remarkably obtuse. My jeans get battered, torn, grease-stained, and worse, daily. If, atop that, I had to slip them on without thoroughly washing my microbiome off (remember what we said about bodily waste above), that would be like wearing the same tattered, shit-stained BVDs, every day, for years.


So when The Atlantic and Buzzfeed insist I should stop showering, they clearly think people like me don’t read their content. They occupy a rarefied world where pundits in suits scold one another about internalizing commercial hype and selling their self-esteem to our corporate overlords. If they realize people like me exist, they certainly don’t give us further thought. They accept the hierarchy, where building structures and making stuff is too menial for their audience.

Maybe bankers and attorneys could afford showering every two or three days. Many probably should. But audiences are diverse, and many haven’t achieved the lofty standards high-gloss magazines promised us in college. These one-size-fits-all hygiene tips innately assume wealth, and indoor work, are normal. And they exclude the people who build their offices, maintain their server farms, and cook their cafeteria dinners. People like, well, me. Do I need to explain why that’s a problem?

Friday, October 27, 2017

If Time Doesn't Exist, Why Am I Always Late For Dinner?

1001 Books To Read Before Your Kindle Battery Dies, Part 83
James Gleick, Time Travel: a History


When H.G. Wells first floated the idea of a time machine (a term he invented) in 1895, it received fierce opposition from philosophers, scientists, and book critics. For an idea so well-fixed in popular consciousness that TV networks use it for everything from high drama to low comedy, the idea of critics rejecting time travel seems weird today. Yet highbrow scholars vehemently refused the idea, even as the novel flew off British bookstore shelves.

With this title, you’d possibly expect science historian James Gleick to write about time travel in popular culture. I certainly did. Instead, we have a gripping overview of how the human relationship with time has evolved across the last 120 or so years. Theologians, physicists, novelists, and ordinary people have struggled to understand what time means, and come to grips with whether it’s malleable. We accept the idea that time is travelable now… sort of.

Our metaphors for understanding time, the most popular being time as river or arrow, always disintegrate when pushed. If time flows like water, why can we not watch from the shore? Why does time’s arrow never hit its target? Despite our best efforts, the ways we understand time always imply its absence. Given the implications of time as reversible and optional, the idea of time travel isn’t illogical; one wonders why nobody invented it sooner.

Problems with understanding time are hardly new. Gleick quotes sources as venerable as Saint Augustine insisting they understood time’s existence, but couldn’t conceive a working definition. Wells quoted a then-developing hypothesis that time (“duration”) was the fourth dimension of space, an idea now entrenched in pop-cultural philosophy; but that idea doesn’t withstand scrutiny. We experience time, but unlike space, cannot rearrange it. We cannot escape time, or resequence time as we reorganize our living rooms.

James Gleick
Philosophers have struggled to define time, too. Besides Plato and Augustine, Gleick quotes extensively from Henri Bergson, whose path-breaking postulations on time… um… I didn’t quite follow. What matters, though, is that Bergson’s ideas gained critical acclaim and sparked new ideas in fellow philosophers between world wars, then fell on disfavor almost overnight, because they proved less than useful. This theme, of the gap between structural soundness and actual utility, recurs consistently throughout Gleick’s book.

Religion, philosophy’s close cousin, has a difficult history with time. God or the gods, we’re assured, are eternal—that is, they exist entirely outside time. They aren’t forever, since time implies change and, frequently, decay; rather, God stands outside time looking in. How, though, can a personal and loving God, invested in our ever-changing lives, stand outside change? Theologians have devised several work-arounds, but have most often simply ignored the question. Because time always creates paradoxes.

Most importantly, time has proven fraught for science. Newtonian physics absolutely relies upon time to measure motion… but it uses motion to measure time. The circular nature deprives science of necessary constants. Quantum physics finesses this by denying time’s objective existence. Yet why has time proven remarkably resilient? A growing minority of physicists are starting to rebel, insisting time exists, and their reasoning is so fascinating, I’ll let Gleick have the satisfaction of telling you.

So basically, time remains so close at hand that we cannot understand it objectively. We really only consider its weight when someone suggests escaping its pull. Gleick uses pop culture time travel, a distinctly Twentieth Century phenomenon, to understand the larger zeitgeist about time. For him, Doctor Who and Doc Brown aren’t interesting in themselves; he considers these a bellwether for others’ workaday understanding of time. We refine our understanding, by moving outside the question.

Gleick spends only two chapters (and the odd throwaway digression) on time travel fiction. The stories he explores favor philosophical concepts. Though he makes multiple passing allusions to Doctor Who, the only episode he explores deeply is “Blink,” the story that gave us the catchphrase “Wibbley-wobbley, timey-wimey,” to explain time’s behavior when viewed from outside. This encapsulates the idea that somehow, time exists, without being constant or reliable. That’s the stage of understanding we’ve achieved.

Time travel arose, essentially, when Western philosophy grew disillusioned with constants. If nothing is forever, then “forever” doesn’t exist. We step outside time, looking for realities we prefer: some cling to disappearing pasts, while others attempt to hasten longed-for futures. Time travel becomes one more among the metaphors which describe truths we glimpse, but cannot explain. James Gleick does a remarkable job explaining this evolution. Something like time exists; now we have to travel it.

Wednesday, October 25, 2017

No, Not #MeToo

The #MeToo hashtag began because the truth
about Harvey Weinstein flooded out.
A few weeks ago, when the #MeToo hashtag drew attention to the widespread sexual violence women suffer regularly in Western culture, I did something appallingly crude: I posted #MeToo myself. Because, yes, it has happened to me. Though I’ve never suffered out-and-out violence, nor had it happen consistently over my lifetime, I have known the degradation of having a stranger’s hands on my body while lewd and humiliating language flew.

I call my action “appallingly crude” not to pretend what happened to me isn’t serious; nobody should endure such treatment, even if it’s just one-off. Rather, I feel shame at hijacking this message because I engaged in bandwagon behavior and tried to make it all about me. Yes, I’ve had strangers make me feel ashamed of having a body… twice. I can count the number of times it’s happened without removing a mitten. And that’s the point. For me, it’s exceptional.

As I’ve spoken to women in recent weeks, I’ve come to understand that, for most of one entire half of the human species, it happens so often, so persistently, that they can’t keep count. I’ve recently spoken to waitresses whose customers think the fact they’re paying for food gives them permission to paw the hired help. Wives who meet strangers that consider wedding vows optional. Daughters whose fathers think there’s any way whatsoever they can comment on their girls’ growing bodies that doesn’t cause shame.

The #MeToo tag surged into public consciousness following Harvey Weinstein’s public meltdown in early October. It burned hot and fast… and, like most social media trends, it burned out quickly behind some celebrity stunt or presidential cock-up. Though it still attracts some comment from high-minded publishers like The Atlantic or The Washington Post, it mostly burned out after about a week.

But during its heated run, it got remarkably wide coverage. Yes, the usual bastions of po-faced liberalism, like The Daily Beast and The Huffington Post, shared their stern disapproval of how men handle being dudes. I was most shocked when right-wing columnist Cal Thomas, who formerly served as Jerry Falwell’s press flack, wrote movingly about his own family members’ experiences being subject to constant, unrelenting sexual pressures.

Sadly, the brevity required by tweets and Facebook status updates means stories get stripped of detail. In 140 characters, sexual harassment and assault, even rape, happen largely in the passive voice. These things happen to women; they aren’t performed by men. In this construction, I (or someone who looks like me) didn’t pressure a woman for sex, make degrading comments, or coerce sex. It just happened, like the weather.

Many women have written about the #MeToo
experience recently; Jennifer Lawrence is
probably the most famous to do so.
The worst part, I’ve come to realize, is the fatuousness of men like me, men who claim to be progressive and fair-minded, joining the bandwagon. Desperate to prove the pervasiveness of sexually motivated bigotry, we commandeer women’s experiences and tell our uniquely male stories. Because hey, a stranger groped me in a public restroom when I was seventeen, so clearly, I understand the feeling of getting leered at daily from birth to the grave.

I fear this arrogance, a heavily (but not exclusively) male phenomenon, may partly doom this hashtag campaign to early obsolescence. Like “All Lives Matter,” the male #MeToo shows a fundamental inability to recognize that society’s fundamental fault lines aren’t distributed randomly. Some people literally get treated so badly in the casual violence sweepstakes that their abuse tilts the tables. My mistreatment was degrading, but it was individual, not systemic.

Because hey, I’ve engaged in the kind of barroom banter where a table-full of men, trying to out-dude one another, get increasingly crude and demanding with waitresses and bartenders. I’ve reached through a woman’s personal space bubble and put my hands on her body, thinking I was being flirty, when I was just ignoring her boundaries. I’ve perused the kind of online porn that profits from degrading women for male gratification.

Rather than taking the opportunity to see women holding a mirror to my violently inappropriate behavior, I wanted to claim victim status too. I wanted to be martyred for the cause. Maybe this doesn’t make me evil, since in embracing the hashtag, I at least recognized the awfulness of that behavior in general. Rather, my behavior makes me selfish, since I ignored my complicity with the system I condemned.

It’s impossible to know how common sexual violence really is. But I’m with memoirist Mary Karr, who estimates that if you include harassment, the number reaches nearly 100%. And I’m not the victim here. The sooner I realize this, the better we’ll all be.

Monday, October 23, 2017

The System Eventually Eats Its People: the Eric Garner Story

Matt Taibbi, I Can't Breathe: a Killing On Bay Street

On July 17, 2014, plainclothes NYPD officer Daniel Pantaleo applied a banned chokehold to a fat, middle-aged, diabetic street hustler named Eric Garner. Bystander Ramsey Orta’s cellphone video caught garner wheezing out “I can’t breathe!” eleven times before losing consciousness. These would be Garner’s last words. Garner’s unconscious body lay untended, possibly already dead, for eight minutes, while paramedics parked over a block away, and cameras kept rolling.

Garner’s death electrified the nation. While more police killed more African-American men and youths, often with flimsier pretexts, like Michael Brown and Tamir Rice, Garner’s death had the distinction of being caught on camera with sound, from beginning to end to badly bungled aftermath. This could’ve been the moment that changed American race relations forever… but nothing happened. The reasons why not matter as much as the death itself.

Rolling Stone contributor Matt Taibbi previously wrote The Divide, an investigation into who, post-Financial Services Collapse, actually goes to jail, and for what. His conclusions were stark. He takes a similar approach here to exactly one case, Eric Garner’s controversial death. He describes a Staten Island so economically and racially divided that residents do anything, however unlawful, to get paid… and cops will do anything, however violent, to keep order.

As Taibbi shows early, Garner was no angel. He’d previously done time for crack distribution, then got into untaxed cigarettes because he became to aged and infirm for the cocaine business. He established a remarkably sophisticated network of buyers nabbing cigarettes by the trunkful in Virginia, with America’s lowest tobacco tax. NYC’s nicotine tax was so high, Gerner could charge a 100% markup and still clear a robust profit.

Matt Taibbi
New York had America’s highest cigarette tax, largely because mayor Michael Bloomberg made a campaign promise not to raise income taxes. But like all sales taxes, cigarette taxes hurt those most who can least afford the expense. Saying poor people should quit is unfair, not only because it implies poor people shouldn’t have nice things, but because nicotine, with its soothing anti-anxiety properties, makes life tolerable for workers living precariously.

So yes, Garner was a street-corner Escobar, breaking the law in broad daylight. But the NYPD targeted him disproportionate to the money he cost the city, or the disorder he actually caused. Garner got caught in a campaign to disproportionately target black and brown communities, assuming that darker-hued neighborhoods innately caused crime. This isn’t hypothetical, either; internal NYPD whistleblowers caught commanders, on tape, ordering racially targeted sweeps.

Taibbi goes into the history of Broken Windows policing, which arose from Sixties-era childrearing theories, before being adopted by Rudy Giuliani in the 1990s. Even the theory’s originator admitted, before it became policy, that Broken Windows had the capacity for deep abuse. But he trusted elected officials to not break public trust and misuse their authority. In interviews with Taibbi, that originator now admits, he naïvely trusted bad people.

Police are hardly villains, though. Their harsh responses to penny-ante crime reflect top-down enforcement of technocratic rules. City managers demanded police keep crime down by making multiple arrests and completing laborious paperwork. Civilian complaints led to added layers of Big Data interference. Many cops get into law enforcement because they believe they’ll make neighborhoods safer for families and children. But, conscripted into punitive bureaucracy, youthful ideals dwindle in a hurry.

Taibbi’s first half focuses tightly on the friction between police, represented by Daniel Pantaleo, and communities, represented by Eric Garner. Following Garner’s death, Taibbi’s second half becomes more diffuse, reflecting the decentralized public response. This is further complicated by both local and national responses; and because Garner’s death coincided with Michael Brown’s in Ferguson, Missouri. Thanks to internet video, a local story became a national crisis.

Newly minted mayor Bill DeBlasio campaigned by promising to reverse Broken Windows policing. But he appointed its chief architect as police chief. Protesters considered DeBlasio too cozy with police, while police thought him too conciliatory with protesters. Massive, leaderless demonstrations gained national support, then lost it overnight when one march turned into an attack on police. Frictions only became worse, the adversarial relationship between police and city more ingrained.

Taibbi paints a heartbreaking picture. Though his sympathy, measured in column inches, clearly lies with community members, the police he interviews appear dedicated, misunderstood, and yoked to an administration that treats them badly. Protesters demanded change, but discovered that the system only exists to protect itself, as it always has. This isn’t easy reading. But in today’s divided society, it’s very much necessary.

Friday, October 20, 2017

That Beatles Parody You Didn't Know You Needed

1001 Movies To Watch Before Your Netflix Subscription Dies, Part 22
Eric Idle (writer/director), The Rutles: All You Need Is Cash


1001 Albums To Hear Before Your iPod Battery Dies, Part Nine
The Rutles, The Rutles


Sometime in the early 1960s, a mop-topped quartet of British musicians took the world by storm. No, not that one. This quartet gained international fame almost overnight, fame for which they proved supremely unprepared. The Rutles, so-named because they began as a one-off sketch on Eric Idle’s show Rutland Television Weekend, hit so close to the Beatles’ actual history that Paul and Ringo supposedly couldn’t watch the finished show.

Eric Idle has a history of weak, uninspiring choices following his Monty Python years. But this one choice probably rescued his name from premature anonymity. Teaming with Neil Innes, who wrote some of Monty Python’s funniest musical segments; Saturday Night Live producer Lorne Michaels; and a selection of top-quality British session musicians, Idle managed to create a band that both honored the Beatles, and challenged Beatlemania’s continuing cult-like adoration.

Emerging from the Cavern Rutland, the band found an unlikely champion in a middle-aged tradesman who didn’t understand music at all. A series of ham-handed business arrangements makes the Rutles a lucrative proposition for record producers, merchandisers, and filmmakers—but the Rutles themselves get ripped off, seeing tiny percentages of the money made off their names. It doesn’t take long before drugs and infighting threaten to overtake the band.

The parallels with the actual Beatles are more than slight. The sudden rise, global popularity, and massive flame-out mirror the Beatles’ trajectory point-for-point. Ringo Starr reported having difficulty watching the finished mockumentary, which hit too close to home, and Paul McCartney had a frosty response. John Lennon, however, called it hilarious, and George Harrison contributed to the production, even appearing onscreen. (The next year, Harrison co-produced Life of Brian.)

Neil Innes’ compositions, most supposedly written during a two-week hot streak in 1977, sound so close to the Beatles, they scarcely count as parody. Early tracks like “Goose-Step Mama” and “Hold My Hand,” mimic the Beatles’ early, American-influenced rock-and-rollers. Later tracks venture into nostalgia with “Doubleback Alley,” psychedelia on “Piggy in the Middle,” and rootless anger on “Get Up and Go.” The soundtrack plays like an unironic Beatles retrospective.

This earnest, ambitious musical texture, available as a separate album for those who appreciate its artistry, contrasts with Idle’s glib tone tone. Idle, who plays both a Rutle and the video host, guides viewers through the Rutles’ tumultuous arc, which we watch with pained awareness of where everything will end. Though Christopher Guest’s Spinal Tap is often credited with starting the “mockumentary” fad, Idle pioneered the format five years prior.

Idle’s characters show glib self-awareness, often speaking directly into the camera: they know they’re in a documentary, and probably know where they’re headed. Interviews with the Rutles’ purported contemporaries, including Mick Jagger and Paul Simon, indicate a deep appreciation of the band’s art, but also an awareness that the group was ultimately doomed. With a “knew-it-all-along” shrug, witnesses describe a ship setting sail with its decks already on fire.

The Rutles, from left: Neil Innes, Ricky Fataar, Eric Idle, and John Halsey

Of the actors playing the Rutles, only Idle (who lip-synchs his vocals) and Innes have significant speaking lines. The other band members, bassist Ricky Fataar and drummer John Halsey, speak little; they were hired primarily as musicians. Fataar cut two albums and toured extensively with the Beach Boys, while Halsey was a regular session musician for Lou Reed, Joe Cocker, and Joan Armatrading. Their musical bona fides are unimpeachable.

As stated above, the audience already understands where the Rutles’ trajectory is headed. While happy lyrics and playfully inventive composition keeps Rutlemania fans distracted, the band’s internal dissensions become increasingly visible. As they work less closely, the band’s art starts suffering, and they begin displaying embarrassing, sprawling pseudo-creativity. It becomes clear the band members need one another, but can’t stand each other.

Eventually, we already know, the band splinters. Some members return to the anonymity from which they originated, while others keep trying to produce art, but remain haunted by their past. Asked directly whether the Rutles will ever get back together, Mick Jagger, looking like a man caught with his pants around his ankles, gasps: “I hope not.” So do we, because they’re worth more as a memory than a living force.

Idle and Innes, plus part-time contributors George Harrison and Michael Palin, infuse the Rutles story with fast, Python-esque humor. But it’s the comedy of a perfectly choreographed train wreck. We almost feel guilty taking pleasure in watching the Rutles self-destruct. Yet the Rutles’ tragedy is so woven into our cultural consciousness, we need that laughter, just to understand the depths of our own pain.

Wednesday, October 18, 2017

We're All Just Szechuan Sauce Now

Before the Great Szechuan Sauce Meltdown of 2017, I hadn’t thought about Cartoon Network’s breakout hit Rick and Morty for over a year. Having witnessed the Internet sensation around the series, I wanted to understand the fervor it engendered. So I watched two episodes, concluded this series wasn’t for me, and didn’t think about it again. Clearly I overlooked something important, because the Szechuan Sauce debacle chills my bones.

For those playing the home game, the third-season Rick and Morty opener included an extended gag pleading for McDonald's to bring back “Szechuan”-flavored McNugget dipping sauce, a short-lived promotional item from 1998. McDonald’s, without consulting the production house, went along with a one-day gimmick resurrection. But they didn’t plan appropriately, distributed sauce haphazardly, and fans were disappointed. Worse, many were outraged. Some fan protests turned into mini-riots.

We could calibrate how imbecilic this debacle really is. Multiple sources have published open-source sauce recipes, which have mostly been around since the sauce first gained admirers nineteen years ago. But for fans, the sauce as comestible doesn’t really matter; geek culture is clearly about shared experiences—and the experience of getting Rick Sanchez’s favorite dipping sauce from his favorite restaurant matters to show fans more than the food.

Never mind that audiences understand Rich Sanchez is supposed to be an obnoxious human being. Never mind that, in pursuit of his appetites, he broke up his daughter’s marriage in that episode, and prior episodes have included murder and implications of incest. The series anti-hero wanted a specific experience, and fans wanted to share that experience. And McDonald’s failed to anticipate the zeal fans bring to having sharing their goals.

Unfortunately, McDonald’s apparently doesn’t understand new manifestations of fan culture. Until this week, McDonald’s, like me, thought fan culture still revolves around small handfuls of geeks wearing Star Trek uniforms or Jedi robes, hand-distributing mimeographed fanzines which only a handful of friends would probably ever see. I've personally excused revolting fan behavior in the past, largely because my belief in fan culture hadn’t much evolved since 1991.

But those days are gone.The Internet now permits fans to organize without regard for geography. No more do fans need to organize conventions in hotel ballrooms in hopes of meeting fellow Trekkers beyond those they attended high school with; I could meet Trekkers in London, Sydney, Trinidad, and the Ross Ice Shelf by logging on. The capacity for unified fan action has never been greater… which is awesome, for essentially benign fandoms.

But we’re not talking about Star Trek, or its uplifting humanist values, anymore. The spread of organized fan culture has coincided with some pretty terrible fan objects. Some of fandom’s most influential properties, like Mad Men, Arrested Development, and Game of Thrones, star characters who exist specifically to be revolting, amoral throwbacks. But fans don’t make that distinction. So we get nerdboys deliberately mimicking Rick Sanchez.



Rick and Morty’s creators have tried to distance themselves from their fandom, especially as the primarily male base has reacted violently to hiring women writers for Season Three. But that’s too little, too late. The entire show exists to spotlight a sociopath whose sense of entitlement overwhelms things like family, common decency, and human life. And it’s attracted a fandom which, to a shocking degree, shares these values.

McDonald’s learned to ignore fan dynamics at their peril. Entitled fans screamed abuse at minimum-wage workers, disrupted business, and demanded their wants be treated as sacrosanct. I could understand such disruptions for noble ends, for instance, protesting that a corporation which claims billions of dollars’ profit every quarter claims it can’t afford giving front-line workers a modest raise. But this was idiot man-children throwing a tantrum over a cartoon.

This isn’t a McDonald’s problem, friends. Recent accusations of sociopathic behavior from superstars like Harvey Weinstein and Ben Affleck have gotten papered over to keep money rolling in, while controversies haven’t stuck to Woody Allen and Roman Polanski. Because hey, the fan base prints money. And this same appeal to sociopathy has given powerful, culture-defining careers to Roger Ailes, Bill “Falafel” O’Reilly, and Donald Trump.

McDonald’s will shrug this controversy off. As the only ready source of cheap, ready-to-eat food in many poor neighborhoods and overbuilt suburbs, its principal customers can’t afford a meaningful boycott. But the images of screaming, entitled fans will remain an American shame for years. Because this fandom didn’t happen; it was cultivated for profit. And fandoms like it will arise as long as there’re man-babies with money.

Friday, October 13, 2017

Maybe the Problem Is Just Men Having Power

Harvey Weinstein
Hollywood greasebag Harvey Weinstein’s descent into pariah status has happened with haste I never expected. It took months for Bill Cosby’s rape accusations to gain sticking power, and he even headlined a successful tour while accusations kept dribbling out. How people feel about Bill Clinton, even after DNA evidence, still largely breaks along party lines. Malcolm Forbes and Jimmy Savile didn’t even get seriously accused until they were dead.

This happens so consistently, though, that we should contemplate the moral. We keep discovering powerful men with their trousers around their ankles. This may mean literally, as with former House Speaker Dennis Hastert, or figuratively, like JP Morgan Chase CEO Jamie Dimon. Either way, we face a discomforting reality: men with egos big enough to pursue and achieve global power, have egos big enough to consider themselves immune from consequences.

Weinstein’s described behavior should sound familiar to people who follow these issues. Like Malcolm Forbes, he greeted targets wearing only a bathrobe, or less, and when his targets refused his advances, he’d masturbate, or otherwise gratify himself, in front of them. Like James Woods, he evidently approached very young women with grandiose offers in exchange for favors. Like Joss Whedon, he did this while publicly ballyhooing his progressive credentials.

In fact, the described behavior is so similar that, like medieval witch hunts, I’d almost believe the accusers were jumping on public hysteria and repeating claims they’d already heard from others. Except that we keep seeing the same behavior emerge from their mouths. They, or a handful of paid shills, deny the accusations and disparage the accusers. They throw themselves on the mercy of the courts. Then, they get convicted.

02102We’re still so early in the Weinstein scandal that we’re just seeing the “non-denial denial” stage. That’s when the accused insist they… something. At this stage, Bill Cosby simply went quiet, refusing to confirm or deny anything. Donald Trump issued a statement insisting that his recorded boasts don’t really reflect his identity. Bill Clinton took the unusual step of out-and-out lying. The effect is identical, however: “It’s not my fault!”

There’s also the attempt to paint oneself as the victim. Weinstein has issued a statement complaining that his wife and children left him, while his board fired him from the company bearing his name. Sob. Donald Trump mustered several of Bill Clinton’s accusers to redirect his story onto “crooked Hillary.” Roman Polanski fled the country and made several award-winning films to distract Americans from his rape confession.

Often, but not always, the accused gets found guilty. After DNA proved the stain on Monica’s dress really came from Bill Clinton’s peter, Clinton admitted his lies, but evaded impeachment, retired at the peak of his poll numbers, and made a cushy bankroll on corporate speaking engagements. Marv Albert pled to a lesser charge to avoid a trial. Mike Tyson did three years on a six-year sentence.

But too often, the accused skate. Sometimes they should; accusations against Tucker Carlson, Jerry Lawler, and Kobe Bryant were deemed baseless. But Michael Jackson stood trial twice without a conviction, and R. Kelly pushed procedural options so far that his ultimate trial became tragicomic, with a pre-written conclusion. And Woody Allen, Errol Flynn, and Al Gore? Hell, they just skated. It’s hard to prove sexual crimes, especially against famous people.

Any individual accused of sex crimes, of course, represents only himself. There’s no magic individual who represents the entire male population, even that male subset comprising the famous, wealthy, and powerful. No stink of sexual impropriety ever clung to Barack Obama or George W. Bush. And the occasional woman has been accused (Britney Spears). So it’s wrong to draw hasty conclusions, or assume all rich, powerful men are guilty.

However, after enough accusations, the pattern becomes visible. Men who grow accustomed to thinking of themselves as bigger than the general rabble, who believe their impulses more worthy of satisfaction, will eventually believe themselves bulletproof. Harvey Weinstein has been in the media production business for forty-eight years, and evidently considered himself a kingmaker. Maybe he started to believe that “divine right of kings” bullshit.

Plato wrote, over two millennia ago, that those most eager to achieve power, deserve it least. This applies in politics, finance, or pop culture. The young, hungry Harvey Weinstein may have produced decades of culture-defining hits; but accusations of impropriety now go back two decades, to when he became an institution. Maybe we need a statute of limitations on power. Maybe we need more women.

Wednesday, October 11, 2017

Little Pieces of America All Around Us

Yeah? What America is that?
(click to enlarge)
I really, really like Creedence Clearwater Revival. But the reason why is pretty embarrassing: when, at sixteen, I rebelled against my parents’ popular culture, as sixteen-year-olds do, I wasn’t ready to embrace Nirvana and Pearl Jam like my peers. I feared getting into anything “new,” and getting left behind, like my friends who’d previously enjoyed Nu Shooz or Duran Duran. Novelty was risky; old stuff came pre-screened. So I started listening to the oldies station.

As half-hearted rebellions go, mine probably seems mild. Given the recent popularity of steampunk, crypto-fascism, and hipsters dressed as Canadian loggers, digging the rock’n’roll of a prior generation isn’t that bad. Except, I’ve increasingly realized, I didn’t really embrace that generation’s vision. Any listen through Casey Kasem’s back catalog reveals that American Top Forty radio has long been dominated by tedious music, driven by labels and producers who manipulate, rather than listen to, the market.

So yeah, I understand the impulse driving people made uncomfortable by today’s cultural divides. I witness friends, people I like and trust, embracing the “Make America Great Again” motto, creating excuses for everyone from Bill Cosby to Peter Cvjetanovic, and calling anything that doesn’t support their power structure “fake news.” Meanwhile, the political party that represents organized progressives offered voters a choice, in the last presidential primary, between nostalgia for the 1990s or the 1950s.

This massive aversion to risk comes at a time when America’s structure is already changing. Our demographics are in motion, as immigration from Asia, Latin America, and the Middle East give this country an increasingly brown complexion. Our range of media options continues to increase as the carrying capacity of TV and Internet sources improves, and we’re drowning in new ideas. Even commerce has become chaotic, with the hectic panoply of chains and online retailers.

Naturally, a large fraction of Americans retreat into what’s comfortable. Whether that means pining for a sunlit Norman Rockwell townscape, or voting for the candidate who promises to restore what we consider our glory days, or listening to “Bad Moon Rising” with the volume at eleven, we’re seeing the same motivation. People intimidated by change, which happens faster now that we can (or choose to) manage, naturally retreat into their favorite version of the past.

I understand this impulse, but I fear it, too. Back in the 1980s, when I began paying attention to social issues, I remember people already complaining that suburban sprawl, with its lack of shared common spaces like parks and downtowns, created vast “communities” bound together only by geographical proximity. Residents sorted themselves into real communities by their workplaces, churches, watering holes, and their children’s schools. Ideas, like people, became unofficially segregated in our diverse America.

Today’s media landscape sees that segregation happening even more quickly. We watch Fox News or MSNBC and have our favorite prejudices ratified by well-coiffed pundits, and equally importantly, we see our ideological challengers reduced to manageable caricatures. We choose our radio stations to ensure we hear only what we know we already enjoy, and, as Gretchen Rubin writes, streaming services like Pandora and Spotify actually narrow our exposure. We’ve improved innovation exposure to a science.

Nor am I immune to this. After resisting new culture for decades, I embraced indie rock when I was pushing forty. But at a recent concert, I realized: this audience is almost as white as the Charlottesville Nazi rally. I could excuse even that as the natural self-sorting nature of crowds, except that I’d driven over 320 miles to see this concert, which I’d heard advertized on an out-of-town radio station I listen to online.

Sadly, I have no ready solutions. I see how aversion to novelty reduces me to a stereotype, the middle-aged white “kid” listening to indie with other honkies. But the alternative is switching my listening habits to locally available radio, which not only bores me, but is overwhelmingly owned by out-of-town corporations famously unresponsive to local needs. I could complain that corporations shattered my community… but I’d have to admit they did it with my assistance.

If America is shattered, as the nostalgia vendors claim, then we have broken it, you and I. We could, as many do, pin responsibility on corporations, or government, or millennials. But that’s just punting the issue down the field. We elect a government, but we lack leaders. We join social networks, but we don’t organize. We look at the little pieces of America all around us and, like good little passive citizens, we do… nothing.

Saturday, October 7, 2017

One Million Ways To Die in the Star Wars Universe

Greg Stones, 99 Stormtroopers Join the Empire



One stormtrooper fails to shoot first.
One stormtrooper doesn’t let the Wookie win.
One stormtrooper fails Lord Vader for the last time.

Back in 1963, macabre cartoonist Edward Gorey published a storybook for grown-ups called The Gashlycrumb Tinies, in which twenty-six children meet horrible ends. Did you ever wonder how that would look if nerds rewrote it for their favorite franchise? Yeah, me neither. But Greg Stones, author of Zombies Hate Stuff and Sock Monkeys Have Issues, apparently did. And boy am I glad, because this book is funny.

Stones imagines different ways stormtroopers die grisly deaths. Stomped by AT-AT Walkers; frozen in carbonite; fed to the Sarlacc; stationed on Alderaan. The deaths incorporate images from all eight live-action movies, though mostly the original trilogy. Some deaths probably refer to ancillary material I haven’t read yet. All are hilarious in the deadpan delivery of frankly gruesome content that the characters probably hated.

click to enlarge

As with Gorey, however, the real life comes from Stones’ illustrations. His flat, cartoonish look contrasts with the three-dimensional, computer-generated style favored in so many picture books these days, a deliberate nod to his adult audience’s nostalgia for their childhood reading. The approach is playful, with oversaturated colors and not-quite realistic proportions (nobody casts a shadow). The stormtroopers are drawn wearing armor from the original trilogy.

Stones’ poker-faced prose, never more than one sentence per page, and childlike folk illustrations, give the gruesome content its ironic comedy. There’s always something hilarious about stating awful things like you’re discussing the weather, especially when you know it’s fiction. Stones’ understatement of the truly awful gives his storybook a Gary Larson-ish tone of gallows hilarity. Who doesn’t love laughing in the face of certain death?

This book is, undoubtedly, part of a marketing push to make Star Wars timely with Disney’s purchase of Lucasfilm. Funny enough, I’m okay with that.  Despite the cynical marketing edge, if publishers can release books that bring happiness into customers’ lives, I say let them. Stones’ playfully grim take on Imperial incompetence will give nostalgic grown-ups the boost they need while awaiting the next movie release.

Wednesday, October 4, 2017

On the 500th Anniversary of the Protestant Reformation

Craig Harline, A World Ablaze: The Rise of Martin Luther and the Birth of the Reformation

According to tradition, October 31st, 1517, marks the beginning of the Protestant Reformation. On that day, Brother Martin Luther, Augustinian, nailed his Ninety-Five Theses to the parish church door in Wittenberg, Saxony. Except, Craig Harline warns, it might’ve happened differently: the university beadle often did the actual nailing. Or the nailing might’ve been purely metaphorical, and Luther simply published his theses. Or he might've only mailed them—the supposed nailing wasn't attested until decades later.

Harline, a BYU historian specializing in Renaissance European religious history, assembles a brief, plain-English history of the five years most readily associated with the German Reformation. Though he includes details of Luther’s life and works before this time window, he mainly covers the period from 1517 to 1522. During these years, Luther’s Theses, presented as routine academic disputation, generated unprecedented controversy in Catholic Europe. Luther’s critics turned him into something he never meant to become.

Even many people unfamiliar with Protestant theology know the history: Luther questioned plenary indulgences, writs sold by the Vatican which excused purchasers from earthly punishment for their sins. Except, in 1517, Vatican emissaries were selling indulgences on the claim that they excused buyers from Purgatory. Luther, who’d lived a flagellant life of self-recrimination until he discovered the Apostle Paul’s injunction that “The righteous shall live by faith” (Romans 1:17), claimed this misses the point altogether.

But Harline reveals something even most Lutherans probably don’t realize (I hadn’t): the 1517 indulgence was already deeply unpopular. It didn’t make the money its sponsor, Archbishop Albrecht of Mainz, had expected, because German churches were already oversaturated with divine orders and magic tokens. Germany lost interest in Vatican witchcraft before 1517. Luther inherited a church already discouraged with Italian power-mongering; his gift wasn’t in creating dissension, but in channeling dissension toward theologically sound ends.

Brother Martin Luther, painted
by Lucas Cranach the Elder
By 1517, monarchs in Britain, France, and Spain had already softened Roman authority inside their kingdoms. Rome depended upon German loyalty to finance increasingly grandiose building schemes, and subsidize that great sybarite, Pope Leo X. Luther seized a moment, which could’ve portended international anarchy, and provided philosophical backbone to popular grumblings. Faith, Luther said, makes all people free. However, at the beginning, he had no intention to create a new church or break with Rome.

From there, Harline describes how Luther’s challengers forced him to revise his theology. His disputes with Johann Eck and Cardinal Cajetan, in particular, forced Luther to explore the ramifications of his scriptural exegesis. Challenging papal indulgences, these learned churchmen believed, inevitably challenged the papacy itself. Luther didn’t want this conclusion, but the longer he disputed, the more he realized the inevitable: popes, however sanctified, cannot bring salvation. That lies between God and individual Christians alone.

Unfortunately, then as now, powerful people misinterpreted theology to their own ends. When Luther promised freedom to believers living by faith, some people assumed that meant ealthly freedom from rules and expectations. Aristocrats began picking fights with bishops, while citizens began disregarding the law. Radical reformers like Thomas Müntzer encouraged the faithful to rebel against human authority. Luther started events in motion, then struggled to keep them under control, sometimes less successfully than other times.

Throughout Harline’s history, one recurrent image dominates: the importance of pamphlets. Many previous theologians had challenged Vatican primacy—the name Jan Hus looms large—but most lacked sufficient reach to carry the discussion. Luther enjoyed the advantages of political turmoil, benevolent patrons, and a conveniently corrupt Pope, certainly; but he couldn’t have parlayed that into a Reformation without the invention of moveable-type printing. The Reformation testifies to the importance of literacy and the portable word.

Harline writes this history with the panache of a novelist, paying particular attention to keeping both the sequences of coinciding events, and history’s inevitable cast of thousands, comprehensible enough for general readers. This includes supplementing the text with historic portraits of the major players and images of the relevant cities, mostly etchings and woodcuts. When the story covers this much territory, and involves this many characters, having authentic period images anchors everything in our minds.

By 1522, with Luther formally excommunicated and Germany splitting along sectarian lines, Luther’s Wittenberg descended into chaos, an outbreak history calls Karlstadt’s Rebellion. Luther returned from exile and, without institutional authority, resumed his pulpit, essentially beginning the native German church. Harline ends there, with a world transformed, and power devolved to regions. This isn’t a full Luther biography, just five years of rapid, world-shattering transformation. Harline delivers a punch as concise as his time frame.

Monday, October 2, 2017

SuperSuit: a Business History of a Non-Linear Business

Reed Tucker, Slugfest: Inside the Epic 50-Year Battle Between Marvel and DC

At a party recently, two fellas got into a heated tangle over Marvel vs. DC. Marvel, one insisted, has grown too snooty living atop the comics sales heap for decades. The other insisted DC was stuck in World War II and hadn’t had a good idea since Eisenhower without pirating it from Marvel. As somebody with no corner to back, I found the conflict confusing. But watching two guys kept my focus narrow.

Freelance journalist and sometime radio sidekick Reed Tucker takes a wider view. Spanning the period from Marvel’s launch to the present, he describes the parallel development of two industry titans who latch onto the wonder inside readers, and speak to beliefs in justice. Launched in 1961, by 1972 Marvel dominated the market, and has ever since. Tucker gets the business right, but something feels missing from his analysis.

After a very brief introduction to DC’s history, Tucker dives into Marvel’s launch and its industry impacts. Marvel started so shoestring that it relied upon DC to distribute its titles. But heroes like the Fantastic Four, who fought among themselves, or Spider-Man, who often couldn’t pay his bills, touched a nerve for teenage readers. DC assumed audiences stopped reading comics around age 12; Marvel caught older kids longing for something meatier.

Marvel’s heroes had complex inner lives that touched Baby Boom readers, while DC’s heroes remained patriotic pin-up characters from a prior generation. Marvel encouraged pathbreaking artists like Jack Kirby and Steve Ditko, while DC maintained a house style so generic, literally anyone could draw any hero. Marvel took risks during an era when risk-taking paid handsomely, while DC conservatively clung to a portfolio worth more in licensing than publication.

Thereafter, Marvel led while DC followed. DC’s Carmine Infantino plundered Jack Kirby, Frank Miller, and other Marvel talent, but shackled them, and their talents sputtered. Marvel pioneered event crossovers, in-universe continuity, and other now-vital aspects of graphic storytelling. DC copied. Even when DC pioneered one domain, live-action cinema, they failed to parley that into marketing success.

Tucker takes the relatively unusual tack of focusing on business and production, spending little time on stories and art. He acknowledges that early Marvel comics had a nuanced depth of characterization that DC, stuck in post-WWII kiddie schlock, didn’t match. But he doesn’t explicate why, as DC matured and Marvel became a factory, Marvel kept outselling. Especially since around 1986, DC’s stories have competed with Marvel’s for psychological complexity.

This is especially perplexing considering how many personalities, like Jack Kirby, Jim Shooter, and Frank Miller, crossed between publishers. DC literally had the ingredients for Marvel-style revolution, but couldn’t translate them into more-than-mediocre sales. Tucker limply says that DC’s in-house management style couldn’t unleash such talent. But that sounds unconvincing when talent moved between the houses throughout the 1980s. Something deeper is at work, and Tucker keeps focus elsewhere.

Tucker offers mere glimpses into even large story developments, like Secret Wars or the Death of Superman, mostly superficial descriptions which anyone who read the actual comics already knows. If Marvel really succeeds from psychological depth and complexity, why not pause on important points? Almost as weird as what Tucker includes is what he omits. Influential writers like Alan Moore, and non-Madison Avenue publishers like Malibu Comics and Dark Horse, get only salutary mentions.

On a personal level, the period Tucker identifies as the high-water mark for printed comic sales, the early to middle 1990s, is actually the period I stopped following comics. Stories became too intricate, universes too massive, and keeping abreast became a full-time job—one I didn’t want because, with young adulthood upon me, I had a literal full-time job. The qualities that drove record sales drove me away.

That being the case, I’d have prefered more attention to stories and art. The business is fascinating, particularly to fans, but sales figures and market dominance follow audience interest, not lead it. Myself, the comics I’ve most enjoyed recently have come from DC, but tellingly, have generally been non-canon graphic novels like Grant Morrison’s Arkham Asylum. Stories that don’t require decades-long immersion in character backstories and universes.

Speaking of Grant Morrison, a book already exists which addresses the psychology Tucker mostly overlooks. Morrison’s Supergods mixes Jungian analysis with Morrison’s own autobiography of comics experience to plumb how each generation’s new superheroes addresses their time’s unique needs. Maybe fans should read Morrison and Tucker together. By itself, Tucker’s MBA analytics are interesting but anemic, lacking clear insight into what drives readers and their loyalties.