Friday, May 31, 2013

Country Music Objectively Sucks, Part Two

How Has the Music Gotten Here?

Garth Brooks
When Garth Brooks hit the scene in 1989, my local radio DJ said of him: “This kid is so young, his skin still shines.” His breakthrough single, “Much Too Young To Feel This Damn Old,” had a simple acoustic arrangement and a spirited singalong chorus. Later songs like “Rodeo” and “Friends In Low Places” cemented his reputation. This guy felt like a country singer.

But unlike former rodeo champion Chris LeDoux, whom Brooks helped shepherd to stardom, Brooks always seemed embarrassed by country music. He recorded covers of Little Feat, Don McLean, and KISS which got progressively less distinct from their originals. He hired sidemen without country credentials. Prior to his Chris Gaines album, when he announced his next CD would not be country, many fans shrugged and said, “How’s that any different?”

Through it all, paying audiences rewarded Brooks for his ambivalence. He sold out arena venues, and moved more albums than anybody but the Beatles. Fans ignored his burgeoning gut and documented extramarital affairs, and his winning streak ended only when he took himself out of the game. The music scene Brooks left behind had been permanently changed in his image.

Country musicians have always felt torn between raw authenticity and commercial success. The honky-tonk of Hank Williams and Lefty Frizzell gave way to the Nashville Sound championed by Chet Atkins. In my childhood, radio provided the field where spare, muscular Outlaw Country competed with more commercial Countrypolitan. In pure dollar terms, slick, highly produced country has always done better than the honest, naive stuff.

Retracing music history, though, these subgenres didn’t squeeze each other off the air. A country station might play a slick track by Ray Price or Dottie West, and swing without pause into Kris Kristofferson or David Allan Coe. Honky-tonk survivor George Jones revived his career with his crisp Atkins-produced duets with his wife, Tammy Wynette. Different influences mingled, but it all remained country music.

Tammy Wynette
and George Jones
Why, then, today’s dominant divisions? The genre’s slick half runs the commercial airwaves, while traditional-minded acts like Robbie Fulks, Neko Case, and the Drive-By Truckers get squeezed to the Internet ghetto. Why is radio driven by music that verbally acknowledges the honky-tonk past, but sounds more like 1970s Southern rock? The answer may come from outside the music itself.

Garth Brooks’ electrified country-lite ascendency coincided with the Telecommunications Act of 1996, which significantly deregulated mass media. Where once, companies were limited in the number of outlets they could own, and their ability to dominate regional markets, now companies like Clear Channel and Sinclair could own nearly all the radio in an area. And these conglomerates demanded returns that would make Mexican drug lords blush.

Country Music is big business. Country is the most common music radio format in America, with a more unified audience base than rock and pop genres. Thus, singles that can follow Brooks’ highly commercialized model have unprecedented reach. Get your song picked up by conglomerate media, Cowboy, and you can wipe your ass on Benjamins.

Studio labels and radio conglomerates conspire to push songs onto the airwaves that, apart from twangy vocals and the occasional fiddle, are indistinguishable from classic rock programming. Taylor Swift and Luke Bryan make money by attracting large crossover audiences. The fact that traditional country listeners, like me, drift away in droves, doesn’t matter. There aren’t enough of us.

But historically, slick country doesn’t produce music that lasts. Many artists honor the idea of Skeeter Davis or Charlie Rich, but nobody actually listens to them. Acts that didn’t have the same influence in the short term, like Dwight Yoakam or Billy Joe Shaver, remain listenable decades after they laid their music down. Of course, music studios have to pay their bills right now. But that doesn’t justify artistic short-sightedness.

Chet Atkins
When I saw Darrell Scott in concert, he made a point of emphasizing what he did as “real country music,” then ticked off on his fingers: “pre-Toby, pre-video, pre-mechanical bull.” The audience applauded forcefully, because we knew what he meant: country music isn’t about selling more copies. It isn’t about winning. Country music is about telling the truth. And the truth is seldom commercial.

No one audience should monopolize the genre. Just because I don’t like Blake Shelton doesn’t mean he should go away. But only one kind of music now dominates the country mainstream. The competition and difference that made the country of my childhood has largely disappeared. Without that, the sound has become incestuous. That’s why country music objectively sucks.

PART ONE:
Where Has the Music Gotten?

Wednesday, May 29, 2013

Country Music Objectively Sucks, Part One

Where Has The Music Gotten?

Hank Williams
Blake Shelton’s annoying country anthem “Boys ‘Round Here” begins with this strange boast:
Well the boys ‘round here
don’t listen to the Beatles,

run old Bocephus
through the jukebox needle

at the honky-tonk
where they boot stomp

all night (That’s right)
So... let me get this straight. The Beatles broke up in 1970, while Bocephus (Hank Williams, Jr.) hasn’t had a significant chart hit since 1990. So Shelton seems to declare that “the boys ‘round here” embrace their parents’ pop culture as a means of rejecting, what, their grandparents? That makes plenty of sense.

We see the same problem in Brantley Gilbert’s “Country Must Be Countrywide,” when the artist boasts: “In every state, there’s a station/Playing Cash, Hank, Willie, and Waylon.” By “Hank” he must mean Hank Williams, since nobody calls Hank Junior or Hank III “Hank.” And nobody listens to Hank Snow anymore (your loss). So that’s three dead superstars, and one who hasn’t had even a minor hit since 2002.

Jason Aldean’s strange country rap “1994,” an invocation of novelty artist Joe Diffie, who hasn’t mattered in nearly two decades, merits mention here. But nothing more.

I love country music. My parents had me singing with George Jones and Tammy Wynette about the time they got me onto solid food. Ask me anything I’ve ever learned about environmental science, and maybe I’ll remember something about storm clouds; but I remember all the words to Eddie Rabbitt’s “I Love a Rainy Night.” Willie Nelson’s Red-Headed Stranger album is the pride of my collection.

That’s why it bothers me when mediocre stars playing uninspiring music dominate country radio. Brantley Gilbert’s right, I’ve never lived anywhere in America where you can’t dial in a hillbilly station; but since the middle 1990s, the music has grown small in its ambitions, blandly slick in its execution, and timid in its themes. But artists attempt to establish their credibility by name-checking superstars they scarcely resemble.

June Carter Cash
and Johnny Cash
Shelton seems proud that his boys are “keeping it country,” a turn of phrase Gretchen Wilson also uses in “Redneck Woman.” I don’t know what it means in either case. But from Shelton, it sounds especially weird coming from the mouth of a man singing in a twang-rap hybrid. His song seems designed to reach white suburban youths who would never actually put a quarter in the jukebox for Bocephus.

Country music, with its conservative rural ethic, has a long history of looking backward. Waylon Jennings’ “Are You Sure Hank Done It This Way” complained, in 1975, that country music had fallen in love with its own mythology and lost its roots. The title character in the Bellamy Brothers’ 1985 hit “Old Hippie” “gets off on country music, because disco left him cold.”

But today’s over-the-shoulder credibility grabs feel different. By name-checking older artists, Shelton and Aldean seem to admit the music they and other hitmakers record today can’t hold a candle to their heroes. This feels like a tacit confession that the crap Nashville studios push out under the country music rubric isn’t very good. They’re saying what I’ve said for fifteen years: today’s country music objectively sucks.

I still listen to Hank Williams, Loretta Lynn, and Johnny Cash because their rustic authenticity speaks to me in ways Top 40 cannot. The very lack of studio polish means that nothing stands between the musician and the audience but a guitar. When Hank sings “I’m So Lonesome I Could Cry,” by damn, I believe him.

When the Dixie Chicks recycle Fleetwood Mac’s “Landslide,” or Darius Rucker cuts the gonads off Old Crow Medicine Show’s “Wagon Wheel,” I don’t feel that. I’m conscious, hearing these songs, of the studio environment, the background vocalists holding lyric sheets, and instrumental tracks someone will dub in later. In today’s country music, the producer, not the soloist, is the star.

Waylon Jennings
and Jessi Colter
Not that good country music doesn’t exist. Acts like The Gourds and The Old 97s still record recognizable country music that maintains that raw cry from the heart. They just persevere without mainstream radio support. Johnny Cash won a Grammy for Unchained and a CMA award for “Hurt,” despite being ignored by radio for the final two decades of his life.

If anyone doubts that country music objectively sucks, I dare you: Google the Bellamy Brothers’ 1982 hit “Redneck Girl.” Then Google Gretchen Wilson’s 2004 “Redneck Woman.” Go on, listen to them back to back. I’ll wait. Then tell me, with a straight face, that country music is not on a decades-long slide.

Part Two:
How Has the Music Gotten Here?

Monday, May 27, 2013

First Contact and Nightfall

S.G. Redling, Damocles

When the crew of the human exploratory vessel Damocles settles over the planet Didet, they think they have a years-long observe-and-report mission ahead. But mechanical failure forces an emergency landing for which neither the humans nor the Dideto are prepared. Suddenly, a beach drenched in eternal sunlight plays host to negotiations that will change two species forever.

Redling’s second novel, and first out-and-out science fiction, blatantly combines elements of Octavia Butler’s Lilith’s Brood with Isaac Asimov’s “Nightfall” to create a cerebral journey sci-fi devotees will find comfortably familiar, yet engrossingly new. Redling keeps focus on the scientific aspects of first contact—language and cultural barriers, incompatible technology, finding food—without ever losing pace or bogging down in jargon and effluvia.

Linguist Meg Dupris feels like an alien on a ship full of engineers and scientists. Where her five colleagues deal with empirical precision and absolutes, her discipline relies on guesswork, delicate balances, and false cognates. But on the surface, her affinity for the fuzziness of language and culture makes her indispensable. Sadly, it just doesn’t make her any more liked.

Loul Pell has been seconded to peon work since his thesis on how to handle First Contact got him laughed out of respected public service. But when the tall, lithe humans land on his planet, he accidentally finds himself between them and the trigger-happy generals who stand completely unprepared for interspecies dialog. Only Loul and Meg have the kind of thinking required for something as imprecise as First Contact.

Where many writers skip past such mundane details as language and technology, Redling revels in such specialized detail. These species have no C-3P0, no TARDIS translation circuit. They have to painstakingly overcome the simple barrier that their languages, learned gestures, and other forms of communication, have no correlation. Making contact requires diligence and trust.

Redling spares no detail, relaying brass tacks with the kind of humane care few authors have captured well since Asimov. But where Asimov believed technology and dispassion could close the gaps in human fallibility, Redling trusts “hard” science to readers’ wisdom. (Self-regenerating crystal propulsion? Pshaw.) Instead, she focuses on traits which make us, and the aliens we encounter, most innately human: the ability to build bonds and communicate.

As Meg and Loul make incremental linguistic gains, slowly developing rapport and a working pidgin, Redling’s shifting perspective allows readers to see First Contact from both perspectives. The Dideto evolved under different conditions, leading to profound genetic distinctions. (Apologies to Butler, but there will be no species melding tonight.) The two races explore the similarities, and differences, between them, some lovingly, others with profound distrust.

Yet the races share remarkable resemblances. Loul communicates through text messages, loves comic books, and is known to his friends and colleagues as a “nerd.” Just as in human society, the powers that be mock and belittle nerds for their excessively specialized interests. Until, that is, they need a nerd’s expertise to bridge gaps that generals and presidents find abstruse.

As First Contact lingers, though, complications arise. The humans cannot yet return to their damaged ship, and Meg doesn’t want to leave Loul yet; but The Purpling approaches, that one night per decade when Didet’s seven suns all set and the sky goes dark. The Purpling is a Dideto religious celebration, but it’s also the time when their world is at its most vulnerable. Someone must make a decision, but they lack the vocabulary to explain.

Perhaps nowhere else in this long, careful book does Redling so blatantly channel another author. But she doesn’t just imitate one of science fiction’s most beloved moments; she also seems to refute it, arguing that Asimov, that pedant, missed the point. She proposes her own interpretation of that moment, one which will follow readers as surely as Asimov’s slant does.

This perhaps proves what Redling does so well, that other writers strive after but fall short: she takes the components that readers love, and makes them her own. A plot breakdown would make this novel feel bloodless, derivative, and flat. But Redling injects it with such verve and purpose that it develops its own momentum, separate from the classics it mirrors.

I grew up on science fiction, but have found so little recently that recaptures the smart, cerebral wonder of my youth. I’m pleased to report, I’ve finally found it. No ray-guns, no chest-bursters, no nuclear whatsits impinging the mitochondrial do-funny for Redling. Just a high-browed psychological peek into the cogs that make us most profoundly, vivaciously human.

Friday, May 24, 2013

Self-Publishing Delirium

Maxwell Perkins
Two friends with literary aspirations recently voiced desire to escape the publishing grind, dominated by media conglomerates who demand higher profit than drug lords, and print their own books through subsidy publishers. As a reviewer, I receive (sometimes after specifically refusing them) numerous such books. To my friends, and anyone else reading, I request: in the name of all you hold sacred, don’t do it!

People who subsidy publish are nuts. Not everyone, of course; some are sweet people with noble intentions who want more creative autonomy. Some buy into hype without sufficient research. Some think they’ll be the next breakout sensation, like Christopher Paolini. But subsidy-published authors have a disproportionate likelihood to be jibbering, delirious, cuckoo-for-Cocoa-Puffs unhinged. Avoid their company.

Authors who pay to “publish” through Outskirts, CreateSpace, and AuthorHouse choose not to submit to informed scrutiny before turning their words loose on the audience. These companies don’t publish works so much as print them. They provide no assistance on manuscript preparation, physical design, or efficient publicity. Most important, they provide no editing services, which such authors consistently need.

Some people see the word “editor,” think “copy checker,” and respond that their best friend, spouse, or roommate did the proofreading. But editors don’t just untangle grammar; they serve as surrogate audience, carefully anticipating paying readers’ response to a piece of writing. Without such intervention, subsidy-published authors show no sign of awareness that human readers exist at the other end of the continuum.

Legends recall young F. Scott Fitzgerald pushing a manuscript variously titled The High-Bouncing Lover and Under the Red, White, and Blue on several uninterested publishers. Because the book relied on concept more than execution, staff readers bounced it out the door. It took a visionary, Maxwell Perkins, to see Fitzgerald’s embryonic genius through his unfocused prose, and coax The Great Gatsby to maturity over three strenuous years.

Walt Whitman
Many subsidy-published authors refuse such difficult nurturance. I receive inchoate, sprawling books that clearly haven’t seen correction since the outline dribbled off the authors’ fingers. These manuscripts, often formatted with tiny type and little margin to cut costs, look like the first drafts they probably are. Such authors demonstrate overt disinterest in their audience, expecting praise for their every brain dropping.

Several subsidy publishers advertise that classic authors like Walt Whitman and Virginia Woolf published their own books. That’s true, they did. Their work stood so far outside the mainstream that they had to take publishing into their own hands. That’s a legitimate reason to publish your own book.. But Whitman and Woolf didn’t just have their books printed; they undertook the full publishing process, which isn’t easy.

Authors like Dara Beevas and Joel Friedlander write on this topic. If you would publish your own book, you must assume full business responsibility, including not just hiring a third-party editor, but also a designer, marketer, and publicist. You must stop thinking like an artist, and become a legitimate entrepreneur. This isn’t Field of Dreams; they’ll only come if you build something worth seeing.

Subsidy publishers discourage such frank humility. Because technology cuts printing time to weeks, even days, the lag between scribbling your first draft and seeing your book on Amazon.com can run remarkably short. But this doesn’t mean you’ve published a book, only that you’ve printed a manuscript. And in pushing an unfinished book onto the market, you’ve stolen time and resources from paying customers.

I’ve read authors who base entire books on half-remembered lectures heard two decades ago. I’ve read authors who recite bigoted stereotypes as contributions to interracial dialog. I’ve read authors who think plagiarizing 1970s pop songs and Victorian quatrains makes them poets. These authors have two traits in common: they paid to publish their own books. And they thought that having a byline entitled them to unstinting public praise.

F. Scott Fitzgerald
Again, self-publishing is both possible and noble. My friend Jerry published his first novel with a mainstream conglomerate, who dropped the ball on publicity, letting his book languish. He’s published multiple books since then, going guerilla on contemporary digital platforms. But he has taken the full entrepreneurial approach. He doesn’t just print his books, he publishes them. He pays the costs, and reaps the rewards.

Online retailers are choked with books published without scrutiny, often looking scarcely better than mimeographed liberal newsletters from the disco era. Even if your book shines like silver, it will not emerge from the morass of such books. Save your money for the real grind. Because it may be difficult and dispiriting, but it continues to exist for a reason.

Wednesday, May 22, 2013

Barukh Atah Adonai

Rabbi Ted Falcon and David Blatner, Judaism For Dummies

Until recently, Jewish people and their faith were lumped into two categories in the public imagination: either stereotyped lawyers and entertainment executives, or abstract cultural “heroes” like Anne Frank. That is, when they weren’t hated for centuries-old fictional slurs. But recent trends have moved Judaism to a central position in public discourse, without necessarily answering important questions in outsiders’ minds.

Rabbi Ted Falcon and David Blatner come from a background in multi-faith outreach and cultural clarification. They have experience answering queries, some of them quite naive, and know what doubts and misinformation linger most in outsiders’ minds. They spot the gap between reality and what people think they already know. That makes them good choices to write a “For Dummies” book about the faith that founded the Abrahamic tradition.

If you’ve ever read a “For Dummies,” this book’s format is familiar. It’s designed to read out of sequence, dipping into the reference as questions arise. But it also rewards conventional reading, as the authors progress in inquirers’ most common sequence, from broad strokes of belief, through the people’s history, into brass tacks of practice, finishing with fine detail about what makes Judaism unique.

In their introduction, Falcon and Blatner say they write for two audiences: non-observant Jews interested in rediscovering their heritage, and outsiders curious about one of Earth’s oldest continuously observed religions. As such, the text is essentially bilingual. It gives a plain-English survey of Jewish religious precepts, then for those who want it, proceeds to a detailed investigation of exacting practice. The authors are good about defining terminology.

Jews have maintained their identity as a people over centuries of diaspora, in largest part because they have retained their traditions in ways other scattered peoples have not. Their elaborate mix of written history and oral tradition, bound in ritual that gives observant Jews a body of shared experience, preserves their mutuality. This includes their tradition of controversy, which outsiders have long mistaken for disunion.

Falcon and Blatner do a remarkable job keeping the balance between Orthodox and more Liberal traditions, especially considering that some parties in such divides consider their opposite numbers apostate. Controversy is at the heart of Judaism, as any Talmud student knows. Our authors carefully recount such debates as influence readers’ understanding, while remaining studiously neutral themselves—sometimes, the debate matters more than the solution.

Not that they are completely impartial. They say some controversies aren’t worth having. They completely exclude Messianic Judaism, as even the Israeli Knesset does, saying it constitutes a wholly separate religion. Also, though they address humanist Judaism briefly, they preponderantly assume Jews share belief in God, while they admit the word “God” admits multiple definitions. “Israel,” after all, means “he wrestles with God.”

The authors include glossaries of Hebrew and Yiddish terms, useful in understanding Jewish thought, and several standard Orthodox prayers, including the ritual blessings, famed for their salutation: “Barukh atah Adonai.” Because Judaism, like any religion or philosophy with a long history, has its own vocabulary, these glossaries help readers understand more, better, faster.

I initially felt frustrated that the authors didn’t cite sources for some of their claims, especially for important rabbinic controversies, which they report in a “some say... others say” manner. But Appendix C cites several valuable books, magazines, websites, and organizations for readers wanting in-depth study. I still wish the authors integrated their citations, but they do pave the way for readers who’ve had their interest piqued.

Christian readers will especially enjoy this book. Our Sunday School history of Judaism often stops in the late Second Temple period, ending when the Gospels diverge from the Talmud. But Judaism, like Christianity, is a living faith. We need to understand where it is, now, if we want to understand where we came from ourselves.

As inclusive as this book is, readers should remember what it is not. Falcon and Blatner craft a synoptic introduction to the Jewish religion, not the Jewish people; you’ll find nothing about Jewish art, culture, or non-religious history. It’s also a layperson’s overview, not a rabbinical textbook, and will make nobody more spiritual, or more Jewish. Remember, this is Judaism for Dummies, not Judaism for the already learned.

But for non-observant Jews seeking a connection to their heritage, or Gentiles wanting deeper understanding of Judeo-Christian roots, this book makes a good primer. The authors’ plain English explanations, laced with gentle but pointed humor, keeps the reading brisk. Any readers interested in browsing Jewish beliefs have here a good reference to begin their research.

Monday, May 20, 2013

Mom and Dad as Learning Coach

Jen Lilienstein, A Parent's Playbook for Learning

If I learned anything in my teaching years, it’s that most “remedial” students don’t really have a problem with the subject. They have a problem with the system. Teachers and students talk past each other, and even eager students become discouraged because school seems like an adversarial environment. Education innovator Jen Lilienstein wants to give parents and teachers the tools to make kids better learners.

Many learning experts don’t actively analyze students’ learning until roughly high school, or older. Lilienstein focuses on grade school ages, adapting the concept of “multiple intelligences,” as popularized by researchers like Howard Gardner and Thomas Armstrong. This holds that human cognitive abilities, like your child’s learning ability, are separate, distinct components, not one big “mind.” Students have more ready individual access to certain intelligences than others.

The classroom model we take for granted, which all of us who went to public (state) school shared, is not necessarily the best way to learn. Lumping kids together based on age and geography, and stuffing them into a classroom with one teacher who may or may not understand them, is cost-effective, but pedagogically inadequate. Even more so today, when budget cuts pack fifty kids into many urban classrooms.

But unless you can afford to homeschool your kids, which most working parents can’t, you rely on schools to prepare your children for their adult roles. That means parents must translate often prolix concepts into approaches children can understand. Your child hasn’t learned to close that gap. As a former teacher, I can attest that if you and your child don’t close that gap early, you never will.

Lilienstein uses an abbreviated version of the Myers-Briggs Type Indicator (MBTI), an inventory test designed to highlight personality strengths. She divides kids into eight learning categories, each of which could hypothetically subdivide further—use this book as an introduction, not a blueprint. Each learning type has its own distinct processing patterns, and parents and teachers can maximize learning by playing to these strengths.

Imagine your child loves activity learning, like art or sports, but has difficulty with reading. Lilienstein suggests teaching your child to finger-spell words in sign language, as a way to make English an activity. Or what if your kid prefers short bursts of activity over the tedium of book learning? Consider adapting Trivial Pursuit to make learning competitive, ensuring a measurable goal at the end of the process.

And not just your kids; Lilienstein suggests ways her principles can smooth communications with their teachers, too. Though she writes primarily for parents, Lilienstein encourages teachers to participate in the learning customization process. She has a lengthy section on group learning, allowing teachers to partner students with peers whose complementary abilities let them go farther. I don’t fully trust this idea—research on collaborative pedagogy is at best contradictory—but for teachers who share this value, Lilienstein’s analysis will help design better group environments.

Lilienstein divides her book according to learning category, signaled by helpful visual icons. This will especially come in handy for parents whose kids have different learning styles. My parents sincerely tried to help, but because my brain doesn’t work like theirs, their tutoring sessions frequently ended in tears. If they’d had this book thirty years ago, my life might look very different, and our relationship would feel much less strained.

I see two inherent risks with this book. First, kids could easily conclude adults will cater to them. Parents and teachers must emphasize that, while we want to utilize their learning strengths, they must learn to take the initiative. Lilienstein calls this a “playbook,” and illustrates the cover with a coach’s whistle, on purpose: while adults may call the play, students must run it in a field they cannot predict.

Second, parents could approach this book too passively. Many adults, like me, graduated from the “come in, sit down, shut up” pedagogical approach, and we learned to run the system by going along to get along. Teaching our children to be active learners requires breaking our own molds and thinking in innovative ways. We must constantly adapt Lilienstein’s guideposts to children’s growing minds, meaning we must grow, too.

Lilienstein wrote this book as a companion to her website, Kidzmet.com. Consider using both together to ease your child through the difficulty of school. Because we all need to learn, and cannot all afford private tutors, Lilienstein’s thoroughly researched assistance can make the difference between kids frustrated and discouraged by the system, and self-guided learners, ready for adult life.

Friday, May 17, 2013

The Right Fear at the Right Time

1001 Books To Read Before Your Kindle Dies, Part 15
Barry Glassner, The Culture of Fear: Why Americans Are Afraid of the Wrong Things


Recent news reports noted that amid heightened fears of gun violence, actual firearms homicide statistics have dropped precipitously since the early 1990s. Despite high-profile shootings like Sandy Hook and Aurora, Americans in the aggregate reach for guns less than a generation ago. It’s hardly unmitigated good news, since America still has Earth’s highest homicide rate. But it raises the question: why the discrepancy between fear and fact?

Times have changed since sociologist Barry Glassner wrote his most influential book. In 2000, American media personalities performed elaborate mental gymnastics to blame anything but guns for the prior decade’s alarming violence. Now, when fewer Americans keep firearms at home, and violence sits at a low unseen since before Lyndon Johnson, an apparent preponderance of public opinion appears willing to blame guns first. Why the about face?

Perhaps, though, circumstances haven’t changed that much. Glassner notes that the anti-drug hysteria of the 1980s, when Nancy Reagan added “Just Say No” to the American lexicon, came at a low ebb for drug use. In the 1990s, we stood paralyzed in fear of “Gulf War Syndrome,” a portmanteau diagnosis into which we threw every ailment ever suffered by Desert Storm veterans. Popular fears historically have little factual foundation.

Americans seem remarkably susceptible to fears based more in emotional reflex than external evidence. This vulnerability gets amplified by our willingness to trust perceived authority figures, particularly in media and politics. When multiple authorities overlap, as happened with drugs in the Reagan era, or with gun homicide today, this creates the appearance of corresponding evidence, sufficient to pass for “proof” in the court of public opinion.

Consider President George HW Bush’s famous 1989 speech, when he waved a baggie of crack at television cameras, claiming Secret Service purchased it across the street from the White House. Despite its telegenic qualities, we now know this speech stank of inaccuracies: the dealer had to be coaxed to Lafayette Park, unfamiliar turf regularly policed by federal smokies. The President manufactured evidence to unify Americans behind his pet scare.

But crack, a drug favored by African Americans, made sense in DC, a predominantly black city. Americans wanted to believe drug abuse was prevalent, and accepted specious testimony that supported their existing prejudices. Even conservative commentator PJ O’Rourke noted, at the time, the racial subtext behind Bush’s speech. Yet Americans so wanted to fear somebody for something, that many (including me) swallowed the tale whole.

Glassner is merciless in identifying culprits behind such pandering techniques. Magazine and TV editors who fan middle-class white paranoia; lobbyists with an agenda; even, appallingly enough, demagogues who believe their own overheated rhetoric. Glassner shows particular annoyance with then-President Bill Clinton, a man so fearful of being called “weak” that he habitually pimped the very fears he campaigned to resist.

Because the media/government confluence tends to create its own cultural narrative, Glassner approaches popular paranoias using the techniques prior critics used to dismantle myths. Reagan-era drug hysteria, for instance, masked official rejection of 1960s culture, and liberalism in general. Gulf War Syndrome allowed Americans to obliquely face our popular support for an overseas engagement, without making concessions for soldiers’ return to civilian life.

Because we refuse to meet such challenges directly, we surrender ourselves to scares du jour. Each bête noire holds our attention long enough to feel we’ve “done something about the problem,” then the circus moves onto the next hip nightmare (Casey Anthony, anyone?). Monday, we whitewash the Seventh Amendment to bust street gangs; Tuesday, we suspend habeas corpus to combat road rage. Nothing gets fixed, but we feel good about ourselves.

How, then, should we read today’s misplaced gun violence fears? Tempting as it is to blame such phobias on delayed reaction to past crimes, or tyrannical government overreach, such partisan responses overlook the wider cultural climate. Even if violence has grown relatively uncommon, America remains a remarkably violent society, conflating saber-rattling with strength and substituting payback for justice. Bold gun laws to address 1993’s social ills only defer real action.

Glassner unequivocally states that anti-intellectual willingness to nurture fear impedes real solutions. Especially when professional fear merchants exaggerate sentiment regarding small risks, we surrender ourselves to paralysis, living from crisis to crisis. We must stop waiting for the white-hatted hero to arrive, talking points in his six-shooter, to solve everything for us. Let us face life as it is, without hiding in somebody else’s prefab world.

Because the next crisis will certainly come, whether we fix the one before us or not.

Wednesday, May 15, 2013

Go Ask Ashley

Ashley Dukart, Nothing More, Nothing Less

Young Brandon’s spirit has become a festering cancer of guilt, which he drowns with drugs. He has progressed from consuming to dealing, a progress that has him walking a knife’s edge with the suburban PD. His brothers fight to pull him back, but he will only get better when he wants to get better—and his time is growing short.

I like the idea of this book. Freshman author Ashley Dukart has a strong premise, turning on the inherent duplicity of suburban life. While his brothers try to steer him back to the confines of caucasian normalcy, Brandon (no last name) rebels against the tedious constraints of suburban life. Sounds like the sitcom Weeds, but with more tears.

But Dukart’s execution suggests she has memorized these components from a textbook. Like Beatrice Sparks, whose Go Ask Alice described acid highs cribbed directly from government pamphlets, Dukart’s descriptions of drug effects on Brandon, his brothers, and their lives, feels entirely familiar. It has the comforting vagueness of 10th-grade health class.

As our first-person narrator, Brandon describes several highs, drug-fueled sexual encounters, and casual fistfights, all of which he uses to conceal the pain growing inside. His life is driven entirely by guilt, which he admits to himself without coming to grips with it. He would rather numb the pain than actually do anything about it.

The word “numb” comes up frequently, as does “happy.” Brandon uses these words to describe the benefits he thinks he gets from drugs, raves, and house parties. His vocabulary never gets any more specific than that. Read interviews with real drug users. Not only do they describe, sometimes in wrenching detail, the temporary benefits of using, they also describe what they use to get away from.

Instead, Dukart caroms through descriptions of highs and lows that repeat the same few details so often, we clearly recognize their memorized nature. Brandon's vague highs are followed by predictable crashes, characterized by repetitive strings of physical symptoms. Dukart seems weirdly fascinated by the fact that hangovers and withdrawals make users vomit.

Brandon pukes so often, so powerfully, and so close together that I find myself losing track of the story to wonder: when does this guy eat? Because vomiting is the only concrete detail Dukart gives us, the mechanics of malfunctioning digestion loom large in readers’ attention. Only around page 150 does another character finally observe that you can’t puke on an empty stomach.

For about fifty pages, Brandon avoids directly addressing the cause of his guilt, though it’s poorly concealed, since it’s in the back-cover synopsis. He blames himself for his mother’s suicide because... well... because her lengthy, well-constructed note told him not to. His guilt becomes circular. He feels he’s still letting her down, which only prompts more self-destructive behavior.

Brandon’s pain, not that unusual for children from divorced or domestically violent homes, seems larger than life to him. On the one hand, he seems to have remarkably little empathy for others’ pain, believing his own more important. This lack of empathy is emphasized when his part-time squeeze gets busted for possession, and he doesn’t even bat an eye.

On the other hand, Brandon’s apathy finds its mirror in his brothers, Ace and Cole (wow, them’s some white names). Ace wants to be paternal, but only knows how to solve problems with his fists. Cole would keep the peace, but doesn’t apparently know what he wants from his own life. He’s a middle child from Central Casting.

Instead of guiding us through a fully fleshed human soul in torment, Dukart recites a thesaurus of cautionary stereotypes. We never feel for Brandon’s struggles outside bog-standard white suburban malaise (Mom’s death feels distant, reported rather than experienced), nor share the pain he wants to numb. He just grabs more drugs, and flushes his life away, because that’s what people in these stories do.

That encapsulates my problem with the whole story. Rather than push deeper into Brandon’s pain, Dukart seems driven by a checklist. Personal trauma that needs narcotized away: check. Meaningless sex that makes him feel cheaper rather than loved: check. Painful jolt that motivates him to change: check. True love as salvation: check.

Again, I like the idea of this book. But rather than describe a real journey, Dukart recites bromides recollected from prior books. She doesn’t need to have experienced drug washout to write about it. Stephen Crane wrote convincingly about the Civil War without serving, but he learned about authentic experiences. Dukart hasn’t gone that far.

Monday, May 13, 2013

The Jung and the Restless

Caroline Myss, Archetypes: Who Are You?

Most of us, at times, feel like guest stars in life’s drama. Caroline Myss suggests that’s because we don’t know the real role we were born to play. In our dreams, we envision ourselves in one position—victorious warrior, vibrant artist, rebellious firebrand. But we accept some other workaday role because we don’t grasp which is our real life. We have missed, or failed to act upon, our true archetype.

Archetypes, as they appear in this book, derive from the theories of psychologist Carl Jung, who broke with Freud over spiritualism versus materialism. Myss’ archetypes, though, don’t strictly accord with Jungian archetypal theory; her ideas have more in common with Joseph Campbell’s “heroic journey” model. Though Myss doesn’t directly cite Campbell, his Hero with a Thousand Faces casts a long shadow over this book.

Briefly, each person has a life role for which our disposition is uniquely suited. Pause a moment, here, and consider: what do you see yourself doing five years from now? Raising a family? Captaining a rising business? Signing copies of your first novel? That provides a firm clue to what role, what archetype, dominates our psychology. Our life’s mission, then, is to reconcile our actions and situations with our innate archetypal role.

But just because we know our archetypal role doesn’t mean we’ve finished the journey. We may carry an immature archetype, which needs proving through ordeal. Lingering trauma may warp or distort our archetype, requiring us to turn inward and rediscover our correct path. We may even have a false archetype; many of us were ramrodded early into the role our parents, schools, and society chose for us.

This last problem is especially prevalent for women. Though Myss states that both genders have their archetypal prerequisites, she admits she wrote this book primarily for women, who often get trained to accept submissive roles that leave them unfulfilled. Creative, adventurous, or visionary women frequently squelch their true archetype, thinking that they must play an “appropriate” role. I’ve seen such women. They move through their own lives like ghosts.

But if we shed our blinders and accept our natural roles, not only will we know individual fulfillment (Myss says), we’ll increase our ability to promote common good and benefit our society. By undertaking the task for which our dispositions best suit us, we’ll accomplish something nobody else could do for us. Only we ourselves know what that is.

Myss collates ten common archetype families, explaining not only what they mean, but also what unique challenges they face, what gifts they present to themselves and society, and what journeys these people will face as they consummate their roles. Her work rewards careful browsing. Take time to familiarize yourself with them. Some people have more than one. Even if one isn’t your own role, you know people in each category.

At times, Myss ventures into unnecessary mysticism, as Jung himself did. Her postulations on where archetypes originate (Genes? Society? God?) confuse the issue more than they clarify. A brief, separate overview of various theories might interest some readers, perhaps in an appendix or afterword. We want the practical applications, the elucidation of the stories we never realized we tell ourselves, not woowoo spirituality.

Myss also hits heavily on the concept of myth. Don’t misunderstand what this word means in context. “Myth” signifies the stories we tell which allow us to understand topics too vast to address directly. Have you ever had a friend try to sum you up, and thought, no, that’s not me at all? You and your friend have conflicting myths. Perhaps you should examine your myth to understand where your life and your archetype got out of sync.

This book has a companion website, ArchetypeMe.com, to help you find what archetype in Myss’ lengthy gallery explains you. I don’t care for the website. It’s all flashy design and radio buttons, forcing readers into excessively narrow categories. Standardized testing works as poorly for psychology as it does for schoolchildren. Instead, take time with the book, finding which implicit narrative best describes you.

Don’t mistake this book for a profound exegesis on Jungian psychology or the human condition. Myss writes to help individuals take control of their lives by recognizing their roles and attendant myths. As Joseph Campbell asserted, each of us has some task we were born to fulfill, and until we do that, we will shamble through life, hungry and never satisfied. Caroline Myss wants to help you find your role. Accept her help with care.

Friday, May 10, 2013

How Can You Be Healthy When You Aren't Even Awake?

Dr. Janet Bond Brill, Blood Pressure Down: The 10-Step Plan to Lower Your Blood Pressure in 4 Weeks--Without Prescription Drugs

Forgive my rush to the conclusion, spilling Dr. Brill’s thesis first: Americans, and increasingly other peoples too, are just not conscious of what we put in our bodies. We eat packaged filth because it’s easier than thinking about food or paying attention to health effects. We don’t cook at home, and we don’t ask about what goes into the recipe. As a result, hypertension now sits at epidemic levels.

High blood pressure afflicts around a third of Americans. Worse, it’s a ripple effect disease. People with hypertension have higher risk of heart disease, kidney disease, stroke, and certain cancers—many of the most common causes of preventable death. Doctors habitually treat hypertension with drugs, which aren’t worthless, but don’t do everything. According to Brill, solutions and preventions exist which don’t involve costly medical interventions.

I’m old enough to remember when everyone thought they could control blood pressure by watching their salt. But Brill, a nutritionist with specialization in cardiovascular disease, collates the latest science suggesting that salt is only one part of a much larger machine. Many of us regularly consume foods that, in small amounts, keep us running, but in large quantities, bog us down. And we think we’re eating healthy.

For instance, what foods hit you with the greatest sodium content? Did you say potato chips or french fries? While nobody should mistake these foods for healthful, foods which taste salty are often a fairly low sodium risk, because sodium forms compounds besides salt. Most packaged bread and cheese contains more sodium than salty-tasting foods. Same with commercial sauces, marinades, and salad dressings. Many supposedly healthy foods are hypertension bombs.

More important than just one element, though, Brill emphasizes the interaction of complex forces on human health. Many readers flinch at books like this because authors inevitably recommend weight loss. Yes, so does Brill. She urges readers to lose five pounds in four weeks, not an unreasonable standard. Many of us can lose five pounds by using stairs rather than elevators, taking a daily walk, and biking on weekends.

Once we’ve committed to weight loss and sodium control, Brill graduates to foods she wants us to consume more. If Americans get too much sodium, we get too little magnesium, potassium, and calcium. Brill goes into the science, but the thumbnail version goes thus: human physiology is optimized (whether by evolution, God, or whatever) for environments where sodium is rare, but other elements are common. That doesn’t describe today’s society.

Less bread, more bananas. Less cheese, more yogurt. Brill’s DASH Diet—Dietary Approaches to Stop Hypertension—isn’t about self denial. She stresses establishing good habits, including what fresh or nutritionally packed ingredients she wants us to introduce. This includes, no kidding, red wine and dark chocolate. But it does require one sacrifice: Brill wants us to cook and eat our meals at home.

Much take-out or delivery food relies heavily on bread, cheese, and cured meat. The preparation process for these foods requires heavy infusion of sodium, including salt and baking soda, to ensure long, stable shelf life. Moreover, storage strips these foods of necessary nutrients. Many people, including me, fail to check nutrition labels on packaged convenience foods, and wouldn’t dare ask restaurants for nutrition details. Remaining unconscious to consequences is easier.

The main body of Brill’s book emphasizes the science underlying her prescriptions. She says readers can cherry-pick which chapters they want to read, but I strongly recommend reading all of them, because if we understand why we make a dietary choice, we’ll resist the desire to stray. By combining her prescription with repercussions, Brill forces readers to remain conscious of the choices we make.

Brill moves her brass tacks to the appendices and back matter. Here she gives the checklists, charts, and nitty-gritty instructions on how to live out the plan she put in the main text. She also includes fifty pages of recipes, four weeks of nutritionally rich, flavor-packed meals that help us maintain needed bodily balances. Readers with food allergies should plan substitutions, but by just reading ahead, building healthy habits should come easily.

Look around any grocery store, and notice people tossing food blindly into baskets, looking hypnotized. Before reading this book, that was me. If that’s you, and you’re happy sleepwalking through deciding what to feed your body, avoid this book. But if you’re ready to wake up, pay attention, and take responsibility for your own health, let me introduce Janet Bond Brill. She’ll guide you to the world of attentive eating.

Wednesday, May 8, 2013

Werewolves In the Mist

Benjamin Percy, Red Moon: A Novel

The Lycans live among us. They could be the pretty girl at school, the quiet guy at the factory, or the person in the next seat on the plane. They are ordinary people, but they could become cold-blooded killers. Every time some isolated Lycan lashes out at the humans around them, hateful political rhetoric notches up; and with every rhetorical increase, the chance of violence grows.

Benjamin Percy won’t let readers to dismiss his werewolf novel as mere fantasy. He works hard to spotlight the political ramifications, practically grabbing readers’ lapels to demonstrate how simplistic solutions—on either side—fail to remedy a painful situation. But by keeping the focus on his nuanced, melancholy characters, he prevents it spiraling into mere drum-beating propaganda.

When Lycans attack three transcontinental airplanes, leaving a trail of dead throughout America, young Patrick Gamble survives by sheer accident. It will be the first of several times he survives by refusing to fight. But politicians don’t learn from him, and every American Lycan gets treated like a criminal because so few acted violently. Not surprisingly, previously law-abiding Lycans feel oppressed, and rebel.

One such rebel is Claire Forrester. At an age when other girls pick colleges and boyfriends, she’s on the run from the FBI, paying for her parents’ sins which she never previously knew. Now she finds herself thrust into a world of radical politics, legalized oppression, and criminal shapeshifters. She never wanted this fight, but it chose her, and she has no choice but to see it through to the end.

Meanwhile, firebrand governor Chase Williams’ knee-jerk rhetoric draws Lycan attention. One bite, and he finds himself slowly turning into the enemy he has promised to bring down. While he keeps to the old rhetoric, riding anti-Lycan sentiment into the White House, he has to take increasingly extreme measures to conceal the fact that, on full moon nights, he feels less and less like himself.

Percy juggles these three convergent narratives, and enough subplots to fuel a cast of thousands, in a mostly satisfying manner. His savage, austere language, reminiscent of Mickey Spillane or Raymond Chandler, strips away all pretense and demonstrates characters at their most raw. He manages to keep characters engaged and stories moving forward without making them feel busy.

Throughout, Percy’s story mirrors the way America has treated subsets of its population in recent years. Readers who like their political symbolism oblique may not appreciate Percy’s straight-on approach. But by removing race and religion from the limelight, Percy is able to address common, widely held concerns without having to seem pedantic or preachy.

While America deals with restive Lycans at home, American troops struggle to keep the peace in the Lycan Republic. Or, let’s say, “keep the peace,” because commanders on the ground don’t even pretend they’re there for any reason than to keep the supply lines open and raw uranium flowing back to the States. Many soldiers enter the field thinking they’re doing right, and leave with more questions than answers.

A scientist with military connections stands at the verge of inventing a cure for Lycans, a mission that gains extra urgency when President Williams needs his help to conceal his growing illness. But his own ghosts won’t let him be. And Patrick finds himself needing the professor’s help when he discovers his soldier father’s strange connection to the growing research. All society may rest on an appallingly fragile foundation.

Percy’s storytelling chops resist easy designation. Though his language makes easy reading, his non-sequential structure takes some getting used to. He may jump forward several months without warning, leaving us hanging for pages before closing the gaps in flashback. His characters’ penchant for long bouts of introspection come across as either touching or tedious, depending on your taste.

And while Percy makes a good novelist, he’s less of a fantasist. His unnecessary attempt at a scientific justification for Lycans just thuds, leading to scattered moments of unintentional comedy. It feels as though, in moments of visceral terror, he forgets that horror is as horror does. He feels that he ought to explain, which isn’t his strong suit; he’s far, far better at creating characters and situations.

These problems notwithstanding, Percy creates a smart, engaging allegory that rewards attentive audiences. His structural complexity does not let us read this as a “mere” novel, demanding we read with our critical faculties engaged. Diligent readers will find plenty to like in this book that reads smoothly, but refuses to let readers down easy.

Monday, May 6, 2013

Who Are U?

Jeffrey J. Selingo, College (Un)bound: The Future of Higher Education and What It Means for Students

Let us start with a statement college professors, homeschool advocates, and Jeffrey Selingo can surely agree upon: American higher education is too expensive. Budget cuts have jacked tuition, schools spend scarce resources outside the classroom, administrative roles have become patronage plums, and deregulated loans put many working-class students in debt they may never beat. The question becomes: what do we do about it?

Books like this one matter, not because they attempt to answer the question, but because they advance the debate. No 250-page book can truly address all the options. Believe me, several noble attempts have crossed my desk. But they inevitably reflect the authors’ preferences for what American education should resemble. Therein, maybe, lies the problem, that American higher ed has become perilously homogenous.

Selingo, a respected educational journalist, addresses the question from multiple angles, gathering diverse sources with divergent views, reflecting real trends in recent debate. Because he addresses so much, I find myself swinging wildly. At one moment, I pump my fist and shout “Yes! Yes! Yes!” Then the next moment, I palm my face and mutter “No! No! No!” Then I ask myself the real question: why do I feel so strongly?

Education, Selingo says, has suffered in the last decade from a “race to the top” that involves little actual educational content. Highly groomed campuses and pricey sports championships attract new enrollees and alumni donations. But colleges, particularly private colleges, have offset these expenses by hiring adjunct instructors, concentrating efforts on grant-earning grad students, and packing undergrads into lecture halls of questionable pedagogical value.

We could reverse such trends by re-evaluating what education is for. The emphasis on defined disciplines and mandatory curricular trajectories is cost-effective and requires little effort from professors. Prestige majors with putative professional applications lock students into career tracks early. Instead, we should recall that employers, and society, love college grads not for their subject mastery, but for their wide-ranging ability to face new and unprecedented challenges.

I’ve made similar claims myself. But where the rubber meets the road, Selingo has a frustrating tendency to get giddy over unproven options. He especially shares contemporary reformers’ uncritical love for technology. Selingo writes: “Every new study of online learning arrives at essentially the same conclusion: students who take all or part of their classes online perform better than those who take the same course through traditional instruction.”

That’s just not true. Earlier this year, Selingo’s own magazine, the Chronicle of Higher Education, wrote: “Online Courses Could Widen Achievement Gap Among Students.” The more online courses students take, the greater their dropout risk. This especially applies to minorities, men, and students less prepared for the rigors of self-guided education. Multiple studies in multiple sources confirm this.

Consider: Selingo praises Thrun and Norvig’s celebrated 2011 online Stanford course that attracted 160,000 enrollees worldwide. But according to his own numbers, this class had a completion rate of only 13.75%, barely a third of Fairleigh-Dickinson University’s graduation rate, which Selingo calls “dismal.” Sure, a thousand students got job referrals, but would you pay to enroll in such a class for a one-in-160 chance of professional advancement?

In fairness, Selingo repeatedly ventures in the right direction, but not far enough. He drops a one-sentence reference to St. John’s College of Annapolis and Santa Fe, which is one more sentence than I’ve seen elsewhere. Schools like St. John’s, Deep Springs, and Reed College, which soften or eliminate disciplinary divisions, graduate high numbers of desirable employees. Why not more in this direction?

Similarly, Selingo makes a fleeting reference to competencies earned through “internships outside of the classroom.” I’ve often suggested students could benefit from non-classroom education, particularly vocational students who could learn their fields faster through old-fashioned apprenticeship. I know research exists on this, because I’ve read it in authors like John Taylor Gatto. But Selingo just name-drops it and walks away.

Don’t mistake me. Despite my critical tone above, Selingo says plenty I think educators could stand to hear. We need to remain responsive to our students, providing personalized education that works around especially working-class students’ needs. And we must eschew discipline-based “skillz drillz,” instead empowering students’ higher reasoning ability. I may dispute Selingo’s details, but his thesis is spot on.

On balance, I do recommend this book as part of a balanced library on what necessary reforms await American higher ed. We may embrace his principles while rejecting his brass tacks. I simply encourage any would-be readers to approach this book with their critical thinking cap on.

On a similar topic:
Living For the New U
The Next University

Friday, May 3, 2013

The Perils of Unsanctioned Thought in School

Kiera Wilmot
By now, you probably know Kiera Wilmot’s story. This past Monday, Wilmot, 16, and an unnamed peer combined aluminum foil and toilet bowl cleaner in a closed bottle to see what happens. This is called a “works bomb,” for obvious reasons, and when the bomb worked, Wilmot got escorted off school grounds in handcuffs. Now the former straight-A student has been expelled, though her expulsion remains eligible for appeal.

Mere days later, Wilmot has become a cause célèbre, attracting attention especially for the way schools treat seemingly minor infractions from minority students. A well-liked student at a middle class Southern high school, Wilmot was treated like a criminal for reproducing a chemical reaction so simple that it’s often used as demonstration fodder in classrooms. But the racial aspect may be this story’s least interesting implication.

Nobody should deny that Wilmot’s experiment was dangerous. Garage scientists have been grievously harmed building similar devices. But all learning carries inherent risk. I once sloppily took the push block off a sheet of plywood I’d just cut in shop class, letting the wood kick backward across the circular blade, right into my groin. Half an inch to the right, and my dreams of starting a family would have ended before they began.


I care more about the attitude some, including the school, show regarding this incident’s pedagogical value. Where I see a classic teachable moment, when teachers and administrators could discuss the value of safety management and personal protection, many jump straight to seeing this as a crime. One Facebook friend commented: “Her blowing up a water bottle had nothing to do with any class she was in. To blow stuff up at school, out of curiosity, that's ok?”

I find the presumption in this comment telling. To this person, “science” happens in Science Class, under credentialed supervision. “Experiments” should not be undertaken unless outcomes are predictable. “Curiosity” has nothing to do with learning, unless teachers can channel it into approved curriculum. Students should essentially sit down, shut up, and receive information from on high.

Thank God this killjoy didn’t govern the schools I attended. I attempted many experiments that, under today’s Zero Tolerance regime, could have gotten me jailed. I remember multiple occasions, under teachers’ watchful gaze, when I literally mixed two chemicals grabbed at random in a corked test tube over a Bunsen burner. I only quit my childhood dream of becoming a scientist when I realized it’s damned hard to explode a test tube.

America today suffers an epidemic of Magical Thinking. Far from a mere buzzword, Magical Thinking is a technical term describing what happens when people see events as essentially disconnected, happening without cause. They refuse, for instance, to acknowledge links between cigarettes and cancer, or between burning hydrocarbons and climate change. We project our thoughts onto the world, rather than subjecting thoughts to evidence.

This manifests most prominently in a form of religious conservatism that rejects empirical evidence which does not accord with beliefs. Some Christians, for instance, wholly discount evolutionary biology, because it differs from the Genesis creation. To bolster these positions, Magical Thinkers mine scientific texts for evidence of any contradiction whatsoever. For such people, science is not a process, but the words which dribble from scientists’ mouths, or pens.

Richard  Feynman
Scientists, by definition, doubt. That does not mean they disbelieve, but they want someone, possibly themselves, to subject any position to tests before accepting it. While about half of American scientists say they subscribe to some religious creed, they do not receive beliefs passively; they examine texts, evidence, and lived experience to the point where, as Richard Feynman puts it, they reduce doubt to the smallest possible level.

Feynman made his name on the Manhattan Project. While there, he determined—against majority opinion—that nuclear flash blindness was manageable, not necessarily permanent. To prove it, though his evidence was not absolute, he sat in the cab of a pickup truck, shielded only by the tinted windscreen, to watch Operation Trinity, the first nuclear explosion. He thus became the only person to look directly at a nuclear fireball without going blind.

Kiera Wilmot, like Richard Feynman, wanted to know. So, though her evidence was incomplete, she made an effort. And for such unsanctioned curiosity, she may have a permanent black mark on her record. But American education has something far worse. It now has an open confession, on the public record, that mandatory schooling isn’t about learning, it’s about obedience. Learn without permission at your own risk.


On a related topic:
Gatto's School of Immeasurable Meaning

Wednesday, May 1, 2013

The American Evolution

Gar Alperovitz, What Then Must We Do?: Straight Talk About the Next American Revolution

If 1992 sounded the death knell for global communism, 2008 and the worldwide stagnation thereafter have left many disillusioned with market capitalism. Yet for generations, media, business and education have trumpeted the socialist/capitalist coin-toss so persistently, many citizens cannot imagine a third option exists. But what if a respected economist said forward thinkers have spent years building new options, right under our noses?

Maryland political economist Gar Alperovitz wants citizens to use their imaginations. We’ve become so trapped by the past, yearning for mid-20th century prosperity, that we try to recreate long-gone conditions. But neither triumphant American postwar prosperity nor aggressive liberal reforms are likely to recur, because the conditions that existed, roughly from 1930 to 1966, will never reappear. We may praise the past, but we must look to the future.

According to Alperovitz, the spectre of failed state socialism keeps economic discussion, left and right, beholden to corporate capitalism. Whether conservatives would use money as a barometer of virtue and entrust governance to the wealthy, or liberals want government and organized labor to provide counterweight to corporate power, we nevertheless assume corporate might is somehow normal. We have let a minority narrow our thinking.

Instead, Alperovitz calls attention to maverick innovators on the edges of our economy. As early as the 1970’s, trailblazers pioneered new approaches to ownership, industry, and democratic wealth. When workers engineered an unprecedented buyout of a floundering Ohio steel mill, Alperovitz was there, helping them forge a new kind of corporate charter. Alperovitz witnessed the rise of agricultural co-ops, pathbreaking land trusts, and philanthropic investing.

Alperovitz calls this a “checkerboard strategy.” Instead of waiting for national intervention, economic innovators focus on one town, area, or economic sector. Their innovations may reach as small as one business, or as large as an interstate compact. But they share common goals of transforming the way business, government, and people respond to the ever-changing needs of a volatile globalized economy.

At first blush, such pathbreakers have a long fight ahead. Looking around, we see stagnant global economies where long-term unemployment and bleak prospects have become normal. We can say, without exaggeration, that inequality has hit medieval levels. It’s easy to wallow in our own gloom. Alperovitz admits, “All of us have a vested interest in pessimism. We don’t have to do anything if nothing can be done!”

Realistic, ambitious attempts at economic reform have repeatedly failed because activists have put their trust in politicians and policy. That succeeded during the New Deal and Great Society due to unique historical circumstances. Today, kowtowing to politicians will help little, because government policy is part of the broken, lopsided system. Politics can do little about a society hobbled by more workers than work, and massively undemocratic concentrations of wealth.

But faced with such circumstances, some individuals and communities refuse to cave. They prove the long-term viability of collaborative business models and public-private hybrids. They invest money into new industries, even during recession, and draft new business categories driven by what Alperovitz calls a “triple-bottom-line” model: people, planet, and profits. They stare in the eye of economic despair and answer, “No.”

Alperovitz doesn’t waste readers’ time with pie-in-the-sky notions of how economies ought to run. He focuses instead on how dauntless innovators have successfully built new economic models that should serve to inspire millions to seize their own futures. Looking forward, Alperovitz challenges us to imagine ways we can build upon these successes, forging a society of democratic wealth and renewed community.

These suggestions don’t come easy. Alperovitz doesn’t promise quick fixes. Quite the contrary, he emphasizes the generational nature of his suggestions. We intend to build new social nstitutions that serve the greater good, Alperovitz repeatedly avers. The biennial focus on elections permits others to control our lives. Such essential surrender has successfully stripped democracy from our treasured democratic institutions.

Instead, Alperovitz extols visionary new ways we can assert power autonomously. While nobody can predict the future, we have the choice whether we let the future roll over us, or take a steering hand ourselves. We can continue along our current path, circumscribed by the options others offer. Or we can, in small and growing ways, blaze our own trail.

America is Earth’s richest, largest, most populous nation, which Alperovitz stresses repeatedly. The experiments we undertake don’t stop at our own borders. We have the choice: a new, democratic economy, or more of the same, with the disastrous consequences we see around us. Our past does not control our future.