Monday, March 31, 2014

Renewed Learning, Book One

Ron Berger, Leah Rugen, and Libby Woodfin, Leaders of Their Own Learning: Transforming Schools Through Student-Engaged Assessment


Early in this book, the authors say “Teachers frequently fall into the trap of simply saying, “try harder” without giving students specific targets, feedback, time to revise, and a purpose for doing quality work.” I know, back during my teaching days, I often fell into that trap. Since I intend to resume classroom duties someday, I find this book bracing, with its new, startlingly active approach to continuous engaged assessment.

Our authors cut their teeth at Expeditionary Learning, a charter school network stressing cumulative learning, interdisciplinary evaluation, and portfolios. Expeditionary Learning schools have refined their techniques for over twenty years, building modular course approaches designed to put principal learning burdens on students while increasing their ownership of their own learning. One part of this is assessment. Their most important lesson: assessment isn’t just for culminations and report cards anymore.

Having decided to make their discoveries available to teachers and administrators outside their network, Expeditionary Learning anchors their first book to their assessment process. And what an exciting topic they make of a frequently dull activity. If your classroom experience was anything like mine, you got assessed at the end of some arbitrary interval (semester, quarter, etc.), and the letter grade felt vague and abstract. Assessment was discouraging, not supportive.

EL assessment involves approaches that, on first blush, appear consonant with existing techniques. The elucidation of clear goals and “learning targets,” for instance, superficially resembles common lesson planning. But the authors emphasize these components emerge from different roots, and pursue distinct goals. By emphasizing students rather than classrooms, this approach takes what’s traditionally the teacher’s sole responsibility, and makes everyone, students included, equally liable for outcomes.
Other components involve students understanding beforehand what teachers expect. Our authors spend an entire chapter on “Models, Critique, and Descriptive Feedback” (clearly, EL stresses writing as evaluative learning). Looking back, my teachers expected me to produce critical writing, and critique others’ papers, as early as seventh grade, but I never saw what critical writing should look like until graduate school. I wish I’d had this approach in my youth.

Beyond classroom organization, our authors describe techniques to engage parent and community engagement. Since many teachers report their number one problem is parental apathy, with students getting reinforcement at home that school doesn’t much matter, my many teacher friends will surely appreciate this inclusion. Though parental engagement will require time to overcome encultured apathy, EL’s time-tested techniques will provide educators with valuable shortcuts.

Our authors also spend copious time explaining how to reconcile their innovative evaluation techniques with Common Core standards, which often impede individualization. The standards, as written, are frequently opaque, and even trained teachers have difficulty making sense of them. With their specific, plain-English learning targets, EL schools can potentially address multiple Common Core targets simultaneously. This transforms Common Core’s top-down hierarchical approach into real, measurable learning outcomes for diverse communities.

Besides simply telling teachers how they ought to assess students, this book includes a DVD of EL techniques in practice. Fairly short videos, cued according to chapters in the book, provide object lessons in how EL approaches work in real classroom environments. Thus the authors don’t just lecture at their audience, as my pedagogy teacher did. We get to see how innovative, groundbreaking techniques actually work.

Though our authors aver that their approach applies at any learning level, they clearly focus on the K-12 school environment, particularly its long rolling approach to college and career preparedness. That’s not to say that creative, diligent instructors couldn’t adapt this approach to post-secondary education, skills training, or remedial and GED schooling. By applying these techniques, ideally school-wide, inventive teachers could construct an educational environment conducive to multiple learning styles.

I confess one trepidation. EL asks teachers to dedicate generous one-on-one time to students, helping them customize learning goals and evaluation. That sounds good in charter schools, which are publicly funded but nominally private. What happens, though, when these techniques hit perennially understaffed, cash-strapped public schools? (Cf. Jonathan Kozol’s Savage Inequalities.) Schools will need time to experiment with workarounds for schools that have less money and time to spend.

EL has evolved over two decades of in-the-trenches use, so presumably it’ll absorb public school challenges gracefully. Though it would be a mistake to regard this book as finished and done, it offers intriguing, ambitious guidelines for creating school-wide learning cultures where students own their process and teachers serve to guide. In a school environment favoring “reform” over practice, this provides a compelling outline of truly renewed learning.

Friday, March 28, 2014

Did America Lose the Cold War?

Mystery novelist Tyler Dilts, who moonlights as an adjunct English instructor on the Left Coast, has recently posted several articles to his Facebook page about the shameful treatment of adjunct instructors in American universities. Adjuncts work for universities on an “as-needed” basis, often making below minimum wage when amortized for their hours, with no health benefits or job security. Considering his career, Dilts’ interest in this subject is unsurprising.

But other forces have weighed into this debate recently. From Jim Hightower to President Obama to the Catholic Church, powerful advocates have have loudly declaimed “the Wal-Martification” of higher education. Having played the adjuncting game myself for several semesters, I feel I should support my brethren in this well-earned fight. Yet a throwaway reference in a tangentially related book persuades me we’re seeing a bigger battle unfolding here.

Rhetorician Gerald Graff, my favorite source for overviews of academic culture, notes that the government began pouring funds into American universities after the Soviet Union successfully launched Sputnik in 1957. Leaders and elected officials feared America could win the military conflict with the Soviets, and lose the brain game, a true Pyrrhic victory. Universities became battlegrounds for fighting the Cold War, and philanthropists quickly followed government’s lead.

But that makes university spending part of a larger matrix. Politicians competed to spend more money more effectively, subsidizing public broadcasting, interstate highways, public arts projects, urban renewal, and NASA, all in pursuit of moral and ideological advantage over the Soviets. Look, America told the world. Admire our roads, our schools, our Moon rockets! Red aggression and Stalinist statism could never offer you such well-earned grandeur!

And America was right. America tied abstract moral and intellectual goals with real-world consequences. Interest in pure science drove developments in technology that continue driving global economic markets. The web-connected computers that permitted me to write this essay, and you to read it, derived from government spending, targeting investments in new technologies whose ultimate implications wouldn’t become visible for decades. The future was worth our money.

The American government, and flag-waving capitalist philanthropists, were literally willing to spend money on investments with outcomes they couldn’t foresee. Generous donors might endow scholarships, knowing students they paid to educate might choose to work for their competitors. Educational tracks with no measurable market value, like English and Physics, nevertheless merited massive charitable gifts, because America’s future education was worth it.

After the Soviet tent folded, Americans apparently stopped caring. NASA, which at its peak in 1966 consumed nearly five percent of America’s federal budget, is now too cash-strapped to send humans into space. Authors have written books and books and books about how Neil Armstrong’s televised Moon landing reversed global anti-Americanism during the Vietnam war. Now we ship astronauts to Kazakhstan if we hope to chuck them into orbit.

We previously believed that a well-educated, scientifically advanced, economically mobile society defined America not just as a people, but as a distinctly superior choice to communal Soviet mediocrity. Though we frequently fell short of our best ideals, Americans believed that, in Jim Hightower’s words, “Everybody does better when everybody does better.” And though it’s hard to tell when that ethos ended, clearly it has ended.

Perhaps we could say Liberal Arts or Astrophysics contributes little to American society. I disagree, though we could say that. But can we really say that about grade schools, good roads, and safe food? We’ll pay billions of dollars to salvage bankers who treat their banks like personal casinos, but every nickel spent fixing potholes gets called creeping Bolshevism. Congressmen cross-examine scientists spending public funds exploring the nature of reality.

But money is never just money. As Christ says, “Where your treasure is, there your heart is also.”

This week’s preliminary NLRB ruling, permitting college football players to unionize, reflects something beyond sports. Since the NLRB rigidly forbids graduate students to unionize, the government has thrown its tacit support behind the idea that sports matters more than learning in today’s America. This is just the latest, most blatant declaration that America no longer values any investment that pays off in the indeterminate future.

“Inequality” has become today’s political watchword. Everybody, we hear, deserves equal opportunity to climb society’s ladder. But there is no ladder anymore. We climbed that ladder to best the Soviets, then we didn’t just kick that ladder over; we chopped it up for firewood. We won the Cold War, then abandoned the values that made that victory possible. If that’s true, did we really win anything?

Wednesday, March 26, 2014

Country Music Used To Be Dangerous

Late in The Power of Habit, Charles Duhigg extols Outkast’s “Hey Ya.” Originally on track for abject failure, its lack of commercial success baffled record executives, who’d run it through an analytical program and determined it had the makings of a hit. They could tell, because it so completely resembled prior hits that its success should’ve been assured. So radio programmers manipulated playlists to ease “Hey Ya” to number one.

I read that story and thought: holy shit! Computer programs determine which songs deserve hit status based on their parallels to prior hits? Radio programmers hold doors for sponsored songs? No wonder pop music and commercial radio all sounds identical! “Hey Ya” is a good song, sure, but did it deserve more attention in 2003 than The Mountain Goats, who’d just released their first major-label album? Probably not.

At a restaurant recently, an employee surprised me recently by hijacking the in-store music player and switching to the classic country station. After a brief commercial, the very first song was Loretta Lynn’s “One’s On the Way.” A brief Conway Twitty love song followed, then the Statler Brothers’ classic “Bed Of Rose’s.” Beyond the simple acoustic instrumentation and clear vocals, I felt impressed by these songs’ raw, frankly subversive themes.



My dislike for current hit country music is well documented. But hearing these old songs, some of which I hadn’t heard for years, I realized that the old stuff isn’t just more aesthetically pleasing. This older, less commercial country music has a shockingly countercultural message. Loretta Lynn’s outright rejection of popular culture hype, or the Statler Brothers’ exposure of the weakness in conformist morality, really slap mainstream America’s face. Hard.

This outsider ethos contrasts brutally to the insider cliquishness dominating today’s country radio. Lee Brice’s “Parking Lot Party,” Blake Shelton’s “Boys Round Here,” or Toby Keith’s remorselessly pandering “Red Solo Cup” all desperately want to be liked. Sure, they may twang things up, trying to sound more muscular than the Statlers’ deliberate close harmony. But their motivating force is ultimately urban, flaccid, and aggressively slick.



While songs like Merle Haggard’s “Mama Tried,” Hank Williams’ “My Son Calls Another Man Daddy,” and the Carter Family’s “Can the Circle Be Unbroken” come from very different backgrounds, they all represent outsiders struggling for a fair shake. They may lack specific political motivations, like Loretta Lynn’s feminist  “The Pill” or Merle Haggard’s (apparently) satirical “Okie From Muskokgee,” but there’s more to outsider culture than current events.

However, we’ve seen an inverse relationship between country music’s wealth, and its dangerous countercultural thrill. Listening to classic honky-tonk music, the songs of people whom life punched in the heart so often they grew thick scars, it’s tough to miss that they just see life differently than today’s glossy hit-makers. Don Gibson’s “Oh Lonesome Me,” is remarkably frank, not just in his heartbreak, but in his alienation from America’s mainstream.



Thus we see two conflicting forces: corporate-owned media, including record labels and radio stations, embrace slick party-time music. Just yesterday, while browsing YouTube, a sponsored ad urged me to sample Luke Bryan’s “new spring break anthem,” a sign of his corporate handlers’ desire for insider acceptance. Country music, as an organized force, resembles the fat nerd in high school, just begging for a seat at the lunchroom cool table.

Meanwhile, various artist at various times have resisted the push toward country insidership. In my youth, artists like Steve Earle kicked Nashville’s balls with music that defied white-suit politesse. Besides his Memphis-influenced musical pugnacity, his politics shocked the establishment—in interviews, he recounts his drug history and his close call with Death Row. Earle earned his bullish outsider standing honestly, and at times almost bloodily.



But less obviously, Dwight Yoakam threatened country insiders in ways Earle never could. Lacking Earle’s post-Vietnam swagger, Yoakam actually embraced something far older. He willfully channeled Merle Haggard, Buck Owens, and other Bakersfield icons at a time when “Urban Cowboy” porridge dominated Nashville. Sure, his singles were often straightforward love songs. But we’d grown unaccustomed to straightforward.

Growing up, I didn’t realize how subversive Dwight really was. But by the early 1990s, his music had become so muscular and dark, while rejecting corporate cowardice, that any radio programmer playing his songs was almost performing civil disobedience. Dwight’s bleak, downbeat songs poked Nashville’s enforced happiness in the eye. It’s surprising how often he hit the Top Ten, considering how completely he rejected trendy optimism.



This Spartan outsidership survives, though undoubtedly, you have to seek it out. Artists like Robbie Fulks, Neko Case, and the Drive-By Truckers don't generate chart hits. But they exist, and they continue to speak for the audience that made stars of Hank, Johnny, and Lefty Frizzell. For audiences alienated by corporations that pick our hits for us, “Hey Ya” style, seeking these bold outsiders returns real power to real listeners.

Monday, March 24, 2014

Lonely At the Top

Ben Horowitz, The Hard Thing About Hard Things: Building a Business When There Are No Easy Answers

I knew this hybrid memoir and advice manual was different on page 20, when Horowitz writes: “We finished the third quarter of 2000 with $37 million in bookings—not the $100 million that we had forecast.” These numbers are so huge, they sound fictional. His massive inter-business contracts and repeated fiscal brinksmanship resemble Jack Ryan adventures. I wondered how they’d reach neighborhood entrepreneurs seeking $30,000 for rent and payroll.

Then I realized: this business book sounds different because it is different. Some books aim for middle managers, people with limited authority but little power, and others offer moral framework without strategic guidance. Horowitz writes for CEOs, division heads, and other top-rank executives who make powerful decisions in essential isolation. Horowitz’ intended audience has probably read innumerable books about how business should work; he illustrates how business really works.

That’s good and bad. CEOs, venture capital entrepreneurs, and other soaring-eagle outliers are probably an underserved market. Middle managers generally have in-house mentors and have so many books written for them, they could get bulk-buying discounts at Books-A-Million. CEOs frequently have to re-invent the wheel, because only a handful ever exist at Horowitz’s level. Horowitz steps into the mentor role, dispensing hard-won advice when every decision costs millions of dollars.

But CEOs at Horowitz’s level remain rare for good reason. When he describes selling his corporation to a competitor, but retaining intellectual property rights, which he leases out for $30 million annually, he clearly operates a business model that only functions among the One Percent. Could you sell anything you made, but still own it, and license it back to your buyer? Unless you’re a software cartel, everybody knows your answer.

Horowitz goes, with shocking haste, from saying something reasonable and necessary, to something so frankly stupid, I wonder if he’s listening to himself. For instance, he discusses the CEO’s importance in creating business culture. Managers should train their own workers at all levels, he says, including CEOs, because hands-on involvement optimizes productivity and employee retention. Essentially, Horowitz wants leaders to lead, not delegate glamorless basics onto subordinates. Huzzah!

Then he concedes business models so lopsided, even this English teacher turned forklift driver wondered how he could be so tone-deaf. Horowitz entered the tech startup business after the 2000 NASDAQ collapse Roto-Rootered the tech stock sector. He clearly thinks this makes him a bold maverick. Maybe. But his company, Loudcloud, made only one product, which only international mega-corporations and governments could afford. That’s a weak foundation for an IPO.

Horowitz repeatedly discusses needing enterprise capital at Defense Department levels, then admits concentrating hundred-million-dollar ventures on only one customer. When that customer takes a $30 million bath, Horowitz must scramble for replacement finance. And I repeatedly pull my beard, screaming: “Diversify, dammit! You have one product, one customer, and one revenue stream; you’re a deathtrap waiting to happen!”

Seriously. If naifs like me are catcalling your business model, you’re in deep shit.

Perhaps you’ve noticed my praise for Horowitz’s book is vague, sweeping, and global. My condemnation runs very specific and detailed. There’s a reason for that. Horowitz propounds principles I find downright admirable; but when the rubber meets the road, he doesn’t honor his own precepts. His doctrines are bold, jargon-free, and exciting. His actions give me the willies. I don’t know how to reconcile the gap.

British psychologist Kevin Dutton has spent decades studying psychopaths. He notes, despite Hollywood stereotypes, that you’re more likely to meet psychopaths in corporate boardrooms than dark alleys. When Horowitz describes himself and his career, he repeatedly rings bells I recognize from Dutton’s writing. Horowitz comes across as a charming, controlling narcissist who doles swift punishment, but evades culpability. By Dutton’s standards, Horowitz is a classic psychopath.

Maybe that’s why there’s never been a book quite like this. Maybe corporate leaders like Ben Horowitz truly don’t see life like ordinary humans. Horowitz verbally advocates putting people first, which I applaud. Then he makes others eat the consequences of his actions. He extols swift, decisive actions, including layoffs, to avoid Dunder Mifflin-ish rumor-mongering and status games. Then he describes scattering pink slips like Halloween candy.

I tried my best to like this book. Horowitz hides little moments of surprising candor and self-awareness like Easter eggs, and I briefly suspect he understands something other business writers miss. Then he apparently fails to notice the gaps between his precepts and his actions, or says something that makes me want a shower. Sadly, I think the rich are just different.

Friday, March 21, 2014

My Lament For Fred Phelps

Jeff Chu, a gay Christian journalist, interviewed Pastor Fred Phelps some years before his recent death. In his report, Chu describes a man accustomed to media harangues, who started out profoundly defensive, expecting to get attacked. But when Phelps realized Chu, who was forthright about both his faith and his sexuality, had no malice in his interview, Chu reports that Phelps loosened up. He describes Phelps as warm and grandfatherly.

But besides how cordial and gracious he found Phelps, Chu also records surprise at something most Americans might find astonishing: in pride of place in his church office, Phelps displayed a personalized award he’d received from a regional chapter of the NAACP. Before Westboro Baptist Church became notorious for its anti-gay vitriol, it was deeply involved in anti-racism and civil rights. Phelps’ brand of religious literalism, apparently, defies pat categorization.

Most Americans, of any religious or non-religious inclinations, might feel shocked by this revelation. I certainly was. I felt deep surprise despite knowing that Charlton Heston, renowned Republican activist and NRA firebrand, was formerly famous for marching shoulder-to-shoulder with Dr. Martin Luther King, Jr. Media portrayals of outspoken activists frequently circumscribe their opinions, making them equivalent to their most odious or least palatable statements.

Phelps himself didn’t help his isolation. He bifurcated his public life: his nuanced sermons and diverse awards remained behind church walls, while his interactions with the larger world often turned on vulgar language, ad hominem attacks, and shouting. Phelps, and his small but dedicated congregation, probably saw his confrontational style as like Daniel before Nebuchadnezzar. Outsiders saw him photographed waving nasty signs, mouth frozen open mid-scream.


Westboro Baptist Church existed for over thirty-five years before Phelps’ loathsome “God Hates Fags” campaign commenced. They expressed an iconoclastic Calvinist theology that alienated them from other churches—originally begun as a satellite congregation of another Topeka church, Phelps broke ties shortly after WBC opened its doors. But despite his liturgical lonerism, Phelps formerly made profitable, loving relationships with outside secular groups.

WBC was never large. By its own numbers, membership hovered around forty, mostly related to Phelps by blood or marriage. Chu reports meeting a few converts, including one who first attended hoping to film an anti-Phelps documentary, but was persuaded by Phelps’ sincere scriptural literalism. The congregation subsidized its exhaustive, international protest tours largely through lucrative anti-government lawsuits, which hastened some large paydays.

The transition from anti-racist progressive to anti-gay bombthrower boggles the mind. Having once been on history’s winning side, Phelps’ later attempts to freeze public morals seems like an essential reversal. He even registered his congregation’s web URL as godhatesfags.com. In his youth, Phelps reached across racial lines to help upend unjust power structures and spread justice. In old age, Phelps reinforced crumbling hierarchies and defended structures of privilege.

So what changed?

Fred Phelps hardly invented Biblical literalism. Many theologians through the ages have had diverse opinions on what it means to take the Bible literally, even though, as Nadia Bolz-Weber notes, it’s physically impossible to obey every Biblical dictum. Still, Phelps believed that upholding certain moral stipulations took priority. Reading the Levitical law or the Book of Ruth, Phelps found anti-racist morals. The same law called homosexuality “an abomination.”

Thus, Phelps’ opinions were theologically consistent. The way they cut across secular political alliances didn’t matter. Phelps wanted God’s approval, not humankind’s. Yet his latter career’s extreme verbal violence shows inconsistency. He stopped talking to anybody else. In his anti-racism, he allied with the NAACP. In his anti-gay activism, he flew solo. He became the sole arbiter of God’s will. Like Jim Jones or David Koresh, he became his own religious idol.

Six months before Phelps’ death, Westboro Baptist Church, the congregation he founded, formally excommunicated him for suggesting that their protests reverse the brutal confrontational tone. Not that they stop protesting; evidence suggests Phelps’ fundamental views remained unchanged. He just couldn’t remain angry that close to the Pearly Gates. The all-male board that excommunicated their own pastor had six members. Four were Phelps’ sons and grandsons.

The monster Phelps created ultimately devoured him.

If Fred Phelps were the mere caricature media portrayals have shown, it would be easy to piss on his grave. Before reading Jeff Chu, I’d have forgotten Phelps before he was cold. But one brief interview forced me to evaluate my prejudices. We mustn’t condemn Phelps on his passing, or we’ll become his equals. Like the Pharisees, Fred Phelps self-righteously damned perfect strangers. To save ourselves, we must first know our neighbors.

Wednesday, March 19, 2014

Neil DeGrasse Tyson Is Wrong

Neil DeGrasse Tyson
Periodically, back in my teaching days, I’d perform a simple demonstration: I’d take a piece of chalk, hold it about head-height, and ask my students to speculate what would happen if I let it go. Naturally, everybody knew that it would fall. So I’d ask: “Who would like to write a paper about that?” No takers. Unsurprisingly. Then I’d ask: “Now who’d like to write a paper about why this chalk would fall?”

Sometimes I’d get takers right away. Often, though, my students would require some prompting. When I’d ask them what Newton said about how gravity works, what current science declares about gravity, or about gravity’s place in the still-hypothetical Grand Unification Theory, they’d realize how little they understand about something so obvious and comprehensible as, well, gravity. Such a fundamental force remains poorly understood by even our brightest minds.

I greatly admire American astrophysicist and media personality Neil DeGrasse Tyson. Unlike my college science instructors, Tyson is affable and telegenic, a master communicator who makes abstruse scientific concepts comprehensible to mass audiences. That’s why it worries me, in his current media tour promoting his reboot of Carl Sagan’s classic TV series Cosmos, that Tyson tells interviewers science is beyond debate. Because that’s just not true.

Let me define my point. Like Tyson, I don’t believe CNN and other media should permit “flat earthers” and seven-day creationists to present their anti-empirical positions as essentially equal to science. Any scientific position must rely on demonstrable evidence and, religious as I am, I deny that citing Genesis constitutes evidence. Ken Ham and his various cohorts who willfully deny confirmable evidence are, to pinch some technical terminology, nutcakes.

That said, while we can clearly spot what isn’t science, we have a harder time assessing what science actually is. Isaac Newton believed science relied upon experiment. The apocryphal story of his apple shows what that means: he observed a phenomenon, described a mechanism that explained that phenomenon, then devised experiments to test that mechanism. Because the mechanism accurately predicted experimental outcomes, he considered the mechanism proven.

Richard Feynman
When Tyson speaks of experiments and their results, he’s describing the Newtonian process. But he overlooks something important: the concept of gravity made Newton squeamish. What seems obvious to us now, because we’ve grown up with the idea, was so controversial that even its own inventor conceded that he’d found a sketchy preliminary conclusion, and he or somebody would have to devise a better, more seamless explanation for a strange and inexplicable phenomenon.

Thinkers like Leonardo and Galileo have bolstered Newton’s precepts about experiment. The (fictional) tale of Galileo dropping different-sized balls off the Leaning Tower of Pisa demonstrates this principle. But not everyone agrees. Albert Einstein had no patience for experiment. He believed higher math accurately described the universe, so he ran equations and, if the math held, he believed his principles proven. Demonstration was mere gravy for the plebs.

This is no small difference. Physics has made great leaps in recent decades, and Tyson derives the physical cosmology he describes on TV from current physics. But concepts like string theory and supersymmetry remain hotly debated in scientific circles because we only have math to go on, and while multiple equations prove sound and self-sufficient, they’re mutually contradictory. The theories hold up internally, but they can’t all be right.

And none has accurately predicted any experimental outcomes in thirty years.

Is science, then, the product when we test hypotheses and get desirable outcomes? Or is it the math that predicts the outcomes? And how do we reconcile irreconcilable outcomes? I contend that it is these debates, and not obvious physical facts, that constitute science. My students have no interest in writing about something obviously true. If I drop the chalk, it will fall. But as soon as I show them why that outcome is deeply controversial, they become interested.

Rhetorician Gerald Graff writes that “uncontroversially true statements are by definition inarguable and therefore not worth making, at least not as an essay’s main thesis.” This echoes what physicist Richard Feynman wrote about why we perform science: because the outcomes are in doubt. Once we resolve the controversy, the science stops. Obviously true ideas, like gravity, may be foundations for future science, but they are not science themselves.

Surely a decorated scientist like Tyson recognizes the importance of controversy in scientific thinking. This doesn’t mean journalists should give alien theorist Harold Ickes or climate denier Noel Sheppard equal time. But trying to deny the importance of debate makes science something it is not. And the entire scientific premise contradicts what Tyson has said in recent high-profile interviews. Frankly, this makes me sad.


See Also:
The Perils of Unsanctioned Thought in School

Friday, March 14, 2014

If War Is the Answer, What Was the Question?

Christopher Coker, Can War Be Eliminated?
The war is not meant to be won, it is meant to be continuous. Hierarchical society is only possible on the basis of poverty and ignorance.... In principle the war effort is always planned to keep society on the brink of starvation. The war is waged by the ruling group against its own subjects and its object is not the victory over either Eurasia or East Asia, but to keep the very structure of society intact.
            —George Orwell
Like other war-weary eras before ours, we’ve begun seeking alternatives to violence to solve our global problems. And like prior eras, we’ve begun realizing, however dimly, that alternatives aren’t exactly forthcoming. Christopher Coker, professor of International Relations at the London School of Economics, seems a likely candidate to ruminate on humanity’s future military options. But speaking as a guy who’s marched for peace, I find his prognosis rather worrisome.

Briefly, Coker answers his title question early, and often: war persists because war makes us human. I repeat myself, as Coker does: war makes us human. Seriously. Screw art, philosophy, science, religion, industry, or even cuisine. We become human by killing others into agreement. This argument might’ve needed less defense (though probably more than Coker offers) before two world wars and the spectre of nuclear extinction reframed the debate.

Don’t mistake me: Coker isn’t some crypto-fascist warmonger somehow immune to the Twentieth Century’s lingering lessons. He adroitly demonstrates how anti-war advocacy has produced slovenly thinking, particularly among New Agers and similar utopians. Abjuring war will require radically transforming human global politics and public morals, which will come only with great difficulty, even with violence. Global disarmament isn’t on our horizon. War will remain common for now, because it’s familiar.

Coker has many valid points. He astutely describes how war reproduces itself, through myths of valor and in-group identity, in human culture. And he rightly faults war’s opponents for failing to define peace as anything besides “not war.” If that’s all peace is, then peace cannot exist without wars to oppose. But Coker’s inarguably accurate points don’t excuse strange, overstated assertions that dedicated newshounds and part-time Quakers could dismantle.

Even in the very early pages, Coker makes sweeping, easily refuted errors of fact. For instance, pitching war as an ever-evolving force, Coker cites UN missions to Congo encountering rape as a “new” weapon in 2010. But Newsweek reported on Bosnian military rape tactics in 1993, and Edwidge Danticat described rape as a weapon of Haitian civil repression even earlier. One could perhaps cite Vikings as pioneers of militarized rape.

Likewise, Coker quotes Edward Luttwak quoting the old maxim: “If you want peace, prepare for war; if you actively want war, disarm yourself and then you’ll get it.” One wonders, then, why nobody attacks Costa Rica, which abolished its army in 1949. Costa Rica is so peaceful, the Organization of American States (OAS) centers its Inter-American Court of Human Rights there. Likewise, Panama and Haiti disbanded their armies, and military coups mysteriously ceased.

Coker might counter that smaller countries enjoy American and international military defense, and there’s something to that. But William Blum observes that America has been involved in a shooting conflict with someone, somewhere, continuously, since 1946; we took a brief breather after WWII and dove back in. Advanced civilizations cannot keep large standing armies, with expensive military technology, and not use them. They get rebellious.

Throughout, Coker repeatedly declares that because war exists, war should exist, QED. He asserts this in ways great and small, correlating it with human evolution, societal norms, and religious dogma. (Coker seems strangely obsessed with religion. The Prince of Peace might take issue.) Logicians call this approach “the naturalistic fallacy,” assuming that whatever exists is, ipso facto, good, or anyway normative. Tell that to land mine amputees.

I could continue, but laundry-listing Coker’s logical omissions gets wordy. I could scarcely savvy two pages in this mercifully brief monograph without encountering something so intellectually unsteady, it felt disrespectful. Coker never subjects his assertions to evidentiary testing; he rejects what Peter Elbow calls “The Believing Game,” never assessing his ideas by viewing them from the opposite perspective. This leaves his thesis appallingly vulnerable to frankly rudimentary counterargument.

As I write, world powers stand poised before a possible second Crimean War. Watching Vladimir Putin bait NATO into an unnecessary battle nobody could possibly win, I have difficulty believing this great global pissing contest is merely, as Coker asserts, “a product of the social complexity of life.” The exigencies of unfolding history, unencumbered by faux Darwinian jargon, conspire to spit in Christopher Coker’s eye.

Coker’s title implies he’ll investigate the debates surrounding a powerful, world-defining issue. But he essentially answers his own question in the preface: “No.” Then he spends about 110 pages (plus back matter) explaining why there is no debate. War is important enough to justify a broader, more even-handed discussion. Coker instead proffers a manifesto so lopsided and easily rebutted, informed readers will find it insulting.

Wednesday, March 12, 2014

Success—an Owner's Manual

1001 Books To Read Before Your Kindle Battery Dies, Part Thirty
Malcolm Gladwell, Outliers: The Story of Success, and
Charles Duhigg, The Power of Habit: Why We Do What We Do in Life and Business


While American schools invest deeper in standardized tests and “accountability,” regular citizens recognize that material success doesn’t come from consensus benchmarks. But where, then, does success originate? While no single answer encompasses every success story, Malcolm Gladwell and Charles Duhigg, if read together, suggest an intriguing pattern which intrepid apprentices could apply to their own self-improvement.

Gladwell, a longtime New Yorker veteran, gathers diverse diverse success (and failure!) stories and back-engineers their patterns to find what traits flourishing professionals share. From immigrant Jewish garment manufacturers to Korean rice farmers to the Beatles, Gladwell finds remarkable degrees of commonality. Readers may be surprised to discover what they share with Bill Gates, Robert Oppenheimer, and Michael Jordan.

First, Gladwell demonstrates that innate talent and natural genius have little bearing on success. Many highly gifted prodigies languish in obscurity, because life doesn’t reward inwardness. Success tends instead to cluster around certain traits, many learned in childhood or bequeathed to us by culture, others acquired slowly through effort and struggle. Overnight successes don’t exist. Victory comes to those most prepared, whether by nurture or by determination.

Some success traits originate from individuals. Gladwell explains “The 10,000 Hour Rule,” meaning those who practice exhaustively, for long periods of time, become outstanding in their fields. Likewise, despite longstanding patterns of childhood education, those who emerge successful from their schooling generally resist institutional pressures, such as (gasp!) summer vacation. It’s possible for determined individuals to supervise their own progress.

But we’re all beholden to forces we cannot control. Certain traditional cultures, like those found in the Great Smoky Mountains, reward anger and vengefulness, while other cultures, like Asian rice-farming societies, favor patience and humility. Even something as seemingly insignificant as your birthday, Gladwell demonstrates, can have sweeping consequences for long-term success. Bucking the influences that surround us daily requires more than mere grit.

That’s where the New York Times’ Charles Duhigg comes in.

While Gladwell describes the external factors that foster success, Duhigg parses the intricate psychology. Humans rely on habitual actions to function. We couldn’t traverse daily life if we needed to contemplate every option. Habits finesse us past ordinary moments, letting us concentrate our brains on unexpected or unusual circumstances. But how can we change when habits prove injurious or counterproductive?

Emerging neuroscience helps us understand how habits form. Generally, they emerge from a sequence of desire and reward Duhigg calls the Habit Loop. But while this seems obvious, our Habit Loops rely on complex internal motivations often clouded by our conscious minds. The forces that drive us vanish behind the stories we tell about what forces ought to drive us. Identifying our habits requires unaccustomed levels of studious honesty.

Thus, science describes patterns that describe everybody equally; but our individual circumstances cause universal effects to manifest in unique ways. This means our habits aren’t deterministic. We can change seemingly hardened behaviors, if we’re willing to examine ourselves forthrightly. Understanding our cravings and influences lets us revise our choices. Understanding society’s influences lets us start revolutions.

Duhigg describes habits in areas we wouldn’t normally associate with neurology. Besides individual habits, organizational habits drive complex groups, like companies and governments, while societal habits percolate throughout culture, steering us in ways we cannot see, like fish through water. All these habit structures arise in similar ways (organizations may lack nervous systems, but leadership culture behaves similarly). And all these habits are liable to revision.

Like Gladwell, Duhigg is no high-minded aesthete, discussing scientific precepts in isolation. Both authors doggedly apply their sometimes-cryptic science to practical, lived problems. Reading Gladwell’s “Ethnic Theory of Plane Crashes” or Duhigg’s explanation of how discount stores manipulate our habits, may seem depressing in the near term. But both offer workable approaches to turning powerful forces toward our ultimate benefit.

Both Gladwell and Duhigg rely on difficult scholarly sources, and their bibliographies cite authors us peasants couldn’t possibly understand. But as seasoned journalists, they translate very difficult concepts into vernacular English, guiding generalist readers through thorny disciplines. Though they deal in principles already well-known to scientists and other advanced thinkers, they bring often-abstruse knowledge into common Anglo-American discourse.

I recommend reading these two books together. Though written and published separately, the overlap of their themes is remarkable. Gladwell describes what success looks like, not just in dollar signs, but in individual sacrifice and social impact. Duhigg provides concise, scientific steps for translating our longings into meaningful action. Taken together, these books describe real, viable plans for turning your life into a world-changing success.

Monday, March 10, 2014

Queens of the Bronze Age

Anne Fortier, The Lost Sisterhood: A Novel

Sophomore novelist Anne Fortier does something fiction writers seldom do: she states her thesis boldly near the beginning. Her mouthpiece character, Oxford philologist Diana Morgan, gives a speech about the mythic Amazons, which we jump into for her final summation. She spares us the academic details, gifting us with this straightforward closing nugget:
“[T]he knowledge that these bloodthirsty female warriors were pure fiction did not stop our writers from using them in cautionary tales about the dangers of unbridled female liberty.”
Except Diana knows the truth. The Amazons weren’t fiction. She comes descended from a line of warrior women, who’ve somehow kept their secrets unchanged from time immemorial. When a mysterious benefactor gives Diana the opportunity to prove what she secretly already knows, everything changes around her.

Anne Fortier might’ve crafted an interesting woman-driven historical fantasy if she’d recognized her limits. Sadly, she apparently takes her mythological conceits seriously, and wants to strike a blow for sisterhood and freedom. Even then, she might’ve accomplished a merely windy didactic novel for fellow concerned eggheads. Instead, she requires us to disregard everything we know about archaeology, mythology, human societies, and academia.

Fortier’s present-day frame story salvages images from various movies to add action to an essentially talky exposition. Diana’s Oxford uncannily resembles Hogwarts, while her excavations of bronze-age monuments channel Indiana Jones. Come on, Oxford philologists write Hobbit novels; they don’t rappel into archeological sites under cover of moonlight!

Meanwhile, Diana gets ghosted by an enigmatic stranger, Rick Barrán, whom she claims to despise, though she describes him in wholly sensual terms. Rick so completely resembles Clive Cussler’s famed adventurer, Dirk Pitt, that I envision him played by Matthew McConaughey. Perhaps Fortier, a sometime film professional, should shut off the DVD player occasionally.

But Fortier’s parallel narrative has the real human elements her frame story lacks. When ancient huntress Myrina finds herself exiled from her Bronze Age village, the priestesses of the Moon Goddess recognize her martial prowess and welcome her (mostly) warmly. But Greeks sack the temple, slaughter the priestesses, and enslave the survivors. Myrina refuses to die, and her pursuit turns the sisterhood into a legendary Tribe of Women.

It feels like Fortier wrote two different novels, using two different templates. Her contemporary novel recycles traditional “literary romance” components. Diana emphasizes early her own physical beauty, but bad luck with men. She then evaluates other characters by their appearance, and we understand how virtuous each person is by how appealing Diana finds them. This slows the story way down.

And what a story. Though a philologist, Diana never does anything philological. When confronted with a lost ancient language, she simply consults Granny’s old handwritten glossary. Then she enters archeological digs via subterfuge, ducks conspiracies masterminded by international billionaire anarchists, and escapes ancient sites one step ahead of very modern explosions.

One scene was so blatant, I couldn’t help recalling the poster art from Robert Zemeckis’ Romancing the Stone.

Fortier’s other novel features a heroine, Myrina, whose entire milieu appears designed to break her will. But Myrina chooses her friends wisely, accepts fights but doesn’t go looking for them, and refuses to defer to the Patriarchy. When men try to break her sisterhood, and enslave her actual sister, she steals a leaky boat, raises an army of women, and crosses the ancient Mediterranean in pursuit of justice.

I wanted to read more of this other novel. In Myrina, Fortier has created a character of strength and determination, a character who will not break just because patriarchal Greeks want to break her. Myrina is a much more interesting character than Diana Morgan, who spends so much time expounding her theories that one suspects she’s basically Fortier’s authorial sock puppet.

In Diana, Fortier has committed the Isaac Asimov Error: she’s created a character not to do something, not to face challenges and undertake a personal journey, but to expound the author’s point. Though Diana crisscrosses the ancient world, making connections and rebuilding a history lost beneath time and chauvinism, by the end, she’s essentially changed very little. Her entire story serves to vindicate the point she makes, literally, on page two.

Well, Dr. Asimov wasn’t always constrained by his Error. He created Bayta Darell, one of sci-fi’s most compelling female characters. And Fortier has created Myrina, easily Bayta’s equal. If only she hadn’t also created an intrusive, pseudoscientific goulash that keeps interrupting the real story, she might’ve created a great novel. Sadly, Fortier, like Dr. Asimov, keeps standing in her own authorial way.

Friday, March 7, 2014

Paging Doctor Asimov, Stat!

Andreas Eschbach, Lord of All Things

Growing up poor in the shadow of Tokyo’s glimmering prosperity, Hiroshi Kato has a vision. He foresees a world where technology banishes poverty forever. Humanity’s established order, including his best friend, a wealthy ambassador’s daughter, mock Hiroshi’s vision. But in an epic spanning decades and crisscrossing the globe, Hiroshi Kato stops at nothing to realize his dream. If that means remaking human civilization in Hiroshi’s image, so be it.

Award-winning German author Andreas Eschbach crafts a vision American science fiction readers will find numbingly familiar. Eschbach’s themes precisely mimic those in Isaac Asimov’s classic I, Robot, though at much greater length. Eschbach requires a cast of thousands, globetrotting narratives, and decades upon decades, to retell Asimov’s story. At least Asimov offset his talky, intellectually dense novels with rapid pace and action-driven scenes.

Like Doctor Asimov, Eschbach uses characters and situations to expound authorial principles. Character dialog resembles academic discourse, even from children’s mouths, because they’re less humans than personifications of the author’s message. Yet Asimov redeemed his story through concision, running about one-third Eschbach’s glacial Teutonic length. Eschbach might have saved this massive, brick-like book if, like Doctor Asimov, he excised everything that didn’t serve his story. Which is quite a lot.

Moreover, it’s hard to swallow Eschbach’s message when it contradicts everything an informed audience already knows. Technology isn’t morally neutral. Hiroshi, and perhaps Eschbach through him, lives in a world untouched by environmental catastrophe; a world free from Wendell Berry or James Howard Kunstler; a world where more technology can fix problems existing technology has created. In real life, as in casinos, doubling down is a stupid strategy.

Asimov published I, Robot in 1950, when technology’s potential seemed limitless. Machines would replace human labor, making every human necessity free, liberating us for lives of intellectual fulfillment. I, Robot is an excellent book (though a lousy movie). But sixty-four years later, our understanding of technology’s social impacts has evolved appropriately. Eshbach’s expectations nevertheless remain mired in Jet Age utopianism, making grandiose promises already three generations outdated.

Eschbach divides humanity into two groups: those who support Hiroshi’s dreams, and tragicomic straw men Hiroshi demolishes effortlessly. Besides a handful of close friends, every character has one or, at most, two character traits, and exist to enact allegorical roles in Hiroshi’s morality play. The ambassador’s migraine-prone wife, the MIT professor who can’t successfully debate an undergraduate, even Hiroshi’s own mother, all provide colorless background chatter while Hiroshi redeems humankind.

Meanwhile, as Hiroshi’s human community becomes increasingly one-dimensional, his technological dreams become increasingly detailed and specific as the book advances. What begins as a sweeping desire to eliminate poverty bogs down in abstruse descriptions of robotics, nanotechnology, and other sci-fi buzzwords. Technology, for Hiroshi, is a Platonic ideal, free from human interference. Hiroshi understands humans so vaguely, and technology so precisely, that I wonder, is Hiroshi perhaps autistic?

It’s dangerous to assume a character represents the author’s message. Authors sometimes foreground characters who exist to get demolished, or who represent societal failure; consider everything cyberpunk pioneer William Gibson ever wrote. But because Hiroshi’s vision brooks no argument, and every circumstance eventually breaks Hiroshi’s way, I suspect Eschbach at least doesn’t dispute Hiroshi’s technological romanticism. I work in a factory, so I’ll say, technology makes work harder, not easier.

When I reviewed Stephen Kiernan’s The Curiosity, which made similar attempts at scientific moralism without relying on empirical science, several people criticized my review, saying, “it’s just science fiction.” So I ask: when did science fiction become unmoored from science? We should evaluate this novel’s technological dictums based on what we know of science, including human psychology and technology’s social history. Doctor Asimov would have.

Essentially, this book has the same problem Patricia Cori had earlier this week. Our viewpoint protagonist, and presumably our author too, has an idea early, and life happily conspires to see this idea germinate. Hiroshi’s ideas don’t endure any tests, don’t get revised by life, and suffer only token resistance from opponents so trivial, he demolishes them effortlessly. In the final sale, our hero isn’t so much triumphant as vindicated.

Every time I set this book down, picking it up again became increasingly laborious. While I support Hiroshi’s economic egalitarian dreams, he bases his dreams on naive faith in eternal human progress, unencumbered by boring Newtonian physics in a finite world. This novel brooks no dissent, bends all characters to serve its protagonist’s themes, and plays to an inevitable end. You deserve a book that respects your valuable reading time.

Wednesday, March 5, 2014

A New Classic Philosophy of Philosophy

Rebecca Newberger Goldstein, Plato at the Googleplex: Why Philosophy Won't Go Away

Modern Euro-Americans can’t venture outdoors or watch television without encountering some concept which began with Plato. Politics? Plato wrote entire books on public service and leadership. Art? Plato couldn’t restrain himself from voicing opinions on artists’ responsibilities and role. Science? Okay, he didn’t invent experimental technique, but he pioneered ideas in physical cosmology. Yet moderns like us are monumentally resistant to Plato, at least directly. Rebecca Newberger Goldstein wonders why.

Philosophy, as we understand the word, begins with one fundamental question: “Why?” Why do we consider certain ideas obvious and true, rather than their opposite? Why do we do our jobs specific ways? Why do we spend our time on such-and-such? Plato’s mentor, the semi-legendary Socrates, wandered ancient Athens, asking politicians and scholars and tradesman questions. Whatever somebody considered self-evident, whatever certainties left citizens numb, Socrates punctured with simple dialog.

Notwithstanding his foundational position, Plato did not invent philosophy. The process began with nigh-forgotten Ionian scholars ruminating about what we’d now call science. Their speculative cosmology, roughly equal to seven-day creationism, makes Thales and Anaximander mere relics. Plato shifted philosophy’s focus off physical science and onto human spirits. He initiated questions about education, politics, and morals that pay off daily in modern schools, elections, and daily life.

Despite this persistence, not everyone agrees Plato remains relevant. Goldstein quotes people she calls “philosophy jeerers” on why changing times have (putatively) rendered conventional philosophy obsolete. But using their own words, she demonstrates the ultimate circularity of their arguments, and how attempts to discredit classical philosophy are ultimately philosophical. She concedes that not everything Plato records remains relevant. But we can only understand that by performing legitimate Platonic philosophy.

Goldstein, a humanist thinker who has written award-winning popular books on Kurt Gödel and Baruch Spinoza, brings exhaustive familiarity with philosophic history to her inquiries. She can correlate Enlightenment-era innovations, such as individual rights, which we consider commonplace, with Plato’s thoughts. The correspondence may surprise us. Plato had remarkably progressive ideas about, say, women’s rights and governance. But he didn’t believe we existed individually; “rights” may have scandalized him thoroughly.

These discrepancies are themselves fascinating. Goldstein imagines Plato wandering modern American settings, encountering public thinkers and social pathfinders, testing contemporary ideas against pure reason. Platonic philosophy allows us wide latitude, Goldstein asserts. Unlike Enlightenment thinkers, Plato brings few presuppositions to his thought. He has principles, but remarkably few ironclad demands. For us to test ideas like Plato, we need only ask one important question: can this idea withstand its opposite?

Plato’s works make for very difficult reading. Even very dedicated audiences struggle with his frequent, densely mystical asides. I personally enjoyed Meno, but found Phaedrus almost unreadable. That makes authors like Goldstein profoundly valuable, translating Plato’s millennia-old ruminations into modern English. Because if Goldstein’s right, and Plato remains relevant to modern life, diverse audiences need a contemporary Virgil to guide us through the dense thicket of his prose.

Some reviewers will certainly misunderstand Goldstein’s intentions. One would-be critic anchored his entire review on one line, around the one-third mark, where Goldstein’s viewpoint character disparages Amazon reviewers (don’t look at me that way). But Goldstein isn’t saying this. Like Plato, Goldstein uses Straw Man arguers who are always wrong. Goldstein, like Plato, requires readers to pay attention, separating intermediate arguments from the final take-home lesson.

Goldstein’s oblique loyalties, combined with her extremely dense style, often make slow, effortful reading. Her chapters average over forty pages apiece, though some are much, much longer, and without natural integrated pause points, her polysyllabic prose is monolithically imposing. This ain’t beach reading, folks. Schedule generous sit-down time before reading, because Goldstein, like Plato, won’t let you consume her ideas with only half a brain.

Worse, Goldstein forbids readers to reach after pat answers. Humans often seek to resolve questions neatly, excluding ambiguity and doubt. But Goldstein, like Plato, often ends debates with key issues still unresolved. We’re more confused, not less, though perhaps confused in more sophisticated, productive ways. This can feel painful; but Goldstein notes early: “Philosophical thinking that doesn’t do violence to one’s settled mind is no philosophical thinking at all.”

Readers willing to honor Goldstein’s stipulations will find, here, an engaging précis of Platonic thought, and a persuasive justification why Plato continue to matter. Her two-pronged concept lets us both understand Plato’s techniques, while also witnessing the mental processes in action. Because Plato’s questions on topics like virtue, education, and good governance remain alive today, Goldstein gives modern readers new opportunities to join this ancient debate.

On a similar theme:
Thinking About Thinking is Harder Than You Think

Monday, March 3, 2014

So Long, and Thanks For All the Fibs

Patricia Cori, The Emissary: A Novel

Today’s vocabulary word is “Mary Sue.” This term originated in Star Trek fan communities, describing authors who insert themselves as protagonists in fan fiction. Over time, the meaning has broadened. Authors needn’t insert themselves into existing franchises to get called Mary Sue (or Marty Stu) these days. Consider Patricia Cori’s new, appallingly self-aggrandizing mess.

Psychic dilettante Jamie Hastings has a moment on a New Zealand beach. Her soul touches a dying whale, and she becomes an emissary for Earth’s suffering biosphere. A lifelong Talent, she now has her mission. So naturally she does what any psychic environmentalist would: accepts a commission from a Texas industrialist to dowse for offshore petroleum in the northern Pacific. Wait, what?

Without even knowing Cori, one suspects Jamie is her authorial avatar because no challenge seriously jeopardizes her. She handles desert isolation, Houston oil money luxury, and life at sea with equal aplomb. She successfully out-argues a seasoned capitalist who’s accustomed to snow-jobbing his corporate board. Plus, everyone she meets compulsively comments on her great physical beauty.

Jamie’s résumé is impressive. She’s dowsed for water in the Australian outback, conducted sweeping scientific research, and helped the LAPD solve nearly sixty murders. (Never mind that no psychic has ever provided criminally actionable evidence in American history.) Everybody from California’s governor to Oprah Winfrey talks up her talents. Everyone holds doors for her. It’s like she can’t fail.

Plus, hell, she’s a psychic. Considering that Patricia Cori has published several books which she claims she psychically transcribed for various Ascended Masters, spotlighting a psychic scientist naturalist industrialist in her novel makes us realize she’s writing an idealized version of herself. This only makes it more frustrating when Cori treats her literary doppelganger gingerly, like a favored friend.

Roger Ebert once wrote, in panning a movie, that we don’t sympathize with characters when things look easy for them; we sympathize when things look hard. That’s really stuck with me. While good characters always have potential to triumph over adversity, they should really face the risk that they could fail. Even when they have right on their side, we’ll only care if they struggle for that final victory.

Patricia Cori disagrees with Ebert there. At no point in nearly 300 pages did I feel her heroine might collapse. She sweeps into every room, charms or outwits or overpowers everyone who might challenge her, and ultimately (spoiler alert) transcends this mortal coil, becoming an Ascended Master while swimming among the whales. In Cori’s mind, Jamie’s apotheosis is a foregone conclusion.

So essentially, Cori creates a situation based on her own principles, inserts herself into the story, then sweeps a clear path to her counterpart’s ultimate vindication. Heroic characters submit to Jamie; villains just get plowed under. Thus this novel isn’t a story with a moral; it’s Cori’s manifesto for her spiritual-environmental-transcendental vision of pseudoscientific hoodoo. We don’t feel anything for her characters, because she’s too busy telling us how we should feel.

We read novels to feel somehow transformed. Despite what your high school English teacher said, good authors don’t write to “mean something.” Themes and symbols usually arise from authors’ subconscious, and we recognize them only retroactively. When authors start with a message, and ramrod the characters and situations into their own morality play, audiences tend to feel manipulated, and resent it.

This goes double when the protagonist is a blatant Mary Sue. Throughout this novel, Jamie sententiously preaches Cori’s metaphysical message, and while others may muster token resistance, they inevitably fold, usually after only one or two pages. Nothing ever threatens Jamie’s presuppositions. Anyone who doesn’t already share Jamie’s (and presumably Cori’s) message coming in will probably finish this preachy, long-winded book feeling confused, disappointed, and ripped off.

If Cori wants to write a treatise, she should do so. Lee Van Ham did so beautifully, if perhaps with less parapsychology and religion-lite wizardry. Novels may have messages they hope we’ll receive, but characters should always take precedence. Authors should take us on journeys, not propound philosophical stances. If a work has one-to-one correlations, like crossword puzzle clues, it shouldn’t be a novel.

In her official biography, Cori claims she’s “often called a ‘real life Indiana Jones’ by fans and readers around the world.” But I get less an Indy Jones vibe from this book, more Deepak Chopra in middle school. Cori remains stuck between genres, and this feels like apprentice-level work. North Atlantic Books, one of America’s top indie publishers, should be better than a book like this.