Saturday, December 31, 2011

The Comic Book That Made All the Difference


Duane Pesice is a writer, musician, and self-taught expert on Cephalopods. He blogs regularly at PlanetModeran.net.

I got to thinking about the subject of "the book that has most influenced me" and kind of surprised myself. Had to go back a really long way, back to 1966, to find the one book that set me on the path to tomorrow. It wasn't even a book. It was a comic book.

More specifially, it was an issue of Marvel's The Avengers-#33, to be precise. It had most of the recognizable Avengers crew (Thor, Cap, Iron Man), a lot of movement and color, and it had some vaguely scientific stuff that filled my five-year-old brain with wonder (The Serpent Squad, I seem to recall, were the villain). I remember just being "opened up" by that comic in a way that has only happened a couple of times since. I read it in the little drugstore at the corner of 55th and Kildare, southside Chicago. I had allowance left and blew that on the same month's issue of the Fantastic Four (don't remember which).

That led to a seventy-issue run of the Avengers for me (I missed #104 because I was sick for a couple of days. It took me twenty years to find out what happened to Hawkeye.), and a 200-issue run of the FF. The writing, the art, everything just clicked. The stories led me to real stories. I was naturally attracted to Bradbury and was able to collect that first round of Tolkien Ballantine paperbacks. I, Robot was in my bookcase by the time I was eight, the same year I discovered Lovecraft through reading Dr. Strange (who I found out about by reading the current issues list in the middle of each mag, under the fan mail).

My first piece of writing, in 1967, was a comic book. I remember distinctly reacting badly to Gene Colan's art and the villain Stilt-Man in an issue of Daredevil and deciding that I could do better. I had only drawn boats and things before, very childish renderings. I remember drawing the people from TV Guide over and over to get the relative sizes of facial features right, and the development of my villain (an antihero named Chevron after the gas station two blocks away). My first short story starred Cthulhu and Randolph Carter.

Stan Lee name-dropping Harlan Ellison in the letters column provided the last kick into today. I recognized the name and spent my allowance for two weeks on the two-volume paperback copy of Dangerous Visions. On the way home from buying that set, I found a copy of The Magazine of Fantasy and Science Fiction lying on the sidewalk. It had Cordwainer Smith in it.

By the time I was twelve, I had somewhere around 200 books, having discovered used-book shops, piles and piles of comics (nearly all Marvel with some Gardner Fox stuff), and had written my first novel (fanfic starring Black Bolt and the Inhumans).

It's possible that I haven't grown up much since then, or maybe the circle remains unbroken. Comics and HPL and SF are still my comforts both for reading and writing, and I've just begun work on a webcomic/graphic novel containing all of the above and Hunter Thompson and Elvis to boot. And Groff Conklin and James Baen and Bernie Wrightson and especially Jack "King" Kirby, and Roy Thomas and Stan "The Man" Lee-I can't thank you enough.

Monday, December 26, 2011

Casual Discoveries and Real Learning

To break the monotony, I pull completed fuel managers from the seam setter and put them in various locations on the conveyor belt. Previously, I’ve placed largely at random—now I set them in specific locations. All along the far edge of the conveyor, for instance. Along the near edge. Along the middle. And wherever I place them, when they reach the overflow guard at the end, they cluster in the same pattern.

As the conveyor tries to push the cylindrical fuel managers off the end, and the overflow guard prevents that, the units form six-point radial symmetry. Kids who have arranged pennies on a flat tabletop will recognize this pattern: six circles completely surround a seventh of equal size. The central circle has six radial contact points in a symmetrical array. This pattern can expand forever without interruption, as long as the circles remain equal.

The fuel managers assume this pattern, no matter where I set them on the conveyor. Once they leave my hands, they face three basic forces: the horizontal impetus that wants to push them off the far end, the overflow guard that will not permit that, and gravity, which ensures they don’t just fly off willy-nilly. These three forces always produce the same results. My attempts to manipulate conditions don’t matter.

I feel the swelling potential of insight challenging factory work’s tedium. If they always assume the same pattern, that must mean something. The laws of thermodynamics say that, in a closed system, components form an entropic jumble. These units would not form such orderly patterns if I dumped them on the floor. But, because the conveyor provides constant inputs, this is not a closed system. Therefore it strives for geometric order.

The fuel managers don’t just sit there, absorbing their three inputs, however. They are tall and cylindrical, with smooth metal shells. As the conveyor’s momentum builds tension with the unmoving guard arm, the fuel managers rotate, accumulating friction that turns into surface heat. Good thing another worker will soon remove these fuel managers from the line, or they might overheat, become warped, then... what?

What would happen if the thin metal cylinders became distorted? Would they warp equally, or would random chance disturb the orderly system? Perhaps entropy would halt my orderly experiment. Like the ocean, a system which seems orderly on a granular level could prove chaotic on a large scale. Is a system truly open if all forces exist equally and behave in unvarying ways? Oh, the questions!

The history of science abounds with stories of profound discoveries arising from mundane events. Consider Newton’s apple tree, or Archimedes and his bath. The myths surrounding Pythagoras describe him making several discoveries while sitting, watching wind tousle the long grass. The intricate patterns produced, which appear random, actually reflect the interaction of simple forces in predictable, but not controllable, ways.

Sort of like my fuel managers on the line.

But modern schoolhouses, like factories and discount stores, tend to be boxy, windowless cells. This reinforces the notion that our discontents can be settled only by passively receiving what this facility peddles, whether a job, a consumer good, or a diploma. Students who engage in open-ended rumination outside the book are dismissed as daydreamers, and some who balk most demonstratively at the structure are medicated until they comply.

The very process that has produced our most profound discoveries could be easily replicated in the school environment. Youth care little for Avogadro’s constant or Planck mass if they read about them in books. Even laboratory experiments seem distant from real life. But if students see these concepts in play, as I saw thermodynamics in my fuel managers, we can kindle real curiosity that they can only sate through deep learning.

Therefore I propose that, if we really want students to learn, we need to get them out of school. Not that we should abolish institutions of learning, but that we should take them outside the inflexible, controlled classroom, and let them see the world around them. Guided opportunities to make discoveries like the one I made on the factory line would inspire growing curiosity and a love of learning.

Our students are not educated because they memorize facts in tables, or perform experiments with known outcomes. They are educated when they have the wherewithal to face unknown, unpredictable situations without fear, and pursue productive goals. We teachers can provide that. We just need to step outside our classrooms and encourage their natural desire to discover.

Saturday, December 24, 2011

A Truth So Vast, Only Magic Could Convey

Edward Blaylock has worked as a parking valet, a stadium vendor, a telemarketer, and a teacher. He writes under a (very thin) pseudonym on his blog, Walking the Earth.

Kevin has asked me to write about a book that has been a powerful influence on me. I could write about the way Have Spacesuit, Will Travel turned me onto science fiction. I could try to explain how Give War a Chance turned me on to political humor. I could wax rhapsodic about The Prophet, or talk about how The Games People Play gave me the tools to escape a difficult marriage. There have been so many books; it should be difficult.

I’ve known almost from the second he asked me what book it is, though. None of my friends or family will be surprised by my choice, because I’ve been a raving fan of it since I was seven years old. The book is The Fellowship of the Ring.

I love the Fellowship, and by extension the whole Ring Trilogy, with unabashed fervor. The first appearance of the Nazgul still fills me with dread. Gandalf’s stand against the Balrog and Boromir’s death bring tears to my eyes to this day. I get as choked up reading Theoden’s speech at Pellenor as I do hearing Henry V speaking of Saint Crispin’s Day. The pictures Tolkien paints are heartachingly beautiful, but that’s not how the books influenced me.

I can say without hesitation that the Ring Trilogy gave me my first instruction in the meaning of moral courage. People who haven’t read the series see it as a trifle, a story about fey Hobbits and twee Elves. What they don’t realize is how much it’s a tale of facing up to real darkness. I started reading the Fellowship for the first time at 7 years old, and while an awful lot of it went over my head, I clearly understood the terrifying evil of the Nazgul and the courage that it took Frodo and the other Hobbits to travel through the Shire and beyond. I understood even then the feeling Frodo had of terrible, heavy necessity when he volunteered to be the Ringbearer.

See, evil in the Fellowship is palpable. Evil is terrifying, not only when it’s swathed in yards of black fabric and riding a night-black horse, but when it’s whispering to you that you could make everything right again, if you’d just take this one easy step. Evil corrupts, whether through despair, through fear, through temptation to power. Evil is insidious and unfathomably powerful.

But here’s the other half of the equation: evil is limited. Sauron, the principal force of darkness in Middle Earth, cannot create anything, but only twist other people’s works. More important still, as much as Sauron and all his vast armies inspire terror in others, they themselves are cursed with constant, gnawing fear, the fear that comes of telling too many falsehoods and of having only that strength that one has taken from others.

In the face of evil, the heroes of the story stand up. They face their fear. They stare back into the baleful gaze of Sauron’s burning eye and they defy him. They face terrible trials, and not all of them succeed, but in the end evil is defeated by the concerted actions of good people.

Tolkien stressed over and over again that the Ring Trilogy was not meant as an allegory. He was right to the extent that he didn’t consciously write it as one. Nevertheless, his own Catholic worldview is plain to see. We’re all on a quest to make the world a better place. We have to do so keeping in mind that the end doesn’t justify the means. The road isn’t easy. We will stumble along the way. We’ll need help to get where we need to go. We may have to pay a high price. But in the end, if we listen to the better angels (or wizards) of our nature, we will win. I can’t think of a better way to summarize my own outlook on how to live a good life. That understanding of the struggle, of how hard it can be to face up to temptation and evil, is part of what has drawn me into the Catholic Church as an adult convert. I had always seen those ideas as self-evident because I heard the Ringbearer’s story as a child.

C.S. Lewis wrote the Narnia books as an allegory for children. As I mentioned, Tolkien vehemently denied any allegorical meaning in the Ring Trilogy. All his protestations aside, as a guide to living a moral life I have to say that The Fellowship of the Ring beats Narnia hollow.

Wednesday, December 21, 2011

Why Johnny and Janie Aren't Wrong to Hate School

At work a couple of nights ago, some co-workers were reminiscing about the difficulties they had with math class back in the day. Like many people, these colleagues, one in her forties, evaluated the class in retrospect based on whether she had put the learned skills to use. And of course, they have scarcely used any math beyond basic arithmetic to balance their checkbooks.

A few semesters ago, I coached my students to debate the value of a narrow, utilitarian curriculum versus a broad liberal education. The group tasked to support the useful education pointed out that “I don’t expect to ever factor polynomials again in my life.” While that’s true, it wasn’t enough to sway my opinion—but the other side couldn’t counter it. In years of schooling, no one had ever explained why we expect students to learn the Three R’s.

In that same debate, the liberal studies advocates fumbled in part because they saw their position in the same pragmatic terms as the more “useful” curriculum. They looked for ways the diverse exposure introduced students to opportunities and experiences which could offer new career tracks. But if either position can only offer practical options, students are clearly better served by a more specialized curriculum.

Thinking back, I cannot recall even one teacher saying I needed to learn higher math, science, or the humanities because I would ever use the skills. My algebra teacher admitted we were in her class for idealistic purposes. My literature teachers asked us to please see past supposed utility, and embrace our studies to expand our minds. Yet I recall struggling to understand why I should spend time on such useless pursuits.

I have written about how new teachers, good people in general, hit the classroom with only a sketchy comprehension of their subject. They believe in education, yet teacher training generally emphasizes classroom management over subject specialization. Yet considering that American teachers are required to have at least a bachelor’s degree, it seems strange that no one ever explains to them how to tie the curriculum together into a unit.

For those who, like me, look back ruefully on precalculus in a windowless, fluorescent-lighted room, let me remind you why we offer students a diverse liberal education. When you learn something new, you don’t just learn facts and processes. You learn new ways to make connections. You learn modes of inquiry. You learn how to learn.

This makes specialized career training seem downright frivolous. Most people would learn their careers faster, and better, through a field apprenticeship. Yet employers don’t want new hires who only know their job. Such people can’t respond to unexpected circumstances. Minor setbacks become catastrophic for workers who only know through the prism of their jobs. Bosses want workers who can learn, adapt, and think.

Nobody explained this to me until I stood in front of the classroom myself. I didn’t have the encouragement to look at my education from an integrated perspective until I was in graduate school. But why should only those who go that far have that opportunity? If I had teachers in high school who saw their role that way, they certainly never said so. Indeed, the various disciplines demonstrated what I could only call rivalry, which sometimes turned bitter.

When I dissected that frog in seventh grade, somebody should have asked harder questions. After all, few students will ever again need to understand amphibian physiology. That frog’s exposed viscera matter little. Questions about how my body works, how I can respect its complexity of design, and how I can extend and maximize its quality—those are important questions. Yet they went unasked.

Likewise, nobody asked how I could apply the logical reasoning of higher math to life’s nuanced controversies. Nobody asked how I could parse out my life’s roles like I parse British literature. Because of that, I took much too long to make these connections. And that’s from a guy who learned the school lingo pretty well. Some people, like at least a few of my co-workers, never make these connections. And this limits their horizons prematurely.

This important fact lingers beneath education, yet remains unacknowledged. Most students I’ve known, including me, never realize this independently. I suspect many teachers have never heard this, either. And the state-mandated curriculum of “skillz drillz” and teaching for the test ensure few students ever make the connection. Perhaps more teachers know this than I realize. Why, then, did none ever tell me this out loud?

Monday, December 19, 2011

In Praise of Newt Gingrich and Child Labor

Newt Gingrich stirred an almighty row recently with his controversial suggestion that poor kids could learn priceless life skills if we put them to work early. This comes as no surprise to anyone who knows Newt’s reputation, built on his tendency to make inflammatory statements, some of which he lived to regret. But let me say something you’d never expect from a registered Democrat and peace activist:

The man has a point.

I've written before that many of my best students did not come to college directly out of high school. I’ve had good luck with parents, veterans, late bloomers, and others who have some work and life experience between public school and my classroom. I believe work provides a sense of identity. Workers test their goals, explore their identities, and find out what they want from their education.

I myself didn’t start college until twenty-five years old. I bounced through several jobs for seven years before resuming my education. While I did not support myself in that time—minimum wage blows—I learned to support myself intellectually. Though work did not provide my liberal arts education, the school of hard knocks taught me what I wanted in life, what I was willing to do to get it, and what value I placed on my time.

Think how much further along my life might be if, instead of waiting until eighteen, I had to work for my money at, say, twelve. What if I’d had to sack groceries, wash cars, walk dogs, or otherwise hustle to fill my pockets? I suspect I’d have learned much earlier the difference between necessity and luxury. Work probably would have cured my middle class ambivalence toward school.

Before I proceed, let me emphasize what I’m not saying. First, I’m not endorsing Newt. He’s little more than a partisan bomb thrower who builds support by ginning the fringe. His phrasing uses longstanding extremist cant to conflate poverty and race, and when he talks about firing union workers and filling their jobs with poor children, he wears his loyalties on his sleeve.

Moreover, I’m not advocating the rollback of child labor laws. Kids’ lives would not be improved if we sent them into sweat shops and salt mines. Child labor laws were written on the fairly straightforward belief that children should not work dangerous, sophisticated jobs until they had the physical strength and mental judgment. In other words, they shouldn’t work jobs that never let them achieve sufficient maturity to work those jobs.

But not all labor is a Dickensian nightmare. We can protect kids from the Triangle Shirtwaist Fire without extending infantile dependency to the cusp of adulthood. If we want kids to build strength and wisdom, we could offer them a hem of responsibility, where they could test themselves, and stretch to meet their potential. We claim to do that by giving them household chores, but many might benefit from answering to someone other than Mommy and Daddy.

We could start by scaling work according to kids’ strength and maturity. No reasonable person would think a twelve-year-old could tote a hod at a construction site, but why couldn’t that kid sort screws? A teen should not drive a forklift in a warehouse, but why can’t that same kid stock shelves? Kids who have work consummate with their ability will develop ability consummate with their work.

Critics might respond that, in tight times, putting kids on the job market would drive wages through the floor. Yet our current lopsided economy is perfectly avoidable. We have seen skyrocketing productivity, coupled with stagnant wages and almost nonexistent hiring because of a crooked system. If we enforced adult labor laws fairly, we would created good-paying jobs for grown-ups and starting opportunities for kids.

Many kids already do work, especially lower class kids. Go to any city and watch. Poor kids mow lawns, run errands, carry groceries, wash cars, and do whatever they can to bring in a little money—under the table. Though childhood crime is far less common than Newt implies, hustling is already commonplace. Sadly, middle class kids are more likely to remain sedentary and passive.

Former schoolteacher John Taylor Gatto, in A Different Kind of Teacher, describes setting up professional apprenticeships for middle school students on the sly. These kids built networks they could tap later in life, but reportedly, they also got better grades in school as well. If it could be administered fairly, I’d like to see more kids have such opportunity.

Saturday, December 17, 2011

All for One, and the Book for Me

Lauren Bonk is a wife, mother, sometime actress, and freelance writer. She blogs and conducts business at LaurenBonk.com.

When Kevin asked me to write about the book that has most heavily influenced me, my thoughts immediately shot to The Great Gatsby or Wuthering Heights. These two novels are simply my favorites. Of all time.

The keyword here, however, is influence. While swooning over protagonists and drowning with pleasure into pages of gorgeous words is all well and good, I'm guessing he was looking for a little more meat.

Plus, I don't think anyone really wants to read 500 words of me swooning over Jay Gatsby or Heathcliffe...I think we'd all come out of that one thinking I've got serious mental issues.

Anyway, the book. The book for me has got to be The Three Musketeers by Alexandre Dumas. I read it when I was in third grade. I know that sounds ridiculous, but let me set the scene:

It's 1993. Amanda and Lindsey have gotten over a fight and are best friends again...I've got a crush on Tim (he's got an earring and a rat tail), and Disney has just released The Three Musketeers starring Chris O'Donnell about a bazillion other celebrities. BAM! my crush on Tim is shelved, and I am head-over-heels for this movie. I love the adventure, I love the clothes, I love the language, but more specifically, I loved Kiefer Sutherland.

This was when I learned that a lot of movies actually came from books. For Christmas, I got a big box of those short little paperbacks...Little Classics, maybe? I don't know, but all the classics were there, and 3M was one of them. I read it and thought, “Hey, that was pretty much the same as the movie, but there was more stuff in it...hmm...” I realized then that the pictures going on in my head were a lot better than the ones I was seeing on the screen.

My parents then informed me that the small book was actually a very shortened version of the original. Since I was completely hooked by this story, I had to get my hands on it. After school, I marched over to the library and asked for help in finding the long version. Seeing as how I was in 3rd grade, the librarian was a little skeptical. Her skepticism, however, was easily shaken off by the fact that I wasn't another kid asking her for an R.L. Stine, and she was jumping at the chance to put some real literature into a kid's hands.

I'll admit that I was pretty intimidated when I saw the book. That thing was a monster...and it only had a smattering of really old pictures in it. The little version had a picture on every other page...but darn it, I read that thing. And though I had no idea what most of the words meant, I loved it. A few chapters in, I decided to read it with a dictionary, and, obviously, that helped me out quite a bit.

Now, I had read “chapter books,” as we called them back then, but none like this. This book treated me like an adult. Now, quite obviously, I wasn't an adult (and by no means did I deserve to be treated like one)...but what I'm saying is that this book was like an honest friend. A friend who didn't talk to me about babysitters in a club or foxes dressed like Robin Hood. This friend told me an epic story that not only entertained me, but also taught me a few important life lessons.

People don't always say what they mean. Times get rough. Emotions can make us hurt and do stupid things.

These are all very important lessons... and adults don't always want to tell you about them in when you're in third grade...but they couldn't have been more relevant. Dealing with friendship in elementary is pretty much a bloody battlefield, and I needed all the help I could get.

Now, I know this post has a fairly juvenile flair to it. This isn't a book I can be intellectual about; I've just started rereading it for the first time since '93. It is, however, the piece of writing that lit the candle. I had no idea that a stack of paper, bound together, could be so powerful, until I read The Three Musketeers, and if I ever get the chance, I'm buying Mr. Dumas a fancy drink...something with an umbrella.

No, no, no, not an umbrella. Some fruit stabbed with a plastic sword...yeah. That's more like it.

Friday, December 16, 2011

The Soul of Science, the Heart of Discovery

As a farm kid who grew up without indoor plumbing, Richard Ward could have looked forward to inheriting the spread, like a lot of other rural Montana boys. He certainly didn’t set very high expectations among his teachers. But that was just the first time he surprised himself, and everyone around him. Dead Ends to Somewhere reveals how a kid who almost flunked grade school invented a vaccine that saves half a million kids worldwide every year.

Ward was born the youngest son of a large Irish Catholic family, taught to be intensely competitive, to respect hard work, and to revere God. But all that never translated into his schoolwork. The nuns despaired of ever teaching him to read or write. That is, until the day when one sister read his work in front of the class as an example of what to do wrong, and used his name. Ward swore he would never get humiliated like that again.

But that resolution didn’t translate into a clear vision for his life. He started to blow the lid off school, but he drifted through actual career hopes. Vague attempts to find work in San Francisco, play college pigskins, and pursue seminary all took him in directions that never quite paid off. But what seemed like rabbit trails in the moment ultimately proved to offer little glimpses at the bigger secrets in his life. All of them helped advance his life in science.

His story reads like a buttoned down retelling of a Kerouac epic. He pitches his rural upbringing to discover San Fran, where he... um... gets a job in a department store. In graduate school at Berkeley, he probably became the squarest member of the Love & Freedom party, more interested in church and school than protesting. Later he accepted a prestigious job at Sandia National Labs, making earth-shattering discoveries about... sewage sludge.

Ward laces his story with a wry, self-deprecating sense of humor. He talks about the scientific research in pursued, from Berkeley to Germany to Jersey to Cincinnati, in terms even a novice like me can understand. Though he’s a prolifically published researcher, Ward doesn’t rely on the dry voice and dense jargon that makes much scientific writing so imposing. He does a remarkable job communicating with a non-specialist audience.

Along his path, Ward struggles with life and purpose. Like many young people, he had to work hard to accept that he wouldn’t win his Nobel Prize before he turned thirty. Long, painstaking work seemed tedious deep in a Munich basement, but would prove valuable in a Los Angeles lab. He grew to hate a coveted government job, especially when standing in raw sewage, yet it paved the way for his greatest discoveries.

He also never seems to stop running aground on the local cultures. His small town honesty causes him more problems than solutions in Mob-infested New Jersey. His scientific rigor sets him outside the more laid-back, even chaotic UCLA ethos. Only when he finds his real career and mission as a researcher at a children’s hospital—a last ditch job he almost rejected—does his life finally come together with his work.

Ward never set out to investigate infectious diseases. He wanted to study retroviruses, which researchers formerly believed caused cancer. But when rotavirus loomed large in his discoveries, and he discovered that this virus killed 500,000 children yearly, he suddenly had something to work toward. We feel his sense of triumph when his vaccine proves to be the life-saving golden cure suffering children and their parents longed for.

Scientific memoirs often run to either banality or grandiose hot air. Einstein and Planck struggled to make their life stories interesting, while Newton, in his religious fervor, often seemed to be leading his own campaign for sainthood. Ward, by contrast, seems both down-to-earth and candid. I’d put this book alongside classics like Feynman’s What Do You Care What Other People Think?

Not that it’s perfect. It comes from a small independent publisher, and has a few formatting problems. And Ward could use a firmer editorial hand. While the details of his life reveal a great deal about his path toward discovery, not every rambling anecdote about long cross-country drives and frustrating job interviews moves the story forward.

But for all its foibles, this memoir admirably humanizes science. Richard Ward is both a sharp researcher and a savvy storyteller. And if he can make science more accessible and exciting for our society, his contribution will not go unappreciated.

Thursday, December 15, 2011

Matt Mikalatos on New Life, Unlife, and God's Life

In 1678, long overland journeys were part of cultural consciousness, and John Bunyan’s The Pilgrim’s Progress touched English readers right where they lived. But culture has shifted. That’s why Matt Mikalatos’ Night of the Living Dead Christian seems so timely. We surround ourselves with images of monsters; Mikalatos turns these images into metaphors of the struggles Christians face to live transformed lives in the modern world.

Mikalatos makes himself the protagonist of his frenetic comic novel. Just after Mikalatos discovers a mad scientist (“that’s ‘eccentric genius’ to you!”) in his neighborhood, an onslaught of zombies reveals the presence of a werewolf and a vampire in Mikalatos’ previously quiet suburb. When the werewolf invites Mikalatos to church, and the zombies start plastering houses with religious handbills, Mikalatos discovers something deeper in play.

Unlike Bunyan, who wrote for a mixed Christian and secular audience, Mikalatos writes for Christians, calling them to question exactly how Christ has swayed their lives. Though he tells his story with spirited, Monty Python-like humor, his purpose is in deadly earnest. Too many Christians, called to a new life in Christ, instead submit to a strange unlife, often without acknowledging it to themselves.

In Mikalatos’ undisguised symbolism, zombies represent Christians who relinquish their thoughts to leaders and pursue a behavioral checklist rather than living for God. Werewolves tamp down their sinful inclinations, only to see them rear their ugly heads at the worst possible moment. Vampires set themselves apart, sucking the life out of others just to keep going. All of them think they’ve been transformed, but they’ve actually been diminished.

Jesus said: “I have come that [believers] may have life, and have it to the full.” Christ promised us a renewed life, blessed with purpose and mission. But too many religious demagogues claim we can make our profession of faith, be saved, and continue living like we did before. As if Christianity only means getting into heaven when we die, we live like we have already died, shuffling through the world, neither here nor there.

Mikalatos won’t have it. This life has to pass away when Christ gives us the new life. But to achieve that, we need to place our trust in Him, not in human leaders or displays of public piety. Such exhibitions are meant for this world. Remember, we are called not to be conformed to this world, but to transform it by the power of our faith. We are not reanimated; we are resurrected.

Two narrators drive Mikalatos’ novel. Most of the book comes from the mouth of Mikalatos’ alter ego, a frenzied character who knows he’s trapped in a monster movie, but doesn’t know what to do about it. He caroms through a world that parodies the creeping helplessness of George Romero or HP Lovecraft, gradually finding the strength to take control and redeem the monsters who need his help to find the real God.

The other voice belongs to the werewolf, Luther. Discouraged by the hypocrisy of Christians around him, especially his father, Luther is grimly fatalistic, struggling with fear of his own inadequacy and insignificance. He speaks the language of hip secularism, justifying himself in ways that show he can’t quite face the nihilistic implications of his beliefs. Though a parody of Mikalatos’ opponents, he’s also a carefully constructed, realistic voice.

Both voices challenge Christians to face our own pat answers. The Christian mainstream has a tendency to let questions go unasked and to not examine its own suppositions. Though many good pastors have tried to push these concerns back into our awareness, workaday Christians have slid into the comfy patterns Mikalatos caricatures as “undead.” Ironically, the traditions that consider themselves most vital and thriving are most vulnerable to this trap.

Mikalatos makes these questions palatable by couching them in humor, but that doesn’t mean he lets us off easily. His zombies believe themselves saved, his werewolves think themselves justified, his vampires think themselves exonerated. Though he emphasizes that all the monsters still have a chance, if they trust the One who gives new life, he doesn’t sand the rough edges off life’s constant struggles.

Susannah Clements says that how our society handles mythical monsters reflects our ethical compass. Matt Mikalatos uses humor, but he treats his monsters with great respect. To him, any of us could be these monsters; we might already be, and not know it. Mikalatos’ monsters make us laugh, even as they make us ask important questions about our lives, and what we do with them

Wednesday, December 14, 2011

Why Men Are Losing in Modern Universities


Trapped on a particularly slow-moving work station at the factory, I cast my eyes to my fellow line workers. The guy ahead of me was setting prepared filtration elements on the arrayed spin-on base plates, upon which I would place steel tension springs, and finally the guy behind me would place a metal sheath, called a “can,” over the assembly. A machine would set the seals, and then the product was off to be painted by the machine. Regular as clockwork.

My station was tedious, because it involved placing a small, stable component. The guys on either side became flustered easily, because their components were large, snug-fitting, and easy to knock off the line. I pulled a tiny spring from a huge bin, and set it down. They pulled bulky components off pallets that weighed hundreds of pounds and had to rest on pneumatic turntables to remain under control. My station was orderly; theirs were surrounded in detritus.

I realized I could kill two birds with one stone. If I stepped away from my station for a few seconds to pick up and stow their cardboard pallet layers, pull damaged components off the line, and do other spot checks on their stations, my own job accelerated in pace, and I took a load off my flustered colleagues. Everybody wins. But after I did that for a few minutes, I noticed another benefit: the harder I worked helping them, the more I got a rush of endorphins myself.

Men have a deep-seated need to compete and win. Whether it’s learned or innate I cannot say, but I’m hardly the first to observe that, given the chance to overcome obstacles or exceed goals, men feel happy. By exceeding my goal of assisting my colleagues, I earned the reward of inherent pleasure. In a sense, though the factory is a cooperative rather than a competitive environment, I had the opportunity to “win.”


This got me thinking about my other job at the university. Women now comprise about sixty percent of college students. Men aren’t enrolling, and when they do, they’re more likely to withdraw, especially from general education classes like mine. Since a college education is increasingly necessary for upwardly mobile employment these days, that means men are setting themselves up for future economic inferiority.

I lack the knowledge to draw statistical conclusions, but I suspect men sabotage themselves thus because modern academia gives them few opportunities to “win.” Sure, the university isn’t a horse race, and shouldn’t divide the in-group from the out-group. But school does not reward conventional masculine traits. Quite apart from competition, it structurally makes many men feel like “losers” in the very way it’s conducted.

The emphasis on reading books and writing papers, for instance, disadvantages many men. Written language is a complex process requiring both brain hemispheres. This means only those with a mature corpus callosum, the sub-organ that joins brain hemispheres, flourish at reading and writing.

Women generally have a heftier corpus callosum than men. No surprise, then, that stats say women are over fifty percent more likely to read recreationally than men. Women also do better at memorizing arcane grammatical rules. This of course gives men an advantage at monopolar activities, like math. Yet those same men get flummoxed, and flub their degree requirements, when asked to translate mathematical concepts into writing.


Men flourish in the one university domain that rewards their competitive nature, sports. Title IX notwithstanding, money and prestige flow to fields where men excel, particularly football. While classrooms increasingly become bastions of female privilege, football fields, baseball diamonds, and basketball courts still give men places to shine.

I see men leave my classes time and again, and wondered about this lopsided dropout rate. Only this semester did a student admit: he felt like the women were kicking his ass. It felt like a rigged competition, one he’d lost before he walked in the door. That emotional reaction then became a feedback loop, permitting him to put off assignments, get further behind, and cede more to the women.

If we want men to survive in our cutthroat economy, we need them to complete their degrees. But that won’t happen while school subverts their common competitive motivation. Men will get further behind as long as they feel they’ve already lost. I have no solutions yet. I especially struggle to envision “competitive” education that wouldn’t disadvantage women. Yet men deserve the opportunity to win, and we need to find ways to give it to them.

Monday, December 12, 2011

Public Burnout, Private Fallout

When Sarah started watching MTV’s Teen Mom recently, I questioned why she would reward such blatantly unbalanced characters. Though the show debuted with benevolent aims, to let young women see teen motherhood’s actual lack of glamour, it also provides a national, even global, audience for its stars. Considering the twisted, histrionic tendencies they’ve shown, these young women don’t need the reward of anyone paying attention.

Sarah responds by pointing out that she aspires to become a therapist. Her practice will almost certainly consist of people as damaged as these girls, and she needs to see their illnesses in action. In fairness, you can’t exactly order a DVD course on “Case Studies in Borderline Personality Disorder.” If she wants to examine such illnesses before hitting the field, shows like Teen Mom provide an otherwise rare opportunity.

But Amber Portwood’s public meltdown, resulting in a domestic assault conviction and bipolar disorder diagnosis, seems hardly inevitable. While her troubled relationships and illness existed before her strange stardom, the constant attention from TV cameras and Internet commentary certainly didn’t help. The likelihood of recurrence seems all the more likely with the attention paid to her recent weight loss. Narcissistic personalities don’t improve on display.

MTV pioneered reality TV nearly twenty years ago with The Real World, which was in turn based on a concept from PBS 20 years earlier. But The Real World recycled its programming every season, starting with a new cast in a new location. Though it encouraged obnoxious behaviors from its stars, it didn’t showcase them for years at a stretch. Exceptional snots like David “Tuck” Rainey were pulled early when circumstances got out of hand.

Such is not the case anymore. Portwood, with the rest of her cast, has been renewed for a fourth season. The E! Network refuses to divest itself of the increasingly embarrassing Kardashian franchise, even after Kim’s record-setting multimedia extravaganza wedding ended in divorce barely two months later. High-profile flame-outs are a license to print money for the networks that own distribution rights. Controversy sells.

Even on shows that don’t maintain permanent casts, the rotating ensembles get great mileage out of throw-downs. Variations on the claim that “I’m not here to make friends” have become a tedious cliché, but they persist because they encapsulate the behavior that sells ads. Unfortunately, the trite slogan has become its own satire:



The hosts of these shows hardly behave any better. Few people tune in to Hell’s Kitchen to learn anything about cooking, which is treated only fleetingly anyway. It’s far more interesting to watch Gordon Ramsay pitch a fit. Likewise, Donald Trump’s comportment on The Apprentice has been so appalling that one wonders why anyone supported that hambone for President.

I can’t help wondering what motivates these people. By that I mean both the screen personalities and the viewers. Many online commentators claim they want Portwood and Kardashian to recover from their highly visible illnesses; yet controversy draws audiences, which drives advertising revenue. The networks, and their stars, get rich every time some tantrum draws viewers.

The same sense of rubbernecking spectacle made Jim Morrison and Amy Winehouse obscenely rich and famous as they signed their own hanging orders. And let’s be honest, if Kim Kardashian follows that same doomed road, it won’t much affect our lives. Even if she lives, she’ll pass out of public consciousness much like Twiggy Lawson and Gia Carangi before her. Kim is little more than a distraction, easy to forget.

Amber Portwood, however, has a baby. As twisted as she’s become, she makes a contribution to society. Presumably, she and her child will want to reintegrate into the American mainstream, hold jobs, and pass a legacy on to the next generation. Amber certainly put hard constraints around herself when her pregnancy ended her party girl lifestyle, but she and her child still have a future, however limited.

Critics easily claim that reality TV’s exhibitionistic tendencies cheapen our own thinking ability, but I think they set their sights too small. Allowing these quasi-celebrities to take up any more of our consciousness creates a system of reward for their systemic self-flagellation. They will continue to gradually destroy themselves, and those around them, as long as we pay them for the privilege.

If we pretend to care about Amber and her baby, perhaps our kindest gift could be ignorance. By breaking the system of reward, we could create an incentive for her to take care of herself. Then she can start taking care of her child.

Saturday, December 10, 2011

Don't "Kitty" Me


Heather Stauffer is a writer, historian, Alzheimer's Advocate, and blogger for OldJokesGetLaughs.
blogspot.com
.

My bookshelves are full of Great Plains histories, nineteenth-century novels, and mystery paperbacks, so I was surprised when, a few months ago, I checked out up a copy of Dirty Job and couldn’t put it down. Christopher Moore’s 2007 novel takes readers to San Francisco and then exposes them to the “dirty job” of being a death merchant in charge of collecting soul vessels.

Yes, a death merchant. In our culture’s current obsession with vampires and the un-dead, this story stands out by its wit and unconventional characters. The protagonist, Charlie Asher, is a Beta Male; when Alpha Males were out hunting food and defending the community, Beta Males stayed behind and made themselves available for the grieving widows.

The owner of a second-hand shop, Charlie just wants to be a good father to his daughter, Sophie. Charlie becomes a father and a widower in the same day, and soon after he discovers that he can see people’s souls attached to items in his store. Now, in addition to the other major changes in his life, he is responsible for collecting these souls and seeing that they are redistributed to new owners.

This is not a book for everyone (liberal use of death and swear words tend to be unappealing to younger readers and the faint of heart), but it is an interesting commentary on identity and purpose. Charlie is simultaneously thrown into two very important roles (death merchant and single father), and at first he tries to avoid both on the grounds of his timidity and neurosis. Once he comes to terms with his responsibilities, he is more accepting of himself and his abilities. Well, to a point.

As the narrator explains, Charlie’s existence as a Beta makes him perfect for this new role in the soul business. He is dependable, overly-committed, and disappears into the background easily. His overactive imagination helps him problem-solve unorthodox situations associated with death merchantry and fatherhood, but it also inflates his ego to the point that he believes he is Death (with a capital D), and not a servant to it. This, of course, does not quite pan out like Charlie expects.

As people start dying around him, Charlie becomes convinced that he is the reason. Fueling this belief is his daughter’s ability to kill people by calling them “kitty,” and the two massive hell hounds that appear from the shadows to protect the apartment. Instead of continuing to improve on his roles as father and merchant, he seeks ways to become a super-hero Alpha Male (in the form of his “true” identity: The Big D.).

I’m not sure why this story has stayed in my mind so long after returning my copy to the library. Perhaps its humor caught my attention. In true Christopher Moore fashion, several scenes had me literally laughing out loud; who can stay somber with a character named “Minty Fresh” or an immigrant tenant who sells Sophie’s dead pets at the Chinese market?

Perhaps more telling, in the midst of the overly outrageous scenes, are the realistic emotions of Charlie. His wife was the love of his life, and her absence prevents “death” from becoming too lighthearted in the novel. As much as Charlie changes in the story, he genuinely grieves her passing the entire time. Also grounding the reader is his attachment to Sophie. He shows his best characteristics when working at fatherhood.

Though Charlie is not cut out to be a super-hero Alpha Male, he does have the ability to excel in the things that he is good at, and that might be what resonates so strongly about this story. We can all use our talents, skills, and personalities to be part of something great, even if we aren’t “The One.”

Friday, December 9, 2011

me write essay on talk nice for you



Though this event occured several years ago, when Conan still had his pinch-hitter slot on NBC, it has received renewed attention on social media in the last few days. I suspect people like seeing Garner, who looks distinctly self-satisfied while correcting Conan, getting schooled. For a society that accords nearly divine status to celebrities, we love seeing them brought low again.

Yet I see something deeper at play here. As Conan and Garner wrangle over one stupid word, I see status games in play. That’s why Garner needs to bring up Conan’s Ivy League education, and it also explains Conan’s triumphant cackle. Each of them wants to demonstrate their superiority to the other, which they can achieve by showcasing a piece of knowledge the other doesn’t share.

Ursula K. Le Guin
American novelist Ursula Le Guin, in her writing guide Steering the Craft, asserts that such contests are inherently political. “‘Correct grammar,’ ‘correct usage,’ are used as tests or shibboleths to form an in-group of those who speak and write English ‘correctly’ and an out-group of those who don’t.” We can see Conan and Garner doing exactly that above. As Le Guin concludes: “And guess which group has the power?”

In the hippie era, counterculture leaders frequently issued mimeographed broadsides against war, inequality, and other issues. As you’d expect from shoestring operations, the smeary ink and coarse paper often contained haphazard usage. Deputies of the status quo loved to reprint these broadsides, liberally coated in editorial marks, with postscripts asking activist youth if they really wanted to follow leaders with such poor command of English.

Many grammatical “rules” we learned in grade school were flatly made up in bygone days, often by nobility who wanted to exclude the hoi polloi from polite company. Poet John Dryden invented the “rule” that you can’t end a sentence with a preposition. I happen to know you can. I’ve done so many times. Yet teachers still use this “rule” to shame and humiliate students into compliance.

My favorite comparison is table service. Victorians of means loved to invent new pieces of flatware so they could boast in being the only one who knew how to use them “correctly.” Thus, Victorian formal dinners could involve place settings larger than some central London flats. The looks of disdain at people who couldn’t use a demitasse spoon, or couldn’t distinguish a stemmed from a stemless wine goblet, are legend.

Today, we recall that time with disbelief. How many of us ever need more than two forks, a knife, and a spoon? Maybe two spoons if we have both soup and dessert. Victorian silverware wars look abjectly silly. Yet we cling to absurd grammatical rules, which serve just as little value, with remarkable vigor.

Henry Fowler, whose Fowler's Modern English Usage first appeared in 1926 and is currently in its fourth edition, made up rules that nobody had ever previously observed. The New Yorker magazine, the New York Times, and Oxford University Press maintain their own style sheets, which are mutually contradictory, and contain rules most people have never seen. Their competition makes oyster forks look quaint.

Nowhere do these invented rules do more damage than classrooms. One teacher I worked with in grad school graded students down if they wrote “on the other hand” without first writing “on the one hand.” Another had seemingly arbitrary rules about when it was acceptable to use the words “I” and “you.” Students lived in fear of these teachers, because they couldn’t predict and adapt to their demands.

One student still sounded terrified, years later, recalling a middle school teacher’s reprimand before the whole class if he said “I’m done.” As the teacher put it: “Dinner is done! People are finished!” This rule is based on a distinction that has not been observed in spoken English in nearly 300 years. Unfortunately, my student got the real intended lesson: Shut up. And that lesson stuck far better than his academic subjects.

I understand why Conan and Garner’s pissing contest makes for good web chuckles. To an extent, we understand how stupid both of them look, getting exercised over a completely useless debate. Yet I think we also sympathize with Conan’s victory: as trivial as it is, we also wish we could have spoken that triumphantly to the pedants who wanted us to stop talking until we had internalized their rules.

I’d rather speak honestly than correctly. Yet perhaps that’s the message behind grammar lessons—My rules. My language. My truth. You shut up.

Wednesday, December 7, 2011

Screenwriting, Gatekeepers, and the Golden Stepstool

Xander Bennett earned his stripes as a Hollywood slush pile reader, culling spec scripts to make sure that producers didn’t spend their valuable time reading snoozers. Along the way he noticed that inexperienced writers tend to make the same mistakes time and again. To help real aspiring screenwriters avoid those traps, he wrote Screenwriting Tips, You Hack. I appreciate his candor and his utility, but I can’t get past some serious lingering doubts.

The subtitle claims to offer “150 Practical Pointers for Becoming a Better Screenwriter,” and he undersells himself in two ways. First, he actually includes 168 tips, ranging from one-sentence nuggets to four-page essays. Second, though some of his pointers live in film’s exclusive world, most apply to any form of creative writing. I could even harvest some useful advice for my college writing students, like: try dumb in the first draft. You have rewrites to make it into art.

Bennett ranges through the whole screenwriting process, from generating ideas and organizing, through drafting and rewriting, ensuring the strongest possible structure, into troubleshooting so you send out the strongest possible spec script. He also deals with moving across genres—TV writing differs from films—and career planning. He focuses on the general, like crafting strong dialog, and the specific, like choosing the right punctuation to propel your sentence.

As I say, many of his tips apply across all creative forms. For instance, read more of the kind of literature you hope to create. You can’t write a rom-com if you only enjoy science fiction craptaculars. And you can’t write a screenplay just because you watch a lot of films. Also, write dialog for ultimate drama; don’t fall so in love with characters that you won’t let them hurt; and take all feedback seriously, even if it doesn’t jibe with your original vision.

I especially appreciate his advice on active writing. Nothing kills a piece of creative literature, whether a script or a story or a song, like egregious “to be” verbs and “-ing” endings. And while I disagree that adjectives and adverbs kill the momentum set up by verbs and nouns, too many modifiers definitely suck the action right out of a sentence. I’d like to see all writers, whether creative or scholarly or professional, apply his advice on active, energetic prose.

Bennett doesn’t just toss out advice, though. He takes the time to explain what it really means, backing it up with evidence from well-known films—acclaimed successes and notorious flops alike. He writes with a dry humor and the voice of experience, like someone who has seen it all, and has plenty of war stories to share. He reminds me of a best friend who wants to help you achieve your dreams while avoiding the pitfalls he’s already endured along the same path.

But that doesn’t mean Bennett has the golden key to screenwriting success. He offers advice to get past script readers, the gatekeepers who decide what’s worth producers’ time, so don’t think that just because you use his tips, you’ll get produced by Coppola and directed by Scorsese. Also, Bennett focuses on screenplays as finished literature in their own right, which is a far cry from seeing them turned into finished films.

Also, a Google search turns up no produced films under Bennett’s byline. Though his official bio says Bennett has made the leap from script reader to screenwriter, game writer, and graphic novelist, I turn up only one title under his name. He doesn’t even have an IMDb page. When David Mamet pens books on scriptwriting, we know he learned his lessons the hard way. Bennett gives us no such assurance.

Simply put, getting the rubber stamp from a Hollywood script reader, like getting past a New York editorial assistant, only means your work doesn’t suck. It’s a far cry from actually making it into the inbox, much less seeing your name in lights. Bennett’s advice will help you weed out the errors that torpedo aspiring writers and their run-of-the-mill spec scripts. He will not, however, close the gap between wherever you are right now and stardom.

On balance, Bennett’s advice will strengthen most writers. Weak writing, in any genre, tends to suffer the same problems time and again. But strong writing is always strong in unique and surprising ways. Aspiring writers can only close that gap by writing, a lot, until they find that story only they can tell. Bennett offers a golden stepstool, but only you can achieve your ultimate goal.

Monday, December 5, 2011

Living for the New U

Does anyone really like higher education today? Professors lament the loss of teaching as central focus; students condemn the system’s apparent aimlessness; parents fear skyrocketing costs; and legislators complain that no one seems answerable for anything. Andrew S. Rosen, president of Kaplan University, contributes an alternative in Change.edu: Rebooting for the New Talent Economy, if readers can overlook Rosen’s own counter-dogmatic limitations.

Rosen begins with a tour of today’s conventional college landscape. Many schools have become obsessed with Ivy League prestige, attempting to match Harvard in research, teaching, and gravitas. Yet no American school can match Harvard’s multi-billion-dollar endowment or nearly four centuries of history. So other universities find end runs to boost various rankings, including accouterments that contribute little to education.

Too many schools, especially private non-profits and Division I state universities, compete on amenities rather than academics. The surge in colleges has not improved the student pool, and there’s no prestige margin in remedial liberal arts. So top universities become luxury resorts, without improving learning. Schools compete on athletic programs that bleed money, dorm and dining facilities that practically deserve Michelin stars, and recreational facilities that only attract teenagers who don’t need to work.

In reaction against this trend, the free market has responded with private, for-profit universities, like Rosen’s own Kaplan. The rise of these schools, which currently outpaces conventional ecucation, has earned the ire of the old guard. Yet these schools meet a real need. Since they’re primarily trade schools, they benefit from teaching by working professionals in the field. And they work well for non-traditional students.

As they should, since non-trads may be the future of education. While educational prestige turns on the ability to attract teens, many of my best students have been adults with families and careers. Adults know what they want from education, and what they’re willing to do to get it. Football championships, campus nightlife, and mall-like student unions don’t impress them much. Education outcomes and career placement mean more.

Andrew S. Rosen
For-profit schools have the metrics to measure outcomes, while conventional colleges generally measure inputs, like test scores and GPAs. If longer-standing schools feel threatened by for-profit universities, perhaps it’s because for-profit schools accurately measure what students want, and provide it. Schools like my own small regional university could profit from studying these newcomers, shifting focus off teenagers with huge parental bankrolls, onto the non-trads who most want to learn.

Moreover, online learning offers untapped potential for real, cutting-edge learning. When I first dipped my toes into that water a decade ago, promises vastly exceeded what the technology could deliver. Not so any longer. While online classes lack the immediacy of classroom face time, and require significant self-discipline from students, they also offer personalized instruction and malleable scheduling that brick-and-mortar buildings never can.

I have previously been leery of for-profit schools, for reasons Rosen concedes: short-term corporate fiscal horizons don’t jibe with education’s long-term nature. By straddling these worlds, for-profit schools form a new beast, neither fish nor fowl, that must negotiate the interests of two constituencies. Yet Rosen makes a persuasive case that such schools can do so effectively.

Unfortunately, Rosen has his own blinders on. He attempts to dismantle critics’ complaints against for-profit schools using metrics that favor him. For instance, he highlights established schools like Strayer, DeVry, and Phoenix, dodging the reality that unaccredited fly-by-night “colleges” have sprung up for quick profit. This reduces his critics to mere sour grapes, even though legitimate criticisms exist.

This comes across near the end, when Rosen admits “Private-sector colleges have room to improve,” after systematically dismissing nearly every criticism raised. Though he scarcely mentions his own university, preventing this book descending into mere advertising, Rosen’s attitude toward for-profit colleges is unstintingly glowing. To hear him talk, only office park colleges offer good education these days.

Moreover, Rosen’s focus on for-profit colleges gives him tunnel vision on where real innovation is happening in education today. Ingenious developments are arising within the conventional academic environment. New curricula, new schools of pedagogy, and even new philosophies of education, have all debuted in recent years to meet the very lacks Rosen names. Yet they don’t merit even one page of Rosen’s time.

For his limitations, Rosen brings new attention to a sector of educational innovation that has been merely caricatured before. As Rosen asserts, America’s economic future demands we produce educated, ambitious workers. Even if for-profit schools aren’t education’s future, they at least illuminate how conventional colleges can better serve our diverse, growing student bodies.

Friday, December 2, 2011

Workaday Christians and the Modern Calling

While many adults often enter ministry after prior careers, most ministers work for the church full time, and a gulf lingers between parish practice and parishioners’ daily lives. Pastor Tom Nelson resolved to bridge the gulf between his Sunday sermons and the brass tacks his flock faced when they ventured back into the world. Despite some minor hiccups, Work Matters: Connecting Sunday Worship to Monday Work does admirably with that goal.

Traditional Christian narratives emphasize work as an important religious duty. According to Genesis, God placed humans in the Garden to work the soil and care for creation. After the Fall, work changed from our sacred duty to our burden and penance, yet work—and its close cousin, rest—remain some of our most holy charges. How we carry our relationship to God and our neighbor into our work space speaks volumes to our souls and salvation.

Most of us spend more time at work than any other activity. Our families, hobbies, and volunteer commitments may be important, but our day jobs are our largest mission field. As Nelson points out, God created humans to continue God’s work. Yet we often greet work with grim resignation. The cynicism of Dilbert cartoons and “You don’t have to be crazy to work here” t-shirts is a kind of slow spiritual death.

This applies both to leaders, who have a charge to build up their employees and care for those under their authority, and workers, who perform their work as a trust both from their human bosses and from God. We were created in the image of a God who nurtures and builds. Christ Himself made a living with his hands for years before starting His ministry. How we work forms a core of our Christian outreach.

Nelson encourages Christians to embrace work and, in doing so, transform it. When we put our minds on God’s mission and our hands to the plow, we turn our workplaces into mission fields. But to do that, we must see our work as more than a burden. To achieve that goal, Nelson urges us to reclaim an idea that has grown distorted in recent years, the notion of a sacred calling.

Many churches, particularly evangelical congregations, mistakenly divide labors into “higher” work, meaning the pastorate or missions, and “lower” work, done in secular settings. This leads many ordinary parishioners to think they let God down if they stick with their workaday jobs. But Nelson reminds us that the Apostle Paul encouraged Christians to continue their work, and be Christ’s eyes and hands where people actually are.

As Jacques Ellul says in The Presence of the Kingdom, many Christians, especially converts, think they have a call to ministry because that exempts them from life’s tedium. One young man I know thought that, if he became a pastor, he could study the Bible forty hours a week. Ministry’s realities shattered that illusion. Christians are called to work. We are not a rarefied batch of contemplative thinkers; even monks earn their keep by the day.

Yet for all the opportunities work offers Christians, we cannot ignore the challenges. Work can become a form of idolatry. I’ve heard of pastors who have needed to take sabbaticals because they realized they served the pulpit more than God. And work can become a source of profound discouragement, particularly when high Christian ideals conflict with petty workplace politics.

Nelson gets a bit exercised about the risks of sexual impropriety in the workplace, an attitude he takes perhaps to an extreme. While the high tensions of daily labor can cultivate shifting temptations, most workers I know are more threatened by despair or cynicism than lust. I wish Nelson spent more time talking about maintaining God-given hope in the face of daily struggle, because that, not sex, is the biggest obstacle I face.

But even when he follows the occasional cow path, Nelson makes many good points. Church should offer workers encouragement to face our Christian missions with good spirits, despite banal routine; yet many pastors fear to address the workaday world, and Christians fear voicing frustrations to the congregation. This creates a chasm between Sunday sermons and Monday actions, but to Nelson, this is merely an untapped opportunity.

We are all called to a task in this world. God made us to work. When we divide work and worship, we lose some of God’s most profound opportunities. Tom Nelson gives us the encouragement many of us need to reclaim our work for our God.