I just can’t bring myself to finish Jim Butcher’s newest Harry Dresden novel, Ghost Story. This frustrates and confuses me, because I’ve enjoyed every book in the series up to this point. This one continues the same dynamic character tensions and intricate situations, and takes the series’ backstory to new levels. So I certainly can’t say the book is any less good than its predecessors—just the opposite, if anything.
Yet I stalled out around page 225, and I can’t bring myself to pick it up again. Why the sudden change?
It’s not just Harry Dresden I’ve strayed from. A friend put me onto the first Dresden novel back in the spring of 2008, and I read all the series in print as fast as I could. The concept impressed me so much that I started snapping up every author I could: Kat Richardson, Seanan McGuire, Thomas E. Sniegoski, Caitlin Kittredge, Jes Battis. As with any genre, some authors really stunk up the joint. But others were quite smart, inventive, and engaging.
I read my last urban fantasy novel in mid-December 2010. I’d read them at a rate of about four per month for over two-and-a-half years, squeezing them in amid grad school requirements, more sober novels, and other “serious” reading. When I put the last one down, I had over half a dozen waiting for my time. A couple were half-finished, the rest waiting for me to start. And I made a good-faith effort to start. Believe me, I tried.
Maybe I burned myself out. If I overloaded on urban fantasy, I have no one but myself to blame. But considering what a glut these hard-boiled fantasies have been on the market recently, why hasn’t the entire genre burned out? Is there no critical mass beyond which readers cannot handle the saturation? Evidently not.
The problem must be me.
Publishers crank these books out so rapidly that the authors have little time to finish one before they must start the next. These companies, many of which began as labors of love by small entrepreneurs who expected only minimal returns, are now owned by multinational conglomerates. And their corporate overlords demand such extravagant returns that international drug lords would blush.
But we can’t blame publishers alone. Sure, they demand ceaseless reiterations of what they already know the market will bear, but that market consists of real humans who buy books. If authors have become factory hands, assembling uniformly predictable product, surely they do so because they know readers will buy it. Inoffensive blandness may curl my hair, but it also ensures a bottomless revenue stream.
Yet people like urban fantasy because it contains at least the germs of an older fantastic form—which is far from safe. As Richard Mathews asserts in Fantasy: The Liberation of Imagination, fantasy’s founding myth-makers knew their greatest popularity when their work was at its most unsettling. Consider the conflict between the secularist William Morris and the Christian George MacDonald.
Even better, consider the writers who brought the genre to maturity in the 20th Century. Writers like Robert E. Howard, whose classic Conan lashes out against metaphorical agents of the Gilded Age. Writers like JRR Tolkien, whose best works reflect trench soldiers’ fears in World War I. Writers like Lloyd Alexander, who discovered the timelessness of Welsh myth while facing life’s transience during World War II.
The best fantasy is often mischaracterized as bucolic, when at root, it’s profoundly unsettling. It deals head-on with life’s finitude in ways that “realistic” fiction cannot. Like all good literature, good fantasy is slightly threatening. But too many readers want to be soothed like kittens. And paperback publishers, like candy makers, sell anything their customers buy, even if it’s bad for them.
Perhaps I’m the outsider, because I like challenges. As I've said before, I want reading fantasy to be like visiting a foreign country, getting lost on unfamiliar streets, and trying to order coffee in a strange language. Plenty of fantasy still provides that. And while Jim Butcher’s Harry Dresden may lack the epic magnitude of Howard’s Conan or Tolkien’s Aragorn, he at least challenges me from time to time.
But those challenges now come too infrequently. I want authors to push themselves harder, so they can push me harder. I want them to take me farther. I want to continue growing. The current crop of paperback writers, including Jim Butcher, just don’t offer that. And Butcher’s newest sits on my bedside table, months later, still unfinished.
A Close Look at Modern Mythology, Pop Culture, Hot Media, Book Reviews, and the Psychology That Makes Our Society Hop
Monday, November 28, 2011
Friday, November 25, 2011
Three Straight Ways to Conquer Your Waist
This holiday eating season, many will squander limited energy worrying about their beltlines. In my last weight-loss book review, I said different people gain weight for different reasons, so we must lose it through different tools. Maybe that isn’t scientifically rigorous, but it certainly resolves the issue to my satisfaction. But that doesn’t stop the book industry from printing money by insisting why every other weight loss regimen in the world is wrong.
Take, for instance, Jonathan Bailor’s The Smarter Science of Slim, and Mike Schatzki’s The Great Fat Fraud. Both authors claim to sort mountains of scientific research, and both have the uncountable source notes to prove it. Both claim the truth of real weight control has been stifled by a monolithic corporate conspiracy that would rather sell us a million pills than see us get permanently well. That’s where the similarities end.
Bailor builds his book on the thesis that we should “Eat more. Exercise less. Smarter.” For him, the problem stems from habits that short-circuit natural metabolic processes. High-starch diets and workouts that inefficiently distribute benefits leave us more hungry, more tired, and more obese than when physical fitness went mainstream forty years ago. His solution lies in more carefully choosing what goes into our bodies, and how we use them.
Schatzki, however, sees the focus on weight as a canard. The problem, for him, is “fitlessness,” widespread inattention to core wellness that extends beyond mere weight issues. Humans evolved in unsteady circumstances, forced to eat whatever came to hand, even if it wasn’t balanced. Fitness, for Schatzki, croses all weight classes. Instead, we should focus on whether we work our muscles in the way for which our bodies are optimally designed.
I suspect that, if Bailor and Schatzki sat on the same panel, they would be at each other’s throats constantly. They have different solutions because they favor different issues. What one sees as the real problem, the other sees as a mere subset. Though I doubt they’ve read each other’s books, each takes a sneering attitude toward the other’s solution. Here’s the kicker: I suspect they’re both right. By which I mean, they’re both wrong.
Where Bailor presents a fine-tuned diet and exercise regimen, stressing protein and focused effort, Schatzki sees such approaches as sauce for the gander. Humans evolved to walk fourteen miles per day, and if we want to keep our health, we should do so, no matter our weight. Both solutions are more complex than that, of course, and you should not undertake either without consulting your doctor. But that’s their respective positions, in a nutshell.
Both systems make sense, because both stress how the body is engineered. Human physiology exquisitely demonstrates purpose-built design, for hard work, feast-or-famine food supplies, and long haul endurance in hardship conditions. Unfortunately, starchy diets and sedentary lifestyles short-circuit those advantages. Thus both systems are perfectly correct, and woefully lopsided.
That’s why I like Don McGrath’s Dream It, Live It, Love It: Beyond Well, Beyond 50. Where Bailor and Schatzki name ways people should improve weight and fitness, quoting stats and research, McGrath interviews people past fifty who, at an age when many settle into comfy ease, have continued, or even newly begun, competitive athletic careers. No abstractions for these heroes; they’re too busy living to hypothesize.
McGrath interviews runners, cyclists, rowers, triathletes, mountaineers, Special Olympians, and even a competitive dancer. Most are in their fifties and sixties, though some maintain top form into their seventies, eighties, and beyond. Banana George Blair, McGrath’s cover model, remains a top-ranked barefoot waterskier past ninety. The accompanying photos showcase trim figures with great skin and bright smiles. Nobody would mind turning fifty if age looked like this.
The patterns McGrath sees among these heroes, salted with a hint of science, support a three-part system to identify the difference between competitive masters and the rest of us. But these aren’t rare saints. They emphasize that, with determination and the right mindset, we could achieve the heights they have. The effects our culture associate with age reflect inactivity and poor choices more than actually getting old.
McGrath doesn’t contradict anything Schatzki or Bailor say. In fact, he proves them right. How we eat and exercise makes a difference. But instead of treading the rare air of scientific research, he shows us real people who incorporate these principles into their own lives. McGrath provides inspiration; Bailor and Schatzki provide tools. Now it falls on us to make the difference in our own lives.
Take, for instance, Jonathan Bailor’s The Smarter Science of Slim, and Mike Schatzki’s The Great Fat Fraud. Both authors claim to sort mountains of scientific research, and both have the uncountable source notes to prove it. Both claim the truth of real weight control has been stifled by a monolithic corporate conspiracy that would rather sell us a million pills than see us get permanently well. That’s where the similarities end.
Bailor builds his book on the thesis that we should “Eat more. Exercise less. Smarter.” For him, the problem stems from habits that short-circuit natural metabolic processes. High-starch diets and workouts that inefficiently distribute benefits leave us more hungry, more tired, and more obese than when physical fitness went mainstream forty years ago. His solution lies in more carefully choosing what goes into our bodies, and how we use them.
Schatzki, however, sees the focus on weight as a canard. The problem, for him, is “fitlessness,” widespread inattention to core wellness that extends beyond mere weight issues. Humans evolved in unsteady circumstances, forced to eat whatever came to hand, even if it wasn’t balanced. Fitness, for Schatzki, croses all weight classes. Instead, we should focus on whether we work our muscles in the way for which our bodies are optimally designed.
I suspect that, if Bailor and Schatzki sat on the same panel, they would be at each other’s throats constantly. They have different solutions because they favor different issues. What one sees as the real problem, the other sees as a mere subset. Though I doubt they’ve read each other’s books, each takes a sneering attitude toward the other’s solution. Here’s the kicker: I suspect they’re both right. By which I mean, they’re both wrong.
Where Bailor presents a fine-tuned diet and exercise regimen, stressing protein and focused effort, Schatzki sees such approaches as sauce for the gander. Humans evolved to walk fourteen miles per day, and if we want to keep our health, we should do so, no matter our weight. Both solutions are more complex than that, of course, and you should not undertake either without consulting your doctor. But that’s their respective positions, in a nutshell.
Both systems make sense, because both stress how the body is engineered. Human physiology exquisitely demonstrates purpose-built design, for hard work, feast-or-famine food supplies, and long haul endurance in hardship conditions. Unfortunately, starchy diets and sedentary lifestyles short-circuit those advantages. Thus both systems are perfectly correct, and woefully lopsided.
That’s why I like Don McGrath’s Dream It, Live It, Love It: Beyond Well, Beyond 50. Where Bailor and Schatzki name ways people should improve weight and fitness, quoting stats and research, McGrath interviews people past fifty who, at an age when many settle into comfy ease, have continued, or even newly begun, competitive athletic careers. No abstractions for these heroes; they’re too busy living to hypothesize.
McGrath interviews runners, cyclists, rowers, triathletes, mountaineers, Special Olympians, and even a competitive dancer. Most are in their fifties and sixties, though some maintain top form into their seventies, eighties, and beyond. Banana George Blair, McGrath’s cover model, remains a top-ranked barefoot waterskier past ninety. The accompanying photos showcase trim figures with great skin and bright smiles. Nobody would mind turning fifty if age looked like this.
The patterns McGrath sees among these heroes, salted with a hint of science, support a three-part system to identify the difference between competitive masters and the rest of us. But these aren’t rare saints. They emphasize that, with determination and the right mindset, we could achieve the heights they have. The effects our culture associate with age reflect inactivity and poor choices more than actually getting old.
McGrath doesn’t contradict anything Schatzki or Bailor say. In fact, he proves them right. How we eat and exercise makes a difference. But instead of treading the rare air of scientific research, he shows us real people who incorporate these principles into their own lives. McGrath provides inspiration; Bailor and Schatzki provide tools. Now it falls on us to make the difference in our own lives.
Wednesday, November 23, 2011
Billy Graham's Guide to Aging in God's Grace
Famed evangelist Billy Graham once led month-long revivals noted for aerobic sermons. Now in his nineties and widowed, he can’t get out of his chair unaided. Many people might consider that as sign that the ministry has ended; not Graham. In Nearing Home: Life, Faith, and Finishing Well, he exposes the great gifts the aged still have to give for the Kingdom, and the lessons the aged still have to learn from the young.
Graham admits he never thought he’d live this long. The grueling paces he put himself through in his younger days often left him exhausted. Yet the Lord has another purpose for him. He might not get to lead his citywide “crusades” any longer, but his years of accumulated wisdom make him an invaluable resource for those who have inherited his ministry. He may lack the stamina to sermonize, but he has the experience to guide and uplift.
He does not claim this position for himself alone. He makes plain that, like the man who built his house on the rock, a family or society that trusts in the wisdom of the aged—the wisdom of the ages—will withstand the shifting struggles of the moment. Of course this means the aged have a responsibility to take stock of their wisdom. For Graham, as for countless evangelists before him, “fear of the Lord is the beginning of wisdom” (Psalm 111:10).
But he doesn’t limit himself to considering the aged as mere fonts for others to draw on. Those who live long enough have a new ministry. Whether they help guide coming generations, or use years of professional experience to defend the oppressed and uphold the discouraged, or just provide a good model of how to age well and face death secure in God, the elderly never stop offering something profound to this world.
Graham also speaks to the young, from multiple angles. Adult children of the aged have something to learn from their parents, and they have something to give as well. The young deserve the lessons of the aged, but they also have the privilege of offering their clear-eyed vigor to those who need assistance. The partnership of the young and the old, often attested in Scripture but too rare today, is dear to Graham’s heart.
In Chapter 7, Graham approaches something profound. At one time, multi-generational homes were common, and children sat down to dinner with their parents, grandparents, and even great-grandparents. Their partnership kept farms and cities both flush with shared heritage and full of the energy to uphold our best angels. Homes were places of learning, sharing, and growth, for the old as much as the young, who constantly prodded each other productively.
I wish Graham would take that even further. Today we hardly know members of other generations. We hide the aged in nursing homes, and the young in schools, creating a society of people only in their middle years. Our culture exists only in the present tense, without past or future, without wisdom or energy, without tradition or rebellion. Should we act surprised, then, that depression, drugs, and workoholism have rushed in to fill the void?
Graham’s recommendations include the spiritual, as we’d expect from a lifelong evangelist. But he also looks at practical considerations. Have the aged kept their affairs and medical directives current? What can the young do to look after their parents without lapsing into patronizing? How should the elderly respond, in our strange and complicated society, when called on to fulfill roles we never previously expected of them, like raising their grandchildren?
In raising these questions, Graham rejects the temptation to seek pat answers. He guides our attention to our own experience, our shared heritage, and Scripture. The answers to life’s questions, for Graham, always lead to the Kingdom, where we all hope to live when this life is over. To the faithful, this world is not our home. We will all die to this world, and then we will live forever where God always intended. Everything serves that goal.
Some of Graham’s intended readers are already aged. Most of the rest hope to live long enough to get old, too. Whether we reach that goal, or have already, we need hope of how we can age well, and use our age to serve God’s calling. Graham’s clear, thoughtful approach offers great hope while remaining graciously free of stifling dogmatism. He offers real hope for the aged, and those of us who look forward to age.
Graham admits he never thought he’d live this long. The grueling paces he put himself through in his younger days often left him exhausted. Yet the Lord has another purpose for him. He might not get to lead his citywide “crusades” any longer, but his years of accumulated wisdom make him an invaluable resource for those who have inherited his ministry. He may lack the stamina to sermonize, but he has the experience to guide and uplift.
He does not claim this position for himself alone. He makes plain that, like the man who built his house on the rock, a family or society that trusts in the wisdom of the aged—the wisdom of the ages—will withstand the shifting struggles of the moment. Of course this means the aged have a responsibility to take stock of their wisdom. For Graham, as for countless evangelists before him, “fear of the Lord is the beginning of wisdom” (Psalm 111:10).
But he doesn’t limit himself to considering the aged as mere fonts for others to draw on. Those who live long enough have a new ministry. Whether they help guide coming generations, or use years of professional experience to defend the oppressed and uphold the discouraged, or just provide a good model of how to age well and face death secure in God, the elderly never stop offering something profound to this world.
Graham also speaks to the young, from multiple angles. Adult children of the aged have something to learn from their parents, and they have something to give as well. The young deserve the lessons of the aged, but they also have the privilege of offering their clear-eyed vigor to those who need assistance. The partnership of the young and the old, often attested in Scripture but too rare today, is dear to Graham’s heart.
Billy Graham |
I wish Graham would take that even further. Today we hardly know members of other generations. We hide the aged in nursing homes, and the young in schools, creating a society of people only in their middle years. Our culture exists only in the present tense, without past or future, without wisdom or energy, without tradition or rebellion. Should we act surprised, then, that depression, drugs, and workoholism have rushed in to fill the void?
Graham’s recommendations include the spiritual, as we’d expect from a lifelong evangelist. But he also looks at practical considerations. Have the aged kept their affairs and medical directives current? What can the young do to look after their parents without lapsing into patronizing? How should the elderly respond, in our strange and complicated society, when called on to fulfill roles we never previously expected of them, like raising their grandchildren?
In raising these questions, Graham rejects the temptation to seek pat answers. He guides our attention to our own experience, our shared heritage, and Scripture. The answers to life’s questions, for Graham, always lead to the Kingdom, where we all hope to live when this life is over. To the faithful, this world is not our home. We will all die to this world, and then we will live forever where God always intended. Everything serves that goal.
Some of Graham’s intended readers are already aged. Most of the rest hope to live long enough to get old, too. Whether we reach that goal, or have already, we need hope of how we can age well, and use our age to serve God’s calling. Graham’s clear, thoughtful approach offers great hope while remaining graciously free of stifling dogmatism. He offers real hope for the aged, and those of us who look forward to age.
Monday, November 21, 2011
A Writer's Guide to Taking Your Own Control
When I began reviewing, I accepted any and all books that came my way, because I wanted to give up-and-comers a chance. But after several writers who had paid to publish their own books lashed out at me for calling unpolished books unready for the marketplace, I began flatly rejecting all self-published and subsidy published books. It speaks volumes for Joel Friedlander that his book, A Self-Publisher's Companion, has convinced me to change my mind on that policy.
Friedlander zeroes in on the reasons I react adversely to self-published books. Too many are done hastily, with little interest paid to design, legibility, or whether the audience will enjoy the finished product. Subsidy publishers in particular care little for whether the book sells, because they make their money from the author, not the reader. Many people who call themselves “self published” have merely printed their books, not published them.
But instead of calling a halt there, Friedlander looks at the difference between printing a book and truly publishing one. This means a sequential approach, starting with making sure the book is actually ready for distribution. Too many authors print cringe-inducing books because they rush past the editing stage, presenting us, the reading public, with prose that might make sense to the author. Without taking all the steps, such writers advertise their own amateurish tendencies.
This dovetails with Friedlander’s second point, that self-publishers must wear many hats. At some point they must stop being writers and become entrepreneurs. They must be serious enough about publishing to ask themselves what they need to add, what they need to cut, and then act on the answers. They must ask themselves what they cannot do for themselves. And they must admit when they need to seek help from outside.
Just because you have written a book does not mean you know how to arrange the pages for easy, pleasant reading. In the same way, many authors get caught up in the words they’ve written, and forget that the cover is an advertisement for the book. They either think the cover needs to represent their literary vision, or they care little for the cover and tear off a computer generated art project. Outside and in, the book ballyhoos its lack of insight.
Or authors never investigate their niche. Novelists and poets craft art that is a pale shadow of work already available. Nonfiction writers fail to gauge their audiences’ needs and produce work that is lackluster, uninformative, or just vague. Writers of any stripe enter the self-publishing game with poorly defined goals and little understanding of the business. But, Friedlander insists, their common mistakes are entirely avoidable—and he suggests how.
Consider all the church cookbooks or school histories you’ve bought from fundraisers. Consider the cockeyed prose, misplaced photos, and flimsy binding. That’s because the producers didn’t think of themselves as publishers. But Friedlander believes that every church circle or PTA committee could make themselves into the hometown Rachael Ray or David McCullough if they just took themselves seriously enough.
Friedlander made his publishing chops in the letterpress printing heyday, when typesetters actually set type in a frame. This made for a meticulous design approach, since each page was an artwork in its own right. When he discovered an untapped market, and no publisher with an interest in testing uncharted waters, he used his hard-won experience to publish his own book. Even in those pre-Internet days, he sold 10,000 copies over five printings.
Since then, he has guided other aspiring publishers through the design process, helping even complete novices produce professional books. And he has steered them through the post-printing arc, which is just as much of the publishing process. Through it all, he has stayed abreast of new developments, helping publishers utilize advancing technology, innovative publicity techniques, and emerging media.
This is not a how-to book. Friedlander will help you prioritize what goes into your publishing budget, but he will not quote figures to you. He will help you weigh options in e-publishing, but he will not coach you on using specific platforms. If you need a more nuts-and-bolts guide, I recommend Tom and Marilyn Ross’ The Complete Guide to Self-Publishing. Friedlander cares more about developing a productive publishing mindset.
Yet he should care about his area. He helps aspiring publishers identify the industry’s most neglected corners. And he has successfully persuaded me that self-publishing is a legitimate approach for those writers—and those reviewers—who treat it with the respect it deserves.
Friedlander zeroes in on the reasons I react adversely to self-published books. Too many are done hastily, with little interest paid to design, legibility, or whether the audience will enjoy the finished product. Subsidy publishers in particular care little for whether the book sells, because they make their money from the author, not the reader. Many people who call themselves “self published” have merely printed their books, not published them.
But instead of calling a halt there, Friedlander looks at the difference between printing a book and truly publishing one. This means a sequential approach, starting with making sure the book is actually ready for distribution. Too many authors print cringe-inducing books because they rush past the editing stage, presenting us, the reading public, with prose that might make sense to the author. Without taking all the steps, such writers advertise their own amateurish tendencies.
This dovetails with Friedlander’s second point, that self-publishers must wear many hats. At some point they must stop being writers and become entrepreneurs. They must be serious enough about publishing to ask themselves what they need to add, what they need to cut, and then act on the answers. They must ask themselves what they cannot do for themselves. And they must admit when they need to seek help from outside.
Just because you have written a book does not mean you know how to arrange the pages for easy, pleasant reading. In the same way, many authors get caught up in the words they’ve written, and forget that the cover is an advertisement for the book. They either think the cover needs to represent their literary vision, or they care little for the cover and tear off a computer generated art project. Outside and in, the book ballyhoos its lack of insight.
Or authors never investigate their niche. Novelists and poets craft art that is a pale shadow of work already available. Nonfiction writers fail to gauge their audiences’ needs and produce work that is lackluster, uninformative, or just vague. Writers of any stripe enter the self-publishing game with poorly defined goals and little understanding of the business. But, Friedlander insists, their common mistakes are entirely avoidable—and he suggests how.
Joel Friedlander |
Friedlander made his publishing chops in the letterpress printing heyday, when typesetters actually set type in a frame. This made for a meticulous design approach, since each page was an artwork in its own right. When he discovered an untapped market, and no publisher with an interest in testing uncharted waters, he used his hard-won experience to publish his own book. Even in those pre-Internet days, he sold 10,000 copies over five printings.
Since then, he has guided other aspiring publishers through the design process, helping even complete novices produce professional books. And he has steered them through the post-printing arc, which is just as much of the publishing process. Through it all, he has stayed abreast of new developments, helping publishers utilize advancing technology, innovative publicity techniques, and emerging media.
This is not a how-to book. Friedlander will help you prioritize what goes into your publishing budget, but he will not quote figures to you. He will help you weigh options in e-publishing, but he will not coach you on using specific platforms. If you need a more nuts-and-bolts guide, I recommend Tom and Marilyn Ross’ The Complete Guide to Self-Publishing. Friedlander cares more about developing a productive publishing mindset.
Yet he should care about his area. He helps aspiring publishers identify the industry’s most neglected corners. And he has successfully persuaded me that self-publishing is a legitimate approach for those writers—and those reviewers—who treat it with the respect it deserves.
Wednesday, November 16, 2011
Writing a Paper, and Other Futile Acts
I’ve grown accustomed, when my factory colleagues discover that I still teach English part time, to the reflexive revulsion they express. Plenty of people dislike English as a discipline. Many of my students make no bones about the fact that they consider my class a nuisance they must endure to reach the major courses they really want. More than one of my co-workers admit that their inability to savvy English class kept them from getting into college.
Not that I can blame them. Academic disciplines pick up in-group codes and lingo. I have a great interest in physics, but my one class in the subject foundered on my professor’s frequent recourse to jargon. I try to make my class as straightforward and free of buzzwords as possible, but many others don’t share that value. Especially in college, where professors aren’t trained in classroom management, teachers and students often speak completely separate languages.
The other day I overheard my manager—one of the few other workers with a college degree—speaking with another employee. My boss, “Miranda,” admitted that, while she liked most of college, writing-intensive classes drove her nuts. There seemed so many rules, such onerous restrictions, that writing got between her and learning. So she had absolute sympathy with anyone, like the worker she was speaking with, who finds learning a long row to hoe.
That’s when I realized the problem. Many pedogogical theorists have all kinds of theories to explain why students struggle in this class or that one. Some critics think English class divides people according to the development of the corpus callosum, the organ that joins human brain hemispheres, since reading and writing require both hemispheres. Others suggest that English success is class-based, since the poor have little time to read. I say: maybe.
More likely, students stumble on the highly artificial act of “writing a paper.” This act bears little resemblance to any form of technical, scholarly, or aesthetic writing performed by working writers. Comparing the writing done in most classrooms with most novelists, critics, and other people who make their living arranging words on paper, my students might be astonished by how little the two processes have in common.
In public school, my teachers hit me with strange rules about writing. I needed to arrange paragraphs with a topic sentence, supporting claims, and a summation. Paragraphs need a minimum of three sentences. Papers should consist of neither more nor less than five paragraphs. The first sentence of the first paragraph should state the point of the entire paper, and the first paragraph should include a complete summary of everything which will follow.
I balked at these rules as early as eighth grade. The writers I read for fun don’t do these. They stagger sentence and paragraph lengths. Few writers use topic sentences, and thesis statements are optional at best. As I advanced in writing, I learned that working writers seldom use other tools I’d learned as mandatory, like outlines and note cards. This tendency to treat certain writers’ tools as universal makes a straightforward expressive art falsely imposing.
But I had the greatest problem with the route my papers followed. I would spend hours, days, sometimes weeks working on a single paper, which I would show to one teacher, and the arc would be complete. I had no audience. There was no sense of participation in any larger communication. No one would ever respond to my work, because nobody but my teacher—who may or may not read the whole paper—would ever see it.
It felt like throwing my words into a vacuum.
The ability to translate abstract thoughts into words isn’t trivial. As I've written before, people have fought for the liberty to speak with others who aren’t right in front of them. The neurological development that comes with the ability to read and write also correlates with business and professional skills. Simply put, even people who don’t make a living as writers see their economic potential improved by perfecting the ability to communicate in writing.
But school hampers that freedom. It shackles expression to regulations and limitations. It threatens students with punishment if they violate systems that they knowl are completely artificial. Small wonder so many, even those who write recreationally, balk at classroom writing requirements. Saying we write to uphold the rules is like saying we have children so they can get jobs that enrich others without nourishing their souls.
Or is that in fact what we are saying?
Not that I can blame them. Academic disciplines pick up in-group codes and lingo. I have a great interest in physics, but my one class in the subject foundered on my professor’s frequent recourse to jargon. I try to make my class as straightforward and free of buzzwords as possible, but many others don’t share that value. Especially in college, where professors aren’t trained in classroom management, teachers and students often speak completely separate languages.
The other day I overheard my manager—one of the few other workers with a college degree—speaking with another employee. My boss, “Miranda,” admitted that, while she liked most of college, writing-intensive classes drove her nuts. There seemed so many rules, such onerous restrictions, that writing got between her and learning. So she had absolute sympathy with anyone, like the worker she was speaking with, who finds learning a long row to hoe.
That’s when I realized the problem. Many pedogogical theorists have all kinds of theories to explain why students struggle in this class or that one. Some critics think English class divides people according to the development of the corpus callosum, the organ that joins human brain hemispheres, since reading and writing require both hemispheres. Others suggest that English success is class-based, since the poor have little time to read. I say: maybe.
More likely, students stumble on the highly artificial act of “writing a paper.” This act bears little resemblance to any form of technical, scholarly, or aesthetic writing performed by working writers. Comparing the writing done in most classrooms with most novelists, critics, and other people who make their living arranging words on paper, my students might be astonished by how little the two processes have in common.
In public school, my teachers hit me with strange rules about writing. I needed to arrange paragraphs with a topic sentence, supporting claims, and a summation. Paragraphs need a minimum of three sentences. Papers should consist of neither more nor less than five paragraphs. The first sentence of the first paragraph should state the point of the entire paper, and the first paragraph should include a complete summary of everything which will follow.
I balked at these rules as early as eighth grade. The writers I read for fun don’t do these. They stagger sentence and paragraph lengths. Few writers use topic sentences, and thesis statements are optional at best. As I advanced in writing, I learned that working writers seldom use other tools I’d learned as mandatory, like outlines and note cards. This tendency to treat certain writers’ tools as universal makes a straightforward expressive art falsely imposing.
But I had the greatest problem with the route my papers followed. I would spend hours, days, sometimes weeks working on a single paper, which I would show to one teacher, and the arc would be complete. I had no audience. There was no sense of participation in any larger communication. No one would ever respond to my work, because nobody but my teacher—who may or may not read the whole paper—would ever see it.
It felt like throwing my words into a vacuum.
The ability to translate abstract thoughts into words isn’t trivial. As I've written before, people have fought for the liberty to speak with others who aren’t right in front of them. The neurological development that comes with the ability to read and write also correlates with business and professional skills. Simply put, even people who don’t make a living as writers see their economic potential improved by perfecting the ability to communicate in writing.
But school hampers that freedom. It shackles expression to regulations and limitations. It threatens students with punishment if they violate systems that they knowl are completely artificial. Small wonder so many, even those who write recreationally, balk at classroom writing requirements. Saying we write to uphold the rules is like saying we have children so they can get jobs that enrich others without nourishing their souls.
Or is that in fact what we are saying?
Monday, November 14, 2011
The Hollow Life
Sarah has recently taken to watching those home shows on cable TV. The HGTV network is dominated by two kinds of programs. On the one hand, shows like House Hunters and Property Virgins show people trying to come into possession of houses. On the other, shows like Bath Crashers and Design on a Dime let people who already own houses remake them in a newer, hipper image. A few shows, like Property Brothers, straddle the line and do both.
Shows of this sort are popular with cable programmers in part because of their low production overhead. Because they don’t involve scripts, sets, or highly priced stars, they come with fairly low budgets. For cable networks that don’t sell much ad space, that must be pretty appealing. But they couldn’t sell any ad space if the shows didn’t have a viewership. And I think the audience draw for such shows speaks volumes about our current cultural condition.
Back in the Eighties, when media boosterism and the Reagan machine ballyhooed the belief that Americans had grown rich, shows like Dallas and Dynasty consumed the airwaves with images of wealth and splendor. By the Nineties, wealth became less important than the well-scrubbed but libertine parties on Friends and Melrose Place. Both trends represented not just our society’s aspirations, but how we thought everyone else lived, and how we wanted to live.
These myths about our social values did not reflect how we lived. The Eighties were not richer, or more viciously competitive, than anything before or after. In fact, the economic disparity that paid off in the current Occupy Wall Street protests began in the Eighties. Meanwhile, the best statistics indicate that even most people who consider themselves sexually liberated seldom have more than six partners in a lifetime, a number that didn’t fluctuate in the Nineties.
Instead, these shows depicted how we thought everybody else lived. We felt we had missed out on stacks of money and rampant casual sex, so we vicariously sat though depictions of how we thought others lived. And now, as the country suffers through the longest economic doldrums since the Great Depression, we feel like somebody, somewhere, lives in a comfier, more refined house than us, and we want to watch them.
Media professionals, of course, butter their bread with their ability to sell advertising space. The shows, news, and other programming that occupies their broadcast time exist to keep us tuned in long enough to see the ads. While many content creators like to think themselves aloof from such pressures, network execs occasionally admit, sometimes accidentally, that they customize their programming according to what ads they want to sell.
Watching Alexis Carrington devise small-minded plots to seize the family fortune may seem like an odd way to sell ad space. Yet Joan Collins’ undisputed glamour also made people hungry for Dior and Versace in their own closets. A decade later, Ross and Rachel may have lacked Alexis’ splendor, but they convinced audiences that their lives were incomplete until they owned suede living room sets and could spend copious free hours lounging at the coffee shop.
Poet and philosopher Wendell Berry points out that advertisers, by nature, sell a sense of lack. They persuade audiences that our lives have run hollow and that, unless we rush out and buy the latest slick toy, we cannot plug that hole. Blaise Pascal claimed we have a God-shaped hole in our hearts. Advertisers tell us we have a product-shaped hole. Unfortunately, in our noisy and cluttered modern lives, God makes a tougher sell.
America has a longstanding ethic of individual home ownership. Like the English and Dutch who founded our culture, we aspire to not have to share walls with anyone not of our choosing. Lowe’s and the Home Depot have compounded that myth by telling us that we can live in a spotless palace. And network programmers, who get paid to find inventive ways to part us from our money, pitch that dream to us in dozens of pre-packaged forms every day.
But just as glamour and sex lost our interest, palatial environs will tarnish, too. Ad execs will invent new dreams to make us feel disappointed with real life. Human being are not, at root, acquisitive creatures. Left alone, most people get more pleasure from friendships, art, or a well-tended garden than from collecting trophies. We have the choice whether to let salesmen blind us to ourselves. I hope we have character enough to refuse what hucksters sell.
Shows of this sort are popular with cable programmers in part because of their low production overhead. Because they don’t involve scripts, sets, or highly priced stars, they come with fairly low budgets. For cable networks that don’t sell much ad space, that must be pretty appealing. But they couldn’t sell any ad space if the shows didn’t have a viewership. And I think the audience draw for such shows speaks volumes about our current cultural condition.
Back in the Eighties, when media boosterism and the Reagan machine ballyhooed the belief that Americans had grown rich, shows like Dallas and Dynasty consumed the airwaves with images of wealth and splendor. By the Nineties, wealth became less important than the well-scrubbed but libertine parties on Friends and Melrose Place. Both trends represented not just our society’s aspirations, but how we thought everyone else lived, and how we wanted to live.
These myths about our social values did not reflect how we lived. The Eighties were not richer, or more viciously competitive, than anything before or after. In fact, the economic disparity that paid off in the current Occupy Wall Street protests began in the Eighties. Meanwhile, the best statistics indicate that even most people who consider themselves sexually liberated seldom have more than six partners in a lifetime, a number that didn’t fluctuate in the Nineties.
Instead, these shows depicted how we thought everybody else lived. We felt we had missed out on stacks of money and rampant casual sex, so we vicariously sat though depictions of how we thought others lived. And now, as the country suffers through the longest economic doldrums since the Great Depression, we feel like somebody, somewhere, lives in a comfier, more refined house than us, and we want to watch them.
Media professionals, of course, butter their bread with their ability to sell advertising space. The shows, news, and other programming that occupies their broadcast time exist to keep us tuned in long enough to see the ads. While many content creators like to think themselves aloof from such pressures, network execs occasionally admit, sometimes accidentally, that they customize their programming according to what ads they want to sell.
Watching Alexis Carrington devise small-minded plots to seize the family fortune may seem like an odd way to sell ad space. Yet Joan Collins’ undisputed glamour also made people hungry for Dior and Versace in their own closets. A decade later, Ross and Rachel may have lacked Alexis’ splendor, but they convinced audiences that their lives were incomplete until they owned suede living room sets and could spend copious free hours lounging at the coffee shop.
Poet and philosopher Wendell Berry points out that advertisers, by nature, sell a sense of lack. They persuade audiences that our lives have run hollow and that, unless we rush out and buy the latest slick toy, we cannot plug that hole. Blaise Pascal claimed we have a God-shaped hole in our hearts. Advertisers tell us we have a product-shaped hole. Unfortunately, in our noisy and cluttered modern lives, God makes a tougher sell.
America has a longstanding ethic of individual home ownership. Like the English and Dutch who founded our culture, we aspire to not have to share walls with anyone not of our choosing. Lowe’s and the Home Depot have compounded that myth by telling us that we can live in a spotless palace. And network programmers, who get paid to find inventive ways to part us from our money, pitch that dream to us in dozens of pre-packaged forms every day.
But just as glamour and sex lost our interest, palatial environs will tarnish, too. Ad execs will invent new dreams to make us feel disappointed with real life. Human being are not, at root, acquisitive creatures. Left alone, most people get more pleasure from friendships, art, or a well-tended garden than from collecting trophies. We have the choice whether to let salesmen blind us to ourselves. I hope we have character enough to refuse what hucksters sell.
Friday, November 11, 2011
#OCCUPY the Factory
When I set my stack of stuff to the side of the work table at the factory, Dennis noticed my book amid everything. I was reading a pre-release edition of Charles Todd’s upcoming mystery, The Confession, for my blog, and I’d brought it along for lunch breaks. “What’s that you’re reading there?”
“It’s a British mystery. Well, British-style.”
“Any good?”
“I don’t know. I just got it. I like what I’ve read so far. And I liked the last book by this author.”
But Dennis wasn’t so much interested in the content as the simple two-tone cover illustration. “I like the picture,” he said. “Who did that, do you know?”
“I can’t say. Pre-release editions generally don’t have all the credits.”
“That’s a really cool picture. I’d like to do something like that.”
“You mean cover art?”
“No, just art in general. I’d love to do comic books, or maybe movie posters and storyboards. I’ve tried to do pictures like that, with the two colors that make it look fully dimensional.” Dennis reached over and touched the picture like it was an original artifact. “I feel like I’m close, I just haven’t gotten there yet.”
I wish we could have had time to finish that conversation. We had work to do, and strangely enough, our bosses wouldn’t take it well if we spoke rather than crunched our tasks. So many questions lingered there, waiting for me to ask.
Like, why does he feel he can’t do that job? What stands between him and the ability to get paid for something he’s good at? And why does he, like so many other workers at the factory, speak of his life in the future tense?
I’ve heard many factory workers talk about what they’d like to do. One young woman, who already has a degree in commercial finance, can’t receive her CPA credentials until she completes two mandated online courses, which she can’t yet afford. Another guy wants to use his Associate’s Degree in diesel mechanics to work for the railroad, but needs to move across the state to get to where the work is.
Just days ago, Dennis and I sat in the lunch room, and he flipped through the photos in his iPhone, showing me the makeup he and his son wore on Halloween. Dennis designed it himself, turning his son into a zombie and himself into a leering death’s head. The complexity was breathtaking. If his comic art matches his makeup skills, he has profound potential alongside the likes of Jack Kirby.
Yet he has stuck to the factory for several years. He has set himself the standard of breaking even. Perhaps he feels that, to feed his family, he needs to defer his hopes. There’s something to say for keeping body and soul together: why shouldn’t he work, if he wants to get paid?
Yet the factory relies on workers who believe they have no other option. As long as workers feel they can’t leave, the company can set wages at any level they want. In fairness, our factory pays more than a lot of employers in our ho-hum town. But in relation to their annual net, workers’ wages represent a drop in the sea.
The Occupy Wall Street protesters have repeatedly decried financiers who use rubber math to enrich themselves without producing goods. Short-selling commodity futures and leveraging one fictional financial instrument against another, and gambling on which will crumble first, have made a self-selecting elite appallingly wealthy. Meanwhile, people who produce marketable goods have seen little wage movement in nearly two generations.
When I hear commentators disparaging the Occupiers, saying people are demanding handouts and wanting something for nothing, I think of Dennis. He doesn’t want anything handed to him. He wants to work, and work hard, at something he’s good at. But because of our economic conditions, he feels he can’t.
Of course, we need people to work in factories, producing goods. Several of my colleagues at the factory enjoy producing filters, and are good at it. Although this isn’t the life I anticipated in graduate school, I’ve discovered I’m pretty good at the factory, and it’s remarkably fulfilling work.
But if hard-working, industrious people can’t get work in fields where they have spent time advancing their skills, where they have proven their ability, that speaks volumes about our economy. This system has let its best people down. The Occupy protesters don’t want freebies and greedy displays. They just want work for people like Dennis.
“It’s a British mystery. Well, British-style.”
“Any good?”
“I don’t know. I just got it. I like what I’ve read so far. And I liked the last book by this author.”
But Dennis wasn’t so much interested in the content as the simple two-tone cover illustration. “I like the picture,” he said. “Who did that, do you know?”
“I can’t say. Pre-release editions generally don’t have all the credits.”
“That’s a really cool picture. I’d like to do something like that.”
“You mean cover art?”
“No, just art in general. I’d love to do comic books, or maybe movie posters and storyboards. I’ve tried to do pictures like that, with the two colors that make it look fully dimensional.” Dennis reached over and touched the picture like it was an original artifact. “I feel like I’m close, I just haven’t gotten there yet.”
I wish we could have had time to finish that conversation. We had work to do, and strangely enough, our bosses wouldn’t take it well if we spoke rather than crunched our tasks. So many questions lingered there, waiting for me to ask.
Like, why does he feel he can’t do that job? What stands between him and the ability to get paid for something he’s good at? And why does he, like so many other workers at the factory, speak of his life in the future tense?
I’ve heard many factory workers talk about what they’d like to do. One young woman, who already has a degree in commercial finance, can’t receive her CPA credentials until she completes two mandated online courses, which she can’t yet afford. Another guy wants to use his Associate’s Degree in diesel mechanics to work for the railroad, but needs to move across the state to get to where the work is.
Just days ago, Dennis and I sat in the lunch room, and he flipped through the photos in his iPhone, showing me the makeup he and his son wore on Halloween. Dennis designed it himself, turning his son into a zombie and himself into a leering death’s head. The complexity was breathtaking. If his comic art matches his makeup skills, he has profound potential alongside the likes of Jack Kirby.
Yet he has stuck to the factory for several years. He has set himself the standard of breaking even. Perhaps he feels that, to feed his family, he needs to defer his hopes. There’s something to say for keeping body and soul together: why shouldn’t he work, if he wants to get paid?
Yet the factory relies on workers who believe they have no other option. As long as workers feel they can’t leave, the company can set wages at any level they want. In fairness, our factory pays more than a lot of employers in our ho-hum town. But in relation to their annual net, workers’ wages represent a drop in the sea.
The Occupy Wall Street protesters have repeatedly decried financiers who use rubber math to enrich themselves without producing goods. Short-selling commodity futures and leveraging one fictional financial instrument against another, and gambling on which will crumble first, have made a self-selecting elite appallingly wealthy. Meanwhile, people who produce marketable goods have seen little wage movement in nearly two generations.
When I hear commentators disparaging the Occupiers, saying people are demanding handouts and wanting something for nothing, I think of Dennis. He doesn’t want anything handed to him. He wants to work, and work hard, at something he’s good at. But because of our economic conditions, he feels he can’t.
Of course, we need people to work in factories, producing goods. Several of my colleagues at the factory enjoy producing filters, and are good at it. Although this isn’t the life I anticipated in graduate school, I’ve discovered I’m pretty good at the factory, and it’s remarkably fulfilling work.
But if hard-working, industrious people can’t get work in fields where they have spent time advancing their skills, where they have proven their ability, that speaks volumes about our economy. This system has let its best people down. The Occupy protesters don’t want freebies and greedy displays. They just want work for people like Dennis.
Wednesday, November 9, 2011
All Hats, Strange Cattle
Toward a Politics of the Imagination, Part Four
David Remnick’s commentary in this week’s New Yorker, Decline and Fall, suggests that the recent bizarre flame-out by Republican Presidential contenders—mainly Rick Perry’s disjointed New Hampshire speech and Herman Cain’s non-answers to serious allegations—marks a new low in political discourse. He says that “the spectacle of the Republican field is a reflection of the hollowness in the G.O.P. itself.” With more Republican debates impending, expect further meltdowns and evasions in coming days.
Yet I must respectfully refute Remnick’s claim, insofar as he only focuses on the GOP. The problems he describes reflect a political system turned inward upon itself, in which true believers only talk with one another. Recent events like the “Let Him Die” incident at the September 12th debate, or booing a gay soldier on September 22nd, suggest a partisan electorate so invested in a view that alternatives, exceptions, and nuances do not penetrate their thinking.
President Obama is seeking his party’s nomination unopposed, so we have no equivalent anecdotes yet from the Democratic side. But give it time. I attended a town hall meeting by my state’s Democratic Senator prior to the vote on Obamacare. Despite a surprising number of progressives for a historically conservative state, the audience’s extreme polarization, and the unwillingness of either side to let their opponents speak without interruptions or catcalls, was appalling.
When people speak only to peers who already agree on everything, at least in broad strokes, the discussion usually ends with all participants feeling more entrenched, more extreme, and more doctrinaire in their opinions. Anyone who watches The Five on Fox News or ABC’s The View has seen this in action. Psychologists call this “group polarization.” But military jargon has an altogether more apt term for this phenomenon: “incestuous amplification.”
We saw this play out in the 2004 Presidential election, when liberals and conservatives lined up like cattle in a chute behind their respective major party candidates. As Edward Herman and Noam Chomsky assert in Manufacturing Consent, the media echo chamber managed to drown out the fact that neither George W. Bush nor John Kerry really reflected the views of their respective voters. But that didn’t matter. Both talked the game their media handlers expected, and the vote came down to the narrowest split in history.
Democrats in particular felt blindsided by that result. The accumulation of leftist websites and message boards, talk shows and media outlets, let them avoid seeing the larger, more diverse electorate. Like cattle in a chute, progressives could not look left or right, and could only see the rump of the steer in front.
Adrian Woolridge and John Micklethwait say in The Right Nation that many Europeans thought Kerry would take 2004 handily. That’s because the only Americans most Europeans know live in New York and San Francisco. Limited input created the misperception that progressives dominate the States. But saber-rattling nationalists always get their greatest support close to home. Those same Europeans repeatedly vote to restrict Muslim rights or tighten immigration rules, without apparent awareness of the contradiction.
I could say we need to listen to each other more, and seek dissenting viewpoints, but that’s a bromide, not a solution. The leaders who currently split the electorate have done so by ginning up anger that we can only resolve by turning authority over to a perceived leader. That anger will not go away easily, even if centrist voters, who make all the difference, find such demagoguery off-putting.
Instead, perhaps we should change our voting rules. As governor of California, Arnold Schwarzenegger impressed me seldom, but one (failed) proposal garnered my support. By requiring Congressional, Senate, and State Assembly candidates run in non-partisan primaries, and the top two candidates regardless of party face each other in the general election, he the system could discourage actively pandering to the extremes. This would engage independents, centrists, and single-issue voters in a way they aren’t now.
While such a simple change may seem facile, and would unfairly disadvantage third parties, it at least represents a vision. It suggests that the great mass of Americans, not just party loyalists, deserves a voice in the discussion. And by incorporating all voters at all stages, it emphasizes that we are citizens with rights, not cattle to be herded.
America’s great principles remain worth our loyalty. But “divide and conquer” politics has turned important decisions over to morally vacuous tubthumpers. If we believe that a free and sovereign people controls its own destiny, then it’s time to take those decisions back.
Part One: Toward a Politics of Imagination
Part Two: Democracy, the Heart, and All Things Healing
Part Three: The Christian in the Free World—A Survey
David Remnick’s commentary in this week’s New Yorker, Decline and Fall, suggests that the recent bizarre flame-out by Republican Presidential contenders—mainly Rick Perry’s disjointed New Hampshire speech and Herman Cain’s non-answers to serious allegations—marks a new low in political discourse. He says that “the spectacle of the Republican field is a reflection of the hollowness in the G.O.P. itself.” With more Republican debates impending, expect further meltdowns and evasions in coming days.
Yet I must respectfully refute Remnick’s claim, insofar as he only focuses on the GOP. The problems he describes reflect a political system turned inward upon itself, in which true believers only talk with one another. Recent events like the “Let Him Die” incident at the September 12th debate, or booing a gay soldier on September 22nd, suggest a partisan electorate so invested in a view that alternatives, exceptions, and nuances do not penetrate their thinking.
President Obama is seeking his party’s nomination unopposed, so we have no equivalent anecdotes yet from the Democratic side. But give it time. I attended a town hall meeting by my state’s Democratic Senator prior to the vote on Obamacare. Despite a surprising number of progressives for a historically conservative state, the audience’s extreme polarization, and the unwillingness of either side to let their opponents speak without interruptions or catcalls, was appalling.
When people speak only to peers who already agree on everything, at least in broad strokes, the discussion usually ends with all participants feeling more entrenched, more extreme, and more doctrinaire in their opinions. Anyone who watches The Five on Fox News or ABC’s The View has seen this in action. Psychologists call this “group polarization.” But military jargon has an altogether more apt term for this phenomenon: “incestuous amplification.”
We saw this play out in the 2004 Presidential election, when liberals and conservatives lined up like cattle in a chute behind their respective major party candidates. As Edward Herman and Noam Chomsky assert in Manufacturing Consent, the media echo chamber managed to drown out the fact that neither George W. Bush nor John Kerry really reflected the views of their respective voters. But that didn’t matter. Both talked the game their media handlers expected, and the vote came down to the narrowest split in history.
Democrats in particular felt blindsided by that result. The accumulation of leftist websites and message boards, talk shows and media outlets, let them avoid seeing the larger, more diverse electorate. Like cattle in a chute, progressives could not look left or right, and could only see the rump of the steer in front.
Adrian Woolridge and John Micklethwait say in The Right Nation that many Europeans thought Kerry would take 2004 handily. That’s because the only Americans most Europeans know live in New York and San Francisco. Limited input created the misperception that progressives dominate the States. But saber-rattling nationalists always get their greatest support close to home. Those same Europeans repeatedly vote to restrict Muslim rights or tighten immigration rules, without apparent awareness of the contradiction.
I could say we need to listen to each other more, and seek dissenting viewpoints, but that’s a bromide, not a solution. The leaders who currently split the electorate have done so by ginning up anger that we can only resolve by turning authority over to a perceived leader. That anger will not go away easily, even if centrist voters, who make all the difference, find such demagoguery off-putting.
Instead, perhaps we should change our voting rules. As governor of California, Arnold Schwarzenegger impressed me seldom, but one (failed) proposal garnered my support. By requiring Congressional, Senate, and State Assembly candidates run in non-partisan primaries, and the top two candidates regardless of party face each other in the general election, he the system could discourage actively pandering to the extremes. This would engage independents, centrists, and single-issue voters in a way they aren’t now.
While such a simple change may seem facile, and would unfairly disadvantage third parties, it at least represents a vision. It suggests that the great mass of Americans, not just party loyalists, deserves a voice in the discussion. And by incorporating all voters at all stages, it emphasizes that we are citizens with rights, not cattle to be herded.
America’s great principles remain worth our loyalty. But “divide and conquer” politics has turned important decisions over to morally vacuous tubthumpers. If we believe that a free and sovereign people controls its own destiny, then it’s time to take those decisions back.
Part One: Toward a Politics of Imagination
Part Two: Democracy, the Heart, and All Things Healing
Part Three: The Christian in the Free World—A Survey
Friday, November 4, 2011
Manly Men in a Girly Church
Look around any Christian church service. Chances are, among adults, women outnumber men two to one in the pews. Unmarried men are largely absent. A man may occupy the pulpit, but women probably do everything else, from presenting liturgy to performing music to distributing communion. Media producer David Murrow noticed this, and set out to discover Why Men Hate Going to Church. His findings seem bleak at first, but ultimately offer a great deal of hope.
Statistics abound that men avoid church. This applies across congregations, denominations, races, and nations. Other religions don’t have this problem; it crosses every demographic divide, but is a uniquely Christian phenomenon. It isn’t even new; Murrow cites one historian who traces this divide back seven centuries. Not coincidentally, this corresponds with when Church iconography started emphasizing a battered, bleeding Jesus versus a smiling Virgin Mary.
Though many have disparaged church as a patriarchal institution, anybody who watches how congregations run will notice that men occupy a thin stratum at the top of the pyramid. The pastorate remains a largely male occupation, in some churches exclusively male. But women dominate committees, volunteer organizations, and all other nuts-and-bolts aspects of the congregation. One man may “lead,” but women make the place run.
So we shouldn’t be surprised when church becomes feminine. Theology has grown spongy, emphasizing love and minimizing ethics. Music has become feminized, and many church composers write songs about being “in love with Jesus.” Many congregations have replaced altar paraments with lace doilies and flower arrangements. Even the typical Jesus portrait renders Him demure and androgynous. No wonder dudes don’t want to show up on Sunday.
This doesn’t accord with the scriptural Christ. Sure, He urged us to love one another, welcomed children, and promised us rest in Him. But He also drove out the money changers, called the Pharisees some pretty harsh names, and never backed down from a just fight. Empires don’t execute hairy provincial preachers for gently suggesting people get along and pray more. But you wouldn’t know that from the contemporary church.
This trend pushes men out of America’s churches in droves. Though most American men self-identify as Christian, they see church as an impingement on their masculinity. Six days a week, we have to be butch, mighty, and ready for any challenge; but on the Sabbath, we have to take on girly appurtenances and act dainty. Guys ain’t having it.
Murrow admits this sounds sexist at first blush. And he admits there are men in church; he’s one of them. But church as it stands attracts men who are primarily emotive, artistic, and introspective. Men like Murrow and me. Men, in short, who think like women. These are the men who go to seminary. This leads to the effete vicars so familiar to Monty Python fans.
Of course, there are also women who think like men. Just don’t look for them in church.
This is more than a gender issue. Murrow collates numerous statistics demonstrating that churches dominated by women tend to focus internally, fear making tough decisions, and resist needed changes, for fear of hurting anyone’s feelings. Without a balance of the masculine and feminine, like Jesus Himself demonstrated, congregations founder. Only those churches that attract both men and women have a future.
Fortunately, all is not bleak. Murrow spotlights many steps congregations have already taken to reverse their loss of men. One church he extols, led by a female pastor, reversed its decline by simply asking, while preparing the weekly service, what would a solid blue-collar dude think of this? Reclaiming men doesn’t require revised theology, new liturgy, or male dominion. It just requires keeping men’s unique psychological needs in mind.
Attracting men doesn’t mean discouraging women. Indeed, since women are allowed into male domains in ways men can’t enter female domains, women will come to churches that invite men. Note, this doesn’t mean they should come or they ought to come. Murrow doesn’t deal in abstractions. He shows how real churches, taking practical steps, have boosted both men and women in the pews by keeping men in mind.
If we want men back in our churches, we don’t need new music or new ministries. We don’t need showmanship or spectacle. Churches simply need to relearn how to speak the language of dudes in the regular activities they carry on right now. Men show up when they feel needed and useful. Jesus started a movement with twelve guys. Surely the modern church can do just as well.
Statistics abound that men avoid church. This applies across congregations, denominations, races, and nations. Other religions don’t have this problem; it crosses every demographic divide, but is a uniquely Christian phenomenon. It isn’t even new; Murrow cites one historian who traces this divide back seven centuries. Not coincidentally, this corresponds with when Church iconography started emphasizing a battered, bleeding Jesus versus a smiling Virgin Mary.
Though many have disparaged church as a patriarchal institution, anybody who watches how congregations run will notice that men occupy a thin stratum at the top of the pyramid. The pastorate remains a largely male occupation, in some churches exclusively male. But women dominate committees, volunteer organizations, and all other nuts-and-bolts aspects of the congregation. One man may “lead,” but women make the place run.
So we shouldn’t be surprised when church becomes feminine. Theology has grown spongy, emphasizing love and minimizing ethics. Music has become feminized, and many church composers write songs about being “in love with Jesus.” Many congregations have replaced altar paraments with lace doilies and flower arrangements. Even the typical Jesus portrait renders Him demure and androgynous. No wonder dudes don’t want to show up on Sunday.
This doesn’t accord with the scriptural Christ. Sure, He urged us to love one another, welcomed children, and promised us rest in Him. But He also drove out the money changers, called the Pharisees some pretty harsh names, and never backed down from a just fight. Empires don’t execute hairy provincial preachers for gently suggesting people get along and pray more. But you wouldn’t know that from the contemporary church.
David Murrow |
Murrow admits this sounds sexist at first blush. And he admits there are men in church; he’s one of them. But church as it stands attracts men who are primarily emotive, artistic, and introspective. Men like Murrow and me. Men, in short, who think like women. These are the men who go to seminary. This leads to the effete vicars so familiar to Monty Python fans.
Of course, there are also women who think like men. Just don’t look for them in church.
This is more than a gender issue. Murrow collates numerous statistics demonstrating that churches dominated by women tend to focus internally, fear making tough decisions, and resist needed changes, for fear of hurting anyone’s feelings. Without a balance of the masculine and feminine, like Jesus Himself demonstrated, congregations founder. Only those churches that attract both men and women have a future.
Fortunately, all is not bleak. Murrow spotlights many steps congregations have already taken to reverse their loss of men. One church he extols, led by a female pastor, reversed its decline by simply asking, while preparing the weekly service, what would a solid blue-collar dude think of this? Reclaiming men doesn’t require revised theology, new liturgy, or male dominion. It just requires keeping men’s unique psychological needs in mind.
Attracting men doesn’t mean discouraging women. Indeed, since women are allowed into male domains in ways men can’t enter female domains, women will come to churches that invite men. Note, this doesn’t mean they should come or they ought to come. Murrow doesn’t deal in abstractions. He shows how real churches, taking practical steps, have boosted both men and women in the pews by keeping men in mind.
If we want men back in our churches, we don’t need new music or new ministries. We don’t need showmanship or spectacle. Churches simply need to relearn how to speak the language of dudes in the regular activities they carry on right now. Men show up when they feel needed and useful. Jesus started a movement with twelve guys. Surely the modern church can do just as well.
Wednesday, November 2, 2011
Companies That Know Our Needs Before We Do
Why did the iPod and Kindle create an appetite in consumers while the Zune and the Sony Reader didn’t? Why do grocery buyers regard visiting Safeway a chore, but look forward to shopping at Wegman’s? Adrian Slywotzky investigates these questions in Demand: Creating What People Love Before They Know They Want It. But he overtalks the point and, unfortunately, blurs the line between business writing and advertising.
Consider: did you have any idea you were unhappy with Lackluster Video before Reed Hastings launched Netflix? Did you feel the need for constant social networking before you signed up for Facebook? Of course not. Yet the creative minds behind these breakthroughs recognized needs so completely unmet that we hadn’t even noticed we had them yet. Now we can’t imagine our lives without these cushy conveniences.
Slywotzky admits from the beginning that he has no simple formula for what he calls “demand creators,” those producers and vendors who sell what we never knew we needed. Though he identifies several areas where future developments will happen, he can’t give you a step-by-step guide to creating demand. Demand creation relies on creativity and insight. Our best hope is to learn from those who have done it before.
It may seem that many demand creators have their leaps only once. Slywotzky published this book before Netflix’s recent flame-out. While Reed Hastings had a good idea, his follow-through hasn’t been exemplary. He had one idea that revolutionized the media distribution business, but he has peddled a great deal of confusion lately. Can we really learn anything from someone who misjudged his audience so badly?
Yet remember how many times Steve Jobs or Jeff Bezos hit the nail on the head. Each correctly anticipated not only what customers wanted, but how developing technology made fulfilling those wants possible. And technology isn’t even necessary. The Wegman’s grocery chain has stayed ahead of the competition for decades by simply making their stores a destination, and by not expanding faster than economic realities permit.
Some demand creation requires routine awareness. Jeff Bezos recognized the Kindle’s potential when he saw a Sony Reader, which was well designed but poorly pitched, and realized it could benefit from Amazon.com’s infrastructure. Other demand creation is more studious. California-based CareMore delivers managed medical care to the elderly at steep discounts while improving quality because market research returned results that appear downright counterintuitive.
For all their differences, though, demand creators share an ethic of recognizing their customers as real people with real needs. Even non-profits like Teach For America or the Seattle Opera recognize their audiences, in very real ways, as customers. The institutions have something people need—such as education or culture—and their first responsibility is to match their product with others’ needs.
But I have difficulty understanding the mental habits Slywotzky wants me to gain because his segmented structure doesn’t let me spot patterns. He discusses two or three companies per chapter, from profitable industries to philanthropic institutions, but by the next chapter, those examples vanish. I’d love to know how Zipcar embodies Slywotzky’s various virtues, but once he’s done with it, he never mentions it again.
And his praise of companies he admires goes on at length. His praise, in particular, of Tetra Pak and the Kindle go well beyond what we need to understand these products’ success. Slywotzky’s stories mount up, with such unflagging praise that he starts to sound like a PR flack. Is he paid by the line? My eyes glaze over, and I find myself wondering whether I’ve really learned anything in the last fifteen or twenty pages.
Business writing must respect that business innovators lead busy lives. They generally exceed forty hours, and time spent reading should be remunerative in some way, whether by teaching something useful or providing needed psychological refreshment. Writers must put every anecdote, every discursion, every rumination on trial for its life. Anything that doesn’t advance the point must go.
This relatively long book could use a firm editor. It’s nearly a third longer than similar business books, and even at that length, feels disjointed. Much as I appreciate the individual points, and as much as I they advance ongoing discussions, this book doesn’t go nearly as far as it should. Inquisitive readers can gain plenty from reading, but it simultaneously goes way longer than it should, and doesn’t do its topics justice.
I’ve identified a demand I didn’t know I had. I demand this necessary book, with the numerous kinks worked out.
Consider: did you have any idea you were unhappy with Lackluster Video before Reed Hastings launched Netflix? Did you feel the need for constant social networking before you signed up for Facebook? Of course not. Yet the creative minds behind these breakthroughs recognized needs so completely unmet that we hadn’t even noticed we had them yet. Now we can’t imagine our lives without these cushy conveniences.
Slywotzky admits from the beginning that he has no simple formula for what he calls “demand creators,” those producers and vendors who sell what we never knew we needed. Though he identifies several areas where future developments will happen, he can’t give you a step-by-step guide to creating demand. Demand creation relies on creativity and insight. Our best hope is to learn from those who have done it before.
It may seem that many demand creators have their leaps only once. Slywotzky published this book before Netflix’s recent flame-out. While Reed Hastings had a good idea, his follow-through hasn’t been exemplary. He had one idea that revolutionized the media distribution business, but he has peddled a great deal of confusion lately. Can we really learn anything from someone who misjudged his audience so badly?
Yet remember how many times Steve Jobs or Jeff Bezos hit the nail on the head. Each correctly anticipated not only what customers wanted, but how developing technology made fulfilling those wants possible. And technology isn’t even necessary. The Wegman’s grocery chain has stayed ahead of the competition for decades by simply making their stores a destination, and by not expanding faster than economic realities permit.
Some demand creation requires routine awareness. Jeff Bezos recognized the Kindle’s potential when he saw a Sony Reader, which was well designed but poorly pitched, and realized it could benefit from Amazon.com’s infrastructure. Other demand creation is more studious. California-based CareMore delivers managed medical care to the elderly at steep discounts while improving quality because market research returned results that appear downright counterintuitive.
Adrian Slywotzky |
But I have difficulty understanding the mental habits Slywotzky wants me to gain because his segmented structure doesn’t let me spot patterns. He discusses two or three companies per chapter, from profitable industries to philanthropic institutions, but by the next chapter, those examples vanish. I’d love to know how Zipcar embodies Slywotzky’s various virtues, but once he’s done with it, he never mentions it again.
And his praise of companies he admires goes on at length. His praise, in particular, of Tetra Pak and the Kindle go well beyond what we need to understand these products’ success. Slywotzky’s stories mount up, with such unflagging praise that he starts to sound like a PR flack. Is he paid by the line? My eyes glaze over, and I find myself wondering whether I’ve really learned anything in the last fifteen or twenty pages.
Business writing must respect that business innovators lead busy lives. They generally exceed forty hours, and time spent reading should be remunerative in some way, whether by teaching something useful or providing needed psychological refreshment. Writers must put every anecdote, every discursion, every rumination on trial for its life. Anything that doesn’t advance the point must go.
This relatively long book could use a firm editor. It’s nearly a third longer than similar business books, and even at that length, feels disjointed. Much as I appreciate the individual points, and as much as I they advance ongoing discussions, this book doesn’t go nearly as far as it should. Inquisitive readers can gain plenty from reading, but it simultaneously goes way longer than it should, and doesn’t do its topics justice.
I’ve identified a demand I didn’t know I had. I demand this necessary book, with the numerous kinks worked out.
Subscribe to:
Posts (Atom)