Showing posts with label business. Show all posts
Showing posts with label business. Show all posts

Thursday, October 16, 2025

In Praise(ish) of Dollar Stores

The Dollar General I've grown reliant on, in a Nebraska town of only 1000 people

We’ve all heard the ubiquitous complaints about Dollar General and similar “dollar stores,” though that name is increasingly anachronistic. They keep prices low by paying workers poorly, running perpetually short-staffed, and excluding local artisans. Their modular architecture is deaf to local culture and design. They distort our ideas of what commercial goods actually cost. Even John Oliver dedicated a block of valuable HBO time to disparaging how Dollar General hurts workers and the local community.

Like a good economic progressive, I internalized these arguments for years. I prided myself on avoiding dollar stores like plague pits. I aggressively disparaged when a treasured local business got flattened to build a Family Dollar store. Even on an extended Missouri holiday, fifteen minutes from the nearest grocery store, I finally gave in and entered the Dollar General, only two minutes away, I still rationalized myself by saying I remained dedicated to local businesses.

Through the last year, though, I’ve found my perspective shifting. I find dollar stores, and Dollar General specifically, more necessary than I previously realized. Dollar General is like audiobooks, or a Slanket. These products, invented to streamline disabled people’s lives, have gotten derided by able-bodied elites, who don’t realize how privileged their taunts really are. Likewise, Manhattan-based John Oliver, and writers in coastal California, might not realize how dollar stores improve life in rural America.

In the year I’ve spent caring for an aged relative in rural Nebraska, I’ve become deeply reliant on Dollar General. It keeps longer hours than the local grocery store, auto parts retailer, or pharmacy. This has made it absolutely essential for buying convenience foods, over-the-counter meds, and motor oil. That’s saying nothing of other products that nobody else sells locally, like home décor, kitchen supplies, and paperback books. Without Dollar General, these commodities would disappear.

Not that semi-luxury commodities don’t exist in rural areas. But without Dollar General, the nearest big-box retailer selling these products would be over an hour’s drive away, or else Amazon, which delivers to rural areas only sluggishly. The stereotyped image of rural America, with its drab houses, faded curtains, and faded clothes bespeaks the ways retailers don’t bother investing outside already-lucrative markets. Dollar General, by contrast, arguably creates lucrative markets in marginal or abandoned areas.

The small but diverse produce section in my local Dollar General

City slickers might not realize how inaccessible food is in rural areas. The term “food deserts” often describes urban cores, especially non-White neighborhoods, where Dr. King realized that fruits and vegetables were unaffordable, if available. But rural America is also heavily food desertified. Because economic forces, especially banks, force farmers to abandon diverse agriculture for industrial monocropping, farmers seldom eat their produce. In the northern Great Plains, farms mostly grow livestock feed, not human food.

These conditions didn’t just happen. Macroeconomics isn’t inevitable, like rain. Top-level American economic policy flooded America’s central corridor with population after the Civil War, via the Homestead Act. But into the Twentieth Century, that same economic policy largely abandoned the homesteader population in favor of urban industrialization. America still needs its rural agricultural population; it just doesn’t provide that population with meaningful support anymore, since they aren’t lucrative donors. Nobody ever got rich hoeing corn.

Dollar stores are the logical market-driven response to this abandonment. Rural communities have some money, and they want—and deserve—nice things, like attractive curtains, affordable art supplies, and small electronics. Dollar General recognized an unmet market, and met it. Sure, they manipulate wholesalers and use just-in-time restocking to keep prices artificially cheap, in ways unsupported local businesses just can’t. But they aren’t culpable for economic policies that made Pop’s old-timey general store fiscally unviable.

Even their notoriously impersonal architecture reflects America’s top-level economic policy. Sure, I’d love if dollar stores hired local architects to conform their buildings to regional aesthetics. But Walmart and Target already priced architects out of small markets, and America’s rural economic abandonment means the dominant design style is often “decay.” Dollar General’s warehouse-like design and modular construction let them move into marginal markets quickly without incurring high amortized overhead, which just makes good business sense.

Don’t misunderstand me: dollar chains’ low pay, homogenous product, and deafness to local industry aren’t sustainable. I’ve grown fond of Dollar General, but at best, they’re a transitional response to larger forces. But John Oliver’s implicit expectation of shoving a Whole Foods into every small market is equally unrealistic. If we use the market access which dollar stores provide to build toward something better, then these chains have helped America through its current economic malaise.

Wednesday, July 19, 2023

Those Who Walk Away From Tinseltown

While the multi-union strike promises to halt Hollywood for months, a parallel story is emerging: the box-office success of Sound of Freedom. The true-ish story of Tim Ballard and Operation Underground Railroad attracted massive advance buzz, and tapped into an already-popular theme to debut at #3, behind major-studio franchise giants. At this writing, it’s cleared six times its published budget, without major-studio backing or distribution.

Much coverage has focused on how accurate the movie is, or isn’t. But that’s another discussion. More interesting, the studio behind the project, Santa Fe Films, made the movie, found a distributor, then found another distributor after the first one bailed, and printed and shipped for under $25 million. In a movie landscape dominated by surefire blockbusters, where mainstream studios won’t dust their shelves for that money, the return on investment is huge.

Compare the major-studio blockbuster returns. Avatar: the Way of Water cleared over $2 billion for 20th Century Fox, but on a budget of at least $350 million (reports differ) before distribution. That’s more absolute dollars, but largely the same rate. The latest Marvel movie, Guardians of the Galaxy, Vol. 3, grossed $850 million on a $250 million budget, a narrow-enough rate to constitute a virtual loss. Disney executives have announced tightening Marvel and Star Wars releases behind dwindling returns.

The Big Five studio conglomerates rely on franchises to remain afloat. Sound of Freedom came third in its opening weekend box office; first and second were the fifth Indiana Jones movie and the seventh Mission Impossible. Conglomerates keep returning to franchises like the Transformers, John Wick, Fast and Furious, and James Bond. (Okay, James Bond is an MGM property and therefore not Big Five. Stick with me.)

Then there’s sequels that nobody actually wanted and fans despised, like a fourth Matrix movie, every attempted Predator sequel, and the entire Jurassic World trilogy. I’ve said it before but it bears repeating: the Hollywood mainstream is so creatively bereft that they can’t even breathe new life into existing successful properties. Only through constant saturation marketing have they persuaded audiences to view the pablum they keep dribbling out.

Meanwhile, Sound of Freedom is only the latest movie from Christian-themed indie production houses to draw returns well beyond its budget. Prior pious hits like Courageous, Fireproof, and Same Kind of Different as Me have attracted large audiences and robust returns despite their art-house budgets, narrow target audiences, and often unreliable distribution. They go outside the Hollywood mainstream for their funding and promotion, and audiences reward them for it.

Don’t misunderstand me. Several recent Christian mockbusters have been real stinkers, like the God’s Not Dead quartet, or Kirk Cameron’s Saving Christmas. Kirk Cameron in particular needs to realize he’s outlived his Eighties sitcom popularity, and reconsider his life’s choices. And the Christian mockbuster industry leans heavily conservative, parroting the existing moral and religious views of its largely sectarian audience. My opinions here are not uncritical.

However, a certain subset of right-wing Christians have successfully reverse-engineered an alternative Hollywood structure to make, distribute, and showcase their artwork. Their smaller studios must compete for talent, and their margins are narrow enough that they can’t waste the audience’s time with piffle, meaning they have to know their audience, and both give them what they want, while also denying them what they think they want.

Mainstream Hollywood has become highly vertically integrated. As Giblin and Doctorow write, the major agencies, talent scouts, and studios have merged. Therefore making a mainstream movie pitch means entering with a screenwriter, star, director, and designer already signed. Without competition for talent and content, the resulting returns (such as they are) don’t roll into the next movie; they go into executives’ and shareholders’ bank accounts.

Hollywood has become an unfree market. Striking for better conditions and improved contracts will bandage the wound, but it won’t fix the underlying problem, that under the current system, prices don’t float, and the major studios are more likely to collude than compete. In this one circumstance, however, participants have a solution that other mistreated groups don’t share: they can walk away and build their own system.

Our society has other underlying problems which participants must work to fix; poverty, racism, and corruption come to mind. We can’t just walk away from these problems, because we have only one government, only one nation-state. In this unique case, however, walking away is possible. There’s already a model of how others have done it. Hollywood’s mistreated grunt laborers should grab their tools and go.

Saturday, October 29, 2022

Some Incomplete Thoughts on Elon Musk

Elon Musk

Elon Musk has reportedly completed his hostile takeover of Twitter. This joins companies like PayPal, Tesla, and SpaceX that he has burrowed into, tick-like, to nourish himself off the product other people have previously built. But Twitter has a distinct character, an identity that makes it different than those other companies. Twitter doesn’t just provide a product or service; Twitter is a platform. Which opens very different implications.

PayPal, Tesla, and SpaceX need customers to do business. If customers withhold their business, the business dwindles—though, admittedly, it’s rare for large businesses to dwindle to zero. Twitter doesn’t need customers; it needs content. Without users constantly creating content, and the lucrative controversy that content often provides, there’s no business, regardless of how many advertisers and daily viewers the site retains.

Social media networks essentially exist to resell user-created content to other users. Facebook, Instagram, and TikTok all exist to the exact extent that somebody keeps creating content, and if content creators start withholding their words and images, the business model dries up. Recall, advertisers kept pumping money into Friendster, Bebo, and MySpace for some time, but the business models all collapsed because ordinary people stopped creating content.

However, that content isn’t neutral. Musk has garnered right-wing support for his high-profile buyout by championing “free speech,” a coded term for loosening Twitter’s anti-hate speech rules. Likely beneficiaries of such loosening include Donald Trump, who used Twitter to incite violence; Jordan Peterson, who persists in aggressively deadnaming transgendered public figures; Alex Jones, who peddles lies for profit; and David Duke, banned for being David Duke.

As a free speech absolutist myself, I realize why being absolute is conditional. In my hometown, a bar owner needed to ban a handful of people for their language. They used hate speech, including the N-word, quite loudly, and picked needless arguments regarding politics. Any business owner knows that, if you don’t ban Klansmen and fascists early, other customers start avoiding your store; before long, you find yourself running a store for Klansmen and fascists.

Mark Zuckerberg

The same underlying principle drives social networks. Twitter and Facebook banned Donald Trump, not because his words were hateful and incited violence, but because they knew that if they didn’t, other users would leave. Content creators won’t create content if their content rubs elbows with Trumpist bullshit. And again, no matter how many code-writers or advertisers the parent company has, without content, there’s no company.

Serial entrepreneurs Alex Moazed and Nicholas Johnson describe Twitter, and similar businesses like Facebook and Etsy, as platform businesses. That is, these businesses don’t sell anything specific; they sell platforms users can utilize to tell stories, connect with friends, or simply air opinions. Moazed and Johnson regard these businesses as cash cows because they aren’t limited by warehouses or shelf space; given sufficient server storage, they’re hypothetically infinite.

After the Cambridge Analytica scandal, Moazed and Johnson’s description sounds remarkably naïve. Even beyond knowing that social networks profit by selling customers’ metadata, and customer contacts’ metadata, the hypothetically infinite business isn’t infinite. Without people posting thoughts, pictures, and petty quarrels, social media has no business model. This gives users greater authority than standard business models—though that authority often isn’t readily visible.

Other business models favor gigantism and consolidation. Five publishing conglomerates now control the book industry, except they don’t really, because all five do fully half their business through one vendor, Amazon. That’s why boycotts seldom work, because they require such massive commitment by millions of people over vast amounts of time that they usually stall faster than a Model A. It’s hard to fight monopolies of that magnitude.

The same doesn’t apply to social media. Sure, there’s a similar monopolist impulse; Facebook founder Mark Zuckerberg has bought out so many other social properties, including Instagram and WhatsApp. But it’s much easier for smaller numbers of customers, working concertedly, to submarine the business model. It took only two years of Rupert Murdoch’s total mismanagement to transform MySpace from Earth’s biggest website, into the Mary Celeste of the whole internet.

Therefore, I contend, Elon Musk will reverse his entire course within weeks. The very “free speech” he purportedly supports, will become economically toxic if not subjected to certain minimum restraints. Sure, Twitter is sometimes morally toxic, encouraging petty squabbles and polarization, but Elon cares more about economic toxicity. The minute content creators start jumping ship to avoid being contaminated by fascist twaddle, he’ll become as authoritarian as anybody.

Because rules exist for a reason, even for the rich.

Friday, October 18, 2019

Scorsese and “New Hollywood” Part 2

Martin Scorsese's longing for the auteur-driven Hollywood culture which birthed his career is, as I wrote yesterday, naïve and anachronistic. But it isn't wrong. Because the conditions which assassinated New Hollywood around 1982, currently exist again. Only this time, it'll be much, much harder to dislodge them, for two reasons. And Scorsese's abhorred Marvel Comics movies, which caused the current controversy, are only part of one reason.

I stand behind my statement that franchise spectaculars like Star Wars provided a necessary countervailing force to New Hollywood's somber introspection, while also bankrolling more esoteric fare. Initially. But it didn't take long for the economic weight of these movies to change the companies that owned them. George Lucas hadn't finished his first trilogy before abandoning his planned story and turning Return of the Jedi into a toy commercial.

However, this rapid descent of the blockbuster ethos into silliness corresponded with the first reason it’ll be hard to fix. Under the Reagan Administration, the deregulation of Hollywood meant studios were merging, while also buying controlling interests in the distribution networks and cinema chains which got their products in front of eyeballs. Dying of ennui in 1974, by 1984 the studios were flush with cash, and began using it to consolidate their market angle.

Today the big-screen market has few controls: studios own each other, their distributors, the cinemas, and now streaming services. Disney, which Michael Eisner famously saved from extinction in the early 1980s, has become a monolith, controlling Lucasfilm, Marvel, and now Fox. Amazon, once a pinch point in distribution, has become a studio in its own right. A few corporations so thoroughly control the supply end of the curve that, in essence, the entertainment market isn’t free.

This hasn’t been magic for the studios. The apotheosis of vertical integration happened with Roland Emmerich’s 1998 Godzilla remake, which opened on an estimated one-quarter of America’s cinema screens, and still died. The pattern has repeated recently, with Universal Studios’ planned Dark Universe dying in its cradle, and the DC Extended Universe apparently following suit. Though pitching to the middle is supposedly lucrative, audiences are provisionally leery.

But, as Scorsese notes, Marvel movies remain a cash cow. They’ve forestalled the terminal malaise that doomed other series, like Jason Bourne or Pirates of the Caribbean. Audiences still love them, which is the second reason studios have little motivation to abandon the blockbuster model. If, as I stated yesterday, audiences embraced blockbusters because New Hollywood became too repetitive, I can’t explain why this problem dooms some franchises, but not others.

Though the standard model holds that audience preferences drive the industry, that isn’t always so. People are naturally drawn to what’s already familiar, and resist the truly innovative. Movies now regarded as classics, from Citizen Kane and Night of the Hunter to The Big Lebowski and The Shawshank Redemption, initially died at the box office, because audiences found them too different. Risky movies often become successful only in retrospect.

So, between the entertainment industry’s monopoly practices, and audiences’ love for the familiar, there’s little motivation for studio bean-counters to approve anything too innovative. Studios once provided homes to executives like Alan Ladd, Jr., whose gut instinct on unproven properties like Star Wars proved insightful. Today, their decisions are controlled by algorithms which test whether proposed new properties sufficiently resemble what’s been successful in the past.

And because the studios also overwhelmingly own the distributors, they have manifest disincentive to permit smaller, independent producers to get into the business. Occasional upstarts like Blumhouse Productions occasionally break into the distribution business, giving self-financed filmmakers glimmers of false hope; but overall, without a contract with the majors, don’t expect your film to ever particularly go anywhere.

So, combining what I said yesterday with what I’ve written here, where does this leave us? Martin Scorsese’s naïve yearning for New Hollywood is a non-starter, because audiences don’t want that, and the film studios have become too consolidated to do something risky. Industry thinking has become very short-term. But the success of Marvel movies notwithstanding, box office receipts suggest “blockbuster fatigue”; even Star Wars is showing diminishing returns.

Conditions exist for something new to emerge. It probably won’t resemble either New Hollywood, with its self-consciously artsy flavor, or the blockbuster era, dependent on low-risk franchises. Because the five major studios have a stranglehold on the industry, it’ll take some executive willing to defy algorithms and accountants… which is rare currently. Maybe Scorsese can find and cultivate such an executive. Maybe we’re ready.

Thursday, October 17, 2019

Martin Scorsese and the Ghosts of “New Hollywood”

Martin Scorsese
Martin Scorsese, one of the few cinematic directors from his generation still making lucrative movies, garnered attention this weekend when he publicly dumped on Marvel Studios. Extending upon previous comments, he referred to comic book-based action movies as “not cinema” and said, “We shouldn’t be invaded by it.” Street-level commentators and bloggers like me have dogpiled on Scorsese for his comments. But on further consideration, I have my doubts.

I cannot help focusing on his phrase “not cinema.” After fifty-one years directing movies, surely this fellow understands what “cinema” means. Scorsese is among the final survivors of “New Hollywood,” an auteur-driven movement in filmmaking when studios, feeling unable to compete with television, handed unaccountable stacks of cash to ambitious directors and granted permission to go crazy. But does that give him permission to define cinema for everybody else?

New Hollywood began, critics largely agree, with Warren Beatty’s notorious over-the-top craptacular, Bonnie and Clyde. It was ultimately done in by one of its staunchest adherents, George Lucas… though we’ll return to that. During its generation, observers writing New Hollywood’s obituary actually blamed Michael Cimino, whose bloated ego vehicle Heaven’s Gate lost so much money, it killed United Artists. I contend, though, New Hollywood could’ve survived that debacle.

Undoubtedly, New Hollywood had serial flaws: as studios increasingly trusted auteurs, movies became longer and slower, prone to sententiously lecturing the audience and following protagonists on somber internal journeys. Not surprisingly, these protagonists were played by performers who also wrote and directed their own stories. Many such films are rightly regarded as classics now, but in the moment, they became massively repetitive, and audiences wanted something more.

The problem is, after fifteen years of unchallenged box-office supremacy, these auteurs thought they owned the concept of “cinema.” Influenced by movies emerging from postwar Europe and Japan, where pacing and visual effects were limited by the hobbled economy, these auteurs thought “real art” happened in the moments of contemplation where mostly male heroes lived inside their own heads. Relationships, action, and women characters were subordinate to that introspection.

Please don’t misunderstand me, I enjoy several New Hollywood classics. Robert Altman’s MASH, William Friedkin’s The Exorcist, and Clint Eastwood’s The Outlaw Josey Wales are among the best movies ever made. But taken together, the movement’s trajectory becomes overwhelmingly identical; these movies are enjoyable today because we can dip into earlier and later historical periods as necessary. Imagine how oppressive this uniformity must’ve felt in the moment.

George Lucas
Into this milieu came George Lucas. Though his first two features, THX1138 and American Graffiti, are firmly New Hollywood, both films, the latter especially, radiate stylistic nostalgia for the less self-conscious cinema that happened between the World Wars. Raised in an agrarian community with television as his lifeline, Lucas understood the world through more retro content broadcast on Saturday afternoons. This became the spine of his runaway breakout, Star Wars.

Here, not in comic books, is where the “invasion” Scorsese abhors began. Star Wars commenced the era of tentpole franchises. Its massive box office receipts also subsidized the technological innovations that made today’s intensely realistic screen graphics not only possible, but affordable. Perhaps most importantly, his characters acted rather than ruminating; even Yoda’s famously sententious homilies are made possible by the physical nature of his training regimen.

Long before Marvel, Hollywood discovered the lucrative nature of franchises like James Bond, Harry Potter, and Star Trek. Even films conceived as one-off enterprises, like Die Hard and Rocky, couldn’t withstand the demand for sequels to capitalize on existing momentum. The surfeit of sequels and series films, which critics have lamented throughout my lifetime, begins in 1977, when Hollywood realized Star Wars could bankroll their more esoteric projects.

Though it’s tempting to cite Scorsese’s age against his criticisms (he’s currently 76 years old), fuddy-duddiness doesn’t explain it, for one reason: his movies continue making money. Though his last wide-release film, the 2016 religious drama Silence, thudded on arrival, his recent CV includes such successes as The Wolf of Wall Street, Shutter Island, and The Departed. Clearly, unlike many of his contemporaries, Scorsese’s best work isn’t finished yet.

But when I heard his anticipated upcoming film, The Irishman, is three hours and nineteen minutes long, I cringed. No wonder he can’t handle Marvel films. Like Heaven’s Gate, which ran 3:39, Scorsese is directing for an audience that wants to sit for a really long time, an audience I’m not sure really exists anywhere. He’s continuing to write for New Hollywood. Which, sadly, is old news.


To Be Continued

Friday, September 6, 2019

#1 With a Bullet

Depending how you count, either Elvis Presley (below) or the Beatles had the most #1 hits

Earlier this summer, hip-hop artist Lil Nas X broke another record, when his country-rap hybrid “Old Town Road” netted eighteen weeks as number one on the Billboard Hot 100 chart. It couldn’t happen to a better guy, or a better song: though it isn’t something I’d personally seek out, it’s a genuinely good track, with fairly complex hooks and lyrics you could read like literature. Which is saying something in today’s aggressively bland Hot 100.

“Old Town Road” beats the previous record-holders for most weeks at number one, a tie for sixteen weeks between 2017’s “Despacito,” by Luis Fonzi and Daddy Yankee, and 1995’s “One Sweet Day,” by Mariah Carey and Boyz II Men. “Despacito” is, again, a pretty good song, which I don’t mind when it plays on a shared radio. “One Sweet Day,” however, is so bland and forgettable, I had to Google it to write this essay.

That represents how meaningless the pop charts have become for understanding our culture. More songs stay at number one longer. At this writing, 38 songs have stayed at number one for ten or more weeks, a number likely to change. Of those 38, 24 have been since January 1, 2000, which we can roughly designate the beginning of the download era. If we expand our horizon to January 1, 1990, the number jumps to 36.

This happened, paradoxically (not really), as more artists have more opportunities for wider audience reach. Inexpensive on-demand CD manufacturing in the 1990s, and almost-free digital distribution in the 2000s, have turned more struggling garage artists into professional recording artists than ever before. But during that same time, radio charts have become less likely to roll over. It’s almost like the major labels and the radio industry have a handshake deal to protect major-label prerogative. Almost.

As more artists and studios become capable of producing more music at less cost, the peak of commercial musical accomplishment has incongruously become less diverse. Before 1990, only two songs perched at #1 for ten weeks: Debby Boone’s “You Light Up My Life,” in 1977, and Olivia Newton-John’s “Physical,” in 1981. Let’s be honest, these aren’t good songs. Neither are most others on this list. The longest stints at #1 belong to the blandest songs.

Whenever I talk about music, I inevitably come back to Charles Duhigg. Late in his book, Duhigg talks about how the music industry manipulates listeners’ fondness for tracks which resemble music they already know to create new hits. Read that again: the biggest hits are those which resemble something we already like. We embrace songs which sound familiar, not those which take artistic risks or break new ground. And the business feeds us that repeatedly.

Despite my comments about “Despacito” and “Old Town Road,” the songs which remained atop the Billboard charts longest have preponderantly been the blandest songs ever recorded. Los Del Rio’s “Macarena,” Mariah Carey’s “We Belong Together,” and The Chainsmokers’ “Closer” are songs so pugnaciously banal, one wonders whether they aren’t self-referential performance art. You may broadly remember these songs, especially if they were hits while you were in high school, but you probably don’t like them.

Even the genuine hitmakers on this list aren’t represented by their best songs. Elton John’s “Candle In the Wind 1997,” Santana’s “Smooth,” and Whitney Houston’s “I Will Always Love You” are crinkum-crankum radio fare churned out late in the artists’ career. Though sometimes played on radio for nostalgia, these songs, the biggest hits of their artists’ respective discographies, aren’t very good, especially played against “Goodbye Yellow Brick Road,” “Evil Ways,” or “How Will I Know.”

It bears repeating, the artists we consider “classic” didn’t have this kind of chart authority. The Beatles’ longest stay at #1 was nine weeks, for “Hey Jude.” The Rolling Stones’ “Honkey Tonk Women” lasted four weeks at #1; the Supremes’ “Love Child,” only two weeks. These are the biggest hits of pop’s greatest artists. Ernie K-Doe, Paper Lace, and Milli Vanilli all have #1 hits. Bob Dylan, Jimi Hendrix, and Creedence Clearwater Revival do not.

At a time when American public life is known for strife, controversy, and infighting, “Old Town Road,” though good, is also altogether uncontroversial, a commercial bid for valuable airtime that holds audiences by not challenging them. Which is a pretty good description of the most widely heard Top-40 hits altogether. We might argue that having America’s #1 hit mattered back when Elvis and the Beatles dominated the charts. But those days, sadly, are long gone.

Friday, November 3, 2017

How To See the Truth in a World Gone Blind

Isaac Lidsky, Eyes Wide Open: Overcoming Obstacles and Recognizing Opportunities in a World That Can't See Clearly

Isaac Lidsky first hit the national stage as a child actor. One of the inaugural cast from Saved By the Bell: the New Class, he had high expectations… which were largely dashed when NBC discovered they couldn’t cast new actors in old roles. He struggled to find his feet, and had nearly done so, when life dealt him a second blow: a diagnosis of retinitis pigmentosa. He was doomed to spend his life going blind.

Lidsky’s first book is cross-marketed in health, business, and self-help sections, none of which encompasses the author’s level of ambition. Lidsky mixes scholarship, autobiography, and philosophy in a book that deals deeply with what it means to see the world. Our eyes only provide raw data; if we want to see, we see with our minds. Unfortunately, at times, Lidsky also proves the adage that there’s none so blind as one who will not see.

Network interference, and his already failing eyesight, derailed Lidsky’s acting career at an age when other boys still wonder what they want to be when they grow up. But his parents leveraged his television prominence to start a charity advancing retinitis pigmentosa research (there’s still no treatment or cure.) At the same time, Lidsky graduated Harvard Law School, clerked for a Supreme Court Justice, and launched a Manhattanite career younger than I even started college.

At this stage, and throughout this book, Lidsky mixes memoir, and lessons learned from both success and failure, with hard scholarship. His years as a law clerk trained him well in methods of research and writing: this book reads like a more seasoned author’s product, without the digressions and cow paths most business professionals’ first books face. Lidsky lets facts drive his argument, and when he interjects personal philosophy, he knows the purpose it serves.

Except… this taut writing lets Lidsky direct our attention, so it’s tough to notice what he leaves out. Just one example: In one chapter, Lidsky’s TV career is puttering out. His character proves less popular than Screech, and with network hopes pinned on this tentpole franchise, they need numbers. Lidsky finds his Hollywood dream turning into a real disappointment. Then—shazam, he’s nineteen, attending Harvard Law, determined to live on his own. What happened between?

Isaac Lidsky
This diversion happens so fast, I just assumed he was telling his story non-linearly for improved effect. So I forgot it. Only when I consulted my notes did I realize he just basically dropped the thread. Maybe he left it because that story basically fizzled, and there’s nothing left to describe. Or maybe he’s whitewashing something truly horrific. I have no idea, because he doesn’t tell us. It’s tough to evaluate what never gets said.

Please don’t mistake me. Lidsky uses sight and blindness as remarkable metaphors for personal and professional triumph. He tells stories, for instance, about trying to negotiate Cambridge streets with failing eyes, and later, D.C. streets completely blind. This provides insights into screening meaningless data from real content that provides him literal and figurative direction. His transition from normal, if famous, teenager, to ambitious, self-directed adult, provides lessons many adults I know could stand to learn.

Yet Lidsky apparently thinks, as many self-help memoirists do, that his successes are portable; to achieve Lidsky-like success, simply employ Lidsky’s checklist of lessons. He apparently overlooks ways he started from a position of advantage. Consider his acting career, which commenced when his parents drove him to auditions for local TV commercials. Later, they flew him cross-country, on their own nickel, to audition for NBC. He was coached to consider his own success basically inevitable.

Years later, burned out on lawyering at an age when peers were paying down student loans, Lidsky took a career aptitude survey and determined his correct career was, ahem, CEO. Nice work if you can get it… which he did. Because a well-heeled friend purchased him a foundering Florida tile company. This doesn’t discredit Lidsky’s accomplishments: he turned a building subcontractor around during the housing downturn. But maybe I could, too, if somebody fronted me the money.

I found plenty to like in this book. Lidsky’s principles of controlling what we see by yoking our thoughts, have significant merit. But they apply to him too: it really feels like he doesn’t grasp how others ensured he didn’t start from zero. Like other self-help memoirs I’ve reviewed, this is a case of “take what you need and leave the rest.” Because you aren’t Isaac Lidsky, sadly, and Isaac Lidsky isn’t you.

Monday, October 2, 2017

SuperSuit: a Business History of a Non-Linear Business

Reed Tucker, Slugfest: Inside the Epic 50-Year Battle Between Marvel and DC

At a party recently, two fellas got into a heated tangle over Marvel vs. DC. Marvel, one insisted, has grown too snooty living atop the comics sales heap for decades. The other insisted DC was stuck in World War II and hadn’t had a good idea since Eisenhower without pirating it from Marvel. As somebody with no corner to back, I found the conflict confusing. But watching two guys kept my focus narrow.

Freelance journalist and sometime radio sidekick Reed Tucker takes a wider view. Spanning the period from Marvel’s launch to the present, he describes the parallel development of two industry titans who latch onto the wonder inside readers, and speak to beliefs in justice. Launched in 1961, by 1972 Marvel dominated the market, and has ever since. Tucker gets the business right, but something feels missing from his analysis.

After a very brief introduction to DC’s history, Tucker dives into Marvel’s launch and its industry impacts. Marvel started so shoestring that it relied upon DC to distribute its titles. But heroes like the Fantastic Four, who fought among themselves, or Spider-Man, who often couldn’t pay his bills, touched a nerve for teenage readers. DC assumed audiences stopped reading comics around age 12; Marvel caught older kids longing for something meatier.

Marvel’s heroes had complex inner lives that touched Baby Boom readers, while DC’s heroes remained patriotic pin-up characters from a prior generation. Marvel encouraged pathbreaking artists like Jack Kirby and Steve Ditko, while DC maintained a house style so generic, literally anyone could draw any hero. Marvel took risks during an era when risk-taking paid handsomely, while DC conservatively clung to a portfolio worth more in licensing than publication.

Thereafter, Marvel led while DC followed. DC’s Carmine Infantino plundered Jack Kirby, Frank Miller, and other Marvel talent, but shackled them, and their talents sputtered. Marvel pioneered event crossovers, in-universe continuity, and other now-vital aspects of graphic storytelling. DC copied. Even when DC pioneered one domain, live-action cinema, they failed to parley that into marketing success.

Tucker takes the relatively unusual tack of focusing on business and production, spending little time on stories and art. He acknowledges that early Marvel comics had a nuanced depth of characterization that DC, stuck in post-WWII kiddie schlock, didn’t match. But he doesn’t explicate why, as DC matured and Marvel became a factory, Marvel kept outselling. Especially since around 1986, DC’s stories have competed with Marvel’s for psychological complexity.

This is especially perplexing considering how many personalities, like Jack Kirby, Jim Shooter, and Frank Miller, crossed between publishers. DC literally had the ingredients for Marvel-style revolution, but couldn’t translate them into more-than-mediocre sales. Tucker limply says that DC’s in-house management style couldn’t unleash such talent. But that sounds unconvincing when talent moved between the houses throughout the 1980s. Something deeper is at work, and Tucker keeps focus elsewhere.

Tucker offers mere glimpses into even large story developments, like Secret Wars or the Death of Superman, mostly superficial descriptions which anyone who read the actual comics already knows. If Marvel really succeeds from psychological depth and complexity, why not pause on important points? Almost as weird as what Tucker includes is what he omits. Influential writers like Alan Moore, and non-Madison Avenue publishers like Malibu Comics and Dark Horse, get only salutary mentions.

On a personal level, the period Tucker identifies as the high-water mark for printed comic sales, the early to middle 1990s, is actually the period I stopped following comics. Stories became too intricate, universes too massive, and keeping abreast became a full-time job—one I didn’t want because, with young adulthood upon me, I had a literal full-time job. The qualities that drove record sales drove me away.

That being the case, I’d have prefered more attention to stories and art. The business is fascinating, particularly to fans, but sales figures and market dominance follow audience interest, not lead it. Myself, the comics I’ve most enjoyed recently have come from DC, but tellingly, have generally been non-canon graphic novels like Grant Morrison’s Arkham Asylum. Stories that don’t require decades-long immersion in character backstories and universes.

Speaking of Grant Morrison, a book already exists which addresses the psychology Tucker mostly overlooks. Morrison’s Supergods mixes Jungian analysis with Morrison’s own autobiography of comics experience to plumb how each generation’s new superheroes addresses their time’s unique needs. Maybe fans should read Morrison and Tucker together. By itself, Tucker’s MBA analytics are interesting but anemic, lacking clear insight into what drives readers and their loyalties.

Monday, September 11, 2017

Whose Career Is It Anyway?

Bob Kulhan with Chuck Crisafulli, Getting To “Yes And”: the Art of Business Improv

Back in the late-1990s through late-2000s, when improvisational comedy ruled America’s nightclubs and Whose Line secured constant ratings, certain big-city improv troupes invented an idea for increased income. They rented themselves out to corporations for team-building workshops and executive activities. These events possibly encouraged group unity and mutual trust, maybe. But improv performer and management consultant Bob Kulhan questions whether they actually improved bottom-line corporate outcomes.

Kulhan, a Second City graduate, still moonlights in improv, while running his consultancy and adjuncting at Duke University’s business school, a genuine triple threat. He brings his interdisciplinary approach to asking: does improv actually teach anything useful for business? Yes, Kulhan says, but only with modifications that full-time actors probably don’t realize they need. Arguably, though, Kulhan doesn’t realize he’s resurrecting improv’s original purpose.

Improv instructors have an activity called “Yes And.” Two (or more) performers construct a scene by agreeing with one another. One posits some statement—“Well, here we are in Egypt”—and the other agrees, while adding something further—“Yes, and destined to discover King Hatsupbashet’s lost tomb!” Ideally, the performers hear one another clearly enough to build something profound, without contradicting or opposing one another.

This, Kulhan insists, represents how business professionals ought to communicate. Rather than battling for terrain or engaging in one-upmanship, the twin banes of loners seeking individual reward, business people should collaborate, listening intently in the moment without preplanning rejoinders or seeking ways to torpedo colleagues. MBA teachers will say this freely, of course, but actual professionals, desperate to make themselves immune to automation, often squabble for insignificant territory.

Good improv teaches students to listen closely, without preplanning, but with gazes turned toward whatever will produce a unified scene. Self-seeking behavior and stardom undermine the product; improvisors learn to succeed by lifting the whole company, sometimes at individual expense. Likewise, successful business professionals can improve their outcomes by centering their efforts on the project, team, or company, whether that means sacrificing their glamorous personal promotions.

Bob Kulhan
Kulhan delves into particular ramifications, like idea generation, team-building in time-sensitive environments, and generating enthusiasm even when individuals are fatigued. He doesn’t waste busy professionals’ time with stage games like Freeze Tag or Word Ball, which hone performance skills but have questionable offstage outcomes. Instead, he side-coaches readers on productive conversations where they strive to advance others’ ideas and build team momentum, without seeking the next response or personal reward.

Having done improv in college, and having seen the disastrous outcomes of self-seeking teammates in working life, I applaud Kulhan’s enthusiasm. I’d love the opportunity to employ the principles he describes in my workplace, and perhaps someday, if circumstances break my way, I will. That said, I wonder if he realizes he isn’t actually adding anything new to the discussion. Though the original purpose has gotten lost, the ideas Kulhan describes are why modern improv was first invented.

Viola Spolin used her WPA grant to create numerous improv games, some original to her, others reclaimed from Italian commedia dell'arte tradition. She taught these games in Chicago-area schools and community centers, believing that poor children didn’t learn at home the critical listening skills common to children of the wealthy and upwardly mobile. Her son, Paul Sills, carried these games into theatre, when he co-founded Second City in 1959.

Despite his Second City roots, Kulhan never mentions Spolin in the text or index. She gets one fleeting citation in the endnotes, so transitory that I suspect he doesn’t realize how close he’s stumbled to gold. Rather than creating something new, he’s recaptured the reason Spolin invented improvisation, a reason lost behind a richly decorated history of unscripted theatre. This gives Kulhan’s message a certain poignancy, one which I suspect he doesn’t even realize he’s uncovered.

Honestly, I did improve in college, even staging a successful team performance, without ever discovering this history. I didn’t know Viola Spolin had non-theatrical ends in mind until after graduate school, when I stumbled upon the information accidentally. I presume Kulhan similarly never knew improv’s history as professional skills development, or he’d cite more sources from Spolin and her peers. Like me, Kulhan probably doesn’t know the full lost history.

So, though Kulhan doesn’t say anything necessarily new, he says something much-needed. In a business milieu long clouded by individualists seeking their rewards while fearing the eternal spectre of automation, improv skills offer the uniquely human opportunity of innovation through team unity. Viola Spolin knew this around 1940, but the information got lost. Bob Kulhan brings it back.

Wednesday, August 2, 2017

The Trouble With “Isms”

John Oliver, the grumpy uncle
of pay-cable comedy
John Oliver, a man who’s described his accent as being like “a chimney sweep being put through a wood chipper,” has made Net Neutrality a personal pet issue. After pushing the issue once during the Obama administration, successfully, he came back to the issue again after President Trump appointed Ajit Pai chair of the FCC. For a polemicist famous for his wide-ranging interests, coming back to any issue is significant for Oliver.

For those unfamiliar with Net Neutrality, the concept is simple. Under current regulations, internet service providers have to make all web content equally available. Whether dialing up, say, an obscure blog by a struggling provincial writer (let’s just say), or the web’s most successful commerce sites, speed and accessibility shouldn’t vary. Service providers shouldn’t charge site owners for faster or more reliable consumer access.

Seems simple enough. I thought I supported the principle undividedly. And when a friend, an outspoken Libertarian who believes everything would improve if all top-level rules vanished tomorrow, claimed that Net Neutrality rules were forms of government micromanagement interfering with a system that worked just fine without bureaucratic interference. Let whomever charge whatever to whoever they want! It’s not government’s place to get involved.

To be clear, I don’t buy the anti-neutrality argument. If service providers could charge content providers for superior access, massive corporate systems like Facebook and Google, which between them control half the online advertising revenue for the entire earth, or mega-commerce sites like Amazon, could afford extortionate rates, while struggling artists and shoestring entrepreneurs would get shunted onto second-string connections and forgotten. That includes my friend, a strictly regional business owner.

Jeff Bezos, whose Amazon controls
half the Internet commerce in the world
(Full disclosure: this blog appears on a platform owned by Google. And my book reviews link to Amazon. I couldn’t afford this service if I had to pay out-of-pocket.)

But thinking about it, I realized: I’m protecting the very big against the very big. And so is my friend. As stated, half of all online ad revenue travels through two corporations and their subsidiaries. Google owns YouTube, Zagat, Picasa, and the Android operating system; Facebook owns Instagram, Oculus, and WhatsApp. Just for starters. I’m protecting the already massive from paying to get their product onto my computer screen.

Some Google subsidiaries, like Boston Dynamics, actually produce marketable product. But Google mostly sells ads on their search engine—ads that frustratingly often lead customers to something they already wanted to find. They’re already charging small operators, like those artists and entrepreneurs I mentioned, access to get seen by people like me. Same for Facebook: it’s primarily an ad vendor. And if you don’t buy these two companies’ ads, you probably won’t get seen.

So the Net isn’t Neutral right now.

Nearly a century ago, G.K. Chesterton wrote that, for most citizens, the difference between communism and capitalism is vanishingly small. The only choice is whether we prefer to be ruled by government bureaucrats, or corporate bureaucrats. That’s clearly happening here. On careful consideration, Net Neutrality is essentially protecting the very large corporations, who are imposing their own rules on smaller companies, rules that are proprietary and therefore both invisible and arcane.

So we’re faced with the choice between protectionism, which will ensure Google, Facebook, and to a lesser degree Amazon (which controls half of all Internet commerce) can charge small operators like me to get seen by anybody whatsoever; or Libertarianism, which… um… will make these companies pay Comcast, Time Warner, and Charter Spectrum to continue doing the same thing. On balance, neither choice really protects small operators like me.

G.K. Chesterton thinks your neutrality
rules are sweet and naive
I’ve chosen, at present, not to pay either Facebook or Google for increased visibility on this blog. This means I mainly get seen by people clicking through on my personal Facebook and Twitter accounts, and my daily hits seldom exceed 100 to 150. (Oh, who owns Twitter? It’s also a corporate umbrella; it just hasn’t grown nearly as fast.) I’ve chosen not to kiss corporate ass, and the price I’ve paid is vanishingly small readership. Sad trombone.

Both “isms” depend on the premise that, if we create the appropriately utopian regulatory system (too big? Too small? Any at all?), information will flow freely. Except we need only open our eyes to realize information isn’t flowing freely. The commercially accessible Internet, as it currently exists, is essentially a joint-stock partnership between Sergey Brin and Mark Zuckerberg. There’s no sweet spot of appropriate regulation on the Internet. Because there’s no freedom for information to move on a platform that isn’t already free.

Tuesday, August 1, 2017

Business, Ethics, and the Risk of (Self-)Righteousness

Scott Sonenshein, Stretch: Unlock the Power of Less—and Achieve More Than You Even Imagined

Professor Scott Sonenshein divides the the business world into two categories: chasers, who pursue more and better resources to do achieve their objectives, and stretchers, who make what they already have perform double duty and prove maximum return. Sonenshein, who teaches management at Rice University, uses language from business and behavioral economics to convey his message. I was shocked, however, to notice how he made a fundamentally moral point.

A popular mindset persists, Sonenshein writes, particularly among business professionals born into the wealthy class, or among people with very narrow, specialized educations. If we had more money, this mindset asserts, or better tools, or more people, or something, we’d crack the success code and become unbelievably successful. If only we acquire something new, we’ll overcome whatever impediment stops us achieving the success waiting below our superficial setbacks.

By contrast, successful businesses like Yuengling beer, Fastenal hardware, and others, practice thrift in resource management, utilizing existing resources in innovative ways, maximizing worker control over business decisions, eschewing frippery, and making the most of everything they own. Sonenshein calls this “frugality,” a word he admits has mixed connotations. But he’s clearly demonstrating familiarity with once-common ethical standards, what economists still call the Protestant work ethic.

Sonenshein doesn’t once cite religion or morality, either implied or explicit. However, when he breaks successful businesses down into bullet point lists of best practices, like “psychological ownership,” “embracing constraints,” “penny pinching,” and “treasure hunting” (to cite the takeaways from just chapter three), the ethical correspondences become rather transparent. Take responsibility for your choices, little Timmy! Work with what you have! Save now for bigger rewards later! Et cetera.

From the beginning, Sonenshein structures this book much like religious sermons. His points are self-contained, backed with pithy illustrations showing real-world applications. He asserts his point, cites his text, backs it with anecdotes, then reasserts his point. The structure appears altogether familiar to anybody versed in homiletics. It persists in religion, and translates into business books like this one, because it holds distractible audiences’ attention long enough to clinch brief points.

Scott Sonenshein
But again, Sonenshein never cites religion. He frequently quotes research from psychology and behavioral economics to demonstrate how scrutiny supports his principles. But if he’s proffering a business gospel, it’s a purely secularized one. Though Sonenshein comes from the same mold as religious capitalists like Norman Vincent Peale, Zig Ziglar, and Og Mandino, he never relies upon revealed religion. Earthly evidence, not God’s law, demonstrates this gospel’s moral truth.

Oops, did I mention Norman Vincent Peale? See, there’s where doubts creep in. I had mostly positive reactions to Sonenshein’s points until I remembered Peale. There’s a direct line between Peale’s forcibly optimistic theology, and Joel Osteen’s self-serving moralism. We could achieve earthly success by aligning our vision with God’s… but often, already successful capitalists have recourse to God to justify their own goodness. I’m rich because I deserve it!

This often leads to retrospective reasoning—what Duncan J. Watts calls “creeping determinism.” In finding already successful operations, then applying his learning heuristic to them, Sonenshein risks missing invisible factors steering his anecdotes. I cannot help recalling Jim Collins, who praised Fannie Mae and Circuit City scant years before they collapsed. In reading Sonenshein’s anecdotes, like hearing Christian sermons, it’s necessary to listen for the sermonizer’s unstated presumptions.

Please don’t mistake me. I generally support Sonenshein’s principles. I’ve reviewed previous business books and found them cheerfully self-abnegating, urging middle managers to sublimate themselves to bureaucratic hierarchies and treat themselves basely. Sonenshein encourages workers to stand upright, own their jobs, and always seek improvement… and he encourages employers to treat workers like free, autonomous partners. Though Sonenshein never embraces an “ism,” he corresponds with my personal Distributism.

I wanted to like Sonenshein’s principles because I generally support his ethics. His belief in thrift, in embracing a mindset of ethical management, and of getting outside oneself, is one I generally applaud, and strive to apply to myself (not always successfully). Though I don’t desire to control a multinational corporation, I wouldn’t mind leveraging my skills into local business success and financial independence. And I’d rather do it ethically.

But like any ethicist, Sonenshein requires readers to police themselves carefully. They must apply these principles moving forward, not looking backward. Financial success often inspires implicit self-righteousness, which business ethics can inadvertently foster. I’ll keep and reread Sonenshein, because I believe he’s well-founded. But I’ll read him with caution, because his framework conceals minefields I’m not sure even he realizes are there.

Monday, July 24, 2017

The Doctor Is Still In

Peter Capaldi as the Twelfth (current) Doctor
I have seen more people shaming the Doctor Who haters than I’ve actually seen Doctor Who haters. With the recent announcement of Jodie Whittaker’s nod to play the thirteenth (or fourteenth, or fifteenth) version of the title character in the BBC’s Doctor Who, social media has been alight with people mocking and belittling those tender souls who cannot stomach a woman in the role. Actual tender souls, however, have been rare and hard to find.

Perhaps that’s because I haven’t been looking for them. The Internet allows such fungal undergrowth to flourish beneath various rocks on Reddit and 4Chan, but unless one of their number gets elected President, few actually venture into the sunlight, knowing the general cultural trend has moved away from them. Their sexist, homophobic, bigoted attitudes belong to a long-gone era, and they know it. They have the common decency to stay away from us normal people.

Yet rather than let such attitudes fester and die quietly, members of the progressive call-out culture feel obliged to share, retweet, and otherwise publicly announce the existence of such attitudes. They qualify the shares with snarky comments or condemnations of regressive attitudes, which maybe gives such shares the vestments of sanctimony. But as working journalists discovered last year in sharing Donald Trump’s message, vocally refuting regressive attitudes doesn’t change the value of free media attention.

Every time some white, male yob claims that a female Doctor, female Ghostbusters, or female Jedi “ruined my childhood,” they get free publicity from those who purportedly oppose that position. I’d think having a childhood so brittle that you’re still hanging “No Girls Allowed” signs on your door, would be ruined without outside intervention. But by giving these self-proclaimed martyrs sunlight, my fellow progressives allow these attitudes to flourish and propagate, past their sell-by date.

Paul McGann as the Eighth Doctor
Weirdly enough, while rewarding such conservative crybabies, progressives nourish their own crybaby culture simultaneously. I’ve witnessed liberals whining that the female Doctor doesn’t count because she’s too late, or because she’s white. In other words, progress doesn’t really matter unless it happens on my timetable. Besides being just plain arrogant, this attitude overlooks the commercial demands of media. Even a publicly owned resource, like the British Broadcasting Corporation, is still a corporation, beneath the surface.

Richard Walter, professor and chair of screenwriting at UCLA, writes that “American television seems to love last year’s— or last decade’s— controversy.” In other words, media won’t embrace a position until it’s safe enough to handle without getting burned. Hollywood refused to touch the anti-racist message of Broadway plays like Finian’s Rainbow until history had made the message essentially harmless. The BBC is British, yes, but it suffers under the same constraints as American television.

So yes, a character created in 1963 will carry the whiff of the white, patriarchal influences under which he (she?) was created. Though original showrunner Verity Lambert was a twenty-something woman, the character she oversaw was old, male, and white, reflecting the British power structure in her era. The Doctor didn’t get a black traveling companion until Mickey Smith in 2006, or an openly gay one until Bill Potts in 2017. Hispanic? Sorry, not yet.

But the show planted seeds for the Doctor’s more inclusive regenerations years ago. We saw our first on-screen Black Time Lord in 2008, two years after Black actor Patterson Joseph was briefly considered to play the Doctor. We saw a male Time Lord regenerate into a woman in 2016, demonstrating that such transitions were possible. (Don’t bring me Steven Moffett’s “The Curse of Fatal Death,” where Joanna Lumley plays the Doctor. Gag episodes don’t count.)

Tom Baker as the Fourth Doctor
So. Progressives who’ve already seen that the Doctor will become more diverse in the future, complain that such diversity isn’t happening right now. Simultaneously, crusty back-numbers who want Doctor Who to remain a museum piece of their supposedly bucolic, girl-free childhoods, complain that diversity happens at all. The BBC must satisfy the largest number of viewers, so change has to happen, but it cannot alienate large audiences, so change happens slowly. That’s what corporations do.

Which is why I cannot take either side seriously. Calls to make change happen instantaneously, even when such change is legitimate and overdue, are as unreasonable as demands to forbid change and keep things static forever. The Doctor will become a non-white woman someday soon. Possibly even a gay woman. (I favored Sue Perkins for the role, after all.) Conservatives cannot stop it, but progressives cannot rush it. Corporations are slow, but change is inevitable.

Thursday, May 18, 2017

Why I Still Don't Want Genetically Modified Food

Much modern farming less resembles gardening than strip-mining

A friend recently shared another of those articles “proving”—to the extent that science can prove anything—that genetically modified foods are perfectly safe. Perhaps they are, I don’t know. However, the article included multiple references to “conventional” agriculture, insisting that GMO foods are perfectly equivalent to foods produced through selective breeding, which we’ve enjoyed for years, and here I definitely know something. Conventional agriculture, as currently practiced, is deeply dangerous.

That seems controversial to say. Americans today enjoy the cheapest food in world history, quite literally: on your typical grocery run, you probably pay more for packaging than the food inside it. Massive technological investments constantly improve agriculture, improving yield and ensuring continued, affordable supply for whoever can afford it. Selective breeding has produced more fruits, vegetables, meat, dairy, and grain than ever before. Am I calling this improvement dangerous?

That’s exactly what I’m saying, and I’ll offer examples. According to a recent Atlantic article, a single bull who lived in the 1960s produced so many offspring that fourteen percent of all Holstein cattle DNA descends from this one specimen. Anyone who lives in cattle country knows prize cattle semen fetches premium prices on auction. This bull’s DNA quadruples per-cow milk production, but also increases likelihood of spontaneous abortion in utero. Hardly an unqualified success.

Equally important, though, and something the article scarcely touches on: 14% of Holstein DNA is now genetically homogenous. This resembles the degree of crop homogeneity that preceded the Irish Potato Famine. The rise of genetically similar cultivars, some GMO, some developed through conventional selective breeding, has produced remarkable vulnerability to crop blight, resisted only through petroleum-based chemical pesticides and intrusive technological interventions.

Pigs don't live in pens anymore; this is where your pork comes from

One episode of the Showtime TV adaptation of Ira Glass’s This American Life features a visit to a contemporary Iowa hog farming operation. The selectively bred hogs raised here produce more piglets per birthing, and therefore more meat overall, a seemingly desirable outcome. But the pigs produced so completely lack native immune systems that they cannot survive outdoors. They’re raised in clean-room environments more restrictive than those used in silicon microchip manufacture, at massive expense.

So we have livestock so homogenous that they’re vulnerable to blight, so tender of constitution that they cannot handle the outdoors, and so expensive to raise that any output gains are offset by the extraordinary measures necessary to keep them alive. So agriculturalists are backing off these approaches, as reasonable people anywhere would, right? Of course not. A combination of government incentives and corporate marketing encourages increasing output, even during times of unrestrained surplus.

Recombinant bovine growth hormone (rBGH), marketed heavily by Monsanto and Eli Lilly, promises to increase milk outputs. This despite known health effects, including distended udders and pus in the milk, and suspected side effects—rBGH is a possible, but frustratingly unconfirmable, human carcinogen. And this also despite the fact that the U.S. government has purchased excess American dairy stocks and dumped them on the ground to prevent prices going into freefall. It has done this since the 1930s.

I use livestock as examples, because images of living creatures suffering tugs our heartstrings. But this pattern obtains across all farming: fear of shortfall justifies constant excess. According to agriculture journalist George Pyle, America grows twenty times as much corn as Americans could possibly eat. So most of the oversupply gets fed to cattle, making meat insanely cheap. But cattle cannot digest corn starches, turning their shit acidic, a perfect environment for toxic E. coli strains.

That’s saying nothing of the economic impact. When NAFTA became law in the 1990s, some Americans worried that manufacturing jobs would emigrate to Mexico, which somewhat happened. But when subsidized American agriculture hit Mexican markets below the cost of growing, rural poverty, especially in the agrarian south, hit record numbers. Mexican poor sought work where work existed: in the U.S. And Americans elected a demagogue promising to build a wall keeping those impoverished workers out.

Old McDonald had an assembly line, E-I-E-I-O
Corporations sell GMO seedstock by promising increased yields. Conventional farming currently produces enough food to feed 150% of the current world population, mainly driven by petroleum-burning equipment, with fertilizers and pesticides derived from petroleum. (The Rodale Institute estimates that farms currently produce more greenhouse gasses than cars.) When food is already so oversupplied that it’s cheaper than the packages it’s sold in, increasing yields makes no sense.

Yet, as George Pyle notes, American farm policy has assumed an imminent food shortfall justifies continual increases, ever since America devised its first farm policy, during the Lincoln Administration. One friend justifies continuing this approach because, he believes, near-future environmental collapse will require genetically modified foods to save the human race. Two problems: we cannot predict environmental outcomes any better than we could predict post-nuclear war conditions. And, Pyle writes, heirloom varietals are more adaptable anyway.

Starvation exists today, and chronic hunger exists close to home. But increasing supplies, whether through conventional or GMO means, makes little difference. People lack access to food, which usually means money. MLK noted, back in the 1950s, that fresh vegetables cost twice as much in poor neighborhoods as in rich neighborhoods. High-yield GMO seeds, often pitched to cure global famine, are expensive. People too poor to buy and plant heirloom varieties cannot trade up.

So basically, the demonstrable safety of individual GMO varietals doesn’t much matter. (Rampton and Stauber question that science anyway.) If they’re similar to selective breeding, well, breeding hasn’t been benign either. And they’re customized for an economic demand that doesn’t actually exist outside corporate PR. Yet the drumbeat of safety, quantity, and productivity has made these demands common coin. That’s just missing the point. Agriculture is hurting itself just fine right now, without gene technology’s help.

Friday, April 21, 2017

The Wisdom of Crowds, and the Money to Do It

Michael J. Epstein, Crowdfunding Basics in 30 Minutes

The rise of crowdfunding websites has traveled hand-in-glove with spreading social media. Savvy media customers use their web presence to solicit support for their entrepreneurial ventures, artistic experiments, medical bills, and more. A young couple I know is crowdfunding their fertility treatments. But not every crowdfunding venture succeeds. What makes some triumphant, and others sputter on the launch pad?

Los Angeles-based renaissance man Michael J. Epstein has used crowdfunding to support himself as an independent filmmaker and indie musician. His familiarity with crowdfunding shows a mix of academic research and practical experience. As a writer, he shows a careful balance of raconteur and scholar that most working journalists should aspire to emulate. And he explains the crowdfunding principle in ways naifs and part-timers, like me, can really understand.

Novice crowdfunders may mistake the process for the online equivalent of passing the hat. An earnest appeal, backed with some concrete example of your plans, should get at least a few people to crack their wallets, right? Not so, says Epstein. This book, longer than a pamphlet but shorter than, say, a Malcolm Gladwell treatise, delves into crowdfunding without bogging down in details. Because not everything about crowdfunding is obvious.

First, not all crowdfunding platforms are equal. Epstein doesn’t have a thorough list of all crowdfunding websites, since in today’s economy, individual sites come and go; he name-checks a few popular sites, but only as examples. For instance, Kickstarter, targeted at for-profit entrepreneurs, has an all-or-nothing mentality that encourages a certain urgent mindset. GoFundMe aims to buoy struggling individuals, while Patreon subsidizes artists and other creative professionals.

But even beyond finding the right platform, Epstein says, certain habits of businesslike thinking apply across multiple models. In a media-saturated digital marketplace, simply having an earnest, factual appeal isn’t enough. Serious operators need a professional logo, well-made video, concise but informative text statement, and at least a few good audio or video clips. That’s just for a minimum. This means having a good professional network; guerilla operators get overwhelmed quickly.

Michael J. Epstein
Finally, Epstein repeatedly returns to the idea that crowdfunders aren’t merely making a dispassionate business pitch, we’re building relationships. Which makes sense, on consideration. I favor my local grocery for convenience, selection, and value, but also because I know and trust the workers. How much more does that apply online, where we’re bombarded by appeals daily, unmoored from the urgency of needing fresh produce close to where I live?

Epstein’s pitch is detailed enough to inform readers, but brief enough to prevent discouragement. He makes generous use of screen captures, infographics, charts, and other goofballs designed for visual thinkers. Essentially, this book is laid out like a webpage, appropriately enough, since it’s designed for web semi-professionals accustomed to the Internet’s multimedia format. This makes for smooth reading for multiple audiences, without dense, discouraging blocks of text.

This encourages me to say something I don’t believe I’ve ever written in a review: maybe you’re better off getting the Kindle version. Since we read books like this for information rather than pleasure, and since you probably need the data sooner rather than later if you’re drafting a crowdfunding campaign, and since it comes conveniently pre-formatted for online reading, and hey, since it’s four bucks cheaper, having the physical book probably doesn’t help much.

Having the information contained herein, however, helps a great deal. Like many self-starting entrepreneurs, you’re probably throwing yourself against your project with more brute force than professional polish. Having a mentor like Epstein to guide you away from the most common pitfalls can save you long-term heartache, and bring more money into your project. Epstein can’t solve all your problems, but he’ll prevent worse ones.

It may, sometimes, be necessary to separate Epstein’s content from his person. A director of small-budget vampire films, he cultivates a quirky, slightly dangerous image, a sort of off-Sunset John Waters. Many photos, including one inside this book, emphasize his wide, staring eyes and uncultivated beard. Don’t be fooled by his appearance, though. Epstein writes with a cool hand, a mind for thorough detail, and an eye toward diverse audiences.

The title notwithstanding, don’t expect to really understand crowdfunding in thirty minutes. At 73 pages plus back matter, this isn’t lunch break reading. And that’s before the necessary time spent planning and practicing the principles Epstein lays forth here. This book requires readers to think and plan conscientiously. But if it gets us thinking like business professionals, planning with a long horizon, we’re already a step ahead, right?

Monday, April 3, 2017

Why We Need Liberal Arts in the Business World

Christian Madsbjerg, Sensemaking: The Power of the Humanities in the Age of the Algorithm

It’s become dogmatic in certain circles to insist that only STEM subjects matter; disciplines like English, Sociology, and Music have become passé. Danish-American strategy consultant Christian Madsbjerg disagrees. Traditional humanities disciplines are not backward or vestigial; in his experience, business professionals and forward-planning capitalists need these fields to function. We’re not all plugged into an algorithm, Madsbjerg writes. Humane arts are necessary in a wealthy society.

We’ve all grown bored with the repetitive claims. America needs more welders and fewer philosophers, Marco Rubio said. Madsbjerg quotes Jeb Bush that psychology majors are headed for jobs at Chik-fil-A. Today’s data-driven world gives us all important answers, and we can accurately predict outcomes if we simply have sufficient information. Businesses run on data, and we need more number crunchers, more code writers. The numbers speak for themselves.

Not so, Madsbjerg writes. Numbers almost never speak for themselves; they need humans to interpret them. In his early chapters, Madsbjerg details several high-profile incidents where numbers, adrift from human context, proved the exact opposite of reality. Software failed to predict movements in population, economy, even disease epidemiology. Only a well-informed human could restore the context these numbers lacked, giving them power to mean anything in the real world.

From this foundation, Madsbjerg builds five formal bromides about how humanities make American business possible. I could list them: statements like “Culture—Not Individuals,” or “The North Star—Not GPS.” But unlike too many business writers, who dispense fortune cookie platitudes with casual disregard, the real joy in reading Madsbjerg comes from his explanations. A schooled philosopher himself, Madsbjerg coaches readers through a thought process, not memorized “skillz drillz.”

Why, for instance, does the Ford Motor Company struggle to sell cars in India to people who are, demographically, almost identical to their core business in America? The answer, which Madsbjerg teases out across several chapters, has roots in cultural circumstances unrelated to cars. Running the bare statistics, middle-class Indian urbanites seemingly resemble their American peers. Understanding the difference requires pausing big data and unpacking respective cultural contexts.

Christian Madsbjerg
This doesn’t mean abandoning technical skills. In my favorite illustration, Madsbjerg describes a Danish architect scouting a location for a prospective Swiss bid. Important aspects of architecture, like engineering properties of glass, steel, and masonry, apply everywhere. But aspects of designing this building, to fit into this business and regional culture, involve understandings not taught in design classrooms. These require understanding language and industry and art—in short, understanding humans.

Madsbjerg, to his credit, does not produce another crinkum-crankum encomium to why liberal arts education makes us better people. I could’ve written that; I probably have. As a business consultant, Madsbjerg maintains focus on economic implications. Liberally educated professionals make better business executives, he insists, because their diverse education allows them to face difficult situations, sift conflicting evidence, and make decisions that improve everyone’s condition.

This requires a complex relationship with information. Business executives who turn data into outcomes don’t simply receive their information; they run it through filters that, for lack of better terminology, resemble anthropology, literature, and art. Business history, and Madsbjerg’s prose, is replete with examples of people, well-trained to do one thing (spreadsheets, double-entry bookkeeping), who stumbled altogether when confronted with the larger picture. Liberally educated professionals can simply adapt better.

And Madsbjerg himself is actually a good example of this. I’ve had several books cross my desk recently, offering to make readers into billionaire business icons; most either bury the audience in source notes and statistics, or tell long, rambling anecdotes that seem largely irrelevant. Madsbjerg, by contrast, creates the kind of balance that makes his advice practical: numbers where they’re necessary, stories where they’re relevant, always couched in comprehensible context.

Humans are sensemaking creatures, Madsbjerg writes, thus his title. Increases in data collection and statistical analysis have made sensemaking more powerful, nuanced, and worthwhile. But data never simply exists as-is; it always comes from somewhere, and requires human intelligence to make it applicable. Without that human intelligence, which comes from understanding literature, social science, and other humanities disciplines, numbers mean nothing, or even create more confusion than they solve.

I’ve read and reviewed several business books recently, and hated more than I care to recount. The worst are often mere billboards for the authors’ consultancies, comprehensible only if a Harvard MBA or the author is present. Madsbjerg has instead created a manifesto. Businesses, money, and data all serve people, he writes, not vice versa. Understanding this makes the difference between success and failure.