Tuesday, October 29, 2024

“Chemicals,” Food, and You

“3-methyl butyraldehyde is a compound in a blueberry. Think about that.”

Somebody threw this into the ether recently in an argument about whole foods. You know how wise and restrained online debaters are. This person seriously believed they’d made a meaningful point about why people who insist on whole foods and minimal processing were wrong. Because whole foods have chemical compositions which are difficult to pronounce, this person apparently believed all arguments for plant-based whole foods are, a priori, wrong.

In full fairness, the other person in this debate (not me) said something equally unfounded. “If you can’t pronounce an ingredient,” the other person wrote, “DON’T EAT IT!” This person apparently believed in the honest wholesomeness of “natural” ingredients, presuming that naturally occurring, plant-based substances must necessarily be healthful. The other person responded with complete trust in science and technology.

I’ve written about this before; this double-sided fallacy doesn’t bear another unpacking.

However, the 3-methyl butyraldehyde argument deserves some exploration. This person, hidden behind an anonymous sign-on handle and a cartoon avatar, claims that abstruse chemical constituents within whole foods are essentially equal to additives used in manufacturing processed foods. 3-methyl butyraldehyde, which has both naturally occurring and synthetic forms, is found in many commercial foods, both whole and processed.

Blueberries have several naturally occurring chemical constituents. Some are easy to pronounce, including protein, fat, and especially water. Others are more abstruse, such as hydroxylinalool, linoleic acid, and terpinyl acetate. Though most of these chemical compounds are harmless in naturally occurring proportions, some can be harmful if isolated and hyperdosed. Like most organisms, blueberries comprise a subtle, nuanced combination of substances.

However, no combination of these substances, in any quantity, will come together and form a blueberry, not with current science or technology. One can only grow a blueberry by carefully cultivating a blueberry bush, a commitment of time and effort, as blueberry bushes only produce fruit after two or three years. Chemical fertilizers can sometimes hasten fruiting, but at the cost of starchier fruit, which displaces both nutrients and flavor.

One recalls the common counterargument whenever hippies complain about “chemicals.” Some wag, occasionally but not often a scientist, responds: “Everything is chemicals!” To take this argument seriously, the respondent must not know (or pretend not to know) that people say “chemicals” as a synecdoche for synthetic chemicals of an unknown provenance, which, under America’s light-touch regulatory regime, are assumed safe until proven otherwise—cf. Rampton & Stauber.

Though the FDA tests and regulates pharmaceuticals (for now), many food additives, cosmetics, chemicals used in haircare products and clothes, and other things we put on our bodies, are presumed safe. This despite years of evidence that this isn’t good practice. Ethelyne glycol, cyclamate, and several food dyes were regularly used in American foods before being demonstrated as unsafe.

Even beyond safety concerns, the reduction of whole foods to their chemical constituents preserves a dangerous idea. Futurists once posited that food scientists would eventually isolate the basic nutrients in food, and effectively replace the tedium of cooking and eating with the simplicity of gelatin capsules. One finds this reasoning behind the mythology of vitamin supplements, now known to be useless for most people most of the time.

Human digestion doesn’t simply extract chemical nutrients from food like a Peterbilt burning diesel. We require the complexity of food, including fats, fiber, roughage, and limited amounts of sugar. I generally side with Michael Pollan’s ubiquitous advice: “Eat food, not too much, mostly plants.” Food doesn’t mean chemical constituents. You don’t make a blueberry smoothie by adding 3-methyl butyraldehyde, you make it by adding blueberries.

Please don’t misunderstand. I want to avoid the trap of assuming that “natural” equals good. Reasonable adults know you shouldn’t pick wild mushrooms or handle poison ivy. That’s an exaggeration, but the point remains, that nature requires respect, like any other tool. But human agronomists have selectively bred food crops for 5,000 years to maximize healthful content, and apart from occasional allergies, agriculture is broadly trustworthy.

And pretending that food only consists of its chemical compounds is bad-faith argument. You wouldn’t describe your friend by listing his tissues and internal organs, because humans are more than the sum of our parts. The same applies to food, including fresh ingredients. Cooking natural ingredients, then processing them with synthetic additives to make them tasty and shelf-stable, does change the food.

Pretending not to understand the other person is smarmy and disrespectful, and if your argument requires it, your argument is probably bad.

Friday, October 25, 2024

A Weird New Era in Conservative Sex

Missouri AG Andrew Bailey

Pioneering Austrian psychologist Carl Jung wrote about “synchronicity,” when two occurrences physically unrelated appear to form a pattern. For him, this meant not the objects themselves, but how the audience perceives the objects, that we imbue life’s circumstances with meaning. When two separated events or objects appear meaningfully related to us, that doesn’t describe the events or objects, but us seeing them. Humans don’t receive meaning, we create it.

In court filings this week, Missouri Attorney General Andrew Bailey asked federal courts to intervene and restrict access to mifepristone, an abortifacient, through telehealth. Missouri had trigger laws which went into effect following the Dobbs decision, making abortion mostly illegal overnight. But Bailey complains that, because recent regulatory changes make mifepristone available without a face-to-face doctor visit, women are circumventing state law to access abortion.

News junkies my age were astounded when Bailey’s filing, supported by attorneys general in Kansas and Idaho, specifically cited the desire to have more teenaged mothers. Bailey complains that, without teenagers becoming pregnant, Missouri’s economy has suffered a workforce shortage, and the state becomes poorer. I’m old enough to have had public-school sex education in the 1980s, when teenaged motherhood fired paranoia and irrational parental crackdowns nationwide.

Reports quote Bailey, a Republican, claiming that fewer teenage pregnancies result in “diminishment of political representation and loss of federal funds.” In other words, Missouri needs more mothers too young to shoulder the psychological or fiscal burdens of motherhood, in service to the common good. Very Maoist. Bailey’s findings continue from there, including false accusations of medical risk, but media reports focus on the jaw-dropping teen motherhood component.

On Wednesday, former Fox News cornerstone Tucker Carlson introduced former President Trump at a Duluth, Georgia, rally with an extended rant. Carlson described Trump as “Daddy” who was “coming home” to deliver a “spanking” to his electoral opponent, Vice President Kamala Harris. According to reports, audiences responded with spontaneous chants, but even many Trump loyalists in attendance—who also comprise Carlson’s audience base—reportedly felt uncomfortable.

Tucker Carlson, corporate puppet

It's hardly breaking news that Republicans care deeply about American sexual habits. Trump promised at the 2016 Republican National Convention that “I will do everything in my power to protect LGBTQ citizens,” received a striking round of applause, then largely lost interest in the topic. Throughout my political lifetime, Republicans have granted increased freedom to Americans’ money, but simultaneously imposed bleak restrictions on sexual autonomy beyond man-on-top heterosexuality.

For years, the LGBTQ+ community’s supposed sexual licentiousness galvanized Republican support among suburbanites fearing change. Footage of supposed debauchery from Pride parades, caricatured stereotypes of Castro District lifestyles, and the artwork of Robert Mapplethorpe and Andres Serrano help keep squares firmly within the (R) camp. “Don’t Say Gay” bills in Florida and elsewhere have been motivated, partly, by keeping sexuality hidden from children.

Therefore this sudden pivot to sex positivity seems weird. For Republicans to not only disclose their bedroom kinks publicly, but to seemingly encourage adolescent sexuality, contradicts everything I once believed as a teenage Republican. The conservative milieu in which I hit sexual maturity, suffused with Christian purity culture, feared youthful sex, but like anything kinky, it simultaneously wallowed in that which it feared.

The same voting base that supposedly elected Andrew Bailey (he was actually appointed) and bankrolls Tucker Carlson, also gave America the moral horror of virginity pledges and purity balls. These rituals sought to control teenage sexual exploration, but to control it, they first foregrounded it, walking youths through the temptations they’d putatively confront. Unsurprisingly, conservative Christian youths are, if anything, more likely than general American teens to have premarital sex.

As revolting as Bailey and Carlson are, they’re only two data points, not indicative of American conservatism overall. It’s unclear what fraction of Republicans they represent. Yet their apparent willingness to reverse course and condone public sexuality, bespeaks a changing ethic atop American conservatism. Republicans of my generation asserted an overwhelming interest to intervene to preserve teenage celibacy. Now Bailey wants to pimp teenagers out for the common weal.

It apparently never occurs to these reprobates that, to make their states grow, they chould make their states into places people want to live. Economically vibrant, demographically diverse, and yes, open to multiple sexual expressions. These attempts to weaponize human sexuality seem more likely to alienate than invite voters. Instead, these activities turn citizens into livestock, and lawmakers into pornographers.

Sex positivity doesn’t mean public luridness, or breeding teens like puppy mills. Republicans need to learn this if they hope to survive.

Thursday, October 24, 2024

Large-Group Dynamics and the Lonely Child

young woman with books leans against the school library shelves

Nobody actually likes the popular kids in high school. You wouldn’t know that from the deference they receive, from peers and teachers alike. Yet several years ago, reading Dr. Ellen Hendriksen, the author delved into several studies in how people make friends—and the outcomes were surprising, and frequently ugly. Our social structure relies on principles which we frequently can’t see or understand.

Quoting a 1998 study by Dr. Jennifer Parkhurst et al., Hendriksen writes that Parkhurst studied high school social dynamics, a popular field in social psychology. They concluded that popular kids are well-liked, amiable, and natural leaders. But Parkhurst took the unusual step of reading her outcomes to the students she’d studied. To her astonishment, one of her subjects stood up and said (I’m paraphrasing): “Nuh-uh!”

One of Parkhurst’s student subjects, supported by others, reported that peers often widely dislike, even despise, the “popular” kids. They achieve popularity by dominating others, waving their weight and social connections around, and behaving in an entitled manner. Parkhurst, astounded by the outcomes (and probably suffering her own flashbacks to adolescence), reevaluated the data. Turns out, people obey popular kids mostly out of fear and fatigue.

Growing up in a military household, my family moved frequently. Many military brats say likewise, but my father served in the Coast Guard, which mostly operates domestically, and therefore can afford to move personnel more frequently than other branches. Only once did we stay anywhere longer than two years. This proved particularly frustrating because, I now realize, most schools have an unofficial hazing process usually lasting a year.

Without the long-term longitudinal experience that comes from staying in one place for long, I truly never learned to read group dynamics in large populations. If Hendriksen hadn’t reprinted Parkhurst’s findings, translated into vernacular English, I might’ve persisted in believing that I received that hazing alone, unaware that everyone else experienced it too. I certainly would’ve remained mired in the delusion that the popular kids spoke for everyone.

(I know others, like migrant farmworkers’ kids, undoubtedly have it worse. I’m not comparing scars here.)

young child sits alone amid a crowd of active children

Put another way, I legitimately believed, not only throughout childhood but well into adulthood, that the loudest, most attention-hungry person in the room spoke for everyone. Presumably we all experience that phase, including that person. You presumably watched Mean Girls too. The persons demanding others’ attention and obedience legitimately believe they’re shepherding the crowd where it wants to go, simply keeping stragglers in line.

Something which former gymnast turned lawyer Rachael Denhollander said recently stuck with me. Speaking in the documentary For Our Daughters, Denhollander said: “It costs you something to side with the weak and the vulnerable and the oppressed. It costs you nothing to side with the one who’s in power.” Denhollander meant this about women and girls sexually abused in church, but it applies, mutatis mutandis, to all relationships with power.

For most children, public schools are our first interaction with organized power. Teachers have nigh-absolute power over their students, and I believe most wield that power with benevolent intentions. But as with most powerful people, there’s a gulf between intention and act. Whether they bend to a malicious minority, or go along with administrative dictates to get along, the outcome is largely similar for students, inexperienced at resisting injustice.

Popular kids and “mean girls” basically reproduce the regimes they witness, filtered through children’s eyes. They misunderstand the larger purposes behind adult authority; they only witness the demand for obedience and conformity, and repeat it. Meanwhile, adults don’t think like children, and attribute adult reasoning to childlike behavior. Both the popular kids and the subject-blind adults side with the powerful, which costs them nothing.

Kids could, hypothetically, organize against the popular kids and the adults who enable them. Indeed, something Malcolm Gladwell wrote recently stands out, that subgroups like Goths resist by making themselves look unapproachable, thus exempting themselves from popularity dynamics. But the outcasts shepherded by the cool kids, almost by definition, lack the leadership and organizational skills to unionize and form more healthy social dynamics. They’re doomed to struggle.

My father timed his retirement to coincide with my high school graduation, whereupon the family relocated one last time, to their hometown. This dumped me into adult responsibilities with no existing social network to streamline the transition. I hope other “nerds” and outcasts at least preserved their nominal support systems, because to this day, I struggle to read rooms. No wonder so many adults still have nightmares about high school.

Monday, October 21, 2024

First Dates, Economic Violence, and Coffee

young couple on a date

Recently, I’ve seen two social media influencers who propounded the “rule” that women shouldn’t settle for coffee on a first date. These two influencers, both women, appear to be working separately, and use unique wording, so they probably aren’t bots. But they agreed that, unless a man takes the initiative to plan a complicated, and implicitly expensive, first date, he wasn’t worth a woman’s time. No coffee, meals a minimum.

Social media had provided would-be influencers a platform to peddle their malarkey, and many have. Few have achieved the cultural permeation of, say, Emily Post’s Etiquette or that 1994 encomium to personal repression, The Rules. But others have become widespread. Thank-you notes for job interviews, a thing that literally didn’t exist when I entered the job market in the 1990s, are now virtually mandatory in most industries.

Rules exist because we need them. Imagine a society without rules against, say, murder. I believe most people wouldn’t murder, but without a specific prohibition, a tiny minority would. Rules exist not only to standardize the principles we use to operate a complicated society, but to standardize punishments for those who transgress. Rules, written in advance and enforced by putatively neutral arbiters, make a safe and reliable society possible.

But the very processes which create rules, frequently turn those rules abusive. As Joe Nocera writes, universities created the NCAA to standardize the rules for American football. But once those rules existed, the NCAA had to enforce them—and it’s become notorious for enforcing them in arbitrary, high-handed ways which frequently disadvantage student athletes. The players hit hardest tend to be disproportionately Black and poor.

The NCAA is perhaps a frivolous example; little truly depends on BCS game rules. But the same principles apply to more significant examples. The perseverance of the Electoral College shows how rules written to preserve slaveholders’ economic advantage remain part of American law. The famous difference between sentencing standards for powdered cocaine versus crack show how lawmakers continue inserting such inequality into law.

Rules inevitably create hierarchies. Even benevolent rules, like laws against murder, enshrine some people (and not others) as worthy to enforce those laws. Rules demanding job interviewees write thank-you notes protect those who not only know to do so, but have the available time. And because rules almost always arise because somebody did something that needs correcting, rules contain moral judgement against some social out-group.

interior of a starbucks coffee store

When self-appointed arbiters create dating rules, we should ask whom these rules serve. Declaring that a man must present a lush evening’s entertainment, on his own dime, seems facially neutral. But it serves those who have money enough to observe the rule—which means it serves men who are generally older, Whiter, and from stable urban environments. In other words, only men already well-protected by American society deserve romantic love.

Because the rules don’t specifically say rich men deserve women, one can argue that the rules don’t marginalize women. But that’s like saying redlining isn’t racist because it doesn’t name race. Making first dates more expensive grants advantages to those already well protected by American law and economics. It functionally states that only those who have money deserve women—an attitude that reduces women to the level of market commodities.

The #MeToo movement arose in 2017 because a subset of American men believed themselves more deserving of women’s affection, romance, and sex. These men were mostly well rewarded by America’s unequal distribution of wealth, and regarded access to women as just another perquisite. Money encouraged America’s worst men to degrade and dehumanize women, specifically because they had money.

Remember, coffee dates were recommended specifically to let women escape potentially hazardous encounters. Any rule, no matter who writes it, reflects the moral presumptions of its authors, and the informal rule of having low-commitment first dates in public places stemmed from the reality that a dangerous minority of men cannot be trusted. If the last decade taught us anything, those men are more likely to have money to splash out.

Some critics might respond that some men use inexpensive first dates as access to cheap sex. This is true; no rule is perfect. But the sense of entitlement that money fosters, has historically served women very poorly, except perhaps the minority of women with comparable social and economic status to their men.

Rules are never morally neutral, and always reflect the people who write them. If those writers are rich, well-protected, and high-status, they’re also probably untrustworthy, and so are their made-up rules.

Thursday, October 10, 2024

The Magic of Predicting the Past

Malcolm Gladwell, Revenge of the Tipping Point: Overstories, Superspreaders, and the Rise of Social Engineering

Clear back in the year 2000, a New Yorker staff writer published his first book, an overview of recent behavioral science and sociology. The Tipping Point compiled insights from disciplines that frequently don’t communicate well, and distilled their insights into readable vernacular for generalist readers. Malcolm Gladwell introduced an epidemic-like analysis of human group behavior that continues to influence how Americans perceive group behavior and social movements.

A quarter century later, real life has field-tested Gladwell’s precepts, sometimes in direct response to his book. Several of his principles have proven reliable, while others have been OTBE’d: Overtaken By Events. Gladwell takes this, the anniversary of his career-making book, to revisit and update. As journalism of developing social science, it’s interesting reading, though it obviously rehashes the past. As science itself, sadly, it leaves something to be desired.

Gladwell’s dominant principle, that social movements develop like epidemics, seems especially timely because, in the intervening time, two epidemics have gripped American awareness: opioids and COVID-19. (He plays coy that he’s talking about opioids, because he wants his precepts to matter more than the details, but his target audience isn’t fooled.) We can revisit these two epidemics and determine whether Gladwell’s principles accurately describe how these outbreaks unfolded.

First, Gladwell describes the “overstory.” This term comes from ecology, describing the lush efflorescence of life which occupies the canopy of a rain forest, but Gladwell redefines it to fit the sweeping cultural context that humans occupy without conscious awareness. For example, he identifies Miami’s exceedingly high incidence of Medicare fraud with contexts like the Liberty City riots and the Mariel boatlift. My anti-racist spidey sense started tingling.

Then, Gladwell examines the necessity of diversity that protects against contagion. This means both literal diversity—cheetahs are so genetically homogenous that ordinary diseases can devastate communities—or metaphorical diversity, like the differences of backgrounds and goals that defend high schools against pathological groupthink. Nothing here should surprise anybody who got a B in high school biology class, but in today’s cultural milieu, definitely needs restated.

Finally, Gladwell unpacks why some individuals become “superspreaders,” a term most Americans probably first encountered during the pandemic. Some people prove contagious beyond the limits which the laws of chance would determine. Epidemiologists can usually determine fairly straightforwardly which individuals spread, and why, though usually only after they’ve infected others. Gladwell wants social science to have this same trend toward geometric absolutism.

Malcolm Gladwell

Sadly, while his principles seem plausible, they suffer because we only identify them retrospectively. He says, fairly late in the book, that the visible signs of tipping points, the big revolutions of public insight, exist in plain sight. The only reason we miss them, Gladwell asserts, is because we’re looking in the wrong direction. We persistently think change is years away, or longer, until the moment when change happens, usually quickly.

It probably reflects my prejudices, but one example persists in my memory: the sudden reversal of public opinion on gay marriage. The topic consistently failed at polls, most notably California’s Proposition 8. Then abruptly, public sentiment reversed itself, and suddenly most Americans, including a critical mass of conservatives, became accepting of gay marriage. The tipping point, according to Gladwell? The NBC sitcom Will & Grace.

Leave aside that the show’s initial run ended nine years before SCOTUS verified gay marriage in Obergefell v. Hodges. Or that, at its ratings peak, it topped out at only 24 million viewers, considerably below M*A*S*H, which notably failed to turn American sentiment against war. The very idea that one TV sitcom could reverse most Americans’ moral precepts seems laughable. This is especially pointed for series with political points, like Mr. Birchum, which is a global laughingstock.

If the conditions which precipitate the turning point are plainly visible, as Gladwell writes, then seemingly, at least one tipping point should’ve been visible. But nobody predicted Will & Grace changing American politics, nor the McKinsey corporation successfully turning Oxycontin into a public pestilence. Gladwell identifies these consequences because they already happened. In other words, Gladwell congratulates himself for successfully predicting the past. The future remains opaque.

Duncan J. Watts calls this tendency “creeping determinism.” We see outcomes as inevitable, Watts contends, because they happened. That’s exactly what happens here: Gladwell spends no thought for alternate contingencies, and doesn’t consider any attempts which failed, only those that succeeded. Thus he creates an overstory that’s factually correct, but not useful. Models like this only matter if they predict outcomes in advance, a goal which remains tragically elusive.

Monday, October 7, 2024

We Don't Have to Serve the Network

Elon Musk

On Monday, September 30th, 2024, the Verizon mobile phone and data network failed for millions of Americans in multiple markets. This had obvious immediate effects for personal communications and media consumption. But for innumerable gig-economy workers, such as drivers for Ãœber, Lyft, and DoorDash, the outage impeded their ability to work and make a living.

This follows recent outages for other digital service providers. Since Elon Musk took over Xitter and cut support personnel, the site has suffered periodic site failures. This impedes many companies’ ability to communicate en masse with their customers, but is more an inconvenience than a catastrophe. Far worse is the occasional Meta shutdown, as Facebook is the login portal for numerous other digital services.

And, holy cow, remember how many services wend dark following the most recent Microsoft outage? The post-pandemic remote work boom depends, in no small part, on how Microsoft and Meta have standardized communications and data sharing. Such outages steal workers’ ability to do decentralized, autonomous fundamental jobs. Modernity absolutely requires standardization.

Consider how many documents you’ve stored on the much-vaunted Cloud. Not only virtual data systems, like Google Drive, rely on the Cloud; you can’t even save documents locally in Microsoft Word anymore. Not to disparage the sharing and backup possibilities that the Cloud makes possible, but it also creates unprecedented vulnerabilities, both to data outages and to hackers. Everything you write is possibly exposed to mass attacks.

Bill Gates

How, though, do remediate this? The decentralized post-pandemic intellectual workspace requires standardized data platforms and content sharing, which requires large companies to coordinate our systems. Put another way, there’s no such thing as an artisanal cellphone network. Numerous small businesses—from conventional businesses to spunky Etsy entrepreneurs—need concentrated data sharing.

This creates an interesting tension. We valorize entrepreneurs, freelancers, and the self-employed as economic drivers. But to have any market outside a reasonable driving distance, start-up innovators require massive corporations with thousands, if not millions, of workers. The “gig economy” treats every DoorDasher or Ãœber driver as a separate small business. But without the umbrella corporation coordinatingtheir options, their “businesses” vanish.

As a Distributist, I believe that economics should prize personal autonomy and abjure resource hoarding. When carpenters own their tools, and farmers own their land, they have the freedom to enter fair contracts, or refuse unfair ones. But the Distributist model, first postulated in the 1910s, simply didn’t anticipate today’s industrial complexity or global economy. Our world has become less fair, not more.

Therefore, the Verizon outage forces me to reevaluate my own beliefs. Distributed post-industrial economic power requires reliable communication and data storage standards, only possible when our communications systems share a language and a network. That level of coordination only happens when boosted by large corporations. Distributed freedom for some, requires wage servitude for others.

Mark Zuckerberg

Worse, while the liberty of freelance work hypothetically strengthens individual workers, the coordination necessary to actually find work proves terribly brittle. Your DoorDasher can freely accept or refuse each proffered job, but when the network collapses, every job vanishes, leaving the worker without options. In practice, disaggregated freelancers don’t work for themselves, they work for the network.

One could expand this argument to encompass the entire economy. Capitalism’s defenders claim that market capitalism is small-d democratic because we can accept or reject any transaction. This may hold for each unique transaction, but we cannot opt out altogether. We eventually need food, shelter, and clothing, which inevitably means transactions, which means buying into the economy.

Pure libertarian freedom, therefore, is always an illusion. We’re never independent actors; we rely on networks of trust, industry, and solidarity. Whether that means trusting our employers to deal honestly, relying upon industrial processes to keep us connected, or practicing solidarity with our peers (join the union or perish!), we remain permanently enmeshed in networks. I never exist truly alone, but always positionally, within community.

I’m old enough to have watched Communism collapse live on network television. The retreat of the Eastern Bloc revealed environmental blight, impoverished communities, and personal alienation—the same effects we witness now, in late American capitalism. The conventional economic binary which the Cold War bequeathed us proves massively unprepared for the post-industrial information economy.

When the network leaves us vulnerable, we need to recognize that the economy is a wholly owned subsidiary of our communities, not vice versa. We require new paradigms that organize our economy, technology, and government to serve us. And I’m only just beginning to delve into how that might work.