Sunday, November 30, 2025

Internet Censors and Real Speech

The cover art from Sharon Van Etten’s
Remind Me Tomorrow

I had no idea, until this week, that Sharon Van Etten’s folk-pop electronic album Remind Me Tomorrow might be off-color. Specifically, the cover art. I’ve linked to my review of the album several times on several platforms without incident. But this week, I had a link yanked from Instagram by the parent company, Meta, on the grounds of “child nudity.”

As you can see, the cover image is a childhood snapshot of Van Etten and her brother. That’s Van Etten half-folded into a laundry basket, partially unclothed. Small children often hate clothes, and have to be conditioned to wear them in time to start school. Because of this, most people recognize a categorical difference between innocent small-kid nakedness, and smut. I suspect any impartial judge would consider this the former.

This isn’t my first collision between Meta and nudity. I’ve repeatedly needed to appeal them blocking links because my essays included Michelangelo’s Creation of Adam, a panel from the Sistene Chapel ceiling. It depicts Adam, not yet alive, lolling naked in Eden, including his visible genitals. Nearly every blog essay I’ve written that included this image, I’ve had to appeal against lewdness regulations.

Any reasonable person would agree that social media needs basic standards of appropriate behavior. Without a clear, defined threshold, one or a few bad-faith actors could deluge the algorithm with garbage and destroy the common space. Consider the decline of public spaces like Times Square in the 1970s: if nobody defends common spaces, they become dumping grounds for the collective id.

But those standards are necessarily arbitrary. What constitutes offensive behavior? We get different answers if we ask people of different ages, regions, and backgrounds. My grandmother and I have different expectations; likewise, Inglewood, California, and Doddsville, Mississippi, have wildly divergent community standards. But because Facetube and InstaTwit don’t have geographic boundaries, they flatten distinctions of place, race, age, and economic standing.

TikTok perhaps embodies this best. Cutsie-poo euphemisms like “unalived,” “pew-pew,” and “grape” gained currency on TikTok, and have made it vitally difficult to discuss tender topics. YouTube restricts and demonetizes videos for even mentioning crime, death, and the Holocaust. Words like “fascism” and “murder” are the kiss of death. In an American society filthy with violence, the requirement to speak with baby talk circumspection means that we can’t communicate.

Michelangelo's The Creation of Adam, from the Sistene Chapel ceiling

Watching the contortions content creators have to perform whenever called upon to address the latest school shooting or overseas drone strike, would be hilarious, if it weren't heartbreaking. Americans have to contend with legislative inertia, lobbyist cash, and morally absolute thinking when these catastrophes occur. But then the media behemoths that carry the message have the ability, reminiscent of William Randolph Hearst, to kill stories by burying them.

I’m not the first to complain about this. I’ve read other critics who recommend just ignoring the restrictions, and writing forthrightly. Which sounds great, in theory. If censorious corporations punish writers for mentioning death and crime too directly, the response is to refuse to comply. Like any mass labor action, large numbers and persistence should amend the injustice.

In theory.

Practically speaking, media can throttle the message. In the heyday of labor struggles, the Ludlow Massacre and the Battle of Blair Mountain, unions could circumvent media bottlenecks by printing their own newspapers and writing their own folk songs. But most internet content creators lack the necessary skills to program their own social media platforms. Even if they could, they certainly can't afford valuable server space.

Thus, a few companies have immediate power to choke even slightly controversial messages, power that creators cannot resist. Which elicits the next question: if journalists, commentators, and bloggers cover a story, but Mark Zuckerberg and Elon Musk stifle the distribution, has the coverage actually happened? Who knows what crises currently fester unresolved because we can’t talk about them?

This isn’t a call to permit everything. Zuckerberg and Musk can’t permit smut on their platforms, or even link to it, because it coarsens and undercuts their business model. But current standards are so censoriously narrow that they kill important stories on the vine. If we can’t describe controversial issues using dictionary terms, our media renders us virtually mute.

Given how platforms screen even slightly dangerous topics and strangle stories in their beds, I’m curious whether anyone will even see this essay. I know I lack enough reach to start a movement. But if we can start speaking straightforwardly, without relying on juvenile euphemisms, that represents a step forward from where we stand right now.

No comments:

Post a Comment