On Substack and Abandoning Our Collective Morality

"It’s about believing your own hype, which, untreated, can lead to acute ego-monster syndrome."

On Substack and Abandoning Our Collective Morality
Created with a lot of back and forth between Erica and Dalle-3

In the wake of news that Casey Newton’s popular Platformer newsletter is leaving Substack following an ongoing dispute over the media company’s responsibility to ban Nazis, our co-founder at MoP reminded me that I’m something of an expert when it comes to comments moderation, so maybe I should write something about it. 

My first reaction was that I’m not sure I have anything interesting to say about it. After all, the Internet has come a long way since me and a band of extremely online nerds convinced media orgs that comments sections should probably not be an open landfill at the bottom of their pages. Any company that promotes and re-distributes content carries a baseline moral responsibility for the nature of said content. 

Nazis, you hopefully agree, are intrinsically below that baseline, given that Nazi ideology is tautologically genocidal. Genocide is evil, and as such, it is covered by absolutely any form of ‘moral responsibility.’ I suppose I can let you be the judge of whether or not that’s an interesting thing to say.

The fact that Substack and its defenders, by implication, believed it to be interesting until last week or so when they finally… well, implied they would ban Nazis going forward is actually interesting to me. So, as usual, you win this round, Erica! 

I’ve already linked it above, but I highly recommend Casey Newton’s excellent explanation of this affair in his Substack swan song piece if you need more background info on this dispute. 

Now, despite that very clear declarative statement, I do empathize with what Substack believes is the right thing to do. An Internet where ideological capture and suppression ran unchecked could catastrophically fail to reflect the real world, leaving societies entirely unprepared for extremist black swan events, leading to a more brittle and dystopian social existence. 

As a college newspaper journalist at a big conference in 2007 or so, I remember being horrified when an editor told us young reporters that they’d recently decided to simply remove comments they thought were misinformed or insulting. During question time, I made an argument in the same genre of Substack CEO Hamish McKenzie’s. (At this same conference, a Des Moines Register politics reporter told us that he never votes, so that he might maintain his journalistic remove, which continues to be one of the most befuddling public statements I have ever heard, but I digress.) 

On Dec. 21, McKenzie wrote: “We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power. We are committed to upholding and protecting freedom of expression, even when it hurts.”

Since the universe seems to privilege irony above all else, I’d find myself in charge of all NYT comments a few short years later, and took a very different approach than both my previous McKenzie-ian outrage and that editor’s ‘just get rid of it’ credo. 

But anyways, I do empathize. As the moderator of any ideologically diverse public forum, you might notice relatively extreme voices fighting to be heard, and even if you disagree, you see the value they add, the high-quality responses they inspire. So, you create structures and rules to protect them – to maintain the special unique-ness of your forum. Taking notice, the extremists quickly fill in all of the space within your rules, and begin probing outside of it. Soon, you find yourself pushing the needs of your regular users toward the margins as extremists fill in the vacuum. And still, you find yourself trying to maintain the spirit of the place, that little spark of rebellion. 

But really, at this point, it's not about your regular users anymore, or even the extremists, and especially not your business. It’s about believing your own hype, which, untreated, can lead to acute ego-monster syndrome. Ref: Ackman, Bill or Rodgers, Aaron

In response to the will-we-won’t-we Nazi dilemma, Substack isn’t changing their moderation policies, even as they remove a certain number of Nazi publications pointed out to them. Instead, they promise to do more to help the community flag offensive content and customize their experience. That brings me to a couple points of disagreement on moderation I have with Casey Newton and many of Substack’s online critics:

  • I agree with the Substack team that new enforcement actions do not necessarily mean that your public moderation policy should change. A good moderation team typically maintains a fairly static set of public guidelines and an ever-changing private interpretation for what these words should mean in practice, in the context of a changing culture. However, a clear statement that Substack’s policy against inciting violence will now apply to ALL Nazi pubs has been conspicuously missing. 
  • Community involvement in moderation is not some kind of scheme to hand off responsibility to your readers and independent publishers. Every community exists in a nexus of its own local culture of wider societal trends. Predicting those changes and how those changes might affect our own perceptions of individual pieces of content is akin to predicting the weather in Wisconsin next week. Sometimes, you just need to ask someone in Milwaukee whether or not it's actually raining. 

Another question to consider for moderation critics out there for the sake of intellectual consistency: If Substack didn't have weird reactionary-libertarian politics, how might they best deal with the challenge of moderating these publications at scale? It’s hard to find exact figures, but let's assume something like 100,000 publications, and for intractable economic reasons, a tiny team to watch over them? Here’s where I’d start: 

  • Maintain a consistent public representation of your understanding of the moderation problems you face, and the actions you have taken. This will attract critical feedback from interested people.
  • The evolution of online culture is a robust area of academic study. Find universities to partner with, share some relatively significant data, and use their findings to inform your ongoing safety work. 
  • Modify a set of open-source AI tools to rank publications for necessity of internal review.
  • Empower a rotating committee of users to give a wide range of feedback on publications, including potential abuse, and find ways to feed that info back into your premium business to add subscriber value. 
  • Remember: You don’t get to ignore this stuff because of scale. If you can’t do it at all, then you are growing with all the forward-thinking of a cancer cell. Or more nicely: You also can’t tell an employee you’ll pay them sometime in the next six weeks because you don’t have an accountant. 

None of this is particularly easy, and the implied answer of many Substack critics that only the hyper-rich or incumbents that can shoulder the entire cost of moderation get to own platforms on the Internet is deeply unsatisfying. But thankfully for the length of this piece, Substack is not a platform. Let’s circle back to Casey Newton’s post

“But for a time, the distribution of that material was limited to those who had signed up to receive it. In that respect, I did not view the decision to host Platformer on Substack as being substantially different from hosting it on, for example, GoDaddy. 
But as I wrote earlier this week, Substack’s aspirations now go far beyond web hosting. It touts the value of its network of publications as a primary reason to use its product, and has built several tools to promote that network.”

The mission of Substack, in short, is to build a “new economic engine for culture.” This particular economic engine, like many others, is a network. A network where people interact socially with independent publications and the owner’s meta-publication. It’s a social network. 

The simple truth Substack’s top brass must face is that for the ethical user, a social network without a sense of mission is a weapon that you may be arming. That’s the source of the pinching feeling you might get when you write a new tweet, the twang you feel when you contribute to most social network machines. Every post is akin to your tax dollars, and it may go toward hurting people you love. Your posts are billionaires’ currency, and in our economic context, it is borderline childish for executives to be outraged (see: weaponizing Casey Newton’s private questions to inveigh against his purported ideology) when people see it that way. 

To be clear: It is not childish to disagree with this philosophy, but it IS childish to petulantly refuse to engage with the way their customers experience the real world, and raises serious questions about the Substack team’s judgment going forward.

If you are building a platform that can’t be a platform for economic reasons, there is only one answer: Build a community with a common goal. Leave the governing to Congress or anyone else who commands an army under the U.S. Constitution. Unless you are a government agent or happen to be in a voting booth, the only free speech you are responsible for maintaining is your own. If you want to have an active hand in the distribution of content, you must have a point of view that includes some people and excludes others. There is no way to capture and re-distribute the entire range of human thought for profit and still be a moral person.

Yes, this is inconvenient toward the goal of getting filthy rich and maximizing investor value through the power of scale. But it is the truth. No ideological truisms allow us to run or hide from our moral responsibilities, even if users can see them much more clearly than the admins.

P.S. Welcome to Ghost, Platformer! Happy to have you here. Feel free to hook us up with a Casey Newton guest spot, thank u in advance. 


Sign in or become a Machines on Paper member to join the conversation.
Just enter your email below to get a log in link.