Internet. World by Dalpat Prajapati from the Noun Project

The Algorithm and the Metamorphasis

Suppose in a strange Gregor Samsa-esque incident, you awoke one morning and found yourself transformed into Jack Dorsey. How? Why? That’s not the point.

You’re Jack Dorsey now, complete with his responsibilities and memories. You recall few years back how you were shocked just how easily Nazis, terrorists, conspiracy mongers, and foreign intelligence agencies had turned your platform into a weapon that was increasingly tearing the world apart.

Let’s put aside how long it took you to decide that letting terrorists recruit on your platform was a bad idea. Today sir, you have decided to clean up your act.

And then, there’s a little tabloid story, that really looks like tactical misinformation. It’s tied to a notorious spin doctor with a history of stunts like this, claims to validate a debunked conspiracy theory through a series of events that’s stranger than fiction, and there’s a pretty good chance your friends at FBI mentioned that this might be coming.

It’s definitely going to get about a billion retweets.

What do you do?

  1. Let it go! And possibly let a moderately clever propagandist change the course of world history in the span of a few hours.
  2. Slam on the breaks, if only for a few hours while more information comes to light, and be accused of censorship.

This is a terrible position to be in. And nobody feels even a little bit bad for you; this is a conundrum you spent a decade creating for yourself.

The clock is ticking. You have 60 seconds to decide.

Tick, tick, tick

Of course, this isn’t hypothetical.1 And I have nothing novel to add about the specifics of the case or whether the social networks did the right thing.

But let’s look at those choices again.

  1. Let it run, which means your platform can empower a con artist to change the course of history with a well-placed lie in the span of one night. This is horrifying; nobody should have that much power.
  2. Block it. Actively prevent people from discussing this at all. Even if it’s used for good reasons, this is alarming. Nobody should have this much power.

There is no real third option. No matter what you do, the platform is an agent that either accelerates the conspiracy theory or suppresses it.

Now, hand-wringers will wring their hands and talking bobble heads will bobble about where the line is and where the risk lie, but I think they all kind of miss the point.

You shouldn’t be in a position to make this choice in the first place. The problem is a deep, fundamental flaw with the machinery that makes it necessary.

The better demons of our nature

The clearest way to see why this is so hard is to compare the big modern social networks to their alternatives.

Old forums were moderated and chronological. They could get gross, but they could only get so big, and so if you wanted to run a fringe website, you could, but you would basically be confined to your four walls.2

Reddit is another interesting counterexample. It’s built out of contained rooms, and reach is controlled by voting. Since you vote explicitly up or down, you can look at a post, say “this is stupid” and push it down.

Voting based communities become a parody of themselves at scale. HN, which has similar mechanics, is apparently full of people who love taking notes and boost anything about note-taking software or techniques up to the top. Weird? Yes. But under those same rules, vile posts can’t travel except in spaces that are already overwhelming sympathetic to them.

I don’t mean to describe this design as panacea. Reddit notoriously housed a lot of hate groups and twisted niches for years. But when they finally came around to cleaning up material that no reasonable person would tolerate, because of how the system was structured, they could destroy entire networks in a way that other networks can’t.

The platforms we’re critiquing are both a globally scoped and based on Engagement. If a post receives likes, views, shares, or comments, it’s shown to more people. In theory that sounds fine, but we’ve been watching them for a decade, and they all trend towards the same endgame.

On every engagement-based platform, the most provocative and controversial material rises to the top, in no small part because all engagement is created equal. Watching a pandemic conspiracy video on YouTube in disgust or morbid curiosity helps it travel further. Subtweeting a racist on boosts its ranking. Liking an anarchist political meme encourages the machine to show you more anarchist political memes. Commenting on spam with an grotesque dead cat in it in a large Facebook group asking the moderators to delete it skyrockets the item to the top of everyone’s feeds.

These mechanics allow bad actors to take the mic and reach millions of people in minutes.

And that’s why you, having unfortunately transformed in Jack Dorsey, are in the position you’re in.

If Twitter had voting and walls, it would be a different service, but you would be free to go back to parasailing in Hawaii or whatever tech billionaires do for fun in a pandemic confident that this stupid post could only circulate so far.

We can’t do better

When I first started writing about social networks, I was basically a crank shouting J’Accuse! at clouds. Now it’s mainstream, complete with pop documentaries and congressional hearings.

We demanded they fix their shit. They promised to do better. It seems like they tried. Over the years we heard about all sorts of policies, processes, and more than a few purges of terrible people.

And yet, the Noise Machine™ only seems to be running hotter and fuming more toxic trash than ever.

What if the reason it’s still doing this isn’t because its operators don’t care, but because it’s impossible? What if it’s not a coincidence that every large engagement loop system grows into a misinformation machine, but the unavoidable result of that design? What if the reason it’s gotten worse year by year is just that more bad actors understand the playbook?

And this is where I believe we are. That extreme moderation steps seem so necessary is just a symptom. Under a better design, we wouldn’t sweat the conspiracy mongers because, like the JFK or Roswell truthers, their voices are a just an odd curiosity that don’t go anywhere important.

These Noise Machine’s algorithms and all the mechanics around them need to go. Breaking up companies won’t fix it, more AI won’t fix it, digital literacy campaigns will make everyone feel better but it won’t stem the tide of a system that intrinsically benefits extremists.


When Twitter shut down all the verified accounts over a Bitcoin scam, a lot of people who are still on it3 mentioned that this was the best version of Twitter they’d ever seem. It calmed down, and went back to the quirky chat full of normal people that they originally joined.

Twitter is weird because it’s literal more literal than its competitors. There is an algorithm, but public figures have wild amplification powers. Is it possible that, by knocking out all the people with the most powerful reply and retweet buttons, what the lockdown really did was stop the superspreader events?

The few people I know still defend Twitter talk about private Twitter, or some other deeply limited version where they only engage with an inner circle. Which I guess makes sense?

What this might give you is a roadmap. For you, Jack Dorsey, can have a revelation. You don’t necessarily need to pull the plug on your machine to save the world—there’s probably too much money and momentum involved to let that happen without government intervention anyway.

But perhaps it’s within your power to force them to go through their own metamorphoses. That the Noise Machine™ may wake up one day and discover that it has become something new; slower, quirkier, more personal, easier to moderate with lower stakes, with more walls and way less virality.

Come to think of it, it sounds a bit like the old internet.

This too shall not pass

I hear a lot of complaints in tech circles suggesting this critique is a fad, or some kind of anti-tech bias, something that will come and go eventually. Which is… an opinion.

The problem won’t go away. It won’t stop when the pandemic ends, or after the election, or with some really complicated EU regulatory action.

And we should have no illusions: the Salem witch trials, European anti-Semites, and the Klan didn’t need the internet to result in a lot of murder. Every century we think the previous generation just wasn’t as culturally advanced as we are, but human nature doesn’t change when you give everyone an iPhone and serve cocktails at brunch. The virus will be with us forever.

Still, it’s time that we, and especially you (given you’re now trapped being Jack Dorsey, and again, I’m genuinely sorry about that) recognize we’ve created an information ecosystem that aids and abets terrible ideas, the window for observing and experimenting with small fixes has come and gone, and there’s only one real option left.

For western civilization to survive, the engagement loop has to go. And the clock is ticking.