Facebook is censoring me
and most of you too.
Ranjan here, and this week I’ll be writing on social media and censorship. I know this might feel like a lot of Facebook posts in a row from us, but it’s hard, given the proximity to Nov 3rd, to not focus on these topics.
Last week Don Jr. tweeted:
I feel you Don Jr, I really do, because I, too, have felt the heavy hand of Silicon Valley throttling my communications to my friends and followers. Facebook, Instagram, Twitter - they've all censored me, and in this post, I'm ready to fight back.
People, Not Algorithms
To explain what happened, we need to remind ourselves that robotic, magical algorithms don't control who sees what on social media feeds. People do and they always have. Sure, there might be plenty of math and computing power in the content supply chain, but it’s people who very calculatingly craft the parameters that choose what you see.
How much of your life can we get you to give to us? We often talked about, at Facebook, this idea, of being able to just “dial that” as needed. And we talked about, you know, Mark (Zuckerberg) have those dials… “let’s dial up the ads a little bit”, dial up the monetization, just slightly… At all these companies there’s that level of precision.
Okay, the whole documentary was kind of ridiculous, but that visual of Three Petes from Mad Men and the concept of dials, while a bit absurd, is a very good mental model. Your algorithmic feed has always been about people and dials.
And then last week, there was a damning nugget from the WSJ:
In late 2017, when Facebook tweaked its newsfeed algorithm to minimize the presence of political news, policy executives were concerned about the outsize impact of the changes on the right, including the Daily Wire, people familiar with the matter said. Engineers redesigned their intended changes so that left-leaning sites like Mother Jones were affected more than previously planned, the people said. Mr. Zuckerberg approved the plans.
Engineers cranking dials. Sure, they didn't choose the exact photo or comment, but they very purposefully defined the rules that would choose exactly what you would be seeing. There were KPIs and OKRs, and whiteboards and all-nighters underlying those decisions. These decisions are as editorial as it gets, there’s no black box.
The choices made in structuring a content ranking algorithm are on par with Jack Dorsey saying you can't share that NY Post link. They may provide digital cover for those behind the curtain, but just because you didn't manually block that one story, your engineering team has blocked billions of stories from ever seeing the light of day.
The warning labels, retweet UX prompts, and messy content removals are just the most recent iteration of people at platforms choosing what users should see. The day Facebook went algorithmic, with an advertising-based business model, they started censoring the boring and mundane.
Let Boredom Ring
A few years ago, before deleting my Facebook in 2018, I ran a scenario in my mind. I'd post something boring to my Facebook feed and it would get zero engagement. The post would need to be so perfectly boring that there could be no world in which someone would feel any desire to react or engage. It couldn’t even be ironically boring. Absolutely no element of the post would trigger an iota of dopamine.
[This is it. The single most boring thing I could come up with. Could you do better?]
But it would be a setup.
I'd have some lawyer who was basically a good version of Charles Harder (the Gawker-Hogan, now Trump guy) file a lawsuit that my post had been censored. My boring post had just as much a right to be exposed to all of my friends as any engagement-bait, and Facebook was explicitly limiting my freedom of speech.
It was active censorship because Facebook has always favored the salacious over the mundane, no different than favoring the conservative over the liberal. We've oddly accepted this as a weird natural state of content, but it's not remotely neutral nor natural. It's just a choice.
It's why that image of dials stuck with me. From the day they went algorithmic, Facebook started throttling the un-engaging. The boring among became voiceless. Tempered viewpoints are shut down. The mundane moments of life are no longer worth sharing. That off-center photo, not-quite-in-focus photo of your kids had every right to show up in your friends' feeds as the latest post from Dan Bongino, but it won’t.
Groups and Discussions
If a post never shows up in a feed, was it ever posted?
Throttling the mundane does pose a longer-term risk to the business of social platforms as “average” users might stop posting and sharing. They’ll probably still check their feeds regularly, but that’s not enough. Mark Zuckerberg may have called Elizabeth Warren an existential threat, but losing casual users is the real one. The reserved people who don't want to share strong opinions slowly fade away and all we'll be left with are the dicks.
While Mark Zuckerberg is blind to the longer-term societal implications of his company’s work, he is an oracle when it comes to understanding competitive threats, both external or internal. In January 2018, Facebook introduced a major change to the algorithm where they would push more "personal" content over "public". The 2016 election and subsequent fake news conversation were clearly having an impact and it seemed like a reasonable reaction. According to Zuckerberg:
But recently we've gotten feedback from our community that public content -- posts from businesses, brands and media -- is crowding out the personal moments that lead us to connect more with each other.
And they were very transparent in a post called "Bringing People Closer Together":
With this update, we will also prioritize posts that spark conversations and meaningful interactions between people. To do this, we will predict which posts you might want to interact with your friends about, and show these posts higher in feed. These are posts that inspire back-and-forth discussion in the comments and posts that you might want to share and react to – whether that’s a post from a friend seeking advice, a friend asking for recommendations for a trip, or a news article or video prompting lots of discussion.
There were clearly internal active user metrics raising red flags. They needed to get the normals engaging again so they pushed people towards Groups.
Facebook is often criticized for a lack of transparency, but sometimes it’s shocking just how open they are. Adam Mosseri openly told us they would increase the weight of comments in choosing what you see. It’s right out there in the open! If your post is not worthy of “generating discussion” then it will be throttled.
Social platforms have very carefully trained us on how to “not be boring” in the very specific mold that they define. If you want to exist you learn the Twitter jokes, you lean into the Instagram poses, or you caption the Facebook picture in just the right way. The 2018 algorithm change was supposed to connect people more closely to the things they care about, but Facebook did not completely restructure the system upon which things are judged. The inputs changed, but incentives did not. Maybe a comment thread replaced a 3-minute autoplay video to get you to the top of a feed, but it was still the crazy and loud over the commonplace and tedious.
As 2020 feels more like 2020 with every day that passes, I can’t help but wonder: Facebook needed to get the normals back. After years of censoring the boring among us, they needed them to want to take part again. But instead of making the platform reward normalcy, did they make the normals just a little bit crazier?
And if you don’t think they can influence mass behavioral change, let’s never forget in 2015 to get people used to their flagging video product, which was a key business objective because video CPMs are way higher, Mark Zuckerberg had you all dumping ice water on your head.
The conversation around social platforms and speech will not end on November 4th, regardless of what happens. The dam has broken. As platforms become more clumsy and reactive in the coming weeks and months, it’s worth remembering the boring and uninteresting have been actively throttled for years now.
Note 1: I know the social platforms are taking a lot of heat for the messy efforts they’re making to prevent major electoral disinformation. I do want to commend some of the efforts - throttling virality is Plank 3 of the Margins Five Point Plan to fix social media! And just three days ago, Facebook apparently rolled out a new design that makes it easier to go reverse chronological!!!
That was Plank 1 of our plan, and as you can tell from this post, the single most important thing I think social platforms need to do. Please Twitter, stop telling me to “Go back home: You’ll see Top Tweets first” for switching back to an algorithmically curated feed. Reverse chrono is my home.
Note 2: There’s been a Facebook Top Ten twitter account (inspired, and maybe ran by, Kevin Roose) which posts the top ten highest-performing links according to Facebook’s own CrowdTangle data. It’s always a bit of a right-wing cesspool, and the Facebook folks regularly counter that these articles are only showing posts with high ‘engagement’ (i.e. comments and reactions) versus ‘reach’.
Going back and re-reading the 2018 Adam Mosseri post does make those arguments feel a bit more disingenuous.