Hi. Can here.
People often say "if you are not paying for it, then you are the product." It is not entirely incorrect: if you are using a service that costs money to run and you aren't paying for it with your hard-earned cash, the service is most likely monetizing something you do on it. Maybe they run ads, essentially monetizing your time and attention, or maybe they just collect data on your activity and sell that. There are many ways.
It's not often as simple as that, though. It might be true that if you are not paying for it, you are being sold, but that doesn't rule out that you can opt-out from the transaction entirely. The cattle on a farm might not have a choice in how they are being sold and milked and butchered, but you can always decide not to use the product. There's still a ton of agency left to the individuals. The products might be addictive, and there could be networks effects and natural (and artificial) monopolies that dampen this agency, but it's there.
I thought of this agency problem while reading the Wall Street Journal piece on how Zuckerberg went from being a dorm-room libertarian ("Politics is hard, let's go coding") to becoming… a board-room libertarian ("Politics is hard, but I'm harder").
But he wasn't particularly political. A few years after Facebook was created, political advisers met with him to understand how his views might shape company policy, according to a person familiar with the matter. The advisers explained the differences among the American political groups, including Democrats and Republicans, and the meeting ended when Mr. Zuckerberg agreed that he was best described as a libertarian, rather than closely aligned with either major American party.
I talked about the mental model I have for companies like Facebook or any big company that embeds itself as much into a country's political machinery before. Rather than considering these companies as left or right-leaning, it's more useful to look at them as the oligarchies they are. Zuck might be a powerful figure, but even his might and ruthlessness hold no candle to that of the US government. At some point, it's better to align yourself with whoever is in charge and go from there.
You can argue that this is dishonest, or even sleazy. Probably. I think the fact that we have a trashfire of an administration that is led by who is the single worst and the most undignified and dangerous president ever makes the entire thing more dramatic, but the truth is that there's nothing that shocking about it. Facebook does it, and so does Google and even the relatively apolitical companies like Apple. It's what it is, as the kids say these days.
What is interesting, however, is how Facebook, who is more accountable to its users' politics as it's primarily a user-generated content website, could navigate a new world when the administration's politics make a 180 turn from that of its politics. In other words, which way will Zuck lean when its active userbase (in the US) leans heavily conservative but the administration it needs to curry favor with is now left-leaning? Which side do you pick?
If you wanted to be systematic about it, you could, for example, do something like a stakeholder analysis. For Facebook, the list is pretty small. There are the users, the advertisers, the employees, the regulators, and the media. You could also add the board of directors here, but in reality the board exists to rubber-stamp, so why bother?
You could simplify this list even further. While it's true that Zuck cares about his employees, it's less true that he cares what they think about his politics. Moreover, we all know how he doesn't really care about the media —save for being understood by them—. And the advertisers are moot too; they will flock to whatever platform they can find. Sure, there are some boycotts here and there, but it's unclear to me if they ever made a reasonable dent in Facebook's revenue.
Then, you are left with just the regulators and the users, which ironically is where we started in the first place. Ugh, sorry. So, where does this leave us? If we make the reasonable assumption that Trump will lose this election and the senate will flip, what do we expect Facebook to do?
This is all speculation, but in a world where Zuckerberg et al. needs to be on the Democrats' good side, it's hard to see someone like Joel Kaplan having as much influence. We might also expect people like Nick Clegg to be slightly sidelined. And maybe, much to my co-host Ranjan's chagrin, we might also see the one-time Clinton administration official Sheryl Sandberg to exert more of her power and be the glue between the administration and Facebook. Things will get more left-leaning, for sure. It is in fact possible to read some of the recent changes to the company's content policy such as banning Holocaust denial through this lens.
But what about the users? While whether Facebook's userbase in the US leans conservative or progressive is anyone's guess, there's some reasonable evidence that conservatives have a slight edge in engagement. Conservatives Ben Shapiro routinely top engagement numbers in terms of links shared, and anecdotally, the conservative groups also have much more heated debates in them. What happens when Facebook itself starts to lean more heavy-handed towards progressive causes in its content moderation policies?
Admittedly, I don't have a good answer here. It seems plausible that Facebook will sacrifice some engagement numbers to clean up its service not to draw the administration's ire. A less likely option is that the company could benefit from the increased scrutiny of the administration from conservative outlets on its platform, but could try to confine it in a way to neat leak into the greater Facebook.
During our discussions of the WSJ piece, my co-host Ranjan made the case that he doesn't buy Facebook's userbase leans conservative. At Facebook's scale, they can just do whatever they are doing as long as the company makes its employees fabulously rich. I also acknowledge the arguments like those of Benedict Evans that huge networks like Facebook are nothing but a mirror to our society.
In general, though, I believe that companies like Facebook are much more beholden to their leaders' whims, and those people to the prevailing political winds in turn. There is a new world coming. Expect some new faces and some new content moderation challenges. There'll be new winners in the content game, who'll figure out the new rules. And a few new losers, who'll either have to adapt or fade away to obscurity.
Consider that the Top posts on facebook reflect the biases in Facebook's moderation & algorithm, biases that have been shaped by conservatives whining about how they've been mistreated (something they've done for decades, and which they continue to do because it works for them).