Lizzie O'Shea is a human rights lawyer. She is author of "Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Can Teach Us about Digital Technology."
When the U.S. Federal Trade Commission sued Facebook for "illegally maintaining its personal social networking monopoly through a yearslong course of anticompetitive conduct," it signaled a potentially new direction in antitrust law that is much better suited to the digital age.
A key aspect of the complaint centers on Facebook's conduct in acquiring competitors, namely Instagram and WhatsApp. Both these platforms posed competitive risks to Facebook, a fact recognized and documented in emails sent by CEO Mark Zuckerberg, referenced in the suit.
Ultimately the company adopted a philosophy that "it is better to buy than compete." By purchasing popular platforms, Facebook was able to neutralize the competitive threat and buy time before another independent startup could gain the scale necessary to meaningfully challenge its dominance in the social networking space. By doing this, Facebook also avoided a situation where one of its rival platforms, like Apple or Google, could make the purchase instead.
That business strategy, says the FTC, is anti-competitive, and it is seeking to reverse Facebook's acquisitions of these companies. This probably explains why Facebook has recently sought to integrate the messaging services on Instagram and Facebook, with Whatsapp reportedly soon to follow. This technical change has the potential to make any divestiture practically difficult.
Facebook's business strategy might be ruthless but the question remains: what is the harm for consumers? Consumer welfare is a foundational objective of U.S. competition law, and it has traditionally been measured using price. The worry about monopolies is that once the competition is gone, a dominant seller can jack up prices and the buyer is at their mercy.
But Facebook did not neutralize competitors that offered a cheaper service. Facebook does not even charge users a fee. If consumer welfare is about protection from price gouging monopolies, it is hard to see the harm caused by this business model. This disconnect -- between an outdated understanding of harm and modern platform monopolies -- has been a significant source of paralysis when it comes to regulatory action against companies like Facebook.
The answer lies in that omnipotent, metonymic idea of privacy. The FTC says that once entrenched as a dominant player in a particular market, dwindling competition had an impact on the service provided in terms of declining privacy protections. So even if it does not cost anything to use Facebook, the quality of the service it provides has gotten worse over time as its dominance increased. There are other harms identified in the suit, including the stifling of innovation and improvement, but the privacy issue is the most profound as it strikes at the heart of the political economy of the web.
Absent competition, Facebook has not been able to resist the temptation to build its service around an insatiable desire to accumulate data. The company's profits flow from its capacity to generate highly specific and disaggregated audiences for advertising. Without competitors, that may have pursued profitability in other ways, the company was able to dismantle privacy protections without attracting controversy, and without the risk of driving users away.
That gutting of privacy protections has had an enormous impact on not just users of Facebook, but pretty much everyone. The business of Facebook requires that people spend as much time as possible on their devices, sharing their desires and being nudged toward various products. Curation decisions made by Facebook have been made in service not of users but of advertisers, motivated by the need to attract revenue. This has led to culture of online life defined by filter bubbles and extremism, as the company has sought to prioritize engagement and data extractivism above other considerations.
The concept of privacy, therefore, is not just about the right to keep things secret, or the right to be left alone. It is also about the right to engage in collective spaces without being watched, tracked and managed. It is about the right to create a sense of self that is not defined by surveillance capitalism. The harm of monopoly in social networking is not about the price you pay in dollars and cents, it is the cost of living our online lives with diminished autonomy, subject to the whim of a coterie of Silicon Valley elites. Facebook is in many ways more like a government than a company, an observation Zuckerberg has made himself. Unlike governments, however, those that run the company are unelected.
This has wider implications. Since Facebook is motivated by growing its user base and keeping them on their platform, it is far less concerned with the thorny questions posed by its business model. Put simply: Facebook does not take its moderation responsibilities seriously.
In September, for example, BuzzFeed published parts of a memo written by a former staffer Sophie Zhang, who was responsible for tackling disinformation in elections around the world. According to Zhang, Facebook ignored or was slow to act on evidence that fake accounts and disinformation campaigns were undermining elections and political affairs in various parts of the world. But why would it, if such users of the platform were a source of advertising revenue, or if there were costs in dealing with this properly, and very little upside for its bottom line?
This episode raises questions not only about the immense challenges of dealing with complex and sometimes culturally specific content moderation questions, but also whether Facebook can -- and ever would -- invest in trying to grapple with them meaningfully.
The FTC still faces significant challenges in its suit against Facebook, and this action will not solve all the problems posed by the company. But in many ways, the framing of this case suggests that regulators may finally be starting to understand Facebook's business model, and the inherent and profound dangers it creates.