A lot has been made of the befuddled way in which senators tried to confront Mark Zuckerberg last week. Orrin Hatch seemed surprised that you can run a profitable tech business without charging customers for the service. (Zuckerberg’s deadpan reply: “Senator, we run ads.”) Brian Schatz proved he didn’t have even a basic understanding of WhatsApp. Lindsey Graham gamely pursued questions about monopoly practices, but he got lost trying to figure out exactly who and what Facebook is monopolizing.
Everybody from Silicon Valley snobs to frustrated watchdogs rolled their emoji eyes at the spectacle. Those who had hoped to see a perp walk instead watched Zuckerberg stroll to the bank: Facebook’s share price climbed 4.5 percent the next day. The strongest regulatory threat to emerge was the happy CEO’s offer to write some new rules himself.
The House did far better in its questioning the next day. But, to be fair to the senators, the most urgent questions about Facebook’s business model are complicated. Most of us don’t understand them, and the company has spent gobs of money to make sure both users and their elected officials remain foggy on the details. Zuckerberg actually pointed to the core question himself, even as he avoided meaningful answers: Is the company to be understood as a neutral platform—as it would prefer—or something more akin to a media conglomerate, or an Internet service provider, or some kind of forum for electioneering, or… well, any of the many functions Facebook serves, most of which we already agree need some form of public oversight?
I wonder, though, if even that question still skirts around the matter. I fear that, while we rightly worry about privacy and political misinformation and unaccountable power, we are rushing past a still more fundamental concern. The first horror of Facebook is that it reveals just how deeply we’ve let capitalism sink its teeth into our lives; it’s the vampire we’ve invited across the threshold of our homes. Once inside, we’re powerless against the greedy monster: Even our most intimate thoughts and relationships have become commodities.
To be clear, the peripheral questions still matter: Facebook needs to be regulated. The first proof of that fact is the sanitizing jargon we use to describe the company’s primary business. We talk about “targeted advertising,” “user segmentation,” “data collection.” What is it we’re trying not to say? The correct word is surveillance. And even if we’ve passively agreed to be surveilled, by logging on and taking advantage of the unquestionably powerful tool we get in return, it’s a stretch to call it truly informed consent for most of us—even for much of the US Senate, as it turns out.
Moreover, whatever the company’s intent may have been in creating its platform, it now has 2.2 billion global users—an accomplishment of its relentlessly growth-focused strategy—and as a result, it enjoys unique power to silence or amplify everything from political speech to commerce. How it exercises that power remains a secret. The need for close regulation is plain.