Freya was planning a Christmas vacation to a small seaside town when she received a notification: Airbnb had not only canceled her booking but also suspended her account. When she asked why, Airbnb asked her to verify her identity, which she did by submitting her photo ID.
That afternoon, Airbnb sent her an e-mail: “After reviewing your account and the information available, we’ve determined that your account should be removed from the Airbnb platform…. We found that information in your account is linked to activity—online ads for adults [sic] services, which can include escort activity and commercial pornography—that doesn’t comply with the Safety section of our Community Standards.”
Within the sex work community, Airbnb is notorious for its aggressive surveillance practices and draconian banning policies. Officially, the platform’s policies prohibit “in-call commercial sex work” and “procuring sex work,” as well as “commercial pornography”—but there’s nothing barring people who sell (or buy) sex from using the service to take a vacation. (Banning someone for “being a sex worker” too closely resembles discrimination to make it past a legal team.)
That said, whorephobia—the systemic oppression of sex workers that is so ubiquitous its codification in something like a terms-of-service policy is redundant—is most acutely felt as the denial to sex workers of access to resources.
Rather than ease up throughout the pandemic, restrictions on sex workers have gotten increasingly severe, keeping pace with our growing dependence on technology. As I wrote this summer for Wired, sex workers are the canaries in the coal mine for state violence. One manifestation of this violence is the algorithmic surveillance of sex workers, which is both insidious and state-sanctioned: FOSTA/SESTA, a pair of bills designed to cut into Section 230 under the auspices of “fighting sex trafficking,” is intentionally vague. By banning any use of the Internet to “facilitate prostitution,” FOSTA effectively criminalizes the online presence of not only sex workers but also those who are in close proximity to us. From our roommates to our cab drivers, technological whorephobia threatens guilt by association.
Freya didn’t use Airbnb properties for sex work, but she had her suspicions about how Airbnb flagged her. In May of 2020, before adopting cryptocurrency, she was paying for escorting advertisements with her credit card. Data sharing between platforms could have led Airbnb to algorithmically put two and two together.
"swipe left below to view more authors"Swipe →
The House of Representatives Rules That Anti-Zionism Is Antisemitism
The House of Representatives Rules That Anti-Zionism Is Antisemitism
The “Harvard Law Review” Refused to Run This Piece About Genocide in Gaza
The “Harvard Law Review” Refused to Run This Piece About Genocide in Gaza
But then Freya’s partner, who has no history of sex work but was also listed on the booking, tried to sign into Airbnb. He, too, was asked to submit photo identification to access his account.
About five hours later, Airbnb sent him a similar response: “We’ve removed you from the Airbnb platform because your account is closely associated with a person who isn’t allowed to use Airbnb. For the safety of our community, we may remove accounts that are closely associated with people who aren’t allowed to use Airbnb.”
If you’re not a sex worker, then you may not see a problem with a private company’s restricting its user base, especially when targeting such an unsavory population. The Airbnb Community’s “Welcome guest” online message board reveals hosts’ deep fears that, in their absence, their properties will be overrun by “pimps” and “prostitutes.” Hosts exchange tips and tricks for how to identify sex workers, as well as how to politely decline our bookings without overtly violating discrimination policies.
If you’re not a sex worker, then you may also be incredulous that Airbnb would effectively issue a blanket ban on us—a practice that Airbnb has denied over the past several years. Even if you agree that we don’t deserve access to services like Airbnb, the mass suspension of sex workers’ accounts raises several concerns: How do they know who’s a sex worker? If the property isn’t being used for sex work, then why do they care? And if sex workers are indeed being targeted, then why would Airbnb deny the practice that seems to correspond directly to hosts’ values?
In March of 2016, Mistress Julie Simone tweeted that her Airbnb account had been suspended with no explanation. While Airbnb did not explicitly deny that Simone had been suspended for her history of sex work, it never offered an explanation for her account’s termination. Consequently, coverage of the incident by Splinter and the Daily Dot bear almost identical headlines, each asking: Did Airbnb ban a user just because she’s a sex worker?
In April of this year, when my DoorDash account was abruptly suspended, I spoke to the New York Post and referenced Airbnb’s notoriety among sex workers. It, like PayPal, has a long unofficial history of banning us from using its platform. Airbnb explicitly denied the practice: “This is false. Sex workers are not prohibited from using Airbnb on the sole basis of their occupation.” At the time of writing, Airbnb has not responded to multiple requests for comment or to tweets and direct messages alerting it that its listed e-mail address for press inquiries is inactive.
One reason companies like Airbnb are loath to admit to these practices is that civilians, the industry term for non–sex workers, are the control population against whom we’re identified. By enacting early incarnations of these technologies on sex workers, a largely self-reliant and insular demographic necessarily skilled in evading detection, companies can refine their tools before slicing into other, less stigmatized communities. As Simone told the Daily Dot in 2016, “A light needs to be shown that you can’t treat people like this, you can’t marginalize a group of people. What’s to stop them from saying they’re not going to rent to Muslims?”
Another reason these companies keep their cards close to their vest is that the technologies they’re test-driving are often dystopian, invasive, and illegal. In May of 2018, Aurora Snow (no relation) reported in The Daily Beast that porn star Sara Jay’s Airbnb account had been abruptly suspended. A representative told Jay that they had found “advertised sexual services” before linking to a Panama-based escort agency that had used her likeness in their advertising.
The detail of Airbnb’s response should alarm you. The agency, for which Jay did not work, used her photo without her consent. But even if Jay were an escort, what compels a multibillion-dollar company to search for its users’ photos on third-party websites? And following that collection process, why would Airbnb comb through its massive data storage for such details?
The likely answer is that Airbnb used Jay’s biometric data to refine its facial-recognition technology, a practice that earlier this year landed the company in court, using a product with the potential to generate millions of dollars in revenue. (As reported by Sarah Wallace for CNBC on Tuesday, private companies have already begun using facial recognition to ban customers from using their services.) Airbnb more or less admitted this four years earlier when it told Jay it had found her picture: How else would it have been unearthed?
Airbnb’s open admission to Freya and her partner that their accounts were suspended as a result of Freya’s sex work suggests that Airbnb, which until recently denied targeting sex workers when not avoiding the question entirely, is confident in the surveillance technology it has been refining for years.
A contributing factor to sex workers’ being targeted is a misunderstanding of what “sex work” actually is. For example, when Kari Paul reported in Vice that another dominatrix had her account unceremoniously suspended a few months after Jay’s, Airbnb responded that “prostitution is not allowed.” As Paul explains in her article, “prostitution,” or full-service sex work, is distinct from professional domination, which is legal in many jurisdictions. Both fall under the umbrella of “sex work” along with stripping, phone sex, and pornography, all of which are also legal in most of the United States.
Knowing that sex work is often perfectly legal, Airbnb’s essentially policing its users’ sexual behaviors seems, well, illegal. But any company can ban any demographic as long as that isn’t a protected category, which sex work—despite being heavily gendered and racialized—is not. Under current antidiscrimination laws, the policies’ disparate impact on women and queer people could be significant in a sex- or gender-discrimination case. As civil rights attorney and sex worker rights activist Derek Demeri pointed out to me, this raises the question: “Are male sex workers being profiled in the same way?”
The difference between stigma against women and men who do sex work is simple: sexism, which structures whorephobia more than any piece of legislation. Demeri also noted that “adult services,” broadly defined, are not necessarily illegal in many or even most jurisdictions, at least in the United States. There are no federal prohibitions against sex work. And while many conflate sex work with full-service sex work, escorting is also legal in most of the country, provided the client pays for time and company rather than for a specific sex act.
When even the law is hazy on sex work’s legality, “how are we expecting these mega-institutions to enforce what is criminal activity and what is not?” Demeri asked. “I don’t think anyone will be surprised at people having sex in Airbnb rentals; that’s understood as part of their business.”
While Demeri agreed that the e-mail could be significant in proving sex or gender discrimination, they noted that “it’s just another example of, no matter how long ago you did sex work, no matter how infrequently you did it, to the world, you’re still a sex worker, forced to wear a scarlet letter.”
Lyra Foster, an Atlanta-based attorney who takes pro bono cases involving incarcerated trans people and sex worker rights, was more direct: “I would say the e-mail from Airbnb could definitely be read as a categorical ban on sex workers.” But more significantly, Foster said, “the e-mail shows the state of sex worker marginalization.”
As far as Airbnb’s allowing sex workers to use rentals for in-calls or content creation, Foster said she understood the company’s trepidation: “Criminal liability for prostitution, civil liability if a homeowner sued over their property being used as a filming set, that kind of thing. It sucks, but you at least understand the math.” Whatever logical reasoning Airbnb may cite aside, Foster, like Demeri, underscored that “it’s a chilling statement about the political climate for sex workers for a company to be willing to just completely refuse their money.”
It forces the question—if companies can get away with discriminating against sex workers based on data obtained using invasive surveillance technologies without our consent, then which demographic will they target next? And when they do, which services that have long turned sex workers away—such as housing, banking, and education—will follow in Airbnb’s footsteps and adopt these aggressive tactics to ice us out of the public sphere entirely?
While Foster listed various practical reasons—FOSTA/SESTA, or the right-wing human trafficking moral panic—for these policies, there’s a simple explanation for why sex workers are such an ideal test population: stigma. The general public rarely listens to sex workers. And when we are heard, we are typically not believed. By the time others become affected by technologies that have since silenced us, it’ll be too late for you.
In January 2020, Gustavo Turner reported in XBIZ that Airbnb owns AI technology to identify and discriminate against sex workers specifically—as well as people with mental illnesses. The technology rates users to determine their “trustworthiness score” and disposes of low-trust high-risk accounts accordingly. The patent explicitly cites “involvement in sex work” and “involvement in pornography” as traits being targeted, as well as drug use, a criminal record, and “negative” personality traits like “badness” and “neuroticism.”
And yet, the typical response to sex workers’ stories of deplatforming is skepticism, even when the companies themselves confirm our suspicions, over which there is little if any uproar. This is by design.
It may seem like I’m belaboring this point, but experience has shown me that, despite our expertise, the general public does its best to tune out our voices. Airbnb knows it has a higher “trustworthiness score” with the general public than any sex worker, as journalists also know, and as do you.
Similar to sex work, something like “psychopathy” is not a trait that the average Airbnb host wants to invite into their home, which is also why the general public will tolerate the targeting of people with heavily stigmatized mental illnesses like drug use disorder and antisocial personality disorder. “Illicit drugs” are already banned by Airbnb, and this technology offers a way to categorically ban people who use drugs.
In fact, it already has been deployed for that very purpose: As Gary Leff reported in September, Airbnb has been banning users for years-old offenses, including not only drug possession but also civil disobedience. In other words, Airbnb has banned people for exercising their right to protest, and some governments, as reported in AP News on Wednesday, have already started weaponizing similar mass-surveillance technology to stifle free speech.
Sex workers have long warned that surveillance technologies used on us will soon apply to political dissidents, if they haven’t already been targeted. In 2021, the independent sex-work research collective Hacking//Hustling published the report “Posting into the Void,” which found that sex workers and activists suffer a significant negative impact from targeted content moderation, doubly so if they identify as both. Hacking//Hustling’s focus on the 2020 uprisings protesting police violence against Black people dovetails with other reports that social media algorithms disproportionately flag people of color, queer people, and fat people for nebulous terms-of-service violations.
The danger of this mass surveillance isn’t simply that platforms can invade our privacy. Rather, as these technologies become more sophisticated, they become more valuable. Airbnb is developing a product that essentially assigns each of us a social credit score. If and when Airbnb sells that product—or suffers a data breach—the repercussions will echo far beyond banning a dominatrix from vacationing in a farmhouse.