Why the Internet Hates Gay People

Why the Internet Hates Gay People

A conversation with Alexander Monea about his recent book on the history of search engines, content moderation, AI, and the ways they form biases against queerness.

Copy Link
Facebook
X (Twitter)
Bluesky
Pocket
Email

For kids whose formative years fell in the late 2000s and early 2010s, when home computers ebbed with the widespread availability of portable screens, the practice of covertly spelunking into salacious corners of the Web—often while our parents weren’t home—was something like a rite of passage. Chat rooms like ChatRoulette or Omegle, the old Tumblr, and, of course, porn sites, offered a window into the most mysterious aspect of adult life: sex. In my case, poking around the Internet helped me realize that the range of possible self-expression and sexual desire was much bigger than what I was taught to believe in the classrooms and hallways of my suburban Ohio school.

But exploring one’s sexuality online can just as easily reinforce straight desires and suppress anything that deviates from them. In The Digital Closet, Alexander Monea, associate professor of English and cultural studies at George Mason University, explains how the infrastructure of the digital world was built by straight people and remains regulated for the satisfaction of heteronormative desires.

Monea points the reader to a glaring and harmful contradiction at the heart of modern content moderation practices: straight content, from porn to suggestive Instagram influencer photos, flows in flagrant display across nearly all search engines and social media platforms, while the vast majority of content created by members of the LGBTQIA+ community, from sex educators to trans models, is frequently over-moderated or banned by human or algorithmic content moderators.

The result, Monea argues, is an Internet defined by straight monopolies on sexual speech. The LGBTQIA+ community bears the brunt of this monopoly system, but it ultimately harms straight Internet users, too. As algorithms feed viewers exaggerated straight fantasies, they also rob them of the pleasure and possibility of sexual imagination, exploration, and encounter. This can be particularly harmful to the sexual imagination of straight young men, whose ideas about sex and desire are often shaped by troubling pornographic tropes about class and race.

Freeing our digital worlds from the logic of what Monea calls “straight code” will require a robust investigation into how content moderators, moderation algorithms, and artificial intelligence harbor biases. It will require regulatory probes into how platforms like Google and payment processing companies like Visa surveil and police their queer users. Most challenging of all, perhaps, any path forward will require a renewed and widespread commitment to the free expression of ideas about the self and sex, and a belief in the possibility that people can—and should—find purpose and form meaningful communities online.

I spoke to Monea about the origins of the straight Internet, the shape it takes today, and how we can begin to imagine a digital world that welcomes free expression, sexual education, and meaningful exploration. The following conversation has been condensed and edited for clarity.

—Jacob Bruggeman

Jacob Bruggeman: Your history of the Internet in The Digital Closet dates to the mid-1990s, when a moral panic about porn swept the US. You describe this panic as a myth, the “Pandora’s box of porn,” that has only grown in influence since the 1990s. What caused this panic then? And how does its legacy continue to shape public debate today?

Alexander Monea: Each time a new medium comes out—for example lithography, photography, film, home video, the Internet—pornographers are among the first to experiment with it (and in some cases even drive the development of the new medium). Where they are successful, a moral panic tends to ensue over the new forms that pornography takes and the ease with which it gets distributed. I start with this because I think the historical perspective is important. It lets you know that what happened with Internet pornography was not anomalous, but instead just the latest iteration of a centuries-long cycle of pornographic ebbs and flows.

In the mid-1990s, more and more American homes were getting connected to the Internet—I, for one, remember mountains of floppy discs and CD-ROMs preloaded with AOL’s software and promotional priced plans distributed literally everywhere in my hometown: in theaters, at the grocery store, and stuffed in every mailbox multiple times a month.

Similar to the moral panic over home video brought about by VCRs and camcorders, reactionaries [and conservatives] were horrified that audio-visual pornography would be able to penetrate private homes. Anyone and everyone could now create, distribute, and access pornography. But, whereas with video tape and premium subscription channels on TV, parents could still ostensibly limit their children’s access to pornography by hiding VHS tapes or blocking premium channels, the Internet had limited filters available to facilitate parental control. Letting your child online meant potentially exposing them to pornography, and letting them online was increasingly becoming a necessity for everyday life.

[The section “Pandora’s box of porn”] essentially tells a story in which Internet connectivity unwittingly released an unending torrent of pornography that cannot be and has not been contained. In the wake of this pornographic deluge, we all collectively threw up our hands and gave up on ever limiting people’s access to pornography again.

This narrative is not only not true, but it also tends to obscure nearly three decades of sustained and successful political mobilization by anti-pornography groups working to limit the availability of pornography online. These groups have gotten pornography completely censored in schools, libraries, and on public WiFi in places like Starbucks and McDonald’s. They have pushed social media companies to adopt overbroad content moderation policies and algorithms. They’ve successfully lobbied local, state, and federal government to regulate pornography and in many states to declare it a “public health emergency,” and they’ve managed to do all this under the cover of the pandora’s box of porn narrative so no one that might resist or counterprotest was made aware of what they were doing.

JB: One consequence of this moral panic was the conflation of pornography and what you call “sexual speech.” What was lost when these categories were collapsed into one another, and who bore the burden?

AM: I use the term “sexual speech” as an umbrella term to capture all the different ways we communicate about sex and sexuality, or even just communicate our sexuality itself, regardless of the medium or form that communication takes. Sexual speech isn’t just quotidian discussions of sex acts or sexual preferences, though it is that too. It also includes things like sex education, LGBTQ+ community building, LGBTQ+ activism, and even discussions of pornography. And it can take many forms ranging from Tweets to novels to GIFs to TikToks.

For a lot of reasons that I dig into in the book, sexual speech tends to get conflated with pornography, and thus it gets unduly censored by content moderation policies and algorithms across the Internet. When this happens, we lose one of the most utopian possibilities afforded by the Internet: the ability to learn about ourselves and each other, to make connections and build community, and maybe even to feel at home.

This burden is inordinately borne by those with the least power. Thus, while it certainly disadvantages the LGBTQ+ community as a whole, it particularly targets those whose identities and communications disobey homonormative social scripts, as well as those who might have intersectional identities that are multiply marginalized.

For example, Pete Buttigieg, a married, white, cisgender gay man who served in the military, worked at McKinsey to help destroy the planet and the working class, and posts some of the most milquetoast content available on the Internet, suffers minimally under these conditions. He lives a privileged life and there is little chance that overbroad censorship will prevent him from earning an income, participating in society, or finding community. Contrast this, for example, with a Black, nonbinary sex worker, who may get denied online financial services and thus have to engage exclusively in in-person, cash-based sex work; who might get deplatformed and lose the abilities to solicit and vet clients online; who might have to work the streets in an overpoliced and impoverished neighborhood; and who has to take what clients they can get. For this person, free sexual speech has very material implications. It can be a matter of life or death. Many LGBTQ+ readers will fall between these extremes in terms of the burden they bear from undue censorship online, and as this censorship continues to ramp up, their burden will likely increase.

JB: One legacy of the “porn myth” is an egregious contradiction in legislation and corporate content moderation: Hetero porn is permitted to circulate freely, while LGBTQIA+ content and sex workers are routinely banned or blocked from platforms and even prosecuted. What are the major effects of this digital segregation of LGBTQIA+ content?

AM: We see this repeatedly across social media: Playboy can maintain an Instagram account, but sex workers can’t. Thin women can wear bikinis in photos, but fat women can’t. Trojan condoms can access financial services, but queer and feminist condom companies can’t. The primary effects of this are to constrain our options as consumers (of both content and commodities) and to further marginalize LGBTQIA+ people in our society. This latter impact goes beyond simply denying them representation online, and includes materially impacting their ability to feed, clothe, and house themselves. In the United States, this has dire consequences, as LGBTQIA+ and sex worker communities are already systematically marginalized and we offer scant public services or social welfare to prevent them from immiseration and death.

JB: Straight porn viewers hardly benefit from this segregation of content. How does it negatively affect them?

AM: I think one of the key points of the book is that while the suffering this causes is certainly concentrated most heavily amongst the LGBTQ+ community (and even there, unevenly distributed), the burden is borne by everyone. Being able to communicate freely about sex and sexuality is a prerequisite for good sex, something I think we all want and have a collective stake in. Talking about sex and sexuality, exploring desire, building community, and learning how to have safe, good sex are all part of sexual speech and I think have a broad appeal.

I think this connects rather well with another project I’ve worked on recently—a book I coauthored titled The Prison House of the Circuit. This new age of unprecedented connectivity, circulation, and communication tends to obfuscate new and nefarious mechanisms of control. In this instance, we can see straight Internet users presented with a seemingly limitless variety and amount of content, while surreptitiously being constrained into narrow channels or flows that always funnel them back to socially normative content, or, in other words, contents that abides by our puritanical and capitalistic social mores.

JB: How has the anti-porn movement reshaped the political economy of adult entertainment? Who’s benefited and who’s been harmed?

AM: The short answer is that they’ve further concentrated control of pornography into the hands of mainstream heteroporn producers (and similar arguments could be made about sex toys, sex education, and a range of other industries) at the expense of niche, low-budget, DIY producers. I think this happens for a number of reasons.

First, the pressure that these groups put on advertisers to demand puritanical social media platforms, on legislators and regulators to police obscenity online, and more directly on social media companies themselves has resulted in continuous crackdowns on online pornography. In the wake of these crackdowns, only the pornographers with the most resources can defend themselves and maintain their distribution networks online, and that can lead to the heteroporn industry gaining a larger market share.

Second, as you increasingly consolidate pornography into small subsections of the Internet, you make it easier for mainstream heteroporn producers to engage in tactics like search engine optimization to draw larger audiences. For instance, if only a handful of search terms lead you to actual pornography, large, vertically integrated pornography companies can search engine optimize for those few terms and capture most of the Web traffic that results from them. If instead any search term could potentially lead to pornography, there might then be too many avenues of discovery for a small set of producers to dominate the Web traffic for.

Lastly, one of the big results of this constraint of pornographic content to small corners of the internet is their concentration on “tube” sites like PornHub, xHamster, etc. While many of these sites offer the capacity to share queer content, the metadata structure that they use to make content discoverable does not prioritize non-normative content. Even if there is non-normative porn uploaded to the tube site, it may be difficult to locate, and this lack of views will often signal to content producers that they should make less of this non-normative content.

JB: One result of biased content moderation is what you call the “overblocking” of users and content creators from the LGBTQIA+ community. Sex education materials, art, and everyday users are blocked because of the biases in the way image recognition algorithms and human reviewers “see” and interpret concepts like nudity. What are the consequences of overblocking?

AM: When social media platforms try to identify porn, they end up with a lot of false positives: pieces of content that their algorithms falsely identify as pornographic and end up blocking. I call this process overblocking and it leads to the censorship of art, literature, sex education, LGBTQ+ community building, LGBTQ+ advocacy and organizing, and other discourse that talks about sex without being pornography. The consequences of this are vast. Queer Internet users and content creators might feel pressure to self-censor and/or to present as more homonormative online. Queer youth might have more difficulty finding community online and exploring their sexuality in a safe space before coming out. And we all, queer or not, might have less interesting art and literature, less understanding of sex and sexuality, and, in all honesty, we might all be stuck with a much more boring Internet.

JB: You argue that a “straight code” structures the digital world. How are concerned users and the LGBTQIA+ community responding to phenomena like overblocking? Are there promising regulatory solutions on the horizon?

AM: Unfortunately, I don’t see any promising legislative or regulative solutions on the horizon, and in fact bills like the EARN IT Act, which have been proposed, would greatly worsen the censorship of LGBTQ+ content online. I also don’t see asking these companies to self-regulate a long-term solution either. They are nearly unanimously dependent on advertising dollars, and until advertisers stop associating pornography with a danger to their brand image, they will continue to put irresistible pressure on companies to be overzealous in their attempts to block it. Further, when these companies have been sensitive to consumer pressure and the negative public relations that come from censoring queer content, they tend to describe it as an isolated event, a “bug” in the system. In nearly every case I have studied, these promised fixes never come.

I’ve only just started studying user responses, but I think a lot of them fall into what I’m describing as collaboratively executed trial-and-error experiments to produce algorithmic remedies. Users experiment with what platforms will and won’t censor, collaboratively develop best practices, and engage myriad techniques to evade censorship. Perhaps the most prominent of these is what’s called “algospeak,” in which users intentionally misspell, mispronounce, or substitute words that they think trigger content moderation algorithms. Some of these may be familiar to readers, like using the word “seggs” for “sex,” “le$bean” or “le dollar bean” for “lesbian,” or “leg booty” for “LGBTQ community.” At this stage, it’s still unclear to me how successful these practices are and whether they encourage self-censorship in problematic ways.

I do think short-term goals like advocating for new policies at these companies, pushing for third-party watchdogs and algorithmic audits, demanding commitments to protecting queer users, and pushing back on regulators and legislators are worth pursuing, but in my experience, they are like bailing water out of a sinking ship. None of these actions will ensure that we get the Internet we deserve.

For that, we need to think bigger. We need to rehab our democracy so that legislatures and regulators reflect the popular will. We need to initiate much more robust conversations about sex and sexuality. We need to codify LGBTQ+ protections not only into our Internet infrastructures, but into our society. We need social welfare so that people’s material security is not tethered to the whims of an algorithm. We need to end billionaires. We need public ownership of the Internet and its platforms. We need an exit from capitalism.

Ad Policy
x