Revenge Porn Is Malicious and Reprehensible. But Should It Be a Crime?

Revenge Porn Is Malicious and Reprehensible. But Should It Be a Crime?

Revenge Porn Is Malicious and Reprehensible. But Should It Be a Crime?

The line between respecting civil liberties and protecting victims is anything but clear.

Copy Link
Facebook
X (Twitter)
Bluesky
Pocket
Email

The purpose of Pink Meth, a notorious ”revenge porn” site, is not just to share photographs of naked women that have been obtained without their consent. It’s to destroy their lives and reputations. The pictures, submitted by anonymous users, typically include a woman’s full name, e-mail address and screen shots of her Facebook profile. Commenters crowdsource the contact information for the woman’s family, friends, bosses and colleagues, then revel in sending the pictures to them. If she’s a member of a church or synagogue, the pictures will go out to her clergy, as well as others in the congregation. The users plot to make sure that the photos appear at the top of any Google searches of the woman’s name. “I’m hoping this goes on for months and years and people keep on sending the slut pictures to everyone she knows again and again,” wrote a user on one recent thread. “It would be so awesome if someone had set it up to record her on Skype as she was having all this happen and see her cry.”

Right now, in many states, all of this is legal. In much of the country, a person who submits photographs to a site like Pink Meth can’t be prosecuted unless he or she obtained the pictures illegally. It’s not clear who owns Pink Meth—its founder goes by a pseudonym, Olaudah Equiano, and the site operates on the anonymizing network Tor—but even if his identity could be discovered, he’s largely shielded by Section 230 of the 1996 Communications Decency Act (CDA), which immunizes online services from most responsibility for user-generated content. A single person who obsessively contacts a woman’s friends and associations with humiliating photos could be liable for harassment, but according to Danielle Citron, a law professor at the University of Maryland and author of the new book Hate Crimes in Cyberspace, when the abuse is diffused among a mob, it’s very difficult to hold any one member accountable. 

A growing number of activists, attorneys and lawmakers are trying to change this. Since 2013, thirteen states have passed laws criminalizing revenge porn. (Lawyers and activists tend to prefer the term “involuntary” or “nonconsensual” porn, because it also covers images that have been stolen by someone without a personal connection.) Bills have been introduced in sixteen other states and the District of Columbia, and Representative Jackie Speier is working on a federal one. Victims of involuntary porn have established groups like the Cyber Civil Rights Initiative and Army of She. The recent high-profile release of hacked nude photographs of celebrities like Jennifer Lawrence has given these efforts a shot of publicity and, perhaps, momentum.

At first glance, it can be hard to imagine any decent person objecting to these laws. State-level efforts, which target people who share nude images without the pictured person’s consent, vary considerably. Most make the crime a misdemeanor, with prison sentences of up to a year, though in Arizona it’s a felony. Georgia’s law includes the “depiction of covered male genitals in a discernibly turgid state,” while a bill that passed Michigan’s State Senate applies to sexually explicit drawings as well as photographs. The ACLU objects to most of this legislation, arguing that it is dangerous to criminalize the display of material that’s not obscene and was legally obtained.

On September 23, the ACLU filed a lawsuit against Arizona’s revenge-porn statute, arguing that, under the law, a newsstand owner selling magazines with photos of naked prisoners at Abu Ghraib, or a professor showing his class the iconic Vietnam War photograph “Napalm Girl,” could be charged with a felony. These concerns are actually shared by some of the very legal activists who have dedicated themselves to criminalizing revenge porn. Citron, says of Arizona’s statute: “It’s a terrible law.”

Those devoted to electronic freedom, meanwhile, fiercely object to any efforts to change Section 230 of the CDA, a provision they credit with allowing the development of an open Internet. And no one has figured out how to hold individual members of a vengeful cyber-mob responsible for their incremental contributions to harassment campaigns, particularly when those individual acts aren’t enough to qualify as harassment on their own. “It might be a situation where the law does not provide a perfect remedy,” says David Greene, senior staff attorney at the Electronic Frontier Foundation. “When there are opposing civil liberties, that’s not an uncommon situation. I know that’s not a satisfying answer for anybody, but our law frequently doesn’t have remedies for people who are injured.”

For those involved in the campaign against revenge porn, that’s not just unsatisfying—it’s intolerable. University of Miami law professor Mary Anne Franks, vice president of the Cyber Civil Rights Initiative, says that her group hears from women afraid to leave abusive relationships because their partners threaten to release nude photos and others whose exes use such threats in custody negotiations. Some victims—including the founder of the Cyber Civil Rights Initiative, Holly Jacobs—have had to change their names. “You have to deter this behavior from the beginning,” says Franks. “You have to make people afraid of doing this.” The question is whether there’s a way to crack down on online sexual sadism without trampling on free speech or a free Internet.

* * *

One oft-discussed solution to the legal dilemma of nonconsensual porn lies in the use of copyright law. When the involuntarily released nude pictures are selfies, the person who took them can register as the copyright owner, then demand that sites remove them or be in violation of intellectual-property law.

But if copyright is the best defense that victims and lawyers have, it simply goes to show how utterly unprotected they are. “The people who claim that copyright is a perfectly adequate remedy for the victims of revenge porn are making an argument that I think is ludicrous on its face, both in terms of the legal remedies available under the copyright statute, and in terms of the fact that these websites don’t care whether they’re infringing copyright or not,” says Mitchell J. Matorin, an attorney who represents revenge-porn victims.

For one thing, copyright doesn’t help those whose pictures were taken by others. For another, the process of getting Google to remove links to copyrighted material can be drawn out and unpredictable, Matorin says. “If the photographs are selfies, and one approaches Google and goes through their process, then, if you’re lucky, several weeks or a couple of months later, Google will remove the links that you provided. It takes time, and Google sometimes rejects takedown requests for reasons that they don’t disclose.”

One could, of course, sue the websites publishing the copyrighted material, but even if the people behind them can be tracked down—a big if—proving monetary damages from copyright infringement can be difficult. “These photographs are intended to be private, so there’s no market for the photographs that the copyright owner is being cheated out of,” Matorin says. “Even if you can find somebody to sue, at the end of the day, it’s difficult to prove actual financial damages, so the websites don’t care.”

That is one reason why some activists and attorneys insist that criminalization must be part of the solution. “The criminalization part of it, we think, is important—because honestly, this is one of those types of conduct that, once it happens, it’s very hard to undo the damage,” says Franks. “By the time the image makes it out there, even effective copyright remedies only mean that the site that’s chosen to honor your takedown notice will take it down.”

But drafting criminal laws that don’t run afoul of the First Amendment is complicated. Franks has consulted on ten of the state laws passed since 2013 and is also working with Speier’s office on the federal bill. Yet even she says that some of the state laws now in effect are problematic. Earlier this year, for example, she sent a long memo to Kevin Tanner, a Republican in the Georgia Legislature, enumerating the problems with his draft bill: among others, that the bill had no public-interest exception—meaning that it could, for example, criminalize the behavior of Sydney Leathers, the woman who publicized lewd pictures she’d received from former Congressman Anthony Weiner.

The Georgia bill, Franks says, was “way overbroad when it comes to constitutional issues—the definition of nudity was so broad that you could go to jail for publicizing a picture of a woman in a wet bathing suit.” Nevertheless, it ended up becoming state law.

Even worse, say civil libertarians, is Arizona’s law, which turns sharing a nude image into a felony punishable by almost four years in prison—whether or not there was any malicious intent. “[A]ny time someone shares a lawfully obtained, consensually viewed, or consensually possessed image without express permission, she or he may face criminal liability under the terms of the Act,” says the ACLU’s suit, filed on behalf of a coalition of newspapers, bookstores, photographers, publishers and librarians.

“The combination of the broad language and the extreme criminal penalty makes Arizona’s [law] the most troubling,” says Lee Rowland, an attorney with the ACLU’s Speech, Privacy and Technology Project. Those problems are compounded by the fact that Joe Arpaio, the notorious sheriff of Maricopa County, “has some history of investigating and going after media who print clearly protected speech,” creating the risk that the law could be used to crack down on the press.

Yet if Arizona’s law is uniquely bad, Rowland adds, none of the state laws are without hazards. “Philosophically, as a civil libertarian, I’m always hesitant that a new nonviolent crime is the answer to any of society’s many problems. I think the criminal law is rarely the best answer to grappling with large structural questions of gender, of equality, and particularly so where an issue touches on freedom of speech.”

Activists against involuntary pornography, of course, profoundly disagree. It’s entirely possible to draft criminal laws that are narrow enough to avoid constitutional land mines, they argue—and given how fully victims’ lives are being destroyed, such laws are desperately needed. “To argue that our society should not criminalize certain behavior because too many other kinds of behavior are already criminalized is at best a non sequitur,” write Franks and Citron in a recent jointly authored paper, “Criminalizing Revenge Porn.”

* * *

At the same time, there is one approach to the problem of involuntary porn that doesn’t raise any constitutional or criminal issues at all. There is nothing in the Constitution that provides websites with a protection from being sued that is not available to real-world publishers. That’s entirely a function of Section 230 of the CDA, which could be amended, at least in theory, so that involuntary-porn sites can be sued for harassment, invasion of privacy, defamation and other civil offenses.

“The real problem that the victims of this practice have is the fact that we have intermediary liability that nobody is willing to take on,” says attorney Marc Randazza, referring to the protections that Section 230 offers to website publishers for user-generated material. Randazza occupies an odd niche in the debate over involuntary porn. A self-described “socialist libertarian,” he has represented a number of porn companies, and he’s a loud and often vulgar critic of attempts to criminalize revenge porn. At the same time, he has also represented at least twenty victims of involuntary pornography, and says the current law is inadequate to help them. “Section 230, it’s not a constitutional right. It’s not a civil-liberties issue. Section 230 is a deviation from the norm in a statute—and when people abuse a statute, I think it’s time to change that statute.”

He offers a characteristically colorful analogy: “Let’s say I owned a store that had a wall outside, and somebody came by and put sexually compromising photographs of [you] all over that wall. And you call me up and say, ‘Hey, Marc, would you mind taking those pictures of me down?’ And I say, ‘Fuck you, I don’t have to! And by the way, since I have those pictures, people tend to come by my store a little more, so I’m making money off of it.’ In the real world, that would be my responsibility. But in the online world, because of how we’ve interpreted Section 230, it’s tough shit for you.”

Activist scholars like Franks and Citron—people with whom Randazza feuds when it comes to criminal laws—support amending Section 230. Matorin insists that nothing would do more to help his clients: “My view is that if you amend the CDA to remove the immunity that these website operators cower under, you will go a large way toward eliminating the problem.” The ACLU’s Rowland isn’t crazy about the idea but admits it doesn’t raise First Amendment issues. “It’s not constitutionally out-of-bounds,” she says. “We could choose to regulate publishers online the same way we regulate publishers in the bricks-and-mortar world.”

Any attempt to do this, of course, would infuriate the tech industry, as well as partisans of Internet freedom. “It’s not really credible to consider that Google or Facebook will vet speech that is posted on their websites before it’s posted,” Rowland says. But amending Section 230 wouldn’t necessarily require that. As she points out, the CDA already allows for liability when the websites themselves provide the content. You can’t sue Yahoo if someone libels you on a Yahoo message board, but you can if one of its journalists libels you in a Yahoo News story. The question is what it means to be a content provider as opposed to simply a platform. Right now, Pink Meth and similar sites like MyEx.com are considered, for legal purposes, mere platforms. “Certainly, it would be constitutional to shift the line a little bit so that websites that were directly encouraging and requesting the content fell on the content-provider side of that line,” Rowland says.

But constitutional is not the same as politically feasible. Silicon Valley would likely use its political muscle to block any attempt to amend Section 230, which the Electronic Frontier Foundation calls “the most important law protecting Internet speech.” The EFF’s site has a section devoted to celebrating the law, with representatives from online powerhouses like Yelp, WordPress and Wikipedia attesting to its value and arguing against any attempt to narrow it.

“Giving a damn about the people who are being hurt by this practice is just not something that is within their own internal ethics,” Randazza says of the big Internet companies. “Eventually, we’re going to have some senator’s daughter wind up on a revenge-porn site, and then maybe somebody will be taken to task. But until somebody important enough winds up a victim, nothing is really going to happen.”

 

Ad Policy
x