Toggle Menu

“In Techno Parentis”: Who Should Regulate the Online Lives of Teenagers?

With TikTok, Instagram, and other platforms using algorithms to send teen viewers addictive, dangerous content—and reaping immense profits—self-regulation has clearly failed.

Zephyr Teachout

April 2, 2024

The TikTok logo is reflected in the eye of a 13-year-old boy as he looks at a computer screen in Bath, England. (Matt Cardy / Getty Images)

Bluesky

When “Josie” was a 12-year-old fifth-grader, passionate about sports, her parents gave her a smartphone. She immediately started searching for sports-related content on TikTok. But TikTok’s social media feed is targeted based on a user’s data profile: It started sending “Josie”—as a minor we have changed her name to protect her privacy—disordered-eating content, and videos about how to be anorexic. The content engaged her and kept her on the app; it also led her to isolation from friends, family, and sports. Just after she turned 13, she was hospitalized with severe malnutrition, and almost died. She had to spend 16 days in intensive care.

Josie’s story is not unique. The same thing happened to millions of kids: The social media company profiled her, identified the content that would be most likely to keep her online, and served it to her. Research from the Center for Countering Digital Hate found that the recommendation algorithm would suggest eating-disorder and self-harm content to new teen accounts on TikTok within minutes of their signing up to the website. One account saw suicide content within 2.6 minutes; another saw eating-disorder content within eight minutes; and while it did not show up on most accounts so quickly, and not all the content offered was eating-disorder-themed, the same premise holds: Platforms are targeting children with content that addicts them.

There has been, as Jonathan Haidt argues persuasively in his recent book, The Anxious Generation, a sharp spike in teen depression, anxiety, and mental health disorders since 2010 that is only explainable by the widespread adoption of the smartphone. Both boys and girls have been impacted, although the impact on girls is more direct and pronounced. A CDC survey from last year showed that a third of teen girls had seriously considered suicide—a nearly threefold increase from 2011. Most American teen girls (57 percent) now report that they experience persistent sadness or hopelessness, a significant increase from 36 percent in 2011.

This is not simply a straightforward health crisis—although it is certainly that—because there are impacts like loneliness, and discomfort with risk-taking, that ripple beyond measurable suffering into reshaping what it means to be an adult, and a citizen in a democratic society. As Haidt points out, moreover, social media is not like smoking or drunk driving. The cause of the mental health crisis is both direct (more time online does measurably decrease well-being) and indirect (the more time your friends spend online, the less opportunity there is for any given teenager to build meaningful offline relationships).

Current Issue

View our current issue

Subscribe today and Save up to $129.

Dozens of states have now taken up this issue. The policy approaches fall into four different buckets.

First, there are laws that attempt to put platforms in a quasi-parental, or fiduciary relationship to children, by requiring TikTok, Instagram, and Youtube to take on a “duty of care” to minors. This approach, which was first adopted in the United Kingdom in 2021, arises in part from the horrifying stories told by whistleblowers. Former Facebook employees Frances Haugen and Arturo Béjar both described repeated efforts to get Meta corporate management to respond to the company’s own data on teenage harm with increased safety measures. A duty of care would force companies to put children’s welfare ahead of simply maximizing profits, and empower engineers in their conflicts with top management. It is unclear how far a duty of care would extend, but at least it would be something—a binding legal mandate to consider other interests than the bottom line.

The Nation Weekly
Fridays. A weekly digest of the best of our coverage.
By signing up, you confirm that you are over the age of 16 and agree to receive occasional promotional offers for programs that support The Nation’s journalism. You may unsubscribe or adjust your preferences at any time. You can read our Privacy Policy here.

Second, and increasingly popular, is the kind of law I worked on while senior counsel for economic justice for New York State Attorney General Letitia James: banning particular functions. Her bill, sponsored by state Senator Andrew Gounardes and Assemblywoman Nily Rozic, is currently embroiled in the New York legislature’s budget negotiations. Governor Kathy Hochul strongly supports the law, but Big Tech is currently spending hand over fist to get it removed from the budget.

The bill would prohibit social media platforms from using “addictive feeds” without parental consent. Addictive feeds are the algorithms that unilaterally select what content children see, pushing what is likely to keep them online, based on their intimate data. It puts teenagers back in charge of organizing their own pages. The bill also includes a provision that would put limits on notifications (banning them at night, for example) and the monetization of children’s data. Other kinds of “function-based” approaches include bans on using any kind of data about geolocation, and could also extend to banning specific technical features, like infinite scroll, or the “speed filter” that tells children how fast they are moving (a function that led to several teenage deaths and a significant lawsuit). The Florida age-limited bill, passed last week, is a form of a function-based ban: It limits all social media functions for under-14-year-olds.

Third, there is the parental empowerment approach, which effectively provides technical enhancements to the legal status quo. These laws range from requiring platforms to give parents access to children’s social media accounts upon request to tools giving parents the ability to limit the number of hours children are online, or requiring parental consent before any social media account can be created for a minor.

Finally, there is the approach advocated by the Big Tech organizations, the ACLU, and some civil liberties organizations: self-regulation. For these organizations, any kind of law that limits what platforms can do in ordering content will necessarily regulate speech. The Electronic Frontier Foundation, for instance, argues that minors have a First Amendment right to access social media. It recently filed an amicus brief in support of Snapchat, arguing that Snapchat should be immune from lawsuits from parents of children who overdosed on fentanyl supplied by Snapchat-using drug traffickers.

Of all the approaches, the most dangerous is the last, the status quo. Right now, Big Tech companies often spend more time with children than their parents or schools do. The virtual spaces they control are not like playgrounds, parks, or libraries; Instagram is not providing a neutral space for play and exploration but actively engineering its users’ emotions, their connections, their choices, by selecting what they see when and how often. And no amount of social pressure will change the companies’ focus on the bottom line.

Support urgent independent journalism this Giving Tuesday

I know that many important organizations are asking you to donate today, but this year especially, The Nation needs your support. 

Over the course of 2025, the Trump administration has presided over a government designed to chill activism and dissent. 

The Nation experienced its efforts to destroy press freedom firsthand in September, when Vice President JD Vance attacked our magazine. Vance was following Donald Trump’s lead—waging war on the media through a series of lawsuits against publications and broadcasters, all intended to intimidate those speaking truth to power. 

The Nation will never yield to these menacing currents. We have survived for 160 years and we will continue challenging new forms of intimidation, just as we refused to bow to McCarthyism seven decades ago. But in this frightening media environment, we’re relying on you to help us fund journalism that effectively challenges Trump’s crude authoritarianism. 

For today only, a generous donor is matching all gifts to The Nation up to $25,000. If we hit our goal this Giving Tuesday, that’s $50,000 for journalism with a sense of urgency. 

With your support, we’ll continue to publish investigations that expose the administration’s corruption, analysis that sounds the alarm on AI’s unregulated capture of the military, and profiles of the inspiring stories of people who successfully take on the ICE terror machine. 

We’ll also introduce you to the new faces and ideas in this progressive moment, just like we did with New York City Mayor-elect Zohran Mamdani. We will always believe that a more just tomorrow is in our power today.  

Please, don’t miss this chance to double your impact. Donate to The Nation today.

Katrina vanden Heuvel 

Editor and publisher, The Nation

Our laws and society are very confused about teenagers—a social category invented in the postwar period with the spread of the car, but that has grown into something different. Are they little adults, big children? Their cognitive capacity is as capable as (if not more capable than) legal adults’, but their brains are in the most active social developmental period, and of all entities to turn their development over to, it would be hard to come up with an institution less well-suited to nurturing the development of adults in a democratic society than TikTok, Youtube, or Instagram.

Finally we need to confront the peculiar way in which the focus on the phones has been coded as more conservative—if not outright Republican—and thereforenot something that progressives and leftists should care about. Some of this comes from the libertarian streak within civil liberties organizations like the ACLU or the Electronic Frontier Foundation, who have been consistently skeptical of government regulation—even when it is government regulation of big, monopolistic tech companies. Some may come from the tribal tendencies and the old entanglement of Google with the Obama administration—a sense that while Google and other Big Tech may behave in problematic ways, they share our progressive values.

Whatever the reasons, this myth doesn’t bear much relation to reality. Progressives, at the vanguard of protecting children from dangerous industrial labor conditions, should also be at the vanguard of protecting children from having their most intimate data exploited for profit. The end game of Big Tech and children is radical isolation and addiction—the very opposite of the solidarity that defines a thriving open, democratic system.

Zephyr TeachoutZephyr Teachout, a Nation editorial board member, is a constitutional lawyer and law professor at Fordham University and the author of Break ’Em Up: Recovering Our Freedom From Big Ag, Big Tech, and Big Money.


Latest from the nation