Social-media firms raked in $11 billion in US ad sales from minors in 2022 alone. Yet even that astronomical figure doesn’t capture the economic value that these giants derive from addicting children to their platforms. Meta, Google, TikTok, and Twitter are over-incentivized to try to ban, delay, and weaken all efforts to decommercialize children’s social media.
But now, that focus on exploiting teen emotions has come back to bite Big Tech. Parents—who for a decade blamed themselves in isolation for the rising depression, anxiety, and self-harm among teens—are now joining together to demand that social media, the central institution of their children’s lives, be made safe. Bipartisan parents’ groups are building a new movement; 2024 is turning out to be the year that we will stop gross and deadly exploitation of children’s emotions for profit.
Friday marked a major development in this battle. By a nearly unanimous bipartisan vote, the New York legislature passed landmark legislation that goes straight at the exploitation business model. The New York approach—a version of which has also been introduced in both California and Kentucky; talk about bipartisan!—is disarming in its simplicity. The laws prohibit three different design features for minors on social-media platforms: targeted feeds, notifications between 12 a.m. and 6 a.m., and commercialization of children’s data. (It’s a default measure: All of those design features can be deployed if there is verifiable parental consent.)
Unlike so much of the discourse around social media, which traffics in complication and confusion, and a sense that the hill is too thorny and steep to climb, the New York legislation treats social media like a basic child-protection and public-health problem.
Consider the provision on targeting, which forbids platforms from using personal data to deliver content. It’s Facebook 2009, instead of Instagram 2024: There is no “for you” or “recommended” or automatic pushing of content based on a child’s location, friends, likes, mood, voice pitch, or any of the trillions of data bits that are used to pinpoint content that will keep them wanting to stay online. Youtube, TikTok, Twitter, and Instagram will no longer be able to harvest personal data of children and deliver the feed that they think will addict them to the platform. Instead, the feeds will be curated by the children and teenagers themselves, who will have the freedom to select the friends they want (rather than have contacts pushed on them by algorithm); to select the news sources they want (rather than have “lookalike” addictive content pushed on them); and to explore, without being directed and redirected and having their emotions micromanaged for maximum profit and preferred ideology by Mark Zuckerberg, Elon Musk, or the Chinese government. As such, the law is both a public-health measure and one that acknowledges the importance of the free and social development of the human spirit.
The business model of algorithmic AI on social media involves isolating each of us, treating each person differently, destroying common ground by removing the site of shared experience, and replacing it with a top-down, isolated experience. As new research has shown, one of the key ways in which social media is causing teen depression and anxiety is by driving them away from genuinely social experiences, whether it be the arts, sports, clubs, or simple hanging out. It isolates and disempowers at the very moment in human development when the tools of collective joy and meaning—and support in sorrow—are developed.
The addictive nature of social media also disrupts the vital processes of introspection and self-reflection. Addiction of any kind, be it gambling or the gamification of social life, crowds out silence, solitude, and contemplation—practices that are crucial for developing a profound understanding of oneself and one’s place in the world. Teenagers, bombarded with stimuli individually crafted to make it hard to drop the phone, are deprived of the opportunity to engage in meaningful self-exploration, stunting their development. The social-media targeted-content business model is a tool for the destruction of public space and human development.
Consider two techniques used by social-media companies to addict children that will have to be turned off under this approach. The first is the “endless scroll,” a design feature whereby content continuously loads as the user scrolls down the page, creating a never-ending feed of content, eliminating the need to click or navigate to another page to see more information. By continuously presenting new content, endless scroll can lead to prolonged use, making it difficult for users to stop. It causes anxiety, stress, and feelings of inadequacy—as well as sleep loss. The second feature allows the platform to identify the video that a user most wants and to hold it back, only delivering just as the machine senses that the user is about to log off. This feature relies on targeting and is designed to directly weaken teens’ willpower.
So although these design features are about health, the ultimate problem is civic and even spiritual in nature. Few teenagers today can opt out of social media without destroying their social connections, so they are essentially forced to use unhealthy and controlling platforms that addict them. Free exploration and human connection are thus lost—and this, in a critical developmental time in a person’s life.
“We don’t need to accept this business model.”
The revelation of the last 24 months has been that we don’t need to accept this business model. Parents who have lost their children to social media design defects have emerged as the most powerful voices of opposition, telling stories of how their child was targeted with a choking challenge—and accidentally died. Or how their child was targeted with self-harm content and ended up hospitalized for anorexia. Or how their child was targeted with suicide content and took her own life. We don’t need to accept other parts of the business model either, like the idea that notifications are always acceptable. We don’t need to accept that platforms are immune from harm, even from content their algorithms aggressively push on children. We don’t need to accept small tweaks, and we can no longer accept the now-ludicrous argument of social-media executives that they should be trusted with ensuring the well-being of children, after all the revelations of former Facebook insiders Frances Haugen and Arturo Bejar: They shared extensive documentation showing how Facebook repeatedly chose profits over child safety, even when faced with evidence of significant harm.
In the last two years alone, nearly two dozen children’s online-safety laws were enacted across 13 states. Last month, Maryland passed a bill focused on requiring platforms to put the best interests of children ahead of profit in platform design, and Vermont is considering a similar bill. At the federal level, legislation that would include a similar requirement is gaining steam.
Big Tech won’t take this sitting down, and the Silicon Valley oligarchs are predictably turning to First Amendment arguments, claiming that when Youtube sends children a suicide video they didn’t request, it should be treated as protected speech the same way a newspaper is when publishing gruesome war photographs. These legal and moral arguments collapse in the face of common sense: For the last century, the First Amendment has coexisted with laws prohibiting dangerous product designs and reasonable limitations on the sale of all sorts of content to children.
It isn’t going to work. This Supreme Court may be trigger-happy on the First Amendment rights of corporations, but the justices have signaled in many ways that they are skeptical of tech companies’ efforts to call themselves news organizations, and are open to content-neutral regulations for children and teens.
Social-media regulation has matured into something tough, powerful, bipartisan, and First Amendment-proof. We are finally starting to do our job as a society to protect the freedom, community, and mental health of children and teens.