Can Section 230 Protect Tech From Social Media Addiction Litigation?

amid rising concern that social media is harming minors, veteran trial lawyer matthew bergman is leading a new fight to hold companies liable. here’s how he plans to circumvent section 230.
Nick Russo

A bunch of boyscouts. By the fall of 2021, Matthew Bergman was many things — a veteran asbestos victim trial lawyer, a law professor, a Kenyan female literacy philanthropist, a three time delegate to the Democratic National Convention — but he was not a man many would’ve foreseen leading a legal movement to fundamentally transform the internet. Having graduated from law school in 1989 and litigated exactly zero internet cases thereafter, Bergman seemed at first glance like an odd candidate for the role. But by winter he’d founded the Social Media Victims Law Center, which is now leading the fight to hold tech giants liable for harms suffered social media users.

Bergman’s career shift was catalyzed by Facebook whistleblower Frances Haugen, who in September 2021 leaked internal documents appearing to show — though not definitively proving — the company knew its products were harming adolescent mental health. Bergman’s bread and butter had been product liability law, at the very heart of which lies the concept of reasonably foreseeable harm. The “Facebook Files” showed the company had commissioned studies that suggested its apps were detrimental to the mental health of teenagers — especially girls — and that, having seen these study results, it continued to prioritize engagement over user safety. This, in the words of one expert, made the files “shark bait for trial lawyers” like Bergman, who built a lucrative career on proving in court that companies failed to prevent reasonably foreseeable harm stemming from the use of their products.

“This was the Sumner Simpson Papers all over again,” Bergman told an interviewer last September, referencing the files that proved a massive cover up of medical consequences by the asbestos industry. Except, in his telling, today’s social media giants “make the asbestos industry look like a bunch of boy scouts,” because the harm they’re inflicting is even greater, and they’re inflicting it on children. Also, in July 2022, he hosted a webinar organized by Perrin Conferences — a prominent forum for leading trial lawyers — titled “Will Social Media Litigation Become the Next Tobacco War?” In Bergman’s framing, then, social media companies belong in the pantheon of American corporate malfeasants, right alongside the manufacturers of products known to increase the risk of lung cancer five to ten fold. These kinds of inflammatory analogies have left a sour taste in the mouths of the tech giants he’s taking on; a source at Meta, who wished to remain anonymous, called Bergman an “ambulance chaser” running a “class action lawsuit factory masquerading as an advocacy group.”

Whether ambulance chaser or earnest crusader, Bergman has been shouting this provocative narrative from the highest of rooftops. He’s been on Good Morning America, Dr. Phil, and 60 Minutes. He often uses the phrase “social media addiction,” and argues that social media companies have used “operant conditioning, sophisticated psychological mechanisms, and artificial intelligence to hook kids,” who he says are extremely vulnerable to “algorithmic manipulation” because they have undeveloped prefrontal cortexes. The companies, in Bergman’s view, are “profiteers” who rake in lucrative revenues by purposely steering children toward harmful content to maximize user screen time and engagement. 

Bergman likes to use the example of a young girl interested in fitness who starts watching exercise videos and then is led, via recommendation algorithm, toward eating disorder content. Such girls, being addicted to social media and thus unable to resist the temptation to scroll endlessly, wind up in a rabbit hole that induces low self-esteem, depression, and self-harming behavior — a spiral that sometimes ends in suicide. Cyberbullying, he told me, can also contribute to this spiral, because whereas a child being bullied in school can simply walk away, a child being bullied on a platform they’re addicted to can make no such escape. Tragically, for such a child, suicide may seem like their only resort.

Indeed, Bergman is currently representing several families of children who’ve taken their own lives after periods of excessive social media use. He argues such suicides are a “direct consequence of the design decisions these companies have made.” In his 60 Minutes interview — during which the host asked Bergman whether social media giants have effectively armed children with a lethal weapon — he said “I don’t know how these companies sleep at night… [but] I’m coming at them with everything I’ve got.”

Everything he’s got. In less than two years, the SMVLC has — in part by running a television ad campaign — amassed a roster of over 1,200 clients. It has filed lawsuits against Meta, TikTok, Snap, and Discord. Its clients are parents of children who’ve developed eating disorders, been sexually exploited by adults, died after attempting a viral challenge or purchasing fentanyl-laced pills, or even committed suicide after using the platforms operated by these companies.

The lawsuits against Meta aim to pin liability on the company for harms allegedly stemming from heavy Instagram use. Cases filed on behalf of the parents of Selena Rodriguez, Liam Birchfield, and Christopher James Dawley all allege that the children suffered from social media addiction, which largely played out on Instagram, and that the addiction led directly to their suicides. The cases reference specific examples of harmful content that the kids viewed — material related to eating disorders or firearms — but the focus is more on the design of the platforms themselves.

“Meta has intentionally designed addictive products that exploit the incomplete brain development of youths and manipulate the brain reward system,” the SMVLC’s website claims. Their argument: Selena, Liam, and Christopher all grew addicted exactly as Meta intended, and that addiction resulted in sleep deprivation, depression, and other negative psychological effects. Those symptoms rendered the kids more vulnerable to anorexic, self-harming, and other dark content served up by recommendation algorithms. Unable to look away, the kids were sucked into a suicide-inducing rabbit hole. Meta knew about these risks beforehand, but failed to warn minor users and their parents. Meta’s products also lack adequate parental control features and in fact are “intentionally designed to thwart parental monitoring and restriction.” As such, the firm insists, Meta’s products are defective by design and negligently marketed.

Given that these cases are built in large part around the concept of social media addiction, and that the idea you can be addicted to social media as you can to opioids would likely raise the eyebrows of the average American, I asked Bergman whether he thought courts would be hesitant to recognize it as a legitimate psychological condition. He responded that social media addiction “is not a concept devised by clever attorneys,” insisting on the contrary that it’s “basic science” backed by peer-reviewed research. This argument, for the most part, checks out — with one major caveat. Academic research on the topic now dates back over a decade, with several groups having developed diagnostic frameworks, and there is an emerging consensus that excessive social media users exhibit symptoms characteristic of drug addiction. However, neither the DSM-V nor the ICD-11 recognize social media addiction as a mental disorder.

The SMVLC’s other lawsuits are similar in their focus on algorithm over content, on faulty age verification and parental control mechanisms, and on failure to adequately warn of the risks of social media use. The case against TikTok concerns two young girls who died after attempting the viral “blackout challenge.” But it’s not the mere existence of the blackout challenge on the platform that the SMVLC is focused on: “Under Section 230 of the Communications Decency Act, social media platforms cannot be held liable for the content others have posted on their platforms. However, this case against TikTok is not based on the actions of its users but rather on TikTok’s actions.” Specifically, the case alleges that TikTok’s algorithms addict young users and intentionally direct them to harmful content in an effort to maximize engagement.

The case against Snap, which I wrote about in more depth recently, is focused on the app’s disappearing messages and geolocation feature, which are often exploited by drug dealers. The case against Discord involves an 11-year-old girl who attempted suicide multiple times after being sexually exploited by adult men; the app has a user age minimum of 13 years, but the SMVLC alleges it has an inadequate age verification protocol, along with misleading safety settings. The common thread between all these cases — aside from the tragedy of young lives shattered or lost altogether — is a subtle distinction between content and design.

The design-content distinction, and how to play the courts like a lottery. In one of his first media appearances after founding the SMVLC, Bergman told a local reporter that, while regulation is an important avenue for ensuring product safety, “ultimately it’s the prospect of explaining themselves to a jury that incentivizes companies to think proactively.” That, he said, is why the SMVLC is taking the product liability fight to Big Tech. But the comment was a sort of sleight of hand. 

All of the SMVLC’s cases hinge on the design-content distinction because social media companies are shielded from liability for harmful third-party content by Section 230. But some legal experts, like Santa Clara University’s Eric Goldman, call Bergman’s distinction “illusory,” arguing that, ultimately, the problem is the content being served up by the algorithms, which brings you right back to Section 230. Goldman’s retort may also be applicable to the concept of social media addiction, where researchers have “questioned whether people become addicted to the platform or to the content” — if it’s the content that’s addictive, then, again, we’re right back to Section 230. In any case, whether the distinction holds legal water is up to judges, who’ve thus far been hesitant to reinterpret Section 230 so as to expose tech companies to litigation. Bergman is banking, then, on persuading the courts to engage in judicial policymaking by permitting his arguments to be heard by a jury — i.e, on his lawsuits ushering in a new regulatory regime, a world where Section 230 isn’t an impenetrable legal shield for Big Tech. 

After oral arguments in the first Supreme Court case to address Section 230, Gonzalez v. Google, many commentators are expecting a ruling in Google’s favor — further illustrating judicial hesitancy to walk back Section 230. That said, Bergman was well aware of the Gonzalez v. Google case when he launched his legal crusade, and he told me he does not think a ruling against the plaintiff would pose an insurmountable barrier to his cases, because his product liability strategy is distinct from that used by the Gonzalez lawyers.

Bergman knows that, even given the differences between his cases and Gonzalez v. Google, it’s far from guaranteed a judge will let his attorneys go to trial, given the sweeping implications of the decision — and he’s planned accordingly. The SMVLC has filed cases in many different jurisdictions. All they need to do is convince one judge in one jurisdiction that Section 230 is not a valid legal defense against their claims. “Basically,” Goldman told reporters, “it’s like a lottery. You only really need to win one in order to open up a very, very big door for future litigation.”

Unintended consequences. Regardless of if Bergman — or any of the many other trial lawyers following in his footsteps — will succeed in holding social media giants liable for harm suffered by adolescent users, the reader is left to consider whether or not these companies should be held liable. If the courts acknowledge the reality of social media addiction, and tech companies are on the hook legally for consequences stemming from that addiction, the question becomes: what kinds of consequences, exactly, will be considered fair play for plaintiff’s lawyers? How will this new wave of cases affect our technology sector, and more broadly, our culture?

For now, most of the cases in the pipeline concern adolescent suicide, eating disorders, drug overdoses, and sexual exploitation. But if these cases succeed, how long will it be before the rise of Big-Tech-turned-my-kid-trans lawsuits? How long before someone sues Facebook for getting their Capitol-storming dad locked up by recommending him Q Anon content? Or a climate doomer from r/collapsesupport commits suicide, and his parents sue Reddit? Or a wife sues Twitter for getting her husband on the crypto hype train, only for their life savings to go up in smoke? Or the parent of a school shooter sues YouTube for precipitating their child’s attack?

The possibilities here are endless, as is anti-tech sentiment on both left and right. The culture wars are white hot. Matthew Bergman talks of himself as David taking on Goliath, but it’s not clear he’s considered what happens after Goliath falls. Or, if he has, he doesn’t seem to acknowledge unintended consequences as a serious threat, given he told me “I don’t think there’s a downside to social media companies being held to the same standard of responsibility as other companies.” This, of course, is a plaintiff’s lawyer’s framing. Perhaps a better frame is one that asks whether there’s a downside to opening up a new front in the culture wars — a front on which those with money to spend can use social media addiction litigation to influence what kinds of content our tech giants will risk allowing on their platforms, and thereby to redraw the contours of our fifth estate.

— Nick Russo

0 free articles left

Please sign-in to comment