Federal Lawsuit Against Social Media Companies
Federal Lawsuit Against Social Media Companies
Federal Lawsuit Against Social Media Companies
1
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 2 of 107
1
2 Contents
I. INTRODUCTION 2
3
A. Defendants’ Role in the Youth Mental Health Crisis 2
4 II. JURISDICATION AND VENUE 5
5 III. PARTIES 5
6 A. Plaintiff 5
7 B. Facebook and Instagram Defendants 6
C. Snap Defendant 8
8
D. TikTok Defendants 8
9
E. YouTube Defendants (Alphabet Inc., XXVI Holdings, Google, and YouTube) 9
10
IV. FACTUAL ALLEGATIONS 10
11 A. Millions of Minors Have Become Addicted to Social Media 10
12 B. Research Has Confirmed Using Social Media Harms Minors 12
13 C. Defendants’ Platforms Have Caused America’s Minors to Face a Mental Health
Crisis 15
14
D. Defendants Intentionally Market to, Design, and Operate Their Social Media
15 Platforms for Users Who Are Minors 17
16 1. Meta Intentionally Marketed to and Designed Their Social Media Platforms for
Minor Users, Substantially Contributing to the Mental Health Crisis 23
17
a) The Meta Platform 23
18
b. Meta Targets Minors 25
19
c. Meta Intentionally Maximizes the Times Users Spend on its Platforms 28
20 d. Meta’s Algorithms Are Manipulative and Harmful 29
21 e. Facebook’s and Instagram’s Harmful “Feeds” 31
22 f. Meta Is Aware That Its Platforms Are Harmful to Minors 33
23 2. Snapchat Intentionally Marketed to and Designed Its Social Media Platform for
Minor Users and Has Substantially Contributed to the Youth Mental Health Crisis 35
24
a. Snap Designs and Markets Its Platform to Minors 37
25 b. Snap Intentionally Designs and Markets Exploitative Methods to Increase the Time
26 Users Spend on its Platform 38
c. Snapchat’s Algorithms Are Manipulative and Harmful 40
27
28
Complaint
i
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 3 of 107
d. Snap’s Conduct in Designing and Operating Its Platform Has Harmed Youth
1
Mental Health 41
2 3. TikTok Intentionally Marketed to and Designed Its Social Media Platform for
3 Minor Users and Has Substantially Contributed to the Youth Mental Health Crisis 43
a. TikTok’s Platform 43
4
b. TikTok Markets Its Platform to Minors 44
5
c. TikTok Intentionally Maximises the Time Users Spend on its Platform 47
6
d. TikTok’s Algorithms are Manipulative 48
7 e. TikTok’s Conduct in Designing and Operating its Platform Has Harmed The Mental
8 Health of Minors 50
9 4. YouTube Intentionally Marketed to and Designed Its Social Media Platform for
Minor Users, Substantially Contributing to the Mental Health Crisis 57
10
a. The YouTube Platform 57
11 b. YouTube Markets Its Platform to Minors 58
12 c. YouTube Intentionally Maximizes the Time Users Spend on its Platform 60
13 d. YouTube’s Algorithms are Harmful and Manipulative 61
ii
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 4 of 107
1
2 TABLE OF AUTHORITIES
3
STATUTES
4
5 18 U.S.C. § 1341......................................................................................................... 100, 102, 105
12 28 U.S.C. § 1391............................................................................................................................. 8
13
CASES
14
Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1100–01 (9th Cir. 2009), ............................................ 76
15
16 Lemmon v. Snap, Inc., 995 F.3d 1085, 1091 (9th Cir. 2021) .................................................... 76
17 Malwarebytes Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13 (2020)......................... 77
18
OTHER
19
Restatement (Second) of Torts § 581 (Am. Law Inst. 1977).................................................... 77
20
21
22
23
24
25
26
27
28
Complaint
iii
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 5 of 107
1 I. INTRODUCTION
2 A. Defendants’ Role in the Youth Mental Health Crisis
3 1. American youth are facing possibly the most severe mental health crisis in history.
4 Along with the benefits of the rise of technology, also comes serious consequences. The major
5 social media platforms including Facebook, Snapchat, Instagram, TikTok and YouTube have
6 spent millions to develop and market their products to minors, keeping them coming back for
7 more, and significantly contributing to this mental health crisis.
8 2. Meta Platforms, Inc., Facebook Holdings, LLC, Facebook Operations, LLC, Meta
9 Payments Inc., Meta Platforms Technologies, LLC, Instagram, LLC, Siculus, Inc., Snap Inc.,
10 TikTok Inc., ByteDance Inc., Alphabet Inc., Google LLC, XXVI Holdings Inc., Whatsapp, Inc.,
11 and YouTube, LLC (hereinafter, “Defendants”) design, market, promote, and operate social
12 media platforms. Over the past decade, each has grown their respective platforms exponentially,
13 from millions to billions of users. Defendants have not only increased their users, but also
14 enhanced the frequency of use of their platforms. Across the country, including in Plaintiffs’
15 district, the youth mental health crisis has seen a sharp increase due to excessive use of
16 Defendants’ platforms. More minors are struggling with mental health than ever before, with
17 suicide now the second leading cause of death for American minors.
18 3. Defendants have engaged in the aforementioned acts for profit. Their business
19 models are based on advertisements. Defendants’ users expend more time using Defendant’s
20 platforms, allowing Defendants to sell more advertisements and increase their profits
21 exponentially.
22 4. Minors are central to Defendants’ business models. Minors are more likely than not
23 to have access to a cell phone and they gain access to social media. As one Defendant put it,
24 “los[ing] the teen foothold in the U.S.[,]” would mean “los[ing] the pipeline” for growth.1
25 5. Defendants have maximized the time users—particularly minors—spend on their
26
27
1
Sheera Frenkel et al., Instagram Struggles with Fears of Losing Its ‘Pipeline’: Young Users, N.Y. Times (Oct. 26,
2021), https://www.nytimes.com/2021/10/16/technology/instagram-teens.html.
28
Complaint
2
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 6 of 107
1 platforms by purposely designing, refining, and operating them to exploit the neurophysiology of
2 the brain’s reward systems to ensure users come back frequently, and remain on the respective
3 platforms for as much time as possible.
4 6. Minors are particularly susceptible to Defendants’ manipulative conduct in that their
5 brains are not fully developed. Consequently, the minors lack the same emotional maturity,
6 impulse control, and psychological resiliency as adult users.
7 7. Defendants have successfully exploited the vulnerable brains of minors, causing millions of
8 students across the United States, including in Plaintiffs’ district, to become addicted to and
9 excessively using Defendants’ social media platforms. Furthermore, the content Defendants
10 direct to minors is many times harmful and exploitive (e.g., instigating vandalism, eating
11 disorders, or encouraging self-harm).
12 8. Defendants’ misconduct is a substantial factor resulting in a youth mental health crisis,
13 which has been marked by increasingly higher proportions of minors struggling with anxiety,
14 depression, thoughts of self-harm, and suicidal ideation.
15 9. The state of children’s mental health led the American Academy of Pediatrics, the
16 American Academy of Child and Adolescent Psychiatry, and the Children’s Hospital Association
17 to declare a national emergency, and the U.S. Surgeon General to issue an advisory “to highlight
18 the urgent need to address the nation’s youth mental health crisis.”2
19 10. The Centers for Disease Control and Prevention (“CDC”) highlighted this crisis in its
20 most recent bi-annual Youth Risk Behavior Survey report, which evidences a steady and then
21 accelerated increase in almost every category of risk between 2011 and 2021. The survey found
22 that there is an increased use and popularity of YouTube, TikTok, and Snap Defendants’
23 platforms during the same time period. The findings set forth in the report are that although the
24 pandemic added to stressors for depression from isolation, the crisis pre-existed the pandemic.
25 2
AAP-AACAP-CHA Declaration of a National Emergency in Child and Adolescent Mental Health, Am. Acad.
26 Pediatrics (Oct. 19, 2021), https://www.aap.org/en/advocacy/child-and-adolescent-healthy-mental-development/aap-
aacap-cha-declaration-of-a-national-emergency-in-child-and-adolescent-mental-health/ ; U.S.
27 Surgeon General Issues Advisory on Youth Mental Health Crisis Further Exposed by COVID-19 Pandemic, U.S.
Dep’t Health & Hum. Servs. (Dec. 6, 2021), https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-
28 health-advisory.pdf
Complaint
3
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 7 of 107
1 Use of the YouTube, TikTok, and Snap defendants’ platforms also increased during the
2 pandemic. Dr. Victor Fornari, the vice chair of child and adolescent psychiatry for Norwell
3 Health, which is New York’s largest health system, argues that there is “no question” of an
4 association between the use of social media and the dramatic increase in suicidal behavior and
5 depressive mood.3
6 11. Defendants’ algorithm’s perpetuate content that is harmful to children. “Kids are
7 now vulnerable to cyberbullying and critical comments, like ‘I hate you’, ‘Nobody likes you’…
8 It’s like harpoons to their heart every time.”4
9 12. Defendants’ social media platforms produce an even greater burden on an already
10 strained system as the youth mental health crisis continues getting worse, and Defendants’
11 continue to profit. “We don’t have enough therapists to care for all these kids.”5 In fact, the
12 number of teens and adolescents waiting in the emergency room for mental health treatment for
13 suicide nationwide has tripled from 2019 to 2021.6
14 13. President Joe Biden has also called attention to the harm social media has wrought
15 on youth and implored all to “hold social media platforms accountable for the national
16 experiment they’re conducting on our children for profit.7
17 14. Minors in Plaintiff’s school district, and others, are experiencing a similar health
18 crisis as observed nationally.
19 15. Students that experience anxiety, depression, and other mental health issues perform
20 worse in school, are less likely to attend school, more likely to engage in substance abuse, and to
21 act out, all of which directly affects Plaintiffs’ ability to function.
22 16. That is why 96 percent of school districts, including Plaintiff, provide mental health
23
3 Azeen Ghorayshi & Roni Caryn Rabin, Teen Girls Report Record Levels of Sadness, C.D.C. Finds, The New
24 York Times (Feb. 13, 2023), https://www.nytimes.com/2023/02/13/health/teen-girls-sadness-suicide-violence.html
(last visited Mar. 20, 2023).
25 4
Id.
5 Id.
26 6 Stephen Stock, Children Languish in Emergency Rooms Awaiting Mental Health Care, CBS News (Feb. 27,
4
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 8 of 107
1 services to its students. But Plaintiff requires a comprehensive, long-term plan along with
2 funding to drive a sustained reduction and abatement of the mental health crisis its minors are
3 experiencing caused by Defendants.
4 II. JURISDICATION AND VENUE
5 17. The Court has subject-matter jurisdiction over this case under 18 U.S.C. § 1964 and 28
6 U.S.C. § 1331 because the amount in controversy exceeds $75,000, and because this action
7 arises, in part, under the Racketeer Influenced and Corrupt Organizations Act (“RICO”).
8 18. The Court has personal jurisdiction over Defendants because they engage in business in
9 the Northern District of California and have sufficient minimum contacts with the District.
10 Defendants intentionally availed themselves of the markets in this State through deceptive and
11 misleading promotion, marketing, and operations of their platforms at issue in this lawsuit in
12 California, and by retaining significant profits and proceeds from these activities, to tender the
13 exercise of jurisdiction by this Court permissible under California law and the United States
14 Constitution.
15 19. Venue is appropriate in the Northern District of California pursuant to 28 U.S.C. §
16 1391 because Defendants have engaged in substantial business operations and marketing in this
17 District; because Defendants entered into relevant transactions and received substantial ill-gotten
18 gains and profits from consumers who reside in this District. In addition, Plaintiffs reside in and
19 were harmed by Defendants’ conduct in this District, and a substantial part of the events, acts
20 and omissions giving rise to this action occurred in this District.
21 III. PARTIES
22 A. Plaintiff
23 20. Plaintiff HANNA PUBLIC SCHOOLS (“Plaintiff” or “HANNA”) is a school district
24 serving 60 students and is comprised of 2 schools spanning grades PK-12, consisting of 1 PK-
25 eighth school, and 1 high school. Hanna Public Schools is located in Mcintosh County,
26 Oklahoma, with a population of approximately 18,500 residents.
27
28
Complaint
5
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 9 of 107
1 21. Plaintiffs allege that Defendants design, advertising, marketing, and operation of their
2 social media platforms and products target minors, and that these platforms are intentionally and
3 deliberately designed to exploit and cause minors to become addicted, which has caused the
4 harm to Plaintiffs alleged herein.
5 B. Facebook and Instagram Defendants
6 22. Defendant Meta Platforms, Inc. (“Meta”), formerly known as Facebook, Inc., is a
7 Delaware corporation with its principal place of business in Menlo Park, California.
8 23. Defendant Meta develops and maintains social media platforms, communication
9 platforms, and electronic devices that are widely available to users throughout the United States.
10 The platforms developed and maintained by Meta include Facebook (including its self-titled app,
11 Marketplace, and Workplace), Messenger (including Messenger Kids), Instagram, Whatsapp and
12 a line of electronic virtual reality devices and services called Meta Quest (collectively, “Meta
13 platforms”).
14 24. Meta transacts or has transacted business in this District and throughout the United
15 States. At all times material to this Complaint, acting alone or in concert with its subsidiaries
16 (identified below), Meta has advertised, marketed, and distributed the Meta platforms to
17 consumers throughout the United States. At all times material to this Complaint, Meta
18 formulated, directed, controlled, had the authority to control, or participated in the acts and
19 practices set forth in this Complaint.
20 25. Meta’s subsidiaries include Facebook Holdings, LLC; Facebook Operations, LLC;
21 Meta Payments Inc.; Meta Platforms Technologies, LLC; Instagram, LLC; Whatsapp, Inc. and
22 Siculus, Inc.
23 26. Defendants Facebook Holdings, LLC (“Facebook Holdings”) was organized
24 under the laws of the state of Delaware on March 11, 2020, and is a wholly owned subsidiary of
25 Meta Platforms, Inc. Facebook Holdings is primarily a holding company for entities involved in
26 Meta’s supporting and international endeavors, and its principal place of business is in Menlo
27 Park, California. Defendant Meta is the sole member of Facebook Holdings.
28
Complaint
6
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 10 of 107
7
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 11 of 107
8
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 12 of 107
1 transacts or has transacted business in this District and throughout the United States. At all times
2 material to this Complaint, acting alone or in concert with others, ByteDance has advertised,
3 marketed, and distributed the TikTok social media platform to consumers throughout the United
4 States. At all times material to this Complaint, acting alone or in concert with TikTok Inc.,
5 ByteDance formulated, directed, controlled, had the authority to control, or participated in the
6 acts and practices set forth in this Complaint.
7 E. YouTube Defendants (Alphabet Inc., XXVI Holdings, Google, and YouTube)
8 36. Defendant Alphabet Inc. is a Delaware corporation with its principal place of
9 business in Mountain View, California. Alphabet Inc. is the sole stockholder of XXVI Holdings
10 Inc.
11 37. Defendant XXVI Holdings Inc. is a Delaware corporation with its principal place
12 of business in Mountain View, California. XXVI Holdings, Inc. is a wholly owned subsidiary of
13 Alphabet Inc. and the managing member of Google LLC (“Google”).
14 38. Defendant Google is a limited liability company organized under the laws of the
15 state of Delaware, and its principal place of business is in Mountain View, California. Google
16 LLC is a wholly owned subsidiary of XXVI Holdings Inc., and the managing member of
17 YouTube, LLC. Google LLC transacts or has transacted business in this District and throughout
18 the United States. At all times material to this Complaint, acting alone or in concert with others,
19 Google LLC has advertised, marketed, and distributed its YouTube video sharing platform to
20 consumers throughout the United States. At all times material to this Complaint, acting alone or
21 in concert with YouTube, LLC, Google LLC formulated, directed, controlled, had the authority
22 to control, or participated in the acts and practices set forth in this Complaint.
23 39. Defendant YouTube, LLC is a limited liability company organized under the laws
24 of the state of Delaware, and its principal place of business is in San Bruno, California.
25 YouTube, LLC is a wholly owned subsidiary of Google LLC. YouTube, LLC transacts or has
26 transacted business in this District and throughout the United States. At all times material to this
27 Complaint, acting alone or in concert with Defendant Google LLC, YouTube, LLC has
28
Complaint
9
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 13 of 107
1 advertised, marketed, and distributed its YouTube social media platform to consumers
2 throughout the United States. At all times material to this Complaint, acting alone or in concert
3 with Google LLC, YouTube, LLC formulated, directed, controlled, had the authority to control,
4 or participated in the acts and practices set forth in this Complaint.
5 IV. FACTUAL ALLEGATIONS
6 A. Millions of Minors Have Become Addicted to Social Media
7 40. According to a Harvard University study, the effect social media has on the brain
8 exploits “the same neural circuitry” as “gambling and recreational drugs to keep consumers
9 using their products as much as possible.”8
10 41. As described at length herein, each Defendant designed and marketed their exploitive
11 social media platform to be extremely popular among minors. Ninety percent of children ages
12 13–17 use social media.9 Younger children also regularly use social media. One study reported
13 38 percent of children ages 8–12 used social media in 2021.10 Other studies reveal numbers as
14 high as 49 percent of children ages 10–12 use social media, and 32 percent of children ages 7–9
15 use social media.11
16 42. The most popular of these platforms is YouTube. A vast majority, 95 percent, of
17 children ages 13-17 have used YouTube.12
18 43. As of July 2020, “TikTok classified more than a third of its 49 million daily
19 users in the United States as being 14 years old or younger[,]” and that likely underestimates
20 those under 14 and older teenagers (i.e., those between 15 and 18 years old) because TikTok
21
8 Addiction Center, Social Media Addiction, https://www.addictioncenter.com/drugs/social-media-addiction/
22 (last visited Mar. 30, 2022)
23 9 Social Media and Teens, Am. Acad. Child & Adolescent Psychiatry (Mar. 2018), 23
https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Families/FFF-Guide/Social-Media-and-Teens-
24 100.aspx. (last visited Mar. 30, 2023)
10
Victoria Rideout et al., The Common Sense Census: Media Use by Tweens and Teens, 2021 at 5, Common Sense
25 Media (2022), https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-
final-web_0.pdf. (last visited Mar. 30, 2023)
26 11
Sharing Too Soon? Children and Social Media Apps, C.S. Mott Child.’s Hosp. Univ. Mich. Health (Oct.
18,2021), https://mottpoll.org/reports/sharing-too-soon-children-and-social-media-apps
27 12 Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10,
28 2022),https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
Complaint
10
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 14 of 107
1 claims not to know how old a third of its daily users are.13 TikTok is now the second most
2 popular social media platform with over 67 percent of children ages 13–17 having used the
3 app.14
4 44. Instagram also is wildly popular amongst minors, with 62 percent of children ages
5 13–17 reporting they have used the platform.15
6 45. Snapchat also is popular with minors, with 59 percent of children ages 13–17
7 reporting they have used the platform. 16
8 46. Facebook is among the five most popular social media platforms, with 32
9 percent of children ages 13–17 reporting they have used the Facebook platform.17
10 47. Teenagers who use these social media platforms are also likely to use them every
11 day. One study estimates that 62 percent of children ages 13–18 use social media every day.18 An
12 increasing number of younger children also use social media daily with 18 percent of children
13 ages 8–12 reporting using a social media site at least once a day.19
14 48. In fact, another study found that some teenagers never stop looking at social media.20
15 49. Almost 20 percent of teens use YouTube almost constantly.21 TikTok and
16 Snapchat are close behind, with near constant use rates among teens at 16 percent and 15 percent
17 respectively.22 Meanwhile, 10 percent of teens use Instagram almost constantly.23 And two
18 percent of teens report using Facebook almost constantly.24
19 13
Raymond Zhong & Sheera Frenkel, A Third of TikTok’s U.S. Users May Be 14 or Under, Raising Safety
Questions, N.Y. Times (Sept. 17, 2020), https://www.nytimes.com/2020/08/14/technology/tiktok-underage-users-
20 ftc.html.
14
12 Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10,
21 2022),https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
15
Id.
22 16
Id.
17
Id.
23 18
Victoria Rideout et al., The Common Sense Census: Media Use by Tweens and Teens, 2021 at 4, Common Sense
Media (2022), https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-
24
final-web_0.pdf.
19
25 17 Id. at 5.
20
19 Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022),
26 https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
21
Id.
22
27 Id.
23
Id.
24
28 Id.
Complaint
11
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 15 of 107
1 50. Teenagers are aware of the hold social media has on their lives, but they cannot stop
2 using it. Thirty-six percent of teenagers admit they spend too much time on social media.25
3 And over half of teens say that giving up social media would be somewhat hard, with nearly one-
4 in-five teens saying giving up social media would be very hard.26 And of the subgroup of
5 teenagers who use at least one platform “almost constantly,” 71 percent said giving up social
6 media would be hard, with 32 percent saying giving up social media would be very hard.27
7 51. Due to the excessive overuse of Defendants’ platforms, minors become accustomed
8 to and even addicted to checking them. Teenagers who characterize themselves as spending too
9 much time on social media are almost twice as likely to say that giving up social media would be
10 hard, as teens who see their social media usage as about right.28
11 52. Another study shows that among teenagers who regularly use social media, 32
12 percent “wouldn’t want to live without” YouTube.29 Twenty percent of teenagers said the same
13 about Snapchat; 13 percent said the same about both TikTok and Instagram; and 6 percent said
14 the same about Facebook.30
15 53. Despite using social media frequently, most minors do not enjoy it. In a study
16 conducted using data collected since 2015, Only 27 percent of boys and 42 percent of girls ages
17 8–18 reported enjoying social media “a lot” in 2021.31
18 54. A University of Michigan Mott Poll conducted in October, 2021 suggested that
19 schools should be an integral part of addressing overuse of social media and unsafe use—placing
20 a financial burden on schools.32
21 B. Research Has Confirmed Using Social Media Harms Minors
22 55. Social media use—especially excessive use—has severe and wide-ranging effects
23 25 Id.
26
Id.
24 27
Id.
28
25 Id.
29
Victoria Rideout et al., Common Sense Census: Media use by tweens and teens, 2021 at 31, Common Sense
26 Media (2022), 00.
30
Id.
31
27 Id. at 34.
32
C.S. Mott Children’s Hospital, Univ. of Mich. Health, Sharing Too Soon? Children and Social Media Apps (Oct.
28 18, 2021), https://mottpoll.org/reports/sharing-too-soon-children-and-social-media-apps (last visited Mar. 21, 2023).
Complaint
12
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 16 of 107
1 on youth mental health. Social media use is linked to increases in mental, emotional,
2 developmental, and behavior disorders. Defendants are aware of this, as independent research
3 and internal data from Defendants’ platforms show social media has a direct negative impact on
4 teenagers’ mental health on several fronts.
5 56. In general, electric screen use causes lower psychological well-being.33 This
6 link is especially evident among adolescents. Those with high screen time are twice as likely to
7 receive diagnoses of depression, anxiety, or need treatment for mental or behavior health
8 conditions compared to low screen time users.34
9 52. Increased social media use increases depressive symptoms, suicide-related
10 outcomes, and suicide rates among adolescents.35 One reason this is true is because it encourages
11 unhealthy social comparison and feedback seeking behaviors.36 Because adolescents spend a
12 majority of their time on social media looking at other users’ profiles and photos, they are likely
13 to engage in negative comparisons with their peers.37 Specifically, adolescents are likely to
14 engage in harmful upward comparisons with others they perceive to be more popular.38
15 53. Clinicians have also observed a clear relationship between social media use
16
17
18 33
Jean M. Twenge & W. Keith Campbell, Associations between screen time and lower psychological well-being
among children and adolescents: Evidence from a population-based study, 12 Prev. Med. Rep. 271–83 (2018),
19 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6214874/; Ariel Shensa et al., Social Media Use and Depression
and Anxiety Symptoms: A Cluster Analysis, 42(2) Am. J. Health Behav. 116–28 (2018), 18
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5904786/.
20
34
Jean M. Twenge & W. Keith Campbell, Associations between screen time and lower psychological well-being
21 among children and adolescents: Evidence from a population-based study, 12 Prev. Med. Rep. 271–83 (2018),
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6214874/.
22 35 Jean M. Twenge et al., Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among
U.S. Adolescents After 2010 and Links to Increased New Media Screen Time, 6 Clinical Psych. Sci. 3–17 (2017),
23 https://doi.org/10.1177/2167702617723376.
36 Jacqueline Nesi & Mitchell J Prinstein, Using Social Media for Social Comparison and Feedback-Seeking:
24 Gender and Popularity Moderate Associations with Depressive Symptoms, 43 J. Abnormal Child Psych. 1427–38
(2015), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5985443/.
25 37 Id.; see also Nino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation
model of problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022),
26 https://doi.org/10.1186/s40359-022-00990-7 (explaining that youth are particularly vulnerable because they “use
social networking sites for construing their identity, developing a sense of belonging, and for comparison with
27
others”).
38 Id.
28
Complaint
13
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 17 of 107
1 and disordered eating behavior in minors.39 The more social media accounts an adolescent has,
2 the greater disordered eating behaviors they exhibit. Additionally, research shows the more time
3 young girls spend on social media platforms, such as Instagram and Snapchat, the more likely
4 they are to develop disordered eating behaviors.40
5 54. Social media use has also caused an increase in cyberbullying. The more time an
6 individual, especially males, spend on social media, the more likely they are to commit acts of
7 cyberbullying.41 Cyberbullying is now so common that most American teens, 59 percent, have
8 experienced some form of cyberbullying.42 This number includes 42 percent of teens
9 experiencing name calling; 32 percent being subject to false rumors; 25 percent receiving an
10 unsolicited explicit image; 21 percent being subject to online stalking; 16 percent receiving
11 physical threats online; and 7 percent having had explicit images of them shared without their
12 consent.43
13 55. Social media use also contributes to sleep deprivation. Young adults who spend
14 excessive time on social media during the day or check it frequently throughout the week are
15 more likely to suffer sleep disturbances than their peers who use social media infrequently.44 In
16 turn, disturbed and insufficient sleep is associated with poor health outcomes.45
17 56. Defendants exacerbate the disruption of sleep by sending push notifications and emails
18 either at night when children should be sleeping or during school hours when they should
19
20
21 39 Simon M. Wilksch et al., The relationship between social media use and disordered eating in young adolescents,
53 Int’l J. Eating Disorders 96–106 (2020), https://pubmed.ncbi.nlm.nih.gov/31797420/.
22 40 Id.
41 Amanda Giordano et al., Understanding Adolescent Cyberbullies: Exploring Social Media Addiction and
23 Psychological Factors, 7(1) J. Child & Adolescent Counseling 42–55 (2021),
https://www.tandfonline.com/doi/abs/10.1080/23727810.2020.1835420?journalCode=ucac20.
24 42 Monica Anderson, A Majority of Teens Have Experienced Some Form of Cyberbullying, Pew Rsch. Ctr. (Sept.
14
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 18 of 107
problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022),
26 https://doi.org/10.1186/s40359-022-00990-7.
48 Id.
27 49
Id.
50 U.S. Surgeon General Issues Advisory on Youth Mental Health Crisis Further Exposed by COVID-19 Pandemic,
28
Complaint
15
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 19 of 107
1 64. On December 7, 2021, the United States Surgeon General issued an advisory on the
2 youth mental health crisis.51 In issuing the advisory, the Surgeon General noted, “[m]ental health
3 challenges in children, adolescents, and young adults are real and widespread. Even before the
4 pandemic, an alarming number of young people struggled with feelings of helplessness,
5 depression, and thoughts of suicide — and rates have increased over the past decade.”52
6 65. While the report highlights ways in which the COVID-19 pandemic has
7 exacerbated mental health issues for American youth, it also highlights the mental health
8 challenges youth faced before the pandemic. Specifically, the report notes that before the
9 pandemic, “mental health challenges were the leading cause of disability and poor life outcomes
10 in young people.”53
11 66. Before the pandemic, one-in-five children ages 3–17 in the United States had a
12 mental, emotional, developmental, or behavior disorder.54
13 67. From 2009–19, the rate of high school students who reported persistent feelings
14 of sadness or hopelessness increased by 40 percent (to one out of every three kids).55 The share
15 of kids seriously considering attempting suicide increased by 36 percent and the share creating a
16 suicide plan increased by 44 percent.56
17 68. From 2007 to 2019, suicide rates among youth ages 10–24 in the United States
18 increased by 57 percent.57 By 2018, suicide was the second leading cause of death for youth ages
19 10–24.58
20 69. From 2007 to 2016, emergency room visits for youth ages 5–17 rose 117 percent
21
U.S. Dep’t Health & Hum. Servs. (Dec. 6, 2021), https://www.hhs.gov/sites/default/files/surgeon-general-youth-
22 mental-health-advisory.pdf
51 Id.
23 52 Id.
53 Id.
24 54
Id.
55 Protecting Youth Mental Health: The U.S. Surgeon General’s Advisory at 8, U.S. Dep’t Health & Hum. Servs.
25 (Dec. 7, 2021), https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf.
56 Id.
26 57
Id.
58 AAP-AACAP-CHA Declaration of a National Emergency in Child and Adolescent Mental Health, Am. Acad.
27
Pediatrics (Oct. 19, 2021), https://www.aap.org/en/advocacy/child-and-adolescent-healthy-mental-
28 development/aap-aacap-cha-declaration-of-a-national-emergency-in-child-and-adolescent-mental-health/.
Complaint
16
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 20 of 107
1 for anxiety disorders, 44 percent for mood disorders, and 40 percent for attention disorders.59
2 70. This and other data led the American Academy of Pediatrics, the American
3 Academy of Child and Adolescent Psychiatry, and the Children’s Hospital Association to join
4 the Surgeon General and declare a national emergency in child and adolescent mental health.60
5 71. President Biden also addressed the mental health crisis Defendants’ platforms
6 have caused to minors in his state of the union address in 2022.61 In that address, he noted that
7 children were struggling from the harms of social media—even before the pandemic—and called
8 on all Americans to “hold social media platforms accountable for the national experiment they’re
9 conducting on our children for profit.”62
10
D. Defendants Intentionally Market to, Design, and Operate Their Social Media
11
Platforms for Users Who Are Minors
12
13 72. The mental health crisis minors are facing today is the direct result of the Defendants’
14 deliberate choices and affirmative actions to design and market their social media platforms to
15 attract minors.
16 73. Defendants each run and operate social media platforms. The interactive features
17 Defendants provide on their platforms are similar in many respects. For example, Facebook,
18 Instagram, Snap, TikTok, and YouTube all offer tailored “feeds” of content governed by
19 algorithms (also designed by Defendants) intended to learn the user’s interests; ways to publicly
20 express affirmation for such personalized content through “likes,” comments, and sharing or
21 reposting the content; and, each is known to copy the designs and features of one another.63
22
23 59 Matt Richtel, A Teen’s Journey Into the Internet’s Darkness and Back Again, N.Y. Times (Aug. 22, 2022),
https://www.nytimes.com/2022/08/22/health/adolescents-mental-health-technology.html.
24 60 AAP-AACAP-CHA Declaration of a National Emergency in Child and Adolescent Mental Health, Am. Acad.
17
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 21 of 107
1 74. Defendants’ use a tried and true method of profiting from their social media
2 platforms: by selling to advertisers. Defendants collect data on their users’ viewing habits and
3 behaviors, and they use that data to sell advertisers to promote their products. Advertisers pay to
4 target advertisements to specific categories of users, including minors.
5 75. Defendants view young, adolescent, and even pre-adolescent users as one of their
6 most valuable commodities, since they are the consumers of their advertisements. Young users
7 are integral to Defendants’ business model and advertising revenue, as children are more likely
8 than adults to use social media.
9 76. Defendants’ tactics are working, as 95 percent of children ages 13–17 have
10 cellphones,64 90 percent use social media,65 and 28 percent buy products and services through
11 social media.66
12 77. To profit from minors, Defendants intentionally market their platforms
13 to youths and adolescents. For children under 13, the Children’s Online Privacy Protection Act
14 (“COPPA”)67 regulates the conditions under which platforms, like Defendants can collect and
15 use their information.
16 78. COPPA requires platforms that either target children under age 13 or have actual
17 knowledge of users under age 13 to obtain “verifiable parental consent” prior to collecting and
18 using information about children under age 13.68 Defendants have violated COPPA by leaving
19 users to self-report their age, and having no safeguards in place to verify. Defendants doubled
20 down on profiting from pre-adolescent audiences by offering “kid versions” of their platforms
21 that, while not collecting and using their information, are “designed to fuel [kids’] interest in the
22
23
24 64 Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022),
https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
25 65 Social Media and Teens, Am. Acad. Child & Adolescent Psychiatry (Mar.
2018),https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Families/FFF-Guide/Social-Media-and-
26 Teens-100.aspx.
66 Erinn Murphy et al., Taking Stock with Teens, Fall 2021 at 13, Piper Sandler (2021), tinyurl.com/89ct4p88.
27 67 See 15 U.S.C. §§ 6501-6506.
68 Id.
28
Complaint
18
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 22 of 107
https://www.spiegel.de/international/zeitgeist/smartphone-addiction-is-part-of-the-design-a-1104237.html.
24 71 Id.
72 Gino Gugushvili et al., Facebook use intensity and depressive symptoms: a moderated mediation model of
25 problematic Facebook use, age, neuroticism, and extraversion at 3, BMC Psych. 10, 279 (2022),
https://doi.org/10.1186/s40359-022-00990-7.
26 73 Ernst Fehr & Simon Gächter, Fairness and Retaliation: The Economics of Reciprocity, 14(3) J. Econ. Persps.
19
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 23 of 107
1 82. Reciprocity is a known and widely used tactic of Defendants. It is why Facebook and
2 Snapchat automatically tell a “sender when you ‘saw’ their message, instead of letting you avoid
3 disclosing whether you read it. As a consequence, you feel more obligated to respond[,]”
4 immediately.74 That keeps users on the platform or, through push notifications—another
5 dangerous tool—users feel psychologically compelled to return to and use the platform.
6 83. Additionally, Defendants manipulate users to keep using or returning to their
7 platforms through the use of intermittent variable rewards (“IVR”). One commonly known
8 example of IVR is slot machines.75 Slot machines, like Defendants’ platforms, are designed to
9 provide an intermittent reward that varies in value. IVR produces a dopamine response in the
10 brain of the consumer, which in turn develops anticipation and craving. IVR is the fundamental
11 way that gambling creates addiction, and it is used by Defendants to promote the same desire in a
12 consumer’s brain to use their platforms.
13 84. Defendants use IVR heavily in the design and operations of their platforms by
14 “link[ing] a user’s action (like pulling a lever) with a variable reward.”76 For example, when
15 “we swipe down our finger to scroll the Instagram feed, we’re playing a slot machine to see what
16 photo comes next.”77 The platform also delays the time it takes to load the feed. “This is because
17 without that three-second delay, Instagram wouldn’t feel variable.”78 Without that delay, there
18 would be no time for users’ anticipation to build. In slot machine terms, there would be “no
19 sense of will I win? because you’d know instantly. So the delay isn’t the app loading. It’s the
20 cogs spinning on the slot machine.”79 Each of the Defendants’ platforms exploits this
21 biochemical reaction among its users, typically using “likes,” “hearts,” a thumbs up, or other
22 forms of approval that serve as the reward.
23 74 Von Tristan Harris, The Slot Machine in Your Pocket, Spiegel Int’l (July 27,
2016),https://www.spiegel.de/international/zeitgeist/smartphone-addiction-is-part-of-the-design-a-1104237.html.
24 75 See, e.g., Julian Morgans, The Secret Ways Social Media is Built for Addiction, Vice (May 17, 2017),
https://www.vice.com/en/article/vv5jkb/the-secret-ways-social-media-is-built-for-addiction.
25 76 Von Tristan Harris, The Slot Machine in Your Pocket, Spiegel Int’l (July 27, 2016),
https://www.spiegel.de/international/zeitgeist/smartphone-addiction-is-part-of-the-design-a-1104237.html.
26 77 Id.
78
27 Julian Morgans, The Secret Ways Social Media is Built for Addiction, Vice (May 17, 2017),
https://www.vice.com/en/article/vv5jkb/the-secret-ways-social-media-is-built-for-addiction.
79
28 Id.
Complaint
20
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 24 of 107
21
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 25 of 107
1 regular conversation, you don’t know if the other person liked it, or if anyone else liked it[.]”87
2 On Defendants’ platforms, kids, their friends, and even complete strangers can publicly deliver
3 or withhold social rewards in the form of likes, comments, views and follows.88
4 90. These social rewards release dopamine and oxytocin in the brains of children and
5 adults alike, but there are two key differences, as Chief Science Officer Prinstein explained:
6 “First, adults tend to have a fixed sense of self that relies less on feedback from peers. Second,
7 adults have a more mature prefrontal cortex, an area that can help regulate emotional responses
8 to social rewards.”89
9 91. Minors, by contrast, are in a “period of personal and social identity formation,” much
10 of which “is now reliant on social media.”90“Due to their limited capacity for self-regulation and
11 their vulnerability to peer pressure,” adolescents “are at greater risk of developing mental
12 disorder.”91
13 92. Together, Defendants have designed, promoted, marketed, and operated their social
14 media platforms to maximize the number of minors who use their platforms and the time they
15 spend on those platforms. Despite knowing that social media inflicts harms on children,
16 Defendants have continued to create more advanced and adapted versions of their platforms with
17 features curated to keep users engaged and maximize the amount of time they spend using their
18 platforms. Defendants’ conduct in designing and marketing exploitive and manipulative
19 platforms, minors spend excessive amounts of time on Defendants’ platforms.
20
21
22
23 87
Id.
88
Id.
24 89 Id.
90 Betul Keles et al., A systematic review: the influence of social media on depression, anxiety and psychological
25 distress in adolescents, Int’l J. Adolescence & Youth (202) 25:1, 79–93 (Mar. 3, 2019),
https://www.researchgate.net/publication/331947590_A_systematic_review_the_influence_of_social_media_on_d
26 epression_anxiety_and_psychological_distress_in_adolescents/fulltext/5c94432345851506d7223822/A-
systematic-review-the-influence-of-social-media-on-depression-anxiety-and-psychological-distress-in-
27
adolescents.pdf.
91 Id.
28
Complaint
22
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 26 of 107
1 93. Defendants’ campaigns and design were wildly successful. Most teenagers use the
2 same five social media platforms: YouTube, TikTok, Instagram, Snapchat, and Facebook.92 Each
3 of these platforms individually represents that they have a high numbers of teenage users.
4
1. Meta Intentionally Marketed to and Designed Their Social Media
5
Platforms for Minor Users, Substantially Contributing to the Mental Health
6 Crisis
8 94. The Meta platform, including Facebook and Instagram, are among the most popular
9 social networking platforms in the world, with more than 3.6 billion users worldwide. 93
10 (i) Facebook
12 96. Since its release in 2004, Facebook has become the largest social network in the
13 world. As of October 2021, Facebook had approximately 2.9 billion monthly active users,
15 97. When Facebook was first released, it was not widely used, initially. Only students at
16 certain colleges and universities could use the social media platform, and verification of college
19 universities around the world, after which Meta launched a high school version of Facebook that
20 also required an invitation to join. In the early stages, exclusivity rather than profit was the focus.
21 99. Meta slowly expanded eligibility for Facebook to add additional users to its network.
22 100. In September 2006, Facebook became available to all internet users. Meta initially
23 claimed that it was open only to persons aged 13 and older with a valid email address; however,
24 on information and belief, Meta did not require any verification of a user’s age or identity, and
25 92 Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022),
https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
26 93 Felix Richter, Meta Reaches 3.6 Billion People Each Month, Statista (Oct. 29, 2021),
https://www.statista.com/chart/2183/facebooks-mobile-users/.
27 94 See id.; S. Dixon, Number of Daily Active Facebook Users Worldwide as of 3rd Quarter 2022 (in Millions),
23
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 27 of 107
1 did not actually verify users’ email addresses, so underage users could easily register an account
2 with and access Facebook.
3 101. Facebook then underwent a series of changes aimed at increasing user engagement,
4 profits and platform growth, without regard to user safety, including the following:
5 (a) In 2009, Facebook launched the “like” button;
6 (b) In 2011, Facebook launched Messenger, its direct messaging service, and started
7 allowing people to subscribe to non-friends. In essence, Facebook Messenger can be used just
8 like texting;
9 (c) In 2012, advertisements appeared on Facebook news feed and a real-time bidding
10 system was launched through which advertisers could bid on users based on their visits to third-
11 party websites;
(d) In 2014, Facebook’s facial recognition algorithm (DeepFace) reached precise
12
accuracy in identifying faces;
13
(e) In 2015, Facebook made significant changes to its news feed algorithm to
14
determine what content to show users, and launched its live-streaming service;
15
(f) In 2016, Facebook launched games for its social media platform, so that users
16
could play games without having to leave the platform; and
17
(g) In 2017, Facebook launched Facebook Creator, an app for mobile video posts,
18 which assists with content creation.
19 (ii) Instagram
20 102. Instagram is a social media platform that launched in 2010, which Meta acquired for
21 $1 Billion in April, 2012.
22 103. Instagram allows users to share photos and videos with other users, and to view
23 other users' photos and videos. These photos and videos appear on users' Instagram "feeds,"
24 which are endless.
25 104. After being acquired by Meta, Instagram experienced huge user growth, from
26 approximately 10 million monthly active users in September 2012 to more than one billion
27
28
Complaint
24
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 28 of 107
1 monthly active users worldwide today, including approximately 160 million users in the United
2 States.95
3 105. The number of Instagram users continues to climb, and it has been projected to reach
4 nearly one-third of the world’s internet users by 2025. “In 2021, there were 1.21 billion monthly
5 active users of Meta’s Instagram, making up over 28 percent of the world’s internet users. By
6 2025, it has been forecast that there will be 1.44 billion monthly active users of the social media
7 platform, which would account for 31.2 percent of global internet users.”96
8 106. Instagram’s user growth was driven by design and development changes to the
9 Instagram platform that increased engagement at the expense of the health and well-being of
10 Instagram’s users—especially the children using the platform.
11 107. Instagram continued its addictive methods when in August 2020, Instagram began
12 promoting and recommending short videos to users, called Reels.97 Like TikTok, Instagram
13 allows users to view an endless feed of Reels that are recommended and curated to users by
14 Instagram’s algorithm.
15 108. Instagram has become the most popular photo sharing social media platform
16 among children in the United States. According to a Pew research study from 2021,
17 approximately 72 percent of children aged 13–17 in the United States use Instagram.98
18 b. Meta Targets Minors
19 109. To maximize the revenue generated from advertisers, Meta has expended
20 significant effort to attract minors, including teens and preteens, to its platforms by designing
21 features that appeal to them. Meta also views teenagers as a way to attract other potential users,
22 for example by using teenagers to recruit parents who want to participate in their children’s lives
23
24 95 S. Dixon, Number of Instagram Users Worldwide from 2020 to 2025 (in Billions), Statista (May 23, 2022),
https://www.statista.com/statistics/183585/instagram-number-of-global-users/.
25 96 S. Dixon, Number of Instagram Users Worldwide from 2020 to 2025 (in Billions), Statista (Feb. 15, 2023),
https://www.statista.com/statistics/183585/instagram-number-of-global-users/.
26 97 Introducing Instagram Reels, Instagram (Aug. 5, 2020),
https://about.instagram.com/blog/announcements/introducing-instagram-reels-announcement.
27 98 Katherine Schaeffer, 7 Facts About Americans and Instagram, Pew Rsch. Ctr. (Oct. 7, 2021),
28 https://www.pewresearch.org/fact-tank/2021/10/07/7-facts-about-americans-and-instagram/.
Complaint
25
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 29 of 107
1 as well as younger siblings who look to older siblings as models for which social media
2 platforms to use and how to use them.99
3 109. Meta explicitly targets minors. An internal Instagram marketing plan shows that
4 Meta is aware “[i]f we lose the teen foothold in the U.S. we lose the pipeline” for growth.100
5 To ensure that did not happen, Meta’s Instagram devoted almost all of its $390 million annual
6 marketing budget for 2018 to target teenagers.101
7 110. Meta also views preteens as a “valuable but untapped audience,”
8 even contemplating “exploring playdates as a growth lever.”102 Meta formed a team to study
9 preteens, designed more products designed for them, and focused their strategy on the “business
10 opportunities” created.103
11 111. The Meta platforms are designed to be used by children and are actively marketed to
12 children throughout the United States. Internal Meta documents establish that Meta spends
13 hundreds of millions of dollars researching, analyzing, and marketing to children to find ways to
14 make its platforms more appealing to these age groups and to maximize the time children spend
15 on its platforms, as these age groups are seen as essential to Meta’s long-term profitability and
16 market dominance.104 For instance, after Instagram’s founders left Meta in September 2018,
17 “Facebook went all out to turn Instagram into a main attraction for young audiences,” and
18 “began concentrating on the ‘teen time spent’ data point,” in order to “drive up the amount of
19 time that teenagers were on the app with features including Instagram Live, a broadcasting tool,
20 and Instagram TV, where people upload videos that run as long as an hour.”105
21
22 99 Sheera Frenkel et al., Instagram Struggles with Fears of Losing Its ‘Pipeline’: Young Users, N.Y. Times (Oct.
26, 2021), https://www.nytimes.com/2021/10/16/technology/instagram-teens.html.
23 100 Id.
101 Id.
24 102 Id.
103 Georgia Wells & Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents
25 Show; It has investigated how to engage young users in response to competition from Snapchat, TikTok;
'Exploring playdates as a growth lever, Wall St. J. (Sept. 28, 2021), https://www.wsj.com/articles/facebook-
26 instagram-kids-tweens-attract-11632849667 (Last visited Mar. 21, 2023).
104 Id.
27 105 Sheera Frenkel et al., Instagram Struggles with Fears of Losing Its ‘Pipeline’: Young Users, N.Y. Times (Oct.
26
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 30 of 107
1 112. Similarly, Instagram’s popularity among young people is the direct result of Meta’s
2 deliberate efforts to target children—which in turn is driven by the desire of advertisers and
3 marketers to target children on Meta’s platforms. In fact, Meta’s acquisition of Instagram was
4 primarily motivated by its desire to make up for declines in the use of Facebook by children, and
5 Meta views Instagram as central to its ability to attract and retain young audiences. A 2018
6 internal Meta marketing report exposes this, bemoaning the loss of teenage users to
7 competitors’ platforms as “an existential threat.”106 In contrast, a Meta presentation from 2019
8 indicated that “Instagram is well positioned to resonate and win with young people,” and “[t]here
9 is a path to growth if Instagram can continue their trajectory.”107
10 113. With respect to pre-teens, Meta’s policy is that they cannot register an account, but it
11 knowingly disregards and fails to enforce this policy. Since at least 2011, Meta has known that
12 its age-verification protocols are largely inadequate, estimating at that time that it removed
13 20,000 children under age 13 from Facebook every day.108 In 2021, Adam Mosseri, the Meta
14 executive in charge of Instagram, acknowledged users under 13 can still “lie about [their] age
15 now,” to register an account.109
16 114. Meta has yet to implement protocols to verify a users’ age, likely due to the fact it
17 has strong business incentives not to. Meta also has agreements with cell phone manufacturers
18 and/or providers and/or retailers, who often pre-install its platforms on mobile devices prior to
19 sale and without regard to the age of the intended user of each such device. That is, even though
20 Meta is prohibited from providing the Meta platforms to users under the age of 13, Meta
21 knowingly and actively promotes and provides underage users access to its platforms by
22 106 Id.
107 Georgia Wells et al., Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show; Its own
23 in-depth research shows a significant teen mental-health issue that Facebook plays down in public, Wall St. J.
(Sept. 14, 2021), https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-
24 documents-show-11631620739.
108 Austin Carr, Facebook Booting “20,000” Underage Users Per Day: Reaction to Growing Privacy Concerns?,
25 Fast Co. (Mar. 22, 2011), https://www.fastcompany.com/1741875/facebook-booting-20000-underage-users-day-
reaction-growing-privacy-concerns.
26 109 Georgia Wells & Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents
27 Show; It has investigated how to engage young users in response to competition from Snapchat, TikTok;
'Exploring playdates as a growth lever, Wall St. J. (Sept. 28, 2021), https://www.wsj.com/articles/facebook-
28 instagram-kids-tweens-attract-11632849667.
Complaint
27
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 31 of 107
1 encouraging and allowing cell phone manufacturers to pre-install the platforms on mobile
2 devices indiscriminately. Consequently, approximately 11 percent of United States children
3 between the ages of 9 and 11 used Instagram in 2020,110 despite Meta claiming to remove
4 approximately 600,000 underage users per quarter.111
5 115. Meta’s efforts to attract young users have been successful.
6 c. Meta Intentionally Maximizes the Times Users Spend on its Platforms
7 116. The Meta platforms are designed to maximize time spent on the platform, utilizing
8 features that exploit the natural human need for social interaction and the neurophysiology of the
9 brain’s reward systems to keep users endlessly scrolling, posting, “liking,” commenting, and
10 returning to the app to check engagement on their posts. Minors’ brains, which are in
11 developmental stages, are especially vulnerable to such misuse.
12 117. One of the ways in which Meta employs IVR is through its push notifications, which
13 promote habitual use and are designed to prompt users to open the app and be exposed to content
14 selected to maximize the use of Meta’s platforms. In particular, Meta purposefully delays
15 notifications of likes and comments into notifying in several bursts rather than notifying users in
16 real time, so as to create dopamine responses that leave users craving more and promoting
17 addiction. Meta’s push notifications are specifically designed to manipulate users into to
18 reengaging with Meta’s platforms to increase user engagement regardless of a user’s age.
19 118. Meta also exploits IVR to manipulate users with one of its most defining features:
20 the “Like” button. Meta is aware that “Likes” are a source of social comparison harm for many
21 users, as detailed below. Several former Meta employees involved in creating the Like button
22
23
24
25 110 Brooke Auxier et al., Parenting Children in the Age of Screens, Pew Rsch. Ctr. (July 28, 2020),
https://www.pewresearch.org/internet/2020/07/28/childrens-engagement-with-digital-devices-screen-time/.
26 111 Georgia Wells & Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents
27 Show; It has investigated how to engage young users in response to competition from Snapchat, TikTok;
'Exploring playdates as a growth lever, Wall St. J. (Sept. 28, 2021), https://www.wsj.com/articles/facebook-
28 instagram-kids-tweens-attract-11632849667.
Complaint
28
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 32 of 107
1 have spoken publicly about the manipulative nature of Meta’s platforms and the harm they cause
2 users.112
3 119. Additionally, Meta designed other features of its platforms on IVR principles,
4 such as posts, comments, tagging, and the “pull to refresh” feature, which enables a user to scroll
5 endlessly through content.
6 120. Other design decisions were motivated by social reciprocity, such as the use of visual
7 cues to reflect that someone is currently typing a response to a message, which keeps the user on
8 the platform longer awaiting the message response, and providing read receipts for messages.
9 121. The Meta platforms are designed to encourage users to post content and to interact
10 with other users’ posts. Each new post that appears on a user’s feed functions as a dopamine-
11 producing social interaction in the user’s brain. Similarly, likes, comments, and other interactions
12 with user’s posts function as an even stronger dopamine- producing stimulus than does seeing
13 new posts from other users. Thus, users are motivated to post content they expect will encourage
14 interaction. Meta has purposefully designed its platforms to essentially trap users (especially
15 minors) in endless cycles of “little dopamine loops.”113
16 d. Meta’s Algorithms Are Manipulative and Harmful
17 122. Meta designs and employs advanced algorithms and artificial intelligence to
18 keep its platforms as engaging and habit forming as possible. For example, the Meta
19 platforms display curated content and recommendations that are customized to each user
20 by using sophisticated algorithms. The proprietary services developed through such algorithms
21 include: News Feed (a feed of stories and posts published on the platform, some of which
22 are posted by connections/friends and others that are picked by Meta’s algorithms), People You
23 May Know (algorithm-based suggestions of persons or accounts), Suggested for You, and
24 Discover (algorithm-based recommendations). Such algorithm-based content and
25 112 See, e.g., Paul Lewis, “‘Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia”, The
Guardian (Oct. 6, 2017), https://www.theguardian.com/technology/2017/oct/05/smartphone-addiction-silicon-
26 valley-dystopia (last visited Mar. 23, 2023)
113 Allison Slater Tate, Facebook whistleblower Frances Haugen says parents make 1 big mistake with social
27
media, Today (Feb. 7, 2022), https://www.today.com/parents/teens/facebook-whistleblower-frances-haugen-
28 rcna15256 (last visited Mar. 30, 2023).
Complaint
29
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 33 of 107
1 recommendations are presented to each user while the user is on the platform, and through
2 notifications sent to the user’s smartphone and email address when the user is not on the
3 platform.
4 123. Meta’s algorithms are not based exclusively on user requests, or even user inputs.
5 Meta’s algorithms combine information entered or posted by the user on the platform with the
6 user’s demographics and other data points collected and analyzed by Meta, make assumptions
7 about that user’s interests and preferences, make predictions about what else might appeal to the
8 user, and then make very specific recommendations of posts and pages to view and groups to
9 visit and join based on rankings that will optimize Meta’s key performance indicators. Meta’s
10 design dictates the way content is presented, such as its ranking and prioritization.114
11 124. Meta’s current use of algorithms in its platforms is driven and designed to
12 maximize user engagement. Recently, Meta has transitioned away from chronological
13 ranking, which organized the interface according to when content was posted or sent, to
14 prioritize Meaningful Social Interactions (“MSI”), which emphasizes users’ connections and
15 interactions such as likes and comments and gives greater significance to the interactions of
16 connections that appeared to be the closest to users. Meta developed and employed an
17 “amplification algorithm” to execute engagement-based ranking, which considers a post’s likes,
18 shares, and comments, as well as a user’s past interactions with similar content, and shows the
19 post in the user’s newsfeed if it otherwise meets certain benchmarks.
20 125. Meta’s algorithms secretly operate on the principle that intense reactions compel
21 attention. Because these algorithms measure reactions and contemporaneously immerse users in
22 the most reactive content, these algorithms effectively work to steer users toward the most
23 negative content, because negative content routinely elicits passionate reactions.
24 126. Due to its focus on user engagement, Meta’s algorithms promote content that is
25 objectionable and harmful to many users, including minors. Meta was very aware of the harmful
26 content that it was promoting, yet failed to change its algorithms because the inflammatory
27 114 See, e.g., Adam Mosseri, Shedding More Light on How Instagram Works, Instagram (June 8, 2021),
28 https://about.instagram.com/blog/announcements/shedding-more-light-on-how-instagram-works
Complaint
30
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 34 of 107
1 content that its algorithms were feeding to users fueled their return to the platforms and led to
2 more engagement—which in turn helped Meta sell more advertisements, and more profit.
3 Meta’s algorithms promote harmful content because such content increases user engagement,
4 which thereby increases its appeal to advertisers and increases its overall value and profitability.
5 127. Meta’s shift from chronological ranking to algorithm-driven content and
6 recommendations has changed the Meta platforms in ways that are dangerous and
7 harmful to children, whose psychological susceptibility to habit-forming platforms put them at
8 greater risk of harm from the Meta platforms’ exploitive and harmful features. In this regard, the
9 algorithms used by Meta’s platforms exploit child users’ diminished decision-making capacity,
10 impulse control, emotional maturity, and psychological resiliency caused by users’ incomplete
11 brain development—and Meta specifically designs its platforms with these vulnerabilities in
12 mind.
13 e. Facebook’s and Instagram’s Harmful “Feeds”
14 128. Facebook and Instagram feature a primary component which is promoting to each
15 user a “feed” that is generated by an algorithm for that user, which consists of a series of photos,
16 videos, and text posts posted by accounts that the user follows, along with advertising and
17 content specifically selected by algorithms and promoted by Meta.
18 129. These feeds are endless lists of content that encourage users to scroll continuously
19 without any natural end points, thus making it less likely the user would leave the app. In this
20 regard, “[u]nlike a magazine, television show, or video game,” the Meta platforms only rarely
21 prompt their users to take a break by using “stopping cues.”115 Meta’s “bottomless scrolling”
22 feature is designed to encourages users to use its platforms for unlimited periods of time.
23 130. Meta also controls a user’s feed through certain ranking mechanisms, escalation
24 loops, and promotion of advertising and content specifically selected and promoted by Meta
25 based on, among other things, its ongoing planning, assessment, and prioritization of the types of
26 information most likely to increase user engagement.
27 115 See Zara Abrams, How Can We Minimize Instagram’s Harmful Effects?, Am. Psych. Ass’n (Dec. 2, 2021),
28 https://www.apa.org/monitor/2022/03/feature-minimize-instagram-effects.
Complaint
31
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 35 of 107
1 131. As described above, the algorithms generating a user’s feed encourage excessive
2 use and promote harmful content, particularly where the algorithm is designed to prioritize the
3 number of interactions rather than the quality of interactions.
4 132. Meta utilizes private information of its child users to “precisely target [them] with
5 content and recommendations, assessing what will provoke a reaction,” including encouragement
6 of “destructive and dangerous behaviors,” which is how Meta “can push teens into darker and
7 darker places.”116 As such, Meta’s “amplification algorithms, things like engagement based
8 ranking . . . can lead children . . . all the way from just something innocent like healthy recipes to
9 anorexia promoting content over a very short period of time.”117 Meta thus specifically selects
10 and pushes this harmful content on its platforms, for which it is then paid, and does so both for
11 direct profit and also to increase user engagement, resulting in additional profits down the road.
12 133. Meta’s Instagram platform features a feed of “Stories,” which are short-lived
13 photo or video posts that are accessible only for 24 hours. This feature encourages constant,
14 repeated, and compulsive use of Instagram, so that users do not miss out on content before it
15 disappears. As with other feeds, the presentation of content in a user’s Stories is generated by an
16 algorithm designed by Meta to maximize the amount of time a user spends on the app.
17 134. Instagram also features a feed called “Explore,” which displays content posted by
18 users not previously “followed.” The content in “Explore” is selected and presented by an
19 algorithm designed by Meta to maximize the amount of time a user spends on the app. As with
20 other feeds, the Explore feature may be scrolled endlessly, and its algorithm will continually
21 generate new recommendations, encouraging users to use the app for unlimited periods of time.
22 135. Instagram features a feed called “Reels,” which presents short video posts by
23 users not previously followed. These videos play automatically, without input from the user,
24 encouraging the user to stay on the app for indefinite periods of time. As with other feeds, Reels
25 content is selected and presented by an algorithm designed by Meta to maximize the amount of
26 116 See Facebook Whistleblower Frances Haugen Testifies on Children & Social Media Use: Full Senate Hearing
Transcript at 09:02, Rev (Oct. 5, 2021), https://www.rev.com/blog/transcripts/facebook-whistleblower-frances-
27
haugen-testifies-on-children-social-media-use-full-senate-hearing-transcript
117 Id. at 37:34 (statement by Ms. Frances Haugen).
28
Complaint
32
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 36 of 107
33
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 37 of 107
1 researchers have repeatedly found that Instagram is harmful for a sizable percentage of teens that
2 use the platform.122
3 140. In particular, the researchers found that “[s]ocial comparison,” or peoples’
4 assessment of their own value relative to that of others, is “worse on Instagram” for teens than on
5 other social media platforms.123 One in five teens reported that Instagram “makes them feel
6 worse about themselves.”124 Roughly two in five teen users reported feeling “unattractive,” while
7 one in 10 teen users reporting suicidal thoughts traced them to Instagram.125 Teens “consistently”
8 and without prompting blamed Instagram “for increases in the rate of anxiety and depression.”126
9 And although teenagers identify Instagram as a source of psychological harm, they often lack the
10 self-control to use Instagram less. Also, according to Meta’s own researchers, young users are
11 not capable of controlling their Instagram use to protect their own health.127 Such users “often
12 feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to
13 stop themselves.”128
14 141. Similarly, in a March 2020 presentation posted to Meta’s internal message board,
15 researchers found that “[t]hirty-two percent of teen girls said that when they felt bad about their
16 bodies, Instagram made them feel worse.” 129 Sixty-six percent of teen girls and 40 percent of
17 teen boys have experienced negative social comparison harms on Instagram.130Further,
18 approximately 13 percent of teen-girl Instagram users say the platform makes thoughts of
19
122 Id.
20 123
Id.
124
Id.
21 125
Id.
126
Id.
22 127
Id.
128
Id.
23 129 Id.; See also Teen Girls Body Image and Social Comparison on Instagram—An Exploratory Study in the U.S.,
Wall St. J. (Sept. 29, 2021), https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-
24
comparison-on-instagram.pdf; see also Hard Life Moments-Mental Health Deep Dive at 14, Facebook (Nov. 2019),
25 https://about.fb.com/wp-content/uploads/2021/09/Instagram-Teen-Annotated-Research-Deck-1.pdf; Paul Marsden,
The ‘Facebook Files’ on Instagram harms – all leaked slides on a single page at slide 14, Dig. Wellbeing (Oct. 20,
26 2021) https://digitalwellbeing.org/the-facebook-files-on-instagram-harms-all-leaked-slides-on-a-single-page (hard
life moment – mental health deep dive)
130
27 Teen Girls Body Image and Social Comparison on Instagram—An Exploratory Study in the U.S. at 9, Wall St. J.
(Sept. 29, 2021), https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-
28 instagram.pdf.
Complaint
34
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 38 of 107
1 “suicide and self harm” worse, and 17 percent of teen-girl Instagram users say the platform
2 makes “[e]ating issues” worse.131 Meta’s researchers also acknowledged that “[m]ental health
3 outcomes” related to the use of Instagram “can be severe,” including “Body Dissatisfaction,”
4 “Body Dysmorphia,” “Eating Disorders,” “Loneliness,” and “Depression.”132
5 142. The leaked documents show that Meta is aware of the harmful nature of its
6 platforms, and the specific design features that lead to excessive use and harm to children. For
7 instance, Meta knows that Instagram’s Explore, Feed, and Stories features contribute to social
8 comparison harms “in different ways.” Moreover, specific “[a]spects of Instagram exacerbate
9 each other to create a perfect storm” of 133harm to users, and that the “social comparison sweet
10 spot”—a place of considerable harm to users, particularly teenagers and teen girls—lies at the
11 center of Meta’s model and platforms’ features.134 In this regard, Meta’s researchers wrote that
12 “[s]ocial comparison and perfectionism are nothing new, but Instagram is ‘the reason’ why there
13 are higher levels of anxiety and depression in young people.”135
2. Snapchat Intentionally Marketed to and Designed Its Social Media Platform for
14
Minor Users and Has Substantially Contributed to the Youth Mental Health
15 Crisis
16 143. Snapchat was created in 2011 by Stanford University students Evan Spiegel and
17 Bobby Murphy, who serve as Snap Inc.’s CEO and CTO respectively.136
18 144. Snapchat started as a photo sharing platform that allows users to form groups and
19 share photos, known as “snaps,” that disappear after being viewed by the recipients. Snapchat
20 became well known for this self-destructing content feature. But Snapchat quickly evolved from
21
131 Hard Life Moments-Mental Health Deep Dive at 14, Facebook (Nov. 2019), https://about.fb.com/wp-
22 content/uploads/2021/09/Instagram-Teen-Annotated-Research-Deck-1.pdf; Paul Marsden, The Facebook Files’ on
Instagram arms – all leaked slides on a single page age at slide 14, Dig. Wellbeing (Oct. 20, 2021),
23 https://digitalwellbeing.org/the-facebook-files-on-instagram-harms-all-leaked-slides-on-a-single-page
132 Teen Girls Body Image and Social Comparison on Instagram—An Exploratory Study in the U.S. at 34, Wall St.
24 J. (Sept. 29, 2021), https://s.wsj.net/public/resources/documents/teen-girls-body-image-and-social-comparison-on-
instagram.pdf.
25 133 Id. at 31.
134
Id. at 31.
26 135 See Hard Life Moments-Mental Health Deep Dive at 53, Facebook (Nov. 2019), https://about.fb.com/wp-
content/uploads/2021/09/Instagram-Teen-Annotated-Research-Deck-1.pdf.
27 136 Katie Benner, How Snapchat is Shaping Social Media, N.Y. Times (Nov. 30, 2016),
28 https://www.nytimes.com/2016/11/30/technology/how-snapchat-works.html.
Complaint
35
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 39 of 107
1 only functioning as a photo-sharing app, as Snap made design changes and rapidly developed
2 new features targeting teens, which ultimately increased Snapchat’s popularity among minors.
3 145. In 2012, Snap added video sharing capabilities, pushing the number of “snaps” to
4 50 million per day.137 A year later, Snap added the “Stories” function, which allows users to
5 upload a rolling compilation of snaps that the user’s connections can view for 24 hours.138 The
6 following year, Snap added a feature that enabled users to communicate with one another in real
7 time “via text or video.139 It also added the “Our Story” feature, expanding on the original stories
8 function by allowing users in the same location to add their photos and videos to a single
9 publicly viewable content stream.140 Snap also gave users the capability to add filters and graphic
10 stickers onto photos showing a user’s location, through a feature it refers to as “Geofilters.”141
11 146. In 2015, Snap added a “Discover” feature that promotes videos from news outlets
12 and other content creators.142 Users can view content by scrolling through the Discover
13 feed. After the selected video ends, Snapchat automatically plays other video content in a
14 never ending stream until a user manually exits the stream.
15 147. In 2020, Snap added the “Spotlight” feature through which it serves users “an
16 endless feed of user-generated content” Snap curates from the 300 million daily Snapchat
17 users.143
18 148. Today Snapchat is one of the largest social media platforms in the world. By its
19 own estimates, Snapchat has 363 million daily users, including 100 million daily users in North
20
137 J.J. Colao, Snapchat Adds Video, Now Seeing 50 Million Photos A Day, Forbes (Dec. 14, 2012),
21 https://www.forbes.com/sites/jjcolao/2012/12/14/snapchat-adds-video-now-seeing-50-million-photos-a-
day/?sh=55425197631b.
22 138 Ellis Hamburger, Snapchat’s Next Big Thing: ‘Stories’ That Don’t Just Disappear, Verge (Oct. 3, 2013),
https://www.theverge.com/2013/10/3/4791934/snapchats-next-big-thing-stories-that-dont-just-disappear.
23 139 Romain Dillet, Snapchat Adds Ephemeral Text Chat and Video Calls, TechCrunch (May 1, 2014),
https://techcrunch.com/2014/05/01/snapchat-adds-text-chat-and-video-calls/.
24 140 Laura Stampler, Snapchat Just Unveiled a New Feature, Time (June 17, 2014),
https://time.com/2890073/snapchat-new-feature/.
25 141 Angela Moscaritolo, Snapchat Adds ‘Geofilters’ in LA, New York, PC Mag. (July 15, 2014)
https://www.pcmag.com/news/snapchat-adds-geofilters-in-la-new-york.
26 142 Steven Tweedie, How to Use Snapchat’s New ‘Discover’ Feature, Bus. Insider (Jan. 27, 2015),
https://www.businessinsider.com/how-to-use-snapchat-discover-feature-2015-1.
27 143 Salvador Rodriguez, Snap is launching a competitor to TikTok and Instagram Reels, CNBC (Nov. 23, 2020),
28 https://www.cnbc.com/2020/11/23/snap-launching-a-competitor-to-tiktok-and-instagram-reels.html.
Complaint
36
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 40 of 107
1 America.144 Snapchat reaches 90 percent of people ages 13–24 in over twenty countries and
2 reaches nearly half of all smartphone users in the United States.145
3 a. Snap Designs and Markets Its Platform to Minors
4 149. Snapchat’s commercial success is due to its advertising. In 2014, Snap began
5 running advertisements on Snapchat.146 Since then, Snapchat’s business model has revolved
6 around its advertising revenue, which has boomed. Snap now expects to generate $4.86 billion in
7 Snapchat advertising revenue for 2022.147
8 150. Snap specifically markets Snapchat to children ages 13–17 because they are a key
9 demographic for Snap’s advertising business. Internal documents describe users between the
10 ages pf 13-34 as “critical” to Snap’s advertising success because of the common milestones
11 achieved within that age range.148
12 151. In addition to its marketing, Snap has targeted a younger audience by designing
13 Snapchat in a manner that older individuals find hard to use.149 The effect of this design is that
14 Snapchat is a platform where its young users are insulated from older users, including their
15 teachers and their parents. Snap is well aware of this model, as Snap’s CEO boasts, “[w]e’ve
16 made it very hard for parents to embarrass their children[.]”150
17 152. Snap also designed Snapchat in a way that enables minor users to hide content from
18 their parents by ensuring that photos, videos, and chat messages quickly disappear. This design
19 further insulates children from adult oversight.
20
144
21 October 2022 Investor Presentation at 5, Snap Inc. (Oct. 20, 2022), https://investor.snap.com/events-and-
presentations/presentations/default.aspx.
22 145 Id. at 6-7
146 Sara Fischer, A timeline of Snap’s advertising, from launch to IPO, Axios (Feb. 3, 2017),
23 https://www.axios.com/2017/12/15/a-timeline-of-snaps-advertising-from-launch-to-ipo-1513300279.
147 Bhanvi Staija, TikTok's ad revenue to surpass Twitter and Snapchat combined in 2022, Reuters (Apr. 11, 2022),
24 https://www.reuters.com/technology/tiktoks-ad-revenue-surpass-twitter-snapchat-combined-2022-report-2022-04-
11/.
25 148 October 2022 Investor Presentation at 27, Snap Inc. (Oct. 20, 2022), https://investor.snap.com/events-and-
presentations/presentations/default.aspx.
26 149 See Hannah Kuchler & Tim Bradshaw, Snapchat’s Youth Appeal Puts Pressure on Facebook, Fin. Times (Aug.
28 https://www.bloomberg.com/features/2016-how-snapchat-built-a-business/.
Complaint
37
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 41 of 107
1 153. Moreover, Snap added as a feature the ability for users to create cartoon avatars
2 modeled after themselves.151 By using an artform generally associated with and directed at
3 younger audiences, Snap further designed Snapchat to attract teenagers and younger children.
4 154. In 2013, Snap also marketed Snapchat specifically to kids under 13 through a
5 feature it named “SnapKidz.”152 This feature—part of the Snapchat platform—allowed children
6 under 13 to take photos, draw on them, and save them locally on the device.153 Kids could also
7 send these images to others or upload them to other social media sites.154
8 155. Although the SnapKidz feature was later discontinued and Snap claims to now
9 prohibit users under the age of 13, its executives have admitted that its age verification “is
10 effectively useless in stopping underage users from signing up to the Snapchat app.”155 Snap’s
11 purported safeguards are nothing more than a façade.
12 156. Snap’s efforts to attract young users have been successful. Teenagers consistently
13 name Snapchat as a favorite social media platform. The latest figures show 13 percent of
14 children ages 8–12 used Snapchat in 2021, 156 and almost 60 percent of children ages 13–17 use
15 Snapchat.157
b. Snap Intentionally Designs and Markets Exploitative Methods to
16
Increase the Time Users Spend on its Platform
17 158. Snap promotes excessive use of its platform through design features and manipulative
18 algorithms intended to maximize users’ screen time.
19
20 151 Kif Leswing, Snapchat just introduced a feature it paid more than $100 million for, Bus. Insider (July 19,
2016), https://www.businessinsider.com/snapchat-just-introduced-a-feature-it-paid-more-than-100-million-for-2016-
21 7.
152 Larry Magid, Snapchat Creates SnapKidz – A Sandbox for Kids Under 13, Forbes (June 23, 2013),
22 https://www.forbes.com/sites/larrymagid/2013/06/23/snapchat-creates-snapkidz-a-sandbox-for-kids-under-
13/?sh=7c682a555e5a.
23 153 Id.
154
Id.
24 155 Isobel Asher Hamilton, Snapchat admits its age verification safeguards are effectively useless, Bus. Insider
28 https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/.
Complaint
38
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 42 of 107
1 159. Snap has implemented inherently and intentionally exploitive features into
2 Snapchat, including “Snapstreaks,” (tracking and displaying how many consecutive days two
3 users reply to each other) various trophies and reward systems, quickly disappearing messages,
4 and filters. Snap designed these features, along with others, to maximize the amount of time
5 users spend on its platform.
6 160. Snaps are intended to manipulate users by activating the rule of reciprocation.158
7 Whenever a user gets a snap, they feel obligated to send a return snap. Snapchat tells users
8 each time they receive a snap by pushing a notification to the recipient’s cellphone. These
9 notifications are designed to motivate users to open Snapchat and view content, increasing the
10 amount of time users spend on Snapchat. Further, because snaps disappear within ten seconds of
11 being viewed, users feel compelled to reply immediately. This disappearing nature of snaps is a
12 defining characteristic of Snapchat and intended keep users on the platform.
13 161. Snap also keeps users coming back to the Snapchat platform through the
14 Snapstreaks” feature.159 A “streak” is a counter within Snapchat that tracks how many
15 consecutive days two users have sent each other snaps. If a user fails to snap the other user
16 within 24 hours, the streak ends. Snap adds extra urgency by putting an hourglass emoji next to a
17 friend’s name if a Snapchat streak is about to end.160 This design implements a system where a
18 user must “check constantly or risk missing out.”161 And this feature is particularly effective on
19 teenage and minor users. “For teens in particular, streaks are a vital part of using the app, and of
20 their social lives as a whole.”162 Some children become so obsessed with maintaining a
21
158 Nir Eyal, The Secret Psychology of Snapchat, Nir & Far (Apr. 14, 2015),
22 https://www.nirandfar.com/psychology-of-snapchat/.
159 See Avery Hartmans, These are the sneaky ways apps like Instagram, Facebook, Tinder lure you in and get you
23 ‘addicted’, Bus. Insider (Feb. 17 2018), https://www.businessinsider.com/how-app-developers-keep-us-addicted-
to-our-smartphones-2018-1#snapchat-uses-snapstreaks-to-keep-you-hooked-13; see generally Virginia Smart &
24 Tyana Grundig, ‘We’re designing minds’: Industry insider reveals secrets of addictive app trade, CBC (Nov. 3,
2017), https://www.cbc.ca/news/science/marketplace-phones-1.4384876; Julian Morgans, The Secret Ways Social
25 Media is Built for Addiction, Vice (May 17, 2017), https://www.vice.com/en/article/vv5jkb/the-secret-ways-social-
media-is-built-for-addiction.
26 160 Lizette Chapman, Inside the Mind of a Snapchat Streaker, Bloomberg (Jan. 30, 2017),
https://www.bloomberg.com/news/features/2017-01-30/inside-the-mind-of-a-snapchat-streaker.
27 161 Id.
162 Avery Hartmans, These are the sneaky ways apps like Instagram, Facebook, Tinder lure you in and get you
28
Complaint
39
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 43 of 107
1 Snapstreak that they give their friends access to their accounts when they may be away from
2 their phone for a day or more, such as on vacation.163
3 162. Snap also designed features that operate on IVR principles to maximize the time
4 users are on its platform. The “rewards” come in the form of a user’s “Snapscore,” (increases
5 with each snap a user sends and receives) and other signals of recognition similar to “likes” used
6 in other platforms. The increase in Snapscore and other trophies and charms users can earn by
7 using the app operate on variable reward patterns. Like Snapstreaks, these features are designed
8 to incentivize sending snaps and increase the amount of time users spend on Snapchat.
9 163. Snap also designs photo and video filters and lenses, which are central to
10 Snapchat’s function. Snap designed its filters and lenses in a way to further maximize the amount
11 of time users spend on Snapchat. One way Snap uses its filters to hook minor users is by creating
12 temporary filters that impose a sense of urgency to use them before they disappear. Another way
13 Snap designed its filters to increase screen use is by gamification. Many filters include games, 164
14 creating competition between users by sending each other snaps with scores. Snap also tracks
15 data on the most commonly used filters and develops new filters based on this data.165 Snap
16 personalizes filters to further entice individuals to use Snapchat more.166 Snap designs and
17 modifies these filters to maximize the amount of time users spend on Snapchat.
18 c. Snapchat’s Algorithms Are Manipulative and Harmful
19 164. Snap also uses complex algorithms to suggest friends to users and recommend
20 content in order to keep users using Snapchat.
21
‘addicted’, Bus. Insider (Feb. 17 2018), https://www.businessinsider.com/how-app-developers-keep-us-addicted- to-
22 our-smartphones-2018-1#snapchat-uses-snapstreaks-to-keep-you-hooked-13; see generally Cathy Becker,
Experts warn parents how Snapchat can hook in teens with streaks, ABC News (July 27, 2017),
23 https://abcnews.go.com/Lifestyle/experts-warn-parents-snapchat-hook-teens-streaks/story?id=48778296.
163 Caroline Knorr, How to resist technology addiction, CNN (Nov. 9, 2017),
24 https://www.cnn.com/2017/11/09/health/science-of-tech-obsession-partner/index.html; Jon Brooks, 7 Specific
Tactics Social Media Companies Use to Keep You Hooked, KQED (June 9, 2017),
25 https://www.kqed.org/futureofyou/397018/7-specific-ways-social-media-companies-have-you-hooked.
164 Josh Constine, Now Snapchat Has ‘Filter Games’, TechCrunch (Dec. 23, 2016),
26 https://techcrunch.com/2016/12/23/snapchat-games/.
165 How We Use Your Information, Snap Inc., https://snap.com/en-US/privacy/your-information (last visited Dec.
27
8, 2022).
166 Id.
28
Complaint
40
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 44 of 107
1 165. Snap notifies users based on an equation Snap uses to determine whether to suggest
2 someone add someone else as a friend on Snapchat. This is known as “Quick Add.” By using an
3 algorithm to suggest friends to users, Snapchat increases the odds users will add additional
4 friends, send additional snaps, and increase use spending more time on Snapchat.
5 166. Snapchat also utilizes “Discover” and “Spotlight” features that use algorithms to
6 suggest content to users. The Discover feature includes content from news and other media
7 outlets.167 A user’s Discover content is populated by an algorithm, and constantly changes
8 depending on how a user interacts with the content.168 Similarly, the Spotlight feature promotes
9 popular videos from other Snapchat users, and is based on an algorithm that determines whether
10 a user has positively or negatively engaged with similar content.169 Snap programs its algorithms
11 to push content to users that will keep them engaged on Snapchat and, thereby, increase the
12 amount of time users spend on Snapchat, worsening their mental health.
d. Snap’s Conduct in Designing and Operating Its Platform Has Harmed
13
Youth Mental Health
14 167. The way in which Snap has designed and operated Snapchat has caused minors to
15 suffer increased anxiety, depression, disordered eating, cyberbullying, and sleep deprivation.
16 168. Snap is aware Snapchat is harming children because, as alleged above, Snap
17 intentionally designed Snapchat to maximize engagement by preying on the psychology of
18 children through its use of algorithms and other features including Snapstreaks, various trophies
19 and reward systems, quickly disappearing messages, filters, and games.
20 169. Snap reasonably should know that its conduct has negatively affected youth. Snap’s
21 conduct has been the subject of inquiries by the United States Senate regarding Snapchat’s use
22 “to promote bullying, worsen eating disorders, and help teenagers buy dangerous drugs or
23
24
167 Steven Tweedie, How to Use Snapchat’s New ‘Discover’ Feature, Bus. Insider (Jan. 27, 2015),
25 https://www.businessinsider.com/how-to-use-snapchat-discover-feature-2015-1.
168 How We Use Your Information, Snap Inc., https://snap.com/en-US/privacy/your-information (last visited Dec.
26 8, 2022).
169 Sara Fischer, Snapchat launches Spotlight, a TikTok competitor, Axios (Nov. 23, 2020),
27
https://www.axios.com/2020/11/23/snapchat-launches-spotlight-tiktok-competitor; https://snap.com/en-
28 US/privacy/your-information
Complaint
41
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 45 of 107
1 engage in reckless behavior.”170 Further, Senators from across the ideological spectrum have
2 introduced bills that would ban many of the features Snapchat uses, including badges and other
3 awards recognizing a user’s level of engagement with the platform. 171 Despite these calls for
4 oversight from Congress, Snap has failed to limit or stop its use of streaks, badges, and other
5 awards that recognize and promote users’ level of engagement with Snapchat.
6 170. Snap also knows or should know of Snapchat’s other negative effects on minors
7 because of widely available published research findings. For instance, the Journal of the
8 American Medical Association has recognized that Snapchat’s effect on how young people view
9 themselves is so severe, that it named a new disorder, “Snapchat dysmorphia,” after the
10 platform.172 This disorder describes people, usually young women, seeking plastic surgery to
11 make themselves look like the way they do through Snapchat filters.173 The rationale underlying
12 this disorder is that beauty filters on social media, like Snapchat, create a “sense of unattainable
13 perfection” that is alienating and damaging to a person’s self-esteem.174 One social psychologist
14 summed the effect as “the pressure to present a certain filtered image on social media can
15 certainly play into [depression and anxiety] for younger people who are just developing their
16 identities.175
17 171. Despite knowing Snapchat harms its young users, Snap continues to update and
18 add features intentionally designed to maximize the amount of time users spend on Snapchat.
19 Snap continues its harmful conduct because its advertising revenue relies on Snapchat’s users
20
21 170 Bobby Allyn, 4 Takeaways from the Senate child safety hearing with YouTube, Snapchat and TikTok, Nat’l Pub.
Radio (Oct. 26, 2021), https://www.npr.org/2021/10/26/1049267501/snapchat-tiktok-youtube-congress-child-safety-
22 hearing.
171 See Abigal Clukey, Lawmaker Aims To Curb Social Media Addiction With New Bill, Nat’l Pub. Radio (Aug. 3,
23 2019), https://www.npr.org/2019/08/03/747086462/lawmaker-aims-to-curb-social-media-addiction-with-new-bill;
Social Media Addiction Reduction Technology Act, S. 2314, 116th Cong. (2019); Kids Internet Design and Safety
24 Act, S. 2918, 117th Cong. (2021).
172 Snapchat Dysmorphia’: When People Get Plastic Surgery To Look Like A Social Media Filter, WBUR (Aug
25 29, 2018), https://www.wbur.org/hereandnow/2018/08/29/snapchat-dysmorphia-plastic-surgery.
173 Id.
26 174 Nathan Smith & Allie Yang, What happens when lines blur between real and virtual beauty through filters,
42
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 46 of 107
4 a. TikTok’s Platform
5 172. TikTok is a social media platform that describes itself as “the leading destination
6 for short-form mobile video.”176 According to TikTok, it is primarily a platform where users
9 could create and share 15-second videos of themselves lip-syncing or dancing to their favorite
10 music.178
12 also enabled users to create and share short lip-syncing videos that it called TikTok.179
13 175. That same year, ByteDance acquired Musical.ly to leverage its young user base in
15 176. Months later, the apps were merged under the TikTok brand.181
16 177. Since then, TikTok has expanded the length of time for videos from 15-seconds to
17 up to 10 minutes;182 created a fund that was expected to grow to over $1 billion within three
18
176
19 About: Our Mission, TikTok, https://www.tiktok.com/about (last visited Dec. 8, 2022).
177 Protecting Kids Online: Snapchat, TikTok, and YouTube: Hearing Before the Subcomm. On Consumer
20 Protection, Product Safety, and Data Security, 117 Cong. (2021) (statement of Michael Beckerman, VP and Head
of Public Policy, Americas, TikTok).
21 178 Biz Carson, How a failed education startup turned into Musical.ly, the most popular app you’ve probably never
https://www.reuters.com/article/us-bytedance-musically/chinas-bytedance-scrubs-musical-ly-brand-in-favor-of-
23 tiktok-idUSKBN1KN0BW.
180 Liza Lin & Rolfe Winkler, Social-Media App Musical.ly Is Acquired for as Much as $1 Billion; With 60 million
24 monthly users, startup sells to Chinese maker of news app Toutiao, Wall St. J. (Nov. 10, 2017),
https://www.wsj.com/articles/lip-syncing-app-musical-ly-is-acquired-for-as-much-as-1-billion-1510278123.
25 181 Paresh Dave, China’s ByteDance scrubs Musical.ly brand in favor of TikTok, Reuters (Aug. 1, 2018),
https://www.reuters.com/article/us-bytedance-musically/chinas-bytedance-scrubs-musical-ly-brand-in-favor-of-
26 tiktok-idUSKBN1KN0BW.
182 Andrew Hutchinson, TikTok Confirms that 10 Minute Video Uploads are Coming to All Users,
27
SocialMediaToday (Feb. 28, 2022), https://www.socialmediatoday.com/news/tiktok-confirms-that-10-minute-
28 video-uploads-are-coming-to-all-users/619535/.
Complaint
43
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 47 of 107
1 years to incentivize users to create videos that even more people will watch; 183 and had users
2 debut their own songs, share comedy skits,184 and “challenge” others to perform an activity.185
3 178. TikTok marketed and designed its platform to enable endless scrolling.
4 179. “[O]ne of the defining features of the TikTok platform,” is its “For You” feed.186
5 There, users are served with an unending stream of videos TikTok curates for them based on
6 complex, machine-learning algorithms intended to keep users on its platform. TikTok itself
7 describes the feed as “central to the TikTok experience and where most of our users spend their
8 time.”187 The New York Times described it like this:
9 It’s an algorithmic feed based on videos you’ve interacted with, or
even just watched. It never runs out of material. It is not, unless you
10
train it to be, full of people you know, or things you’ve explicitly told
11 it you want to see. It’s full of things that you seem to have
demonstrated you want to watch, no matter what you actually say you
12 want to watch.188
13
180. The “For You” feed has quickly garnered TikTok hundreds of millions of users.
14
Since 2018, TikTok has grown from 271 million global users to more than 1 billion global
15
Monthly users as of September 2021.189
16
b. TikTok Markets Its Platform to Minors
17
18
19
20
21 183 Vanessa Pappas, Introducing the $200M TikTok Creator Fund, TikTok (July 29, 2021),
https://newsroom.tiktok.com/en-us/introducing-the-200-million-tiktok-creator-fund.
22 184 Joseph Steinberg, Meet Musical.ly, the Video Social Network Quickly Capturing the Tween and Teen Markets,
https://www.nytimes.com/2019/03/10/style/what-is-tik-tok.html.
27 189 Jessica Bursztynsky, TikTok says 1 billion people use the app each month, CNBC (Sept. 27, 2021),
28 https://www.cnbc.com/2021/09/27/tiktok-reaches-1-billion-monthly-users.html.
Complaint
44
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 48 of 107
1 181. TikTok has built its business plan around advertising revenue, which has flourished.
2 In 2022, TikTok is projected to receive $11 billion in advertising revenue, over half of which is
3 expected to come from the United States.190
4 182. TikTok, since its beginning as Musical.ly, has been designed and developed with
5 Minors in mind.
6 183. Alex Zhu and Louis Yang, the co-founders of Musical.ly, raised $250,000 to
7 build an app that experts could use to create short, three- to five-minute videos explaining a
8 subject.191 The day they released the app, Zhu said they knew “‘[i]t was doomed to be a failure,’”
9 because “[i]t wasn’t entertaining, and it didn’t attract teens.”192
10 184. According to Zhu, he stumbled upon the idea that would come to be TikTok while
11 observing teens on a train, half of whom were listening to music while the other half took selfies
12 or videos and shared the results with friends.193 “That’s when Zhu realized he could combine
13 music, videos, and a social network to attract the early-teen demographic.”194
14 185. Zhu and Yang thereafter developed the short-form video app that is now known
15 as TikTok.
16 186. TikTok was marketed to minors in its design and content. For example, the Federal
17 Trade Commission (“FTC”) alleged that the app initially centered around a child-oriented
18 activity (i.e., lip syncing); featured music by celebrities that then appealed primarily to teens and
19 tweens, such as Selena Gomez and Ariana Grande; labelled folders with names meant to appeal
20 to youth, such as “Disney” and “school”; included songs in such folders related to Disney
21 television shows and movies, such as “Can You Feel the Love Tonight” from the movie “The
22
23
24 190 Bhanvi Staija, TikTok’s ad revenue to surpass Twitter and Snapchat combined in 2022, Reuters (Apr. 11, 2022),
https://www.reuters.com/technology/tiktoks-ad-revenue-surpass-twitter-snapchat-combined-2022-report-2022-04-
25 11/.
191 Biz Carson, How a failed education startup turned into Musical.ly, the most popular app you’ve probably never
26 heard of, Bus. Insider (May 28, 2016), https://www.businessinsider.com/what-is-musically-2016-5.
192 Id.
27 193
Id.
194
28 Id.
Complaint
45
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 49 of 107
1 Lion King” and “You’ve Got a Friend in Me” from the movie “Toy Story” and songs covering
2 school-related subjects or school-themed television shows and movies.195
3 187. The target demographic was also reflected in the sign-up process. In 2016, the
4 birthdate for those signing up for the app defaulted to the year 2000 (i.e., 16 years old).196
5 188. TikTok also cultivated a younger demographic in unmistakable, covert, ways. In
6 2020, The Intercept reported on a document TikTok prepared for its moderators instructing the
7 moderators that videos of “senior people with too many wrinkles” are disqualified for the “For
8 You” feed because that would make “the video . . . much less attractive [and] not worth[] . . .
9 recommend[ing.]”197
10 189. In December 2016, Zhu confirmed the company had actual knowledge that “a lot
11 of the top users are under 13.”198
12 190. The FTC alleged that despite the company’s knowledge of these and a
13 “significant percentage” of other users who were under 13, the company failed to comply with
14 the COPPA.199
15 191. TikTok settled those claims in 2019 by agreeing to pay what was then the largest
16 ever civil penalty under COPPA and to several forms of injunctive relief.200
17 192. In an attempt to come into compliance with the consent decree and COPPA,
18 TikTok made available to users under 13 what it describes as a “limited, separate app
19 experience.”201 The child version of TikTok restricts users from posting videos through the app.
20
21 195 Complaint for Civil Penalties, Permanent Injunction, and Other Equitable Relief (“Musical.ly Complaint”) at p.
8, ¶¶ 26–27, United States v. Musical.ly, 2:19-cv-01439-ODW-RAO (C.D. Cal. Feb. 27, 2019) Dkt. # 1.
22 196 Melia Robinson, How to use Musical.ly, the app with 150 million users that teens are obsessed with, Bus.
https://www.ftc.gov/business-guidance/blog/2019/02/largest-ftc-coppa-settlement-requires-musically-change-its-
27
tune.
201 Dami Lee, TikTok stops young users from uploading videos after FTC settlement, Verge (Feb. 27, 2019),
28
Complaint
46
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 50 of 107
1 Children can still, however, record and watch videos on TikTok.202 For that reason, experts fear
2 the app is “designed to fuel [children’s] interest in the grown-up version.”203
3 193. The aforementioned ways TikTok markets to and obtained a young user base
4 are manifestations of Zhu’s views about the importance of user engagement to growing TikTok.
5 Zhu explained the target demographic to The New York Times: “[T]eenage culture doesn’t exist”
6 in China because “teens are super busy in school studying for tests, so they don’t have the time
7 and luxury to play social media apps.”204 By contrast, Zhu describes “[t]eenagers in the U.S. [as]
8 a golden audience.”205
9 194. TikTok’s efforts to attract young users have been successful. That is why 67%
10 percent of children ages 13–17 report having used TikTok, and 16% say they use it almost
11 constantly.206
12 c. TikTok Intentionally Maximises the Time Users Spend on its Platform
13 195. TikTok developed and marketed features that exploit the brains of minors such as
14 IVRs and reciprocity to maximize the time users spend on TikTok.
15 196. TikTok employs design elements and complex algorithms to simulate variable
16 reward patterns in a flow-inducing stream of short-form videos intended to captivate its user's
17 attention for as long as possible.
18 196. TikTok drives habitual use of its platform using design elements that operate on
19 principles of IVR. For example, TikTok designed its platform to allow users to like and reshare
20 videos. Those features serve as rewards for users who create content on the platform. Receiving
21 a like or reshare indicates that others approve of that user’s content and satisfies their natural
22
https://www.theverge.com/2019/2/27/18243510/tiktok-age-young-user-videos-ftc-settlement-13-childrens-privacy-
23 law
202 Id.
24 203 Leonard Sax, Is TikTok Dangerous for Teens?, Inst. Fam. Stud. (Mar. 29, 2022), https://ifstudies.org/blog/is-
tiktok-dangerous-for-teens-.
25 204 Paul Mozur, Chinese Tech Firms Forced to Choose Market: Home or Everywhere Else, N.Y. Times (Aug. 9,
2016), https://www.nytimes.com/2016/08/10/technology/china-homegrown-internet-companies-rest-of-the-
26 world.html.
205 Id.
27 206
Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Ctr. (Aug. 10, 2022)
28 https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/
Complaint
47
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 51 of 107
1 desire for acceptance.207 Studies have shown that “likes” activate the reward region of the
2 brain.208 The release of dopamine in response to likes creates a positive feedback loop.209 Users
3 will repeatedly return to TikTok in hope of another pleasurable experience.210
4 197. TikTok also uses reciprocity to manipulate users to use the platform. TikTok
5 invokes reciprocity through features like “Duet” which allows users to post a video
6 side-by-side with a video from another TikTok user. Duet functions as a way for users to post
7 reactions to the videos of TikTok content creators. The response is intended to provoke a
8 reciprocal response from the creator of the original video.
9 198. TikTok offers video filters, lenses, and music, which are intended to keep users on its
10 platform. TikTok has gamified its platform through “challenges.” These challenges are
11 essentially campaigns in which users compete to perform a specific task. By fostering
12 competition, TikTok incentivizes users to use its platform more frequently.
13 199. TikTok’s defining feature, the “For You” feed, is a curated, endless stream
14 of short-form videos intended to keep users on its platform, longer. In that way, TikTok feeds
15 users beyond the point they are satiated. The ability to scroll ad infinitum, coupled with the
16 variable reward pattern of TikTok induces a flow-like state for users that distorts their sense of
17 time.211 The “For You” feed is yet another way TikTok increases the time users spend on its
18 platform.
19 d. TikTok’s Algorithms are Manipulative
20 200. The first thing a user sees when they open TikTok is the “For You” feed, even if
21
22
207 See, e.g., Lauren E. Sherman et al., The Power of the Like in Adolescence: Effects of Peer Influence on Neural
23 and Behavioral Responses to Social Media, 27(7) Psych. Sci. 1027–35 (July 2016),
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5387999/.
24 208 Id.
209 Rasan Burhan & Jalal Moradzadeh, Neurotransmitter Dopamine (DA) and its Role in the Development of Social
25 Media Addiction, 11(7) J. Neurology & Neurophysiology 507 (2020), https://www.iomcworld.org/open-
access/neurotransmitter-dopamine-da-and-its-role-in-the-development-of-social-media-addiction.pdf.
26 210 Id.
211 Christian Montag et al., Addictive Features of Social Media/Messenger Platforms and Freemium Games against
27
the Background of Psychological and Economic Theories, 16(14) Int’l J. Env’t Rsch. & Pub. Health 2612 (July 23,
28 2019), https://doi.org/10.3390/ijerph16142612.
Complaint
48
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 52 of 107
49
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 53 of 107
1 the company’s ‘ultimate goal’ of adding daily active users, it has chosen to optimize for two
2 closely related metrics in the stream of videos it serves: ‘retention’ — that is, whether a user
3 comes back and ‘time spent.’”219
4 208. “This system means that watch time is key.”220 Guillaume Chaslot, founder of
5 Algo Transparency, who reviewed the document at the request of the New York Times,
6 explained that “rather than giving [people] what they really want,” TikTok’s “algorithm tries to
7 get people addicted[.]”221
8 209. The algorithm, along with the design elements, condition users through reward-based
9 learning processes to facilitate the formation of habit loops that encourage excessive use.
10 210. The end result is that TikTok uses “a machine-learning system that analyzes each
11 video and tracks user behavior so that it can serve up a continually refined, never-ending stream
12 of TikToks optimized to hold [user’s] attention.”222
e. TikTok’s Conduct in Designing and Operating its Platform Has Harmed The
13
Mental Health of Minors
14 211. TikTok’s decision to design, market and program its algorithm to prioritize user
15 engagement causes harmful and exploitive content to be amplified to the youth market it has
16 cultivated.
17 212. According to The Integrity Institute, a nonprofit of engineers, product managers,
18 data scientists, and others, prioritizing user engagement amplifies misinformation on TikTok
19 (and other platforms). 223 That pattern, the Integrity Institute notes, is “true for a broad range of
20 harms,” such as hate speech and self-harm content, in addition to misinformation.224
21
219 Ben Smith, How TikTok Reads Your Mind, N.Y. Times (Dec. 5, 2021),
22 https://www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html.
220 Id.
23 221
Id.
222 Jia Tolentino, How TikTok Holds Our Attention, New Yorker (Sept. 30, 2019),
24 https://www.newyorker.com/magazine/2019/09/30/how-tiktok-holds-our-attention.
223 Misinformation Amplification Analysis and Tracking Dashboard, Integrity Inst. (Oct. 13, 2022),
25 https://integrityinstitute.org/our-ideas/hear-from-our-fellows/misinformation-amplification-tracking-dashboard;
see also Steven Lee Myers, How Social Media Amplifies Misinformation More Than Information, N.Y. Times
26
(Oct. 13, 2022), https://www.nytimes.com/2022/10/13/technology/misinformation-integrity-institute-report.html.
224 Misinformation Amplification Analysis and Tracking Dashboard, Integrity Inst. (Oct. 13, 2022),
27
https://integrityinstitute.org/our-ideas/hear-from-our-fellows/misinformation-amplification-tracking-dashboard.
28
Complaint
50
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 54 of 107
https://integrityinstitute.org/our-ideas/hear-from-our-fellows/misinformation-amplification-tracking-dashboard.
27 228 Id.
229
28 Id.
Complaint
51
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 55 of 107
1 that TikTok on average amplified misinformation 29 times more than other content.230
2 220. A separate investigation by NewsGuard found TikTok’s search algorithm
3 similarly amplified misinformation. TikTok’s search engine, like its “For You” feed, is a favorite
4 among youth, with 40 percent preferring it (and Instagram) over Google.231 Unfortunately,
5 NewsGuard found that 1 in 5 of the top 20 TikTok search results on prominent news topics, such
6 as school shootings and COVID vaccines, contain misinformation.232
7 221. Misinformation is just one type of harmful content TikTok amplifies to its young
8 users. Investigations by The Wall Street Journal found TikTok inundated young users with
9 videos about depression, self-harm, drugs, and extreme diets, among other harmful content.
10 222. In one investigation, The Wall Street Journal found TikTok’s algorithm quickly
11 pushed users down rabbit holes where they were more likely to encounter harmful content. The
12 Wall Street Journal investigated how TikTok’s algorithm chose what content to promote to users
13 by having 100 bots scroll through the “For You” feed. 233 Each bot was programmed with
14 interests, such as extreme sports, forestry, dance, astrology, and animals.234 Those interests were
15 not disclosed in the process of registering their accounts.235 Rather, the bots revealed their
16 interests through their behaviors, specifically the time they spent watching the videos TikTok
17 recommended to them. Consistent with TikTok’s internal “Algo 101” document, The Wall Street
18 Journal found that time spent watching videos to be “the most impactful data on [what] TikTok
19 serves you.” 236
20 223. Over the course of 36 minutes, one bot watched 224 videos, lingering over videos
21 with hashtags for “depression” or “sad.”237 From then on, 93 percent of the videos TikTok
22
230
Id.
23 231 Wanda Pogue, Move Over Google. TikTok is the Go-To Search Engine for Gen Z, Adweek (Aug. 4, 2022),
https://www.adweek.com/social-marketing/move-over-google-tiktok-is-the-go-to-search-engine-for-gen-z/.
24 232 Jack Brewster et al., Misinformation Monitor, NewsGuard (Sept. 2022),
https://www.newsguardtech.com/misinformation- monitor/september-2022/.
25 233 Inside TikTok’s Algorithm: A WSJ Video Investigation, Wall St. J. (July 21, 2021),
26 https://www.wsj.com/articles/tiktok-algorithm-video-investigation-11626877477.
234
Id.
235
27 Id.
236
Id.
237
28 Id.
Complaint
52
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 56 of 107
27 https://www.wsj.com/articles/tiktok-algorithm-sex-drugs-minors-
11631052944?st=e92pu5734lvc7ta&reflink=desktopwebshare_permalink.
245
28 Id.
Complaint
53
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 57 of 107
1 without pausing. 246 The bots lingered on videos that matched any of their programmed
2 interests. 247
3 230. Every second the bot hesitated or re-watched a video again proved key to what
4 TikTok recommended to the accounts, which the Wall Street Journal found was used to “drive
5 users of any age deep into rabbit holes of content[.]”248
6 231. For example, one bot was programmed to pause on videos referencing drugs, among
7 other topics. The first day on the platform, the “account lingered on a video of a young woman
8 walking through the woods with a caption” referencing “stoner girls.”249 The following day the
9 bot viewed a video of a “marijuana-themed cake.”250 The “majority of the next thousand videos”
10 TikTok directed at the teenage account “tout[ed] drugs and drug use, including marijuana,
11 psychedelics and prescription medication.”251
12 232. TikTok similarly zeroed in on and narrowed the videos it showed accounts whether
13 the bot was programmed to express interest in drugs, sexual imagery, or a multitude of interests.
14 In the first couple of days, TikTok showed the bots a “high proportion of popular videos.”252
15 “But after three days, TikTok began serving a high number of obscure videos.”253
16 233. For example, a bot registered as a 13-year-old was shown a series of popular
17 videos upon signing up.254 The bot, which was programmed to demonstrate interest in sexual text
18 and imagery, also watched sexualized videos. Later, “[i]t experienced one of the most extreme
19 rabbit holes among The Wall Street Journal’s accounts. Many videos described how to tie knots
20 for sex, recover from violent sex acts and discussed fantasies about rape.” 255 At one point,
21 “more than 90 percent of [one] account’s video feed was about bondage and sex.”256
22
246
Id.
23 247
Id.
248
Id.
24 249
Id.
250Id.
25 251
Id.
252
26 Id.
253
Id.
254
27 Id.
255
Id.
256
28 Id.
Complaint
54
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 58 of 107
1 234. At least 2,800 of the sexualized videos that were shown to The Wall Street Journal’s
2 bots were labeled as being for adults only.257 Yet, TikTok directed these videos to the minor
3 accounts because, as TikTok told the Wall Street Journal, it does not “differentiate
4 between videos it serves to adults and minors.”258
5 235. TikTok also directed a concentrated stream of videos at accounts programmed to
6 express interest in a variety of topics. One such account was programmed to linger over hundreds
7 of Japanese film and television cartoons. “In one streak of 150 videos, all but four” of the videos
8 TikTok directed at the account, “featured Japanese animation—many with sexual themes.”259
9 236. The relentless stream of content intended to keep users engaged “can be especially
10 problematic for young people,” because they may lack the capability to stop watching, says
11 David Anderson, a clinical psychologist at the nonprofit mental health care provider, The Child
12 Mind Institute.260
13 237. In a similar investigation, The Wall Street Journal found TikTok “flood[ed] teen
14 users with videos of rapid-weight-loss competitions and ways to purge food that health
15 professionals say contribute to a wave of eating-disorder cases spreading across the country.”261
16 238. In this investigation, The Wall Street Journal analyzed the tens of thousands of
17 videos TikTok recommended to a dozen bots registered as 13-year-olds. As before, the bots were
18 given interests. Bots scrolled quickly through videos that did not match their interests and
19 lingered on videos that did.262 The accounts registered as 13-year-olds were programmed at
20 different times to display interests in weight loss, gambling, and alcohol.263
21 239. “TikTok’s algorithm quickly gave users the content they’ll watch, for as long
22
23 257
Id.
258
Id.
24 259
Id.
260 Id.
25 261 Tawnell D. Hobbs et al., The Corpse Bride Diet: How TikTok Inundates Teens with Eating-Disorder Videos,
55
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 59 of 107
1 as they’ll watch it.”264 For example, TikTok streamed gambling videos to a bot registered to a
2 13-year-old after it first searched for and favorited several such videos.265 When the bot began
3 demonstrating interest in weight loss videos, the algorithm adapted quickly.266
4 240. After the change in programming, weight-loss videos accounted for well over 40
5 percent of the content TikTok’s algorithm recommended to the user.267
6 241. The other accounts were also flooded with weight-loss videos. Over the course of
7 about 45 days, TikTok inundated the accounts with more than 32,000 such videos, “many
8 promoting fasting, offering tips for quickly burning belly fat and pushing weight-loss detox
9 programs and participation in extreme weight-loss competitions.”268 Some encouraged purging,
10 eating less than 300 calories a day, consuming nothing but water some days, and other hazardous
11 diets.269
12 242. According to Alyssa Moukheiber, a treatment center dietitian, TikTok’s powerful
13 algorithm and the harmful streams of content it directs at young users can tip them into unhealthy
14 behaviors or trigger a relapse.270
15 243. Sadly, the TikTok algorithm had its intended effect for the several teenage girls
16 interviewed by The Wall Street Journal (and upon information and belief, many others), who
17 reported developing eating disorders or relapsing after being influenced by the extreme diet
18 videos TikTok promoted to them.271
19 244. Katie Bell, a co-founder of the Healthy Teen Project, “said the majority of her 17
20 teenage residential patients told her TikTok played a role in their eating disorders.”272
21 245. Others, like Stephanie Zerwas, an associate professor of psychiatry at the
22 University of North Carolina at Chapel Hill, could not recount how many of her young patients
23
264
Id.
24 265
Id.
266
25 Id.
267
Id.
268
26 Id.
269
Id.
270
27 Id.
271
Id.
272
28 Id.
Complaint
56
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 60 of 107
1 told her that “I’ve started falling down this rabbit hole, or I got really into this or that influencer
2 on TikTok, and then it started to feel like eating-disorder behavior was normal, that everybody
3 was doing that.”273
4 246. This trend extends nationwide. The National Association of Anorexia Nervosa
5 and Associated Disorders has fielded 50 percent more calls to its hotline since the pandemic
6 began, most of whom it says are from young people or parents on their behalf.274
7 247. Despite the ample evidence that TikTok’s design and operation of its platform
8 harms the tens of millions of minors who use it, TikTok continues to manipulate them into
9 returning to the platform again and again so that it may serve them ads in between the exploitive
10 content it amplifies.
4. YouTube Intentionally Marketed to and Designed Its Social Media
11
Platform for Minor Users, Substantially Contributing to the Mental Health
12 Crisis
14 248. YouTube is a platform where users can post, share, view, and comment on videos
15 related to a vast range of topics. The platform became available publicly in December 2005, and
17 249. YouTube reports that today it has over 2 billion monthly logged-in users.275 Even
18 more people use YouTube each month because consumers do not have to register an account to
19 view a video on YouTube. As a result, anyone can view most content on YouTube—regardless
20 of age.
21 250. Users, whether logged in or not, watch billions of hours of videos every day.276
22 251. Users with accounts can post their own videos, comment on others, and since
24 252. Beginning in 2008 and through today, YouTube has recommended videos to
25 273
Id.
274
Id.
26 275 YouTube for Press, YouTube, https://blog.youtube/press/ (last visited Dec. 8, 2022).
276 Id.
27 277 Josh Lowensohn, YouTube’s big redesign goes live to everyone, CNET (Mar. 31, 2010),
28 https://www.cnet.com/culture/youtubes-big-redesign-goes-live-to-everyone/.
Complaint
57
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 61 of 107
1 users.278 Early on, the videos YouTube recommended to users were the most popular videos
2 across the platform.279 YouTube admits “[n]ot a lot of people watched those videos[,]” at least
3 not based on its recommendation.280
4 253. Since then, YouTube has designed and refined its recommendation system using
5 machine learning algorithms that today take into account a user’s “likes,” time spent watching a
6 video, and other behaviors to tailor its recommendations to each user.281
7 254. YouTube automatically plays those recommendations for a user after they finish
8 watching a video. This feature, known as “autoplay,” was implemented in 2015. YouTube turns
9 the feature on by default, which means videos automatically and continuously play for users
10 unless they turn it off.282
11 255. YouTube purports to disable by default its autoplay feature for users aged 13–
12 17.283 But, as mentioned above, YouTube does not require users to log in or even have an
13 account to watch videos. For them or anyone who does not self-report an age between 13 and 17,
14 YouTube defaults to automatically playing the videos its algorithm recommends to the user.
15 b. YouTube Markets Its Platform to Minors
16 256. The primary way YouTube profits is through advertising. YouTube made $19
17 billion in ad revenue in 2021 alone.284
18 257. “In 2012, YouTube concluded that the more people watched, the more ads it
19 could run[.]” 285 “So YouTube . . . set a company-wide objective to reach one billion hours of
20
21 278 Cristos Goodrow, On YouTube’s recommendation system, YouTube (Sept. 15, 2021),
https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/.
22 279
Id.
280
Id.
23 281
Id.
282 Autoplay videos, YouTube Help,
24 https://support.google.com/youtube/answer/6327615?hl=en#:~:text=For%20users%20aged%2013%2D17,turned%
20off%20Autoplay%20for%20you (last visited Dec. 8, 2022).
25 283 Id.
284 Alphabet Inc., Annual Report, Form 10-k at 60 (2021),
26 https://www.sec.gov/ix?doc=/Archives/edgar/data/1652044/000165204422000019/goog-20211231.htm.
285 Mark Bergen, YouTube Executive Ignores Warnings, Letting Toxic Videos Run Rampant, Bloomberg (Apr. 2,
27
2019), https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-
28 videos-run-rampant?leadSource=uverify%20wall.
Complaint
58
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 62 of 107
1 viewing a day[.]”286
2 258. “[T]he best way to keep eyes on the site,” YouTube realized, was “recommending
3 videos, alongside a clip or after one was finished.”287 That is what led to the development of its
4 recommendation algorithm and autoplay feature.
5 259. YouTube has long known minors use its platforms in greater proportion than older
6 demographics.
7 260. Still, YouTube has not implemented even rudimentary protocols to verify the age
8 of users. Anyone can watch a video on YouTube without registering an account or reporting their
9 age.
10 261. Instead, YouTube leveraged its popularity among youth to increase its revenue
11 from advertisements by marketing its platform to popular brands of children’s products. For
12 example, Google pitched Mattel, the maker of Barbie and other popular kids’ toys, by telling its
13 executives that “YouTube is today’s leader in reaching children age 6–11 against top TV
14 channels.”288 When presenting to Hasbro, the maker of Play-Doh, My Little Pony, and other
15 kids’ toys, Google boasted that “YouTube was unanimously voted as the favorite website for
16 kids 2-12,” and that “93% of tweens visit YouTube to watch videos.”289 In a different
17 presentation to Hasbro, YouTube was referenced as “[t]he new ‘Saturday Morning Cartoons,’”
18 and claimed that YouTube was the “#1 website regularly visited by kids” and “the #1 source
19 where children discover new toys + games.”290
20 262. In addition to turning a blind eye towards underage users of its platform,
21 YouTube developed and marketed a version of YouTube specifically for children under the age
22 of 13.
23
24
25 286 Id.
287
Id.
26 288 Complaint for Permanent Injunction, Civil Penalties, and Other Equitable Relief, Exhibit A, FTC v. Google
59
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 63 of 107
1 263. YouTube’s efforts to attract young users have been successful. A vast majority, 95
2 percent, of children ages 13–17 have used YouTube.291
3 c. YouTube Intentionally Maximizes the Time Users Spend on its Platform
4 264. Google designed YouTube to maximize user engagement, predominantly through
5 the amount of time users spend watching videos on the platform. To that end, Google employs
6 design elements and complex algorithms to create a never-ending stream of videos intended to
7 grip user’s attention.
8 265. Like the other Defendants’ social media platforms, Google developed features
9 that exploit psychological phenomenon such as IVR to maximize the time users spend on
10 YouTube.
11 266. YouTube uses design elements that operate on principles of IVR to drive both
12 YouTube content creators and YouTube viewers into habitual, excessive use. Google designed
13 YouTube to allow users to like, comment, and share videos and to subscribe to content creator’s
14 channels. These features serve as rewards for users who create and upload videos to YouTube.
15 As described above, receiving a like indicates others’ approval and activates the reward region of
16 the brain.292 The use of likes, therefore, encourages users to use YouTube over and over, seeking
17 future pleasurable experiences.
18 267. YouTube also uses IVR to encourage users to view others content. One of the
19 ways Google employs IVR into YouTube’s design is through subscriber push notifications and
20 emails, which are designed to prompt users to watch YouTube content and encourages excessive
21 use of the platform. When a user “subscribes” to another user’s channel, they receive
22 notifications every time that user uploads new content, prompting them to open YouTube and
23 watch the video.293
24
291
Id.
25 292 See, e.g., Lauren E. Sherman et al., The Power of the Like in Adolescence: Effects of Peer Influence on Neural
and Behavioral Responses to Social Media, 27(7) Psych. Sci. 1027–35 (July 2016),
26 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5387999/.
293 Manage YouTube Notifications, YouTube,
27
https://support.google.com/youtube/answer/3382248?hl=en&co=GENIE.Platform%3DDesktop (last visited Dec.
28 8, 2022).
Complaint
60
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 64 of 107
61
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 65 of 107
1 site for as long as possible to maximize “watch time.”297 Chaslot further stated that “[i]increasing
2 users’ watch time is good for YouTube’s business model” because the more people watch
3 videos, the more ads they see and YouTube’s advertising revenue increases.298
4 272. Early on, one of the primary metrics behind YouTube’s recommendation
5 algorithm was clicks. As YouTube describes, “[c]licking on a video provides a strong indication
6 that you will also find it satisfying.”299 But as YouTube learned, clicking on a video does not
7 mean a user actually watched it. Thus, in 2012, YouTube also started tracking watch time—the
8 amount of time a user spends watching a video.300 YouTube made this switch to keep people
9 watching for as long as possible.301 In YouTube’s own words, this switch was successful. “These
10 changes have so far proved very positive -- primarily less clicking, more watching. We saw the
11 amount of time viewers spend watching videos across the site increase immediately[.]”302And in
12 2016, YouTube started measuring “valued watchtime” via user surveys to ensure that viewers are
13 satisfied with their time spent watching videos on YouTube.303 All of these changes to
14 YouTube’s algorithms were made to ensure that users spend more time watching videos and ads.
15 273. YouTube’s current recommendation algorithm is based on deep-learning neural
16 networks that retune its recommendations based on the data fed into it.304 While this algorithm is
17 incredibly complex, its process can be broken down into two general steps. First, the algorithm
18
19 297 William Turton, How YouTube’s algorithm prioritizes conspiracy theories, Vice (Mar. 5, 2018),
https://www.vice.com/en/article/d3w9ja/how-youtubes-algorithm-prioritizes-conspiracy-theories.
20 298 Jesselyn Cook & Sebastian Murdock, YouTube is a Pedophile’s Paradise, Huffington Post (Mar. 20, 2020),
https://www.huffpost.com/entry/youtube-pedophile-paradise_n_5e5d79d1c5b6732f50e6b4db.
21 299 Cristos Goodrow, On YouTube’s Recommendation System, YouTube (Sept. 15, 2021),
https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/.
22 300 Id.
301 Dave Davies, How YouTube became one of the planet’s most influential media businesses, NPR (Sept. 8, 2022),
23 https://www.npr.org/2022/09/08/1121703368/how-youtube-became-one-of-the-planets-most-influential-media-
businesses.
24 302 Eric Meyerson, YouTube Now: Why We Focus on Watch Time, YouTube (Aug. 10, 2012),
https://blog.youtube/news-and-events/youtube-now-why-we-focus-on-watch-time/.
25 303 Cristos Goodrow, On YouTube’s recommendation system, YouTube (Sept. 15, 2021),
https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/.
26 304 Alexis C. Madrigal, How YouTube’s Algorithm Really Works, Atl. (Nov. 8, 2018),
27 https://www.theatlantic.com/technology/archive/2018/11/how-youtubes-algorithm-really-works/575212/; Paul
Covington et al., Deep Neural Networks for YouTube Recommendations, Google (2016),
28 https://storage.googleapis.com/pub-tools-public-publication-data/pdf/45530.pdf.
Complaint
62
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 66 of 107
1 compiles a shortlist of several hundred videos by finding videos that match the topic and other
2 features of the video a user is currently watching.305 Then the algorithm ranks the list according
3 to the user’s preferences, which the algorithm learns by tracking a user’s clicks, likes, and other
4 interactions.306 In short, the algorithms track and measure a user’s previous viewing habits and
5 then finds and recommends other videos the algorithm thinks will hold the consumer’s attention.
6 274. YouTube’s recommendation system “is constantly evolving, learning every day
7 from over 80 billion pieces of information.”307 Some of the information the recommendation
8 algorithm relies on to deliver recommended videos to users includes users’ watch and search
9 history, channel subscriptions, clicks, watch time, survey responses, shares, likes, dislikes, users’
10 location (country) and the time of day.308
11 275. The recommendation algorithm can determine what “signals” or factors are more
12 important to individual users.309 For example, if a user shares every video they watch, including
13 videos the user gives a low rating, the algorithm learns not to heavily factor the user’s shares
14 when recommending content.310 Thus, the recommendation algorithm “develops dynamically” to
15 individual user’s viewing habits and makes highly specific recommendations to keep individual
16 users watching videos.311
17 276. In addition to the algorithm’s self-learning, Google engineers consistently update
18 YouTube’s recommendation and ranking algorithms, making several updates every month,
19 according to YouTube Chief Product Officer Neal Mohan.312 The end goal is to increase the
20
305 Karen Hao, YouTube is experimenting with ways to make its algorithm even more addictive, MIT Tech. Rev.
21 (Sept. 27, 2019), https://www.technologyreview.com/2019/09/27/132829/youtube-algorithm-gets-more-addictive/;
Paul Covington et al., Deep Neural Networks for YouTube Recommendations, Google (2016),
22 https://storage.googleapis.com/pub-tools-public-publication-data/pdf/45530.pdf.
306 Id.
23 307 Cristos Goodrow, On YouTube’s Recommendation System, YouTube (Sept. 15, 2021),
https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/.
24 308 Recommended Videos, YouTube, https://www.youtube.com/howyoutubeworks/product-
63
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 67 of 107
https://www.nytimes.com/2022/04/21/technology/youtube-rabbit-hole.html.
27 316 Alex Hern, YouTube Kids shows videos promoting drug culture and firearms to toddlers, Guardian (May 5,
28 2022), https://www.theguardian.com/technology/2022/may/05/youtube-kids-shows-videos-promoting-drug-
Complaint
64
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 68 of 107
1 how to conceal a gun, guides on how to bleach one’s face at home, and workout videos
2 emphasizing the importance of burning calories and telling kids to “[w]iggle your jiggle.”317 This
3 research shows that YouTube Kids not only lets inappropriate content slip through its
4 algorithmic filters, but actively directed the content to kids through its recommendation engine.
5 283. Amanda Kloer, a campaign director with the child safety group ParentsTogether,
6 spent an hour on her child’s YouTube Kids profile and found videos “encouraging kids how to
7 make their shirts sexier, a video in which a little boy pranks a girl over her weight, and a video in
8 which an animated dog pulls objects out of an unconscious animated hippo’s butt.”318 Another
9 parent recounted that YouTube Kids’ autoplay function led her 6-year-old daughter to an
10 animated video that encouraged suicide.319
11 284. Other youth are fed content by YouTube’s algorithms that encourages self-harm.
12 As reported by PBS Newshour, a middle-schooler named Olivia compulsively watched YouTube
13 videos every day after she came home from school.320 Over time she became depressed and
14 started searching for videos on how to commit suicide. Similar videos then gave her the idea of
15 overdosing. Weeks later she was in the hospital after “downing a bottle of Tylenol.”321
16 Ultimately, she was admitted into rehab for digital addiction because of her compulsive
17 YouTube watching.322
18 285. Accounting to the Pew Research Center, 46 percent of parents say their child has
19 encountered inappropriate videos on YouTube.323 And children are not encountering these videos
20 on their own volition. Rather, they are being fed harmful and inappropriate videos through
21 culture-firearms-toddlers.
317 Guns, Drugs, and Skin Bleaching: YouTube Kids Poses Risks to Children, Tech Transparency Project (May 5,
22 2022), https://www.techtransparencyproject.org/articles/guns-drugs-and-skin-bleaching-youtube-kids-still-poses-
risks-children.
23 318 Rebecca Heilweil, YouTube’s kids app has a rabbit hole problem, Vox (May 12, 2021),
https://www.vox.com/recode/22412232/youtube-kids-autoplay.
24 319 Id.
320 Lesley McClurg, After compulsively watching YouTube, teenage girl lands in rehab for ‘digital addiction’, PBS
25
(May 16, 2017), https://www.pbs.org/newshour/health/compulsively-watching-youtube-teenage-girl-lands-rehab-
digital-addiction.
26 321 Id.
322 Id.
27 323 Brooke Auxier et al., Parenting Children in The Age of Screens, Pew Rsch. Ctr. (July 28, 2020),
28 https://www.pewresearch.org/internet/2020/07/28/parental-views-about-youtube/.
Complaint
65
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 69 of 107
66
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 70 of 107
1 and over.330 This dopaminergic response is in addition to the reward stimulus YouTube provides
2 users through IVR.
3 290. Mental health professionals across the country have seen an increase in children
4 experiencing mental health issues because of YouTube. Natasha Daniels, a child psychotherapist
5 in Arizona, has said she has seen a rise in cases of children suffering from anxiety because of
6 videos they watched on YouTube.331 Because of their anxiety, these children “exhibit loss of
7 appetite, sleeplessness, crying fits, and fear.”332
8 291. In addition to causing anxiety, watching YouTube is also associated with
9 insufficient sleep.333 In one study on the effect of app use and sleep, YouTube was the only app
10 consistently associated with negative sleep outcomes.334 For every 15 minutes teens spent
11 watching YouTube, they had a 24 percent greater chance of getting fewer than seven hours of
12 sleep.335 YouTube is particularly problematic on this front because of YouTube’s
13 recommendation and autoplay feature make it “so easy to finish one video” and watch the next,
14 said Dr. Alon Avidan, director of the UCLA Sleep Disorders Center.336 In turn, insufficient sleep
15 is associated with poor health outcomes.337 Thus, YouTube exacerbates an array of youth mental
16 health issues by contributing to sleep deprivation.
17 292. Despite the vast evidence that YouTube’s design and algorithms harms millions
18 of youths, Google continues to manipulate them into staying on the platform and watching more
19 and more videos so that it can increase its ad revenue.
20 E. The Effect of Social Media Use on School Districts
21
330 Id.
22 331 Id.
332 Id.
23 333 Meg Pillion et al., What’s ‘app’-ning to adolescent sleep? Links between device, app use, and sleep outcomes,
67
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 71 of 107
1 293. School districts are uniquely harmed by the current youth mental health crisis.
2 This is because schools are one of the main providers for mental health services for school-aged
3 children.338 Indeed, over 3.1 million children ages 12–17 received mental health services through
4 an education setting in 2020, more than any other non-specialty mental health service setting.339
5 294. Most schools offer mental health services to students. In the 2021–22 school year,
6 96 percent of public schools reported offering at least one type of mental health service to their
7 students.340 But 88 percent of public schools did not strongly agree that they could effectively
8 provide mental health services to all students in need.341 The most common barriers to providing
9 effective mental health services are (1) insufficient number of mental health professionals; (2)
10 inadequate access to licensed mental health professionals; and (3) inadequate funding.342 Student
11 opinions also reflect that schools are unable to provide adequate mental health services. Less
12 than a quarter of students in grades 6–12 report accessing counseling or psychological services
13 when they are upset, stressed, or having a problem.343 And of the students who access mental
14 health services, only 41 percent of middle schoolers and 36 percent of high schoolers are
15 satisfied with the services they receive.344
16 295. In part, schools are struggling to provide adequate mental health services because
17 of the increase in students seeking these services. More than two-thirds of public schools
18 reported an increase in the percent of students seeking mental health services from school since
19 the start of the pandemic.345
20
21 338 National Survey on Drug Use and Health, SAMHSA (2019 & 1st & 4th Qs. 2020),
https://www.samhsa.gov/data/report/2020-nsduh-detailed-tables.
22 339 Id.
340 Roughly Half of Public Schools Report That They Can Effectively Provide Mental Health Services to All
23 Students In Need, Nat’l Ctr. Educ. Stat. (May 31, 2022),
https://nces.ed.gov/whatsnew/press_releases/05_31_2022_2.asp.
24 341 Id.
342
Id.
25 343 Insights From the Student Experience, Part I: Emotional and Mental Health at 2, YouthTruth (2022),
https://youthtruthsurvey.org/wp-content/uploads/2022/10/YouthTruth_EMH_102622.pdf.
26 344 Id.
345 Roughly Half of Public Schools Report That They Can Effectively Provide Mental Health Services to All
27
Students In Need, Nat’l Ctr. Educ. Stat. (May 31, 2022),
28 https://nces.ed.gov/whatsnew/press_releases/05_31_2022_2.asp.
Complaint
68
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 72 of 107
1 296. During this same period, adolescents increased their social media use, also raising
2 levels of excessive and problematic use of digital media.346 And these higher rates of social
3 media use are related to higher “ill-being.”347 Thus, the increase in adolescent social media use
4 during the pandemic has caused an increase in adolescents experiencing mental health problems.
5 297. That relationship is reflected in reports from public schools. Over 75 percent of
6 public schools reported an increase in staff expressing concerns about student depression,
7 anxiety, and other disturbances since the start of the pandemic.348 Students receiving mental
8 health services in educational settings predominately do so because they “[f]elt depressed,”
9 “[t]hought about killing [themselves] or tried to” or “[f]elt very afraid and tense.”349
10 298. Anxiety disorders are also up, affecting 31.9 percent of adolescents between 13
11 and 18 years old.350 “Research shows that untreated teenagers with anxiety disorders are at
12 higher risk to perform poorly in school, miss out on important social experiences, and engage in
13 substance abuse.”351
14 299. Schools are struggling not only to provide students with mental health services
15 but also to deliver an adequate education because of the youth mental health crisis. Students in
16 grades 6–12 identify depression, stress, and anxiety as the most prevalent obstacles to
17 learning.352 Most middle school and high school students also fail to get enough sleep on school
18 nights, which contributes to poor academic performance.353 These negative mental health
19
346 Laura Marciano et al., Digital Media Use and Adolescents' Mental Health During the Covid-19 Pandemic: A
20 Systematic Review and Meta-Analysis, Frontiers Pub. Health (Feb. 2022),
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8848548/.
21 347 Id.
348 Roughly Half of Public Schools Report That They Can Effectively Provide Mental Health Services to All
22 Students In Need, Nat’l Ctr. Educ. Stat. (May 31, 2022),
https://nces.ed.gov/whatsnew/press_releases/05_31_2022_2.asp.
23 349 Rachel N. Lipari et al., Adolescent Mental Health Service Use and Reasons for Using Services in Specialty,
69
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 73 of 107
1 outcomes are also the most common symptoms of excessive social media use.
2 300. The youth mental health crisis has also caused a wide range of other behavioral
3 issues among students that interfere with schools’ ability to teach. In 2022, 61 percent of public
4 schools saw an increase in classroom disruptions from student misconduct compared to school
5 years before the pandemic.354 Fifty-eight percent of public schools also saw an increase in
6 rowdiness outside of the classroom, 68 percent saw increases in tardiness, 27 percent saw
7 increases in students skipping classes, 55 percent saw increases in the use of electronic devices
8 when not permitted, 37 percent saw an increase in bullying, 39 percent saw an increase in
9 physical fights between students, and 46 percent saw an increase in threats of fights between
10 students.355
11 301. Further exacerbating school’s struggle to teach is the fact students are not
12 showing up to school. Indeed, student absenteeism has greatly increased. In the 2021–22 school
13 year, 39 percent of public schools experienced an increase in chronic student absenteeism
14 compared to the 2020–21 school year, and 72 percent of public schools saw increased chronic
15 student absenteeism compared to school years before the pandemic.356 Following suit, vandalism
16 has increased in 2022, with 36 percent of public schools reporting increased acts of student
17 vandalism on school property.357
18 302. School districts have borne increased costs and expenses in response to the youth
19 mental health crisis. These costs include:
20
(a) hiring additional mental health personnel (41 percent of public schools added staff
21
to focus on student mental health);358
22
(b) developing additional mental health resources (46 percent of public schools
23 created or expanded mental health programs for students, 27 percent added
24
2015, 67(3) Morbidity & Mortality Wkly. Rpt. 85–90 (Jan. 26, 2018),
25 http://dx.doi.org/10.15585/mmwr.mm6703a1.
354 2022 School Pulse Panel, U.S. Dep’t Educ., Inst. Educ. Sci. (2022), https://ies.ed.gov/schoolsurvey/spp/.
26 355 Id.
356 Id.
27 357
Id.
358
28 Id.
Complaint
70
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 74 of 107
student classes on social, emotional, and mental health and 25 percent offered guest
1
speakers for students on mental health);359
2
(c)training teachers to help students with their mental health (56 percent of public
3 schools offered professional development to teachers on helping students with
mental health);360
4
5 (d) increasing disciplinary services and hiring additional personnel for disciplinary
services in response to increased bullying and harassment over social media;
6
7 (e) addressing property damaged as a result of students acting out because of mental,
social, and emotional problems Defendants' conduct caused;
8
(f) diverting time and resources from instruction activities to notify parents and
9 guardians of students' behavioral issues and attendance;
10
(g) investigating and responding to threats made against schools and students over
11 social media;
12 (h) updating its student handbook to address use of Defendants' platforms; and
13
(i) updating school policies to address use of Defendants' platforms.
14
15
F. Impact of Social Media Use on Plaintiffs
16
304. HANNA has been directly impacted by the mental health crisis among youth in its
17
community.
18
305. The increased use of and dependency on social media has led to an increase in the
19
number of Plaintiff’s students in crisis, acting out, vandalizing school property, and in need of
20
mental health services.
21
306. In an attempt to address the decline in students’ mental, emotional, and social
22
health, Plaintiff has been forced to divert resources and expend additional resources to:
23
a. hire additional personnel, including counselors and medical professionals to
24
address mental, emotional, and social health issues;
25
b. develop additional resources to address mental, emotional, and social health
26
27 359
Id.
360
28 Id.
Complaint
71
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 75 of 107
1 issues;
2 c. increase training for teachers and staff to identify students exhibiting symptoms
3 affecting their mental, emotional, and social health;
4 d. train teachers, staff, and members of the community about the harms caused by
5 Defendants' wrongful conduct;
6 e. develop lesson plans to teach students about the dangers of using Defendants'
7 platforms;
8 f. educate students about the dangers of using Defendants' platforms;
9 g. update its student handbook to address use of Defendants' platforms; and
10 h. update school policies to address use of Defendants’ platforms.
11 315. Additionally, more students have been acting out as a result of the decline
12 Defendants caused in students mental, emotional, and social health. As a result, Plaintiff has
13 been forced to divert resources and expend additional resources to:
14 a. repair damaged property as a result of the exploitive and harmful content
15 Defendants directed to students;
16 b. increase disciplinary services and time spent addressing bullying, harassment, and
17 threats;
18 c. confiscate devices on which students were compelled by Defendants’ conduct to
19 use while in class or school campuses to access Defendants’ platforms;
20 d. meet with students and the parents of students caught using Defendants’ platforms
21 at school;
22 e. divert time and resources from instruction activities to notify parents and
23 guardians of students’ behavioral issues and attendance; and
24 f. investigate and respond to threats made against schools and students over social
25 media.
26 316. As a result, the rest of Plaintiff’s staff must work in overdrive to help students with
27 mental health concerns.
28
Complaint
72
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 76 of 107
1 317. Plaintiff requires significantly greater and long-term funding to address the
2 nuisance Defendants have created. It is time, as President Biden declared, to get “all Americans
3 the mental health services they need.”361
V. THE COMMUNICATIONS DECENCY ACT, 47 U.S.C. § 230(c)
4
EXPRESSLY ALLOWS INTERACTIVE COMPUTER SERVICE COMPANIES
5 LIKE DEFENDANTS TO LIMIT HARMFUL CONTENT, AND THERE IS NOT
IMMUNITY FOR DEFENDANTS’ CONDUCT
6
318. Plaintiff anticipates that Defendants will raise section 230 of the Communications
7
Decency Act, 47 U.S.C. § 230(c)(1), as a shield for their conduct. But section 230 is no shield for
8
Defendants’ own acts in designing, marketing, and operating social media platforms that are
9
harmful to youth.
10
319. Section 230 was enacted by Congress to address the harms associated with certain
11
content and drafted to limit liability for “Good Samaritans” seeking to restrict such harmful
12
content. It is entitled, “Protection for ‘Good Samaritan’ blocking and screening of offensive
13
material.”
14
320. Section 230 provides immunity from liability only to “(1) a provider or user of an
15
interactive computer service (2) whom a plaintiff seeks to treat, under a state law cause of action,
16
as a publisher or speaker (3) of information provided by another information content provider.”
17
Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1100–01 (9th Cir. 2009), as amended (Sept. 28, 2009).
18
322. Publication generally involves traditional editorial functions, such as reviewing,
19
editing, and deciding whether to publish or to withdraw from publication third-party content.
20
Lemmon v. Snap, Inc., 995 F.3d 1085, 1091 (9th Cir. 2021).
21
323. Publication does not, however, include duties related to designing and marketing
22
a social media platform. See id. at 1092–93.
23
324. Plaintiff expressly disavows any claims or allegations that attempt to hold
24
Defendants liable as the publisher or speaker of any information provided by third parties.
25
325. Section 230 does not immunize Defendants’ conduct because, among other
26
27 361 President Biden, State of the Union Address (Mar. 1, 2022) (transcript available at
28 https://www.whitehouse.gov/state-of-the-union-2022/).
Complaint
73
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 77 of 107
1 considerations: (1) Defendants are liable for their own affirmative conduct in recommending and
2 promoting harmful content to youth; (2) Defendants are liable for their own actions designing
3 and marketing their social media platforms in a way that causes harm; (3) Defendants are liable
4 for the content they create that causes harm; and (4) Defendants are liable for distributing,
5 delivering, and/or transmitting material that they know or have reason to know is harmful,
6 unlawful, and/or tortious.
7 326. First, Plaintiff is not alleging Defendants are liable for what third-parties have
8 said on Defendants’ platforms but, rather, for Defendants’ own conduct. As described above,
9 Defendants affirmatively recommend and promote harmful content to minors, such as pro-
10 anorexia and eating disorder content, and Snapchat filters which promote plastic surgery or body
11 dysmorphia. Recommendation and promotion of damaging material is not a traditional editorial
12 function and seeking to hold Defendants liable for these actions is not seeking to hold them liable
13 as a publisher or speaker of third party-content.
14 327. Second, Plaintiff’s claims arise from Defendants’ status as designers and
15 marketers of dangerous social media platforms that have injured the health, comfort, and repose
16 of its community. The nature of Defendants’ platforms centers around Defendants’ use of
17 algorithms and other design features that encourage users to spend the maximum amount of
18 time on their platforms—not on particular third party content. The algorithms Defendants
19 designed and employ adapt to the social media activity of individual users to promote whatever
20 content will trigger a particular user’s attention and maximize their use of the platform, for as
21 long as possible. Defendants’ algorithms are user-focused rather than content-based and are
22 indifferent to the nature and type of content they promote to users, provided that such content
23 increases the time users spend on their platforms. In that respect, they are content neutral.
24 328. Third, Defendants are liable for the content they create. In addition to content
25 such as Snapchat filters which promote body dysmorphia, Defendants send emails and
26 notifications to youth including material they create which often promotes certain harmful
27 content.
28
Complaint
74
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 78 of 107
1 329. Fourth, Plaintiff does not seek to hold Defendants liable as publishers or speakers
2 of information provided by other content providers, but instead Plaintiff seeks to hold them liable
3 for distributing material they know or should know is harmful or unlawful. See Malwarebytes
4 Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13 (2020) (statement of Justice Thomas
5 respecting denial of certiorari discussing the distinction between distributor and publisher
6 liability); cf. Restatement (Second) of Torts § 581 (Am. Law Inst. 1977) (“[O]ne who only
7 delivers or transmits defamatory matter published by a third person is subject to liability if, but
8 only if, he knows or has reason to know of its defamatory character.”)
9 330. Plaintiff’s claim is based upon Defendants’ own conduct, which has resulted in the
10 exacerbation of the public health crisis from which Plaintiffs’ students are suffering.
11 VI. CAUSES OF ACTION
12 COUNT ONE – VIOLATIONS OF PUBLIC NUISANCE LAW
13 331. Plaintiff incorporates each preceding paragraph as though set forth fully herein.
14 333. Defendants have created and maintained a public nuisance which proximately caused
15 injury to Plaintiff.
16 334. Plaintiff, in the operation of its schools, has a right to be free from conduct that
17 endangers their health and safety, and the health and safety of their employees and students. Yet
18 Defendants have engaged in conduct and omissions which unreasonably and injuriously
19 interfered with the public health and safety in Plaintiff’s community and created substantial and
20 unreasonable annoyance, inconvenience, and injury to the public by their production, promotion,
21 design, distribution, and marketing of their social media platforms, for use by youth in Plaintiff’s
22 schools. Defendants’ actions and omissions have substantially, unreasonably, and injuriously
23 interfered with Plaintiff’s functions and operations and affected the public health, safety and
24 welfare of Plaintiff’s community.
25 335. Each Defendant has created or assisted in the creation of a condition that is injurious
26 to the health and safety of Plaintiff and its students and interferes with the comfortable
27 enjoyment of life and property of Plaintiff’s community.
28
Complaint
75
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 79 of 107
1 336. Defendants’ conduct has directly caused a severe disruption of public health, order
2 and safety. Defendants’ conduct is ongoing and continues to produce permanent and long-lasting
3 damage.
4 337. This harm to Plaintiff and the public is substantial, unreasonable, widespread and
5 ongoing.
6 338. Because of the mental health crisis caused by Defendants, Plaintiffs’ schools can no
7 longer operate, use, or enjoy their property free from injury or interference.
8 339. Defendants’ design, manufacture, production, marketing, and promotion of such
9 addictive, manipulative and harmful social media platforms, when such actions were taken with
10 the intent to market to, and in fact, were marketed to youth through targeted campaigns,
11 unreasonably interfered with a public right in that the results of Defendants’ actions created and
12 maintained a condition dangerous to the public’s health, was offensive to community moral
13 standards, or unlawfully obstructed the public in free use of public property. Defendants
14 intentionally created and maintained a public nuisance by, among other acts:
15 a. Designing social media platforms that were uniquely youth oriented;
16 b. Designing a product that was meant to facilitate use by minors, both generally and
17 in a way that was branded youth friendly;
18 c. Failing to sufficiently study and conduct necessary tests to determine whether
19 their platforms were safe for children/minor users;
20 d. Creating, producing, maintaining, distributing, managing, marketing, promoting,
21 and delivering their platforms to the general public and to Plaintiff’s students
22 without thorough and adequate pre-and post market testing;
23 e. Designing and using algorithms which promote harmful, destructive content to be
24 consumed by the user, regardless of age;
25 f. Failing to act on data, reports, analysis, opinions, or information known, or that
26 should have been known in the exercise of reasonable diligence, pertaining to
27
28
Complaint
76
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 80 of 107
1 Defendants’ platform and the risks and hazards posed to children, adolescents and
2 minors;
3 g. Designing their social media platforms to encourage excessive amounts of time
4 that users spend on the platform and causing mental and emotional harm,
5 particularly to children/minors, using manipulative technology such as algorithm-
6 based feeds, IVR and social reciprocity;
7 h. Failing to employ adequate safeguards in the creation, maintenance, and operation
8 of their platforms to ensure they would not encourage excessive and harmful use;
9 i. Failing to take reasonably adequate steps to prevent their platforms from being
10 promoted, distributed, and used by minors under the age of 13;
11 j. Designing engineering, developing, and maintaining their platforms to appeal to
12 children, adolescents and teens, where such minors lack the same cognitive
13 development as adults and are particularly vulnerable to social reward-based
14 manipulative tactics such as algorithm-based feeds and social reciprocity;
15 k. Failing to disclose to or warn Plaintiffs, users, consumers, and the general public
16 of the negative mental and emotional health consequences associated with their
17 platforms and social media generally, especially for children/minors;
18 l. Failing to provide reasonably adequate warnings to youth users or the parents or
19 guardians of such minors, where Defendants could reasonably foresee such
20 minors would use their platforms;
21 m. Failing to disclose to Plaintiffs, Plaintiffs’ students, and the general public that
22 Defendants’ platforms are designed to maximize the time children, adolescents,
23 and teens spend on Defendants’ platforms and that such platforms cause negative
24 mental, emotional, and social health consequences, particularly among minors;
25 n. Failing to warn Plaintiff, Plaintiff’s students, and the public of the true risks of
26 using Defendants’ platforms;
27
28
Complaint
77
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 81 of 107
78
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 82 of 107
1 342. Defendants’ conduct has affected and continues to affect a substantial number of
2 people within Plaintiffs’ school district and is likely to continue causing significant harm.
3 343. But for Defendants’ actions, Plaintiffs’ students would not use social media
4 platforms as often or for as long as they do, be deluged with exploitive and harmful content
5 to the same degree, and the public health crisis that currently exists as a result of Defendants’
6 conduct would have been averted.
7 344. Defendants knew or should have known that their conduct would create a public
8 nuisance. Defendants knew or should have known, that their acts and omissions involved in the
9 development, promotion and use of their platforms would cause students to excessively use their
10 platforms. Defendants knew, or reasonably should have known, that their tactics to encourage
11 user engagement with their platforms were designed to appeal to minors, and that their acts and
12 omissions intended to increase minors’ use of their platforms were causing harm to minors in
13 Plaintiff’s school district and to Plaintiff itself.
14 345. Thus, the public nuisance caused by Defendants was reasonably foreseeable,
15 including the financial and economic losses incurred by Plaintiffs.
16 346. Alternatively, each Defendants’ conduct was a substantial factor in bringing about the
17 public nuisance as described herein. By designing, marketing, promoting, and operating their
18 platforms in a manner intended to maximize the time youth spend on their respective
19 platforms—despite knowledge of the harms to youth from their wrongful conduct—Defendants
20 directly accelerated and enabled the widespread, excessive, and habitual use of their platforms
21 and the public nuisance affecting HANNA. By seeking to capitalize on their success by refining
22 their platforms to increase the time youth spend on their platforms, Defendants directly
23 contributed to the public health crisis and the public nuisance affecting Plaintiffs.
24 347. Defendants’ conduct is especially injurious to HANNA because, as a
25 direct and proximate cause of Defendants’ conduct creating or assisting in the creation of a
26 public nuisance, Plaintiff and its students and employees have sustained and will continue to
27 sustain substantial injuries.
28
Complaint
79
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 83 of 107
1 348. Defendants, and each of them, facilitated and permitted the conditions to exist that
2 caused the harms herein mentioned.
3 349. Defendants directly facilitated the rise of the youth suicide crisis.
4 350. Plaintiffs have attempted to mitigate the harm and disruption caused by
5 Defendants’ conduct, including the following:
6 a. Hiring additional personnel to address mental, emotional, and social health crises
7 and security;
8 b. Developing and spending additional resources to address mental, emotional, and
9 social health issues;
10 c. Increasing training for teachers and staff to identify students exhibiting symptoms
11 impacting their mental, emotional, and social health;
12 d. Implementing additional training for teachers, staff, and members of the
13 community about the harms caused by Defendants’ wrongful conduct;
14 e. Developing/altering lesson plans to teach students about the dangers of using
15 Defendants’ platforms;
16 f. Educating students, staff and parents about the dangers of using Defendants’
17 platforms;
18 g. Remediating property damaged as a result of students acting out because of
19 mental, social, and emotional problems caused by Defendants’ conduct;
20 h. Increasing time and resources spent addressing bullying, harassment, and threats;
21 i. Confiscating electronic devices on which students use Defendants’ platforms
22 while in class or on Plaintiffs’ campus;
23 j. Meeting with students and the parents caught using Defendants’ platforms at
24 school or other disciplinary matters related to students’ use of Defendants’
25 platforms;
26
27
28
Complaint
80
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 84 of 107
1 k. Diverting time and resources from instruction activities to notify parents and
2 guardians of students’ behavioral issues and attendance issues caused by use of
3 Defendants’ platforms;
4 l. Investigating and responding to threats made against Plaintiff’s schools and
5 students originating on or because of social media;
6 m. Updating student handbook(s) to address use of Defendants’ platforms;
7 n. Modifying school policies to address use of Defendants’ platforms; and
8 o. Addressing increased incidence of vandalism, property damage, crime, and
9 increased need for discipline of students caused by use of Defendants’ platforms.
10 353. Fully abating the youth mental health crisis resulting from Defendants’ conduct will
11 require much more than these steps.
12 354. As detailed herein, Plaintiff has suffered special injury, different in kind from those
13 suffered by the general public, including but not limited to, those arising from: discipline and
14 suspensions related to incidents of social media use in Plaintiff’s schools have increased at
15 alarming rates; property damage due to “challenges” and other content promoted and spread by
16 Defendants’ platforms; Plaintiff has had to closely monitor and stop the use of electronic devices
17 in Plaintiff’s schools as to prevent social media being used during a time when students should
18 be learning; Plaintiff has had to divert resources toward the mental health crisis caused by
19 Defendants; Plaintiff has had to change lesson plans and educational courses to mitigate the
20 youth bullying, eating disorders, and suicide content promulgated on defendants’ platforms;
21 Plaintiff has had to devote and divert staff resources to conduct staff training on the dangers of
22 social media use; Plaintiff has had to hire additional school counselors and staff to address the
23 youth mental health crisis caused by widespread use by minors of Defendants’ platforms.
24 355. Plaintiffs therefore request all the relief to which it is entitled in its own right and
25 relating to the special damage or injury it has suffered, and not in any representative or parens
26 patriae capacity on behalf of students, including damages in an amount to be determined at trial
27 and an order providing for the abatement of the public nuisance that Defendants have created or
28
Complaint
81
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 85 of 107
1 assisted in the creation of, and enjoining Defendants from future conduct contributing to the
2 public nuisance described above.
3 356. Defendants engaged in conduct, as described above, that constituted malice,
4 oppression, or fraud, with intent to cause injury and/or with willful and knowing disregard of the
5 rights or safety of another, being fully aware of the probable dangerous consequences of the
6 conduct and deliberately failing to avoid those consequences.
7 357. Defendants regularly risk the lives and health of consumers and users of its platforms
8 with full knowledge of the dangers of its platforms. Defendants made conscious decisions not to
9 redesign, alter the platforms, warn, or inform the unsuspecting public, including Plaintiff’s
10 students or Plaintiff. Defendants’ willful, knowing and reckless conduct, constituting malice,
11 oppression or fraud therefore warrants an award of aggravated or punitive damages.
12 COUNT II
13 NEGLIGENCE
14 358. Plaintiffs incorporate by reference all preceding paragraphs as though fully set forth
15 herein.
16 359. Defendants owed Plaintiffs a duty to not expose Plaintiffs to an unreasonable risk of
17 harm, and to act with reasonable care as a reasonably careful person and/or company would act
18 under the circumstances.
19 360. At all times relevant herein, Defendants owed a duty to consumers and the general
20 public, including Plaintiffs, to exercise reasonable care in the creation, production, maintenance,
21 distribution, management, marketing, promotion, and delivery of Defendants’ social media
22 platforms, including the duty to take all reasonable steps necessary to design, research , market,
23 advertise, promote, operate, and distribute their platforms in a way that is not unreasonably
24 dangerous to users, including minors.
25 361. At all times relevant herein, Defendants knew or, in the exercise of reasonable care,
26 should have known of the hazards and dangers of their respective social media platforms and
27
28
Complaint
82
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 86 of 107
1 specifically, the health hazards their platforms posed to youth in particular, especially prolonged
2 use of such platforms where exposure to harmful content was reasonably foreseeable.
3 362. At all times relevant herein, Defendants knew or, in the exercise of reasonable care,
4 should have known that use of Defendants’ social media platforms by minors would create a
5 dangerous and unreasonable risk of injury to Plaintiffs.
6 363. Defendants also knew, or in the exercise of reasonable care should have known, that
7 users and consumers of Defendants’ social media platforms were unaware of the risks associated
8 with the use of Defendants’ platforms, or the magnitudes of such risks. These risks include, but
9 are not limited to, the risks of excessive social media use and the risks of the probability that
10 algorithm-based recommendations would expose minor users to content that is violent, sexual, or
11 encourages self-harm, among other types of harmful content, or that mental and emotional illness
12 could result.
13 364. Defendants, by actions, inactions, representations and omissions, breached heir duty
14 of reasonable care, failed to exercise ordinary care, and failed to act as a reasonably careful
15 person and/or company would act under the circumstances in the creation, production,
16 maintenance, distribution, management, marketing, promotion, and delivery of their social media
17 platforms, in that Defendants’ creation, production, maintenance, distribution, management,
18 marketing, promotion, and delivery social media platforms that Defendants knew or had reason
19 to know would negatively impact the mental health of consumers, particularly minors, and the
20 schools they attend, and failed to prevent or adequately warn of these serious risks and injuries.
21 365. Despite their knowledge, opportunity, ability, and means to investigate, study, and
22 test their social media platforms, and to provide adequate warnings, Defendants failed to take
23 these action. Defendants have wrongfully concealed information and have made false and/or
24 misleading statements concerning the safety and use of Defendants’ social media platforms.
25 366. Defendants breached their duty in the following ways, including but not limited to:
26
27
28
Complaint
83
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 87 of 107
84
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 88 of 107
1 j. Failing to provide reasonably adequate warnings to child and minor users or the
2 parents of such minors, where Defendants could reasonably foresee such minors
3 would use their platforms;
4 k. Failing to disclose to Plaintiffs, users, consumers, and the public that Defendants’
5 platforms are designed to maximize the time users, particularly minors, spend on
6 Defendants’ platforms and that such platforms cause negative mental, emotional,
7 and social health consequences;
8 l. Failing to warn users and the general public, including Plaintiffs and students of
9 Plaintiffs of the true risks of using Defendants’ platforms;
10 m. Advertising, marketing; and promoting Defendants’ platforms while concealing
11 and failing to disclose or warn of the dangers known by Defendants to be
12 associated with, or caused by, minors’ use of Defendants’ platforms;
13 n. Continuing the creation, production, maintenance, distribution, management,
14 marketing, promotion, and delivery of Defendants’ platforms with knowledge that
15 Defendants’ platforms are unreasonably unsafe, addictive, and dangerous to
16 children’s mental and emotional health;
17 o. Failing to change Defendants’ algorithms, which are used to suggest content to
18 users in a manner that would no longer concentrate on maximizing the amount of
19 time users spend on Defendants’ platforms notwithstanding the reasonably
20 foreseeable mental and emotional safety risks this posed to Defendants’
21 minor/children users;
22 p. Failing to adequately limit Defendants’ algorithm-based recommendations to
23 filter out content that expose children and minor users to content that is violent,
24 sexual, or encourages self-harm, among other types of harmful content;
25 q. Representing that Defendants’ platforms were safe for children, adolescent, and
26 teen users when, in fact, Defendants knew or should have known that the
27
28
Complaint
85
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 89 of 107
1 platforms presented a clear and present danger for youth’s mental and emotional
2 health; and
3 r. Additional failures, acts or omissions as set forth at length herein.
4 367. Defendants knew or should have known that it was foreseeable that Plaintiffs would
5 suffer injuries as a result of Defendants’ failure to exercise reasonable care in creating,
6 producing, maintaining, distributing, managing, marketing, promoting, and delivering
7 Defendants’ platforms, particularly when Defendants’ platforms were intentionally and
8 deliberately designed, maintained, and marketed to maximize the time children/minors spend on
9 Defendants’ platforms, especially when Defendants’ platforms were intentionally and
10 deliberately designed, maintained, and marketed to maximize the time children/minors spend on
11 Defendants’ platforms.
12 368. The full extent and scale of the injuries caused by the intended usage of Defendants’
13 social media platforms could not be known to Plaintiffs.
14 369. Defendants’ negligence was the legal and proximate cause of the injuries, harm and
15 economic losses that Plaintiffs suffered and will continue to suffer. But for Defendants’
16 negligence as described herein, such injuries, harm and economic losses would not have
17 occurred.
18 370. The mental health crisis caused by Defendants has caused a major disruptive
19 behavioral crisis in Plaintiffs’ schools, and Plaintiffs have had to take steps to mitigate the harm
20 and disruption caused by Defendants’ conduct, including but not limited to:
21 a. Providing training to teachers and staff to recognize and build awareness of
22 Defendants’ harmful platforms and the consequences of use of same;
23 b. Hiring additional teachers and staff to alleviate the youth mental health crisis,
24 including mental, emotional, and social harm caused to students and members of
25 the community;
26
27
28
Complaint
86
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 90 of 107
87
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 91 of 107
1 platforms. Defendants consciously decided not to redesign, warn, or inform the unsuspecting
2 public, including Plaintiffs and Plaintiffs’ students. Defendants’ willful, knowing, and reckless
3 conduct therefore warrants, and Plaintiffs seek, an award of aggravated or punitive damages.
4 COUNT THREE
5 GROSS NEGLIGENCE
6 372. Plaintiffs incorporate by reference all preceding paragraphs as though fully set forth
7 herein.
8 373. Defendants were grossly negligent in designing, manufacturing, supplying,
9 distributing, inspecting (or not inspecting), testing (or not testing), marketing, promoting,
10 advertising, packaging, and/or labeling Defendants’ platforms.
11 374. Defendants owed Plaintiffs a duty to not expose Plaintiffs to an unreasonable risk of
12 harm, and to act with reasonable care as a reasonably careful person and/or company would act
13 under the circumstances.
14 375. At all time relevant herein, Defendants owed a duty to consumers and the general
15 public, including Plaintiffs, to exercise reasonable care in the creation, production, maintenance,
16 distribution, management, marketing, promotion, and delivery of Defendants’ social media
17 platforms, including the duty to take all reasonable steps necessary to design, research , market,
18 advertise, promote, operate, and distribute their platforms in a way that is not unreasonably
19 dangerous to consumers and users, including youth.
20 376. At all times relevant herein, Defendants owed a duty to consumers and the general
21 public, including Plaintiffs, to exercise reasonable care in the creation, production, maintenance,
22 distribution, management, marketing, promotion, and delivery of their social media platforms.
23 This included a duty to provide accurate, true, and correct information about the harms and risks
24 of using Defendants’ platforms, including the harms and risks to youth. This also included a duty
25 to give accurate and complete warnings about the potential adverse effects of extended use of
26 Defendants’ platforms by children/minors, and the dangers and risks from the features of their
27 platforms, such as algorithm-driven harmful content recommendations.
28
Complaint
88
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 92 of 107
1 377. At all times relevant herein, Defendants knew or, in the exercise of reasonable care,
2 should have known of the hazards and dangers of their respective social media platforms and
3 specifically, the health hazards their platforms posed to youth in particular, especially prolonged
4 use of such platforms where exposure to harmful content was likely.
5 378. Therefore, Defendants, by action and inaction, representation and omission, breached
6 their duty of reasonable care, failed to exercise ordinary care, and failed to act as a reasonably
7 careful person and/or company would act under the circumstances in the creation, production,
8 maintenance, distribution, management, marketing, promotion, and delivery of their social media
9 platforms, in that Defendants’ creation, production, maintenance, distribution, management,
10 marketing, promotion, and delivery social media platforms that Defendants knew or had reason
11 to know would negatively impact the mental health of consumers, particularly youth, and failed
12 to prevent or adequately warn of these risks and injuries.
13 379. Despite their opportunity, ability and means to investigate, study, and test their social
14 media platforms and to provide adequate warnings, Defendants failed to take these actions.
15 Defendants have wrongfully concealed information and have made false and/or misleading
16 statements concerning the safety and use of Defendants’ platforms.
17 380. Defendants engaged in willful and/or wanton conduct that lacked any care and
18 amounted to an extreme departure from what a reasonably careful person would do in the same
19 situation to prevent harm to others. Defendants’ willful and wanton conduct caused Plaintiffs to
20 suffer harm.
21 381. The willful and wanton conduct of Defendants includes, but is not limited to:
22 a. Creating, producing, maintaining, distributing, managing, marketing, promoting,
23 and delivering their platforms to the general public and Plaintiffs’ students
24 without thorough and adequate pre- and post-market testing;
25 b. Failing to sufficiently study and conduct necessary tests to determine whether or
26 not their platforms were safe for children/minor users;
27
28
Complaint
89
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 93 of 107
90
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 94 of 107
1 k. Failing to disclose to Plaintiffs, Plaintiffs’ students, and the general public that
2 Defendants’ platforms are designed to maximize the time children, adolescents,
3 and teens spend on Defendants’ platforms and that such platforms cause negative
4 mental, emotional, and social health consequences, particularly among minors;
5 l. Failing to warn Plaintiffs, Plaintiffs’ students, and the public of the true risks of
6 using Defendants’ platforms;
7 m. Advertising, marketing, and recommending Defendants’ platforms while
8 concealing and failing to disclose or warn of the dangers known by Defendants to
9 be associated with, or caused by, minors’ use of Defendants’ platforms;
10 n. Continuing the creation, production, maintenance, distribution, management,
11 marketing, promotion, and delivery of Defendants’ platforms with knowledge that
12 Defendants’ platforms are unreasonably unsafe, addictive, and dangerous to
13 minors, and otherwise harmful to minors’ mental and emotional health;
14 o. Failing to change Defendants’ algorithms, which are used to recommend content
15 to users, in a manner that would no longer concentrate on maximizing the amount
16 of time users spend on Defendants’ platforms notwithstanding the reasonably
17 foreseeable mental and emotional safety risks this posed to Defendants’ minor
18 users;
19 p. Failing to adequately limit Defendants’ algorithm-based recommendations to
20 filter out content that exposes youth users to content that is violent, sexual, or
21 encouraging self-harm and other types of harmful content; and
22 q. Representing that Defendants’ platforms were safe for child, adolescent, and teen
23 users when, in fact, Defendants knew or should have known that the platforms
24 created a clear and present danger for the mental and emotional health of minors.
25 382. Defendants knew new or should have known that it was foreseeable that Plaintiffs
26 would suffer injuries as a result of Defendants’ failure to exercise reasonable care in designing,
27 researching, developing, testing, marketing, supplying, promoting, advertising, operating, and
28
Complaint
91
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 95 of 107
92
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 96 of 107
1 h. Addressing and mitigating safety risks caused by bullying, threats, and other
2 harmful behaviors proximately caused by Defendants’ platforms;
3 i. Repurposing existing staff and resources to address and mitigate the mental,
4 emotional, and social issues caused by use of Defendants’ platforms;
5 j. Amending school policy and handbook(s) to address the dangers and disruptions
6 caused by use of Defendants’ platforms on campus; and
7 k. Addressing the increased incidence of vandalism, property damage, crime, and
8 need for student discipline including detention, truancy, and increased security
9 proximately caused by Defendants’ platforms.
10 386. Defendants breached the duties they owed to Plaintiffs and in doing so, were wholly
11 unreasonable. Defendants’ conduct, as described above, was intended to serve their own interests
12 despite having reason to know and consciously disregarding a substantial risk that their conduct
13 was likely to significantly injure the rights of others, including Plaintiffs. Defendants consciously
14 pursued a course of conduct knowing that it created a substantial risk of significant harm to
15 others, including Plaintiffs and Plaintiffs’ students.
16 387. As a foreseeable consequence of Defendants’ breaches of their duties, Plaintiffs have
17 suffered and will continue to suffer direct and consequential economic and other injuries as a
18 result of dealing with the youth mental health crisis in Plaintiffs’ schools, as described herein,
19 including but not limited to expending, diverting, and increasing resources to address this crisis.
20 388. Defendants engaged in conduct, as described above, that constitutes malice, and
21 oppression, with intent to cause injury and/or with willful and knowing disregard of the rights or
22 safety of Plaintiffs, being fully aware of the probable dangerous consequences of the conduct and
23 deliberately failing to avoid those consequences.
24 389. Defendants’ conduct constituting malice, and oppression, was committed by one or
25 more officers, directors, or managing agents of Defendants, who acted on behalf of Defendants;
26 was authorized by one or more officers, directors, or managing agents of Defendants, and
27
28
Complaint
93
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 97 of 107
1 adopted or approved that conduct by one or more of such officers, directors, or managing agents
2 after the conduct occurred.
3 390. Defendants regularly risk the lives and health of minors and other users of their
4 platforms with full knowledge of the dangers of their platforms. Defendants made conscious
5 decisions not to redesign, re-label, warn, or inform the unsuspecting public, including Plaintiffs
6 and Plaintiffs’ students. Defendants’ willful, knowing and reckless conduct therefore warrants,
7 and Plaintiffs seek, an award of aggravated or punitive damages.
8 COUNT FOUR
9 VIOLATIONS OF THE RACKETTER INFLUENCED AND CORRUPT
10 ORGANIZATIONS ACT (“RICO”)
11 391. Plaintiffs incorporate by reference all preceding paragraphs as if fully set forth
12 herein.
13 392. This claim is brought by Plaintiffs against all Defendants (the “RICO Defendants”
14 for purposes of this Count III and Count IV) for actual damages, treble damages, and equitable
15 relief under 18 U.S.C. § 1964, for violations of RICO, 18 U.S.C. § 1961 et seq.
16 393. Pursuant to RICO, it is “unlawful for any person employed by or associated with any
17 enterprise engaged in, or the activities of which affect, interstate or foreign commerce, to
18 conduct or participate, directly or indirectly, in the conduct of such enterprise’s affairs through a
19 pattern of racketeering activity.” 18 U.S.C. § 1962(c).
20 394. At all times relevant herein, each RICO Defendant is and has been a “person” within
21 the meaning of 18 U.S.C. § 1961(3), because they are capable of holding, and do hold, “a legal
22 or beneficial interest in property.”
23 395. Each RICO Defendant conducted the affairs of an enterprise through a pattern of
24 racketeering activity, in violation of 18 U.S.C. § 1962(c), as described herein.
25 396. Each Plaintiff is a “person” within the meaning of 18 U.S.C. § 1961(3), and has
26 standing to sue under 18 U.S.C. § 1964(c) because they were and are injured in their business
27 and/or property “by reason of” the RICO violations described herein.
28
Complaint
94
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 98 of 107
1 397. Plaintiffs demand all applicable relief as set forth in Plaintiffs’ Prayer for Relief.
2 The Enterprise:
3 398. Section 1961(4) defines an enterprise as “any individual, partnership corporation,
4 association, or other legal entity, and any union or group of individuals associated in fact
5 although not a legal entity.” 18 U.S.C. § 1961(4).
6 399. RICO Defendants form an association-in-fact for the common and continuing
7 purpose described herein and constitute an enterprise within the meaning of 18 U.S.C. § 1961(4)
8 engaged in the conduct of their affairs through a continuing pattern of racketeering activity. The
9 members of the enterprise functioned as a continuing unit with an ascertainable structure
10 separate and distinct from that of the conduct of the pattern of racketeering activity. There may
11 also be other members of the enterprise who are currently unknown to Plaintiffs.
12 400. Alternatively, each of the RICO Defendants is a corporation, company, or other legal
13 entity, and therefore an enterprise within the meaning of 18 U.S.C § 1961(4).
14 401. The enterprise functioned as a continuing unit to achieve shared goals through
15 unlawful means, including the following: (1) to preserve and enhance the market for its social
16 media platforms and RICO Defendants’ own profits, regardless of the truth, the law, or the health
17 consequences to the American people, including Plaintiffs’ students and the communities
18 Plaintiffs serve; (2) to deceive consumers, especially children, adolescents, and teenagers into
19 using their platforms by falsely maintaining that there is doubt as to whether their platforms are
20 responsible for the apparent mental or emotional health consequences to children, adolescents,
21 and teenagers, despite that RICO Defendants knew otherwise; (3) to deceive consumers,
22 especially children, adolescents, and teenagers, into using their platforms by falsely maintaining
23 that RICO Defendants could mitigate the mental or emotional health consequences to children,
24 adolescents, and teenagers, despite that RICO Defendants knew that these negative consequences
25 were inherent to its platforms’ features and technology; (4) to deceive consumers, especially
26 children, adolescents, and teenagers, into becoming or staying addicted to their platforms by
27 maintaining that their platforms were not addictive or that any addictive consequences could be
28
Complaint
95
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 99 of 107
1 mitigated, despite the fact that RICO defendants knew that their platforms were inherently
2 addictive by design; (5) to deceive consumers, particularly children, adolescents, and teenagers,
3 by claiming that they did not market to children, adolescents, and teenagers, while engaging in
4 marketing and manipulation of their platform algorithms with the intent of causing children,
5 adolescents, and teenagers to engage in excessive use of their platforms, regardless of the health
6 or safety concerns; and (6) to deceive consumers about the mental and emotional health risks to
7 children, adolescents, and teenagers associated with RICO Defendants’ platforms, including that
8 their platforms were intentionally and deliberately designed to target children, adolescents, and
9 teenagers and to encourage excessive and harmful behavior, and that RICO Defendants had the
10 ability to manipulate and did manipulate their platforms to be highly addictive, and that RICO
11 Defendants targeted children, adolescents, and teenagers specifically.
12 402. The enterprise has pursued a course of conduct of deceit and misrepresentation and
13 conspiracy to make misrepresentations to the public, to withhold from the public facts material to
14 the decision to use or permit children, adolescents, and teenagers to use RICO Defendants’
15 platforms, to promote and maintain sales from RICO Defendants’ platforms, and the profits
16 derived therefrom, as well as to shield themselves from public, judicial, and governmental
17 scrutiny.
18 403. Each enterprise has engaged in, and its activities have affected, foreign and interstate
19 commerce.
20 Pattern of Racketeering Activity:
21 404. RICO Defendants, each of whom are persons associated with, or employed by, the
22 enterprise, did knowingly, willfully and unlawfully conduct or participate, directly or indirectly
23 in the affairs of the enterprise through a pattern of racketeering activity within the meaning of 18
24 U.S.C. § 1961(1), 1961(5), and l962(c). The racketeering activity was made possible by each
25 RICO Defendant’s regular and repeated use of the facilities and services of the enterprise. Each
26 RICO Defendant had the specific intent to engage in the substantive RICO violations alleged
27 herein.
28
Complaint
96
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 100 of 107
1 405. RICO Defendants controlled the resources and instrumentalities of the enterprise and
2 used that control to perpetrate numerous misleading schemes involving the use of mail and
3 wires. Foremost, separate and apart from their regular business dealings, RICO Defendants
4 misled and continue to mislead the public on the mental health dangers for youth on their
5 platforms.
6 406. RICO Defendants had the common purpose of preserving and enhancing the market
7 for their platforms and for youth as consumers for RICO Defendants’ own profits, regardless of
8 the truth, the law, or the health consequences to the American people, including Plaintiffs’
9 students and the communities Plaintiffs serve.
10 407. RICO Defendants deceived consumers to use RICO Defendants’ platforms while
11 concealing and/or suppressing the relevant findings and research. RICO Defendants deceived
12 consumers, particularly parents and children, adolescents, and teenagers, by claiming that they
13 did not market to children, adolescents, and teenagers, while engaging in marketing and
14 manipulation of their platform algorithms with the intent of causing children, adolescents, and
15 teenagers to engage in excessive use of their platforms, regardless of the health or safety
16 concerns.
17 408. RICO Defendants achieved their common purpose through co-conspirators’ actions
18 in deceiving consumers, regulators, and the general public about the dangerous nature of their
19 platforms. Through the enterprise, RICO Defendants engaged in a pattern of racketeering activity
20 consisting of numerous acts of racketeering in the Northern District of California and elsewhere,
21 including mail fraud and wire fraud, indictable offenses under 18 U.S.C. §§ 1341, 1343
22 409. RICO Defendants are each an enterprise that is engaged in and affects interstate
23 commerce because the companies sold and continue to sell products across the United States, as
24 alleged herein.
25 Predicate Acts: Use of Mail and Wires to Mislead the Public in Violation of 18
26 U.S.C. §§ 1341, 1343
27
28
Complaint
97
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 101 of 107
1 410. From a time unknown and continuing until the time of filing of this complaint, in the
2 Northern District of California and elsewhere, RICO Defendants and others, known and
3 unknown, did knowingly and intentionally devise and intend to devise a scheme and artifice to
4 mislead, and obtain money and property from, members of the public by means of material false
5 and misleading pretenses, representations, and promises, and omissions of material facts,
6 knowing that the pretenses, representations, and promises, were false when made.
7 411. It was part of said scheme and artifice that the RICO Defendants would represent
8 that their platforms pose no substantial risk of mental or emotional health concern to children,
9 adolescents, and teenagers, and were not addictive, when in fact, their platforms did pose such
10 risks, and that their platforms were intentionally and deliberately designed to target children,
11 adolescents, and teenagers and encourage excessive and harmful behavior.
12 412. It was further part of said scheme and artifice that the RICO Defendants and their co-
13 conspirators would and did maintain sales and profits of their platforms, by concealing, and
14 suppressing material information regarding the mental and emotional health risks to children,
15 adolescents, and teenagers associated with their usage, including that their platforms were
16 intentionally and deliberately designed to target children, adolescents, and teenagers and to
17 encourage excessive and harmful behavior, and that they had the ability to manipulate and did
18 manipulate their platforms to be highly addictive, and that RICO Defendants targeted children,
19 adolescents, and teenagers specifically.
20 413. It was also part of said scheme and artifice that, in order to conceal the health risks of
21 their platforms, RICO Defendants and their co-conspirators would and did make false
22 representations and misleading statements to the public, and would and did falsely represent that
23 Defendants would fund and conduct objective, scientific research, and disclose the results of
24 such research, to resolve concerns about mental and emotional health related issues to youth, and
25 would and did falsely represent that Defendants did not target children, adolescents, and
26 teenagers, and did suppress and hide adverse research results, did misrepresent and fail to
27
28
Complaint
98
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 102 of 107
1 disclose their ability to manipulate and the manipulation of their platforms and their addictive
2 qualities, and would and did misrepresent their actions to government personnel, and others.
3 414. It was further a part of said scheme and artifice that RICO Defendants and their co-
4 conspirators would and did misrepresent, conceal, and hide and cause to be misrepresented,
5 concealed, and hidden, the purpose of, and acts done in furtherance of, the scheme.
6 415. It was a further part of said scheme and artifice, and in furtherance thereof, that
7 RICO Defendants would and did communicate with each other and with their co-conspirators
8 and others, in person, by mail, and by telephone and other interstate and foreign wire facilities,
9 regarding the true nature of their platforms and the mental and emotional health risks to children,
10 adolescents, and teenagers.
11 416. It was further part of said scheme and artifice that RICO Defendants’ made
12 communications directed toward government officials and to the public in furtherance of their
13 conspiracy to deceive the public by means of telephone, mail, internet, wire transmissions, and
14 other forms of interstate commerce and communications.
15 417. For purposes of executing and attempting to execute that scheme and artifice, RICO
16 Defendants and their co-conspirators would and did knowingly transmit and cause to be
17 transmitted in interstate and foreign commerce by means of wire, radio and television
18 communication writings, signs, signals, pictures and sounds (collectively "transmissions") in
19 violation of 18 U.S.C. §§ 1343.
20 418. For the purpose of executing and attempting to execute the scheme and artifice
21 described herein, RICO Defendants and their co-conspirators would and did: knowingly place
22 and cause to be placed in any post office or authorized depository for mail matter, matters and
23 things to be sent and delivered by the United States Postal Service (and its predecessor, the
24 United States Post Office Department); took and received therefrom such matters and things; and
25 knowingly caused to be delivered by mail according to the direction thereon, and at the place at
26 which it is directed to be delivered by the person to whom it is addressed, any such matter and
27 thing, in violation of 18 U.S.C. § 1341.
28
Complaint
99
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 103 of 107
1 COUNT FIVE
2 CONSPIRACY TO CONDUCT THE AFFAIRS OF THE ENTERPRISE THROUGH
3 A PATTERN OF RACKETEERING ACTIVITY
4 (18 U.S.C. § 1962)
5 419. Plaintiffs incorporate the allegations set forth above as if fully set forth herein.
6 420. From a time unknown and continuing until the time of filing of this Complaint, in the
7 Northern District of California and elsewhere, RICO Defendants and others known and unknown
8 did unlawfully, knowingly and intentionally combine, conspire, confederate, and agree together
9 with each other, and others whose names are both known and unknown, to conduct and
10 participate, directly and indirectly, in the conduct of the affairs of the aforementioned enterprise,
11 which was engaged in the activities of which affected, interstate and foreign commerce, through
12 a pattern of activity consisting of multiple acts indictable under 18 U.S.C. §§ 1341 and 1343, in
13 violation of 18 U.S.C. § 1962(d).
14 421. Each of the RICO Defendants agreed that at least two acts of racketeering activity
15 would be committed by a member of the conspiracy in furtherance of the conduct of the
16 enterprise. It was part of the conspiracy that RICO Defendants and their co-conspirators would
17 commit numerous acts of racketeering activity in the conduct of the affairs of the enterprise,
18 including but not limited to, acts of racketeering set forth below.
19 422. From a time unknown and continuing until the time of filing of this complaint, in the
20 Northern District of California and elsewhere, RICO Defendants and others known and unknown
21 did knowingly and intentionally devise and intend to devise a scheme and artifice to mislead, and
22 obtain money and property from, members of the public by means of material false and mislead
23 pretenses, representations, and promises, and omissions of material facts, knowing that the
24 pretenses, representations, and promises, were false when made.
25 423. It was part of said scheme and artifice that the RICO Defendants would represent
26 that their platforms pose no substantial risk of mental or emotional health concern to children,
27 adolescents, and teenagers, and were not addictive, when in fact, their platforms did pose such
28
Complaint
100
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 104 of 107
1 risks, and that their platforms were intentionally and deliberately designed to target children,
2 adolescents, and teenagers and encourage excessive and harmful behavior.
3 424. It was further part of said scheme and artifice that RICO Defendants and their co-
4 conspirators would and did maintain sales and profits of their platforms, by concealing, and
5 suppressing material information regarding the mental and emotional health risks to children,
6 adolescents, and teenagers associated with their usage, including that their platforms were
7 intentionally and deliberately designed to target children, adolescents, and teenagers and to
8 encourage excessive and harmful behavior, and that they had the ability to manipulate and did
9 manipulate their platforms to be highly addictive, and that RICO Defendants targeted children,
10 adolescents, and teenagers specifically.
11 425. It was further part of said scheme and artifice that, in order to conceal the health risks
12 of their platforms, RICO Defendants and their co-conspirators would and did make false
13 representations and misleading statements to the public, and would and did falsely represent that
14 Defendants would fund and conduct objective, scientific research, and disclose the results of
15 such research, to resolve concerns about mental and emotional health related issues to youth, and
16 would and did falsely represent that Defendants did not target children, adolescents, and
17 teenagers, and would and did suppress and hide adverse research results, would and did
18 misrepresent and fail to disclose their ability to manipulate and the manipulation of their
19 platforms and their addictive qualities, and would and did misrepresent their actions to
20 government personnel and others.
21 426. It was a further part of said scheme and artifice that RICO Defendants and their co-
22 conspirators would and did misrepresent, conceal, and hide and cause to be misrepresented,
23 concealed, and hidden, the purpose of, and acts done in furtherance of, the scheme.
24 427. It was a further part of said scheme and artifice, and in furtherance thereof, that
25 RICO Defendants would and did communicate with each other and with their co-conspirators
26 and others, in person, by mail, and by telephone and other interstate and foreign wire facilities,
27
28
Complaint
101
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 105 of 107
1 regarding the true nature of their platforms and the mental and emotional health risks to children,
2 adolescents, and teenagers.
3 428. It was further part of said scheme and artifice that RICO Defendants’ made
4 communications directed toward government officials and to the public in furtherance of their
5 conspiracy to deceive the public by means of telephone, mail, internet, wire transmissions, and
6 other forms of interstate commerce and communications.
7 429. For purposes of executing and attempting to execute that scheme and artifice, RICO
8 Defendants and their co-conspirators would and did knowingly transmit and cause to be
9 transmitted in interstate and foreign commerce by means of wire, radio and television
10 communication writings, signs, signals, pictures and sounds (hereinafter "transmissions") in
11 violation of 18 U.S.C. § 1343.
12 430. For the purpose of executing and attempting to execute the scheme and artifice
13 herein described, RICO Defendants and their co-conspirators would and did: knowingly place
14 and cause to be placed in any post office or authorized depository for mail matter, matters and
15 things to be sent and delivered by the United States Postal Service (and its predecessor, the
16 United States Post Office Department); took and received therefrom such matters and things; and
17 knowingly caused to be delivered by mail according to the direction thereon, and at the place at
18 which it is directed to be delivered by the person to whom it is addressed, any such matter and
19 thing, in violation of 18 U.S.C. § 1341.
20 VII. PRAYER FOR RELIEF
21
22 WHEREFORE, Plaintiff prays for judgment as follows:
23 431. Entering an Order that the conduct alleged herein constitutes a public nuisance
24 under California law;
25 432. Entering an Order that Defendants are jointly and severally liable;
26 433. Entering an Order requiring Defendants to abate the public nuisance described
27 herein and to deter and/or prevent the resumption of such nuisance;
28
Complaint
102
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 106 of 107
103
Case 3:23-cv-01774-SK Document 1 Filed 04/12/23 Page 107 of 107
1
2 VIII. JURY TRIAL DEMAND
3 Plaintiff hereby demands a trial by jury.
Respectfully Submitted,
4 DATED: April 12, 2023
5 /s/James Frantz, Esq.
CA Bar # 87492;
6 [email protected]
7
/s/William B. Shinoff, Esq.
8 CA Bar # 280020;
[email protected]
9
10 FRANTZ LAW GROUP, APLC
402 W. Broadway, Ste. 860
11 San Diego, CA 92101
P: (619) 233-5945
12 F: (619) 525-7672
13
Attorneys for Plaintiff
14 HANNA PUBLIC SCHOOLS
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Complaint
104