Preview
1 LAURA MARQUEZ-GARRETT, SBN 221542
laura@socialmediavictims.org
2 SYDNEY LOTTES, SBN 345387
sydney@socialmediavictims.org
3
SOCIAL MEDIA VICTIMS LAW CENTER
4 600 1st Ave, Ste 102 - PMB 2383
Seattle, WA 98104
5 Telephone: 206-741-4862
Facsimile: 206-957-9549
6
Attorneys for Plaintiffs
7
8
SUPERIOR COURT OF THE STATE OF CALIFORNIA
9
COUNTY OF LOS ANGELES, CENTRAL DISTRICT
10
COORDINATION PROCEEDING SPECIAL JUDICIAL COUNCIL COORDINATION
11 TITLE [RULE 3.400] PROCEEDING NO. 5255
12 SOCIAL MEDIA CASES Lead Case No. 22STCV21355
_____________________________________ for filing purposes
13
THIS DOCUMENT RELATES TO: [This document relates to Case No.
14 24SMCV00732]
AVIANNAH-LEIGH SALTERS, individually;
15 F.R. on behalf of C.R.; J.K. on behalf of A.K.; Judge: Hon. Carolyn B. Kuhl
T.J. individually, and A.J., individually and on SSC-12
16 behalf of K.J.; L.M. on behalf of F.M.; SHANYA
RAY, individually; M.M. on behalf of S.M.; S.R.
17 on behalf of S.H.; JENNIFER BUTA, NOTICE OF POTENTIAL ADD-ON
individually; JOHN DEMAY, individually and as CASE, AND REQUEST FOR
18 successor-in-interest to JORDAN DEMAY; K.K. COORDINATION
on behalf of A.L.; J.B. on behalf of P.B.; K.W.
19 individually and on behalf of K.A.; T.L. and L.T.,
individually; M.S. on behalf of C.L.; R.S. on Action Filed: 01/31/2024
20 behalf of A.D.; G.B on behalf of I.B.; I.V. on Trial Date: None
behalf of V.V.; C.L., individually; L.B. on behalf
21 of A.G.; N.F. on behalf of A.F.; T.S. on behalf of
A.S.; H.J. individually and on behalf of H.C.; and
22 M.P. on behalf of N.L.,
23 Plaintiff(s),
24 v.
25 META PLATFORMS, INC.; INSTAGRAM,
LLC; FACEBOOK PAYMENTS, INC.;
26
SICULUS, INC.; FACEBOOK OPERATIONS,
27 LLC; SNAP, INC.; BYTEDANCE, LTD.;
BYTEDANCE, INC; TIKTOK, LTD.; TIKTOK,
28
1 LLC; TIKTOK, INC., GOOGLE LLC;
YOUTUBE, LLC; DISCORD, INC.,
2
3 This document relates to Case No.
24SMCV00732
4 All Cases
5 (Christina Arlington Smith, et al., v. TikTok Inc.,
et al., Case No. 22STCV21355)
6
7 TO THE HONORABLE CAROLYN B. KUHL, COORDINATION TRIAL JUDGE,
8 THE PARTIES TO THE ACTIONS, AND THEIR COUNSEL OF RECORD:
9 PLEASE TAKE NOTICE that pursuant to California Rule of Court 3.531, Plaintiffs-
10 Petitioners write to notice the below action as potential add-on case to the JCCP 5255 Coordinated
11 Proceeding.
12 Aviannah-Leigh Salters, et al., Superior Court of 24SMCV00732 January 31, 2024
vs. Meta Platforms, Inc.; et al. California, Los
13 Angeles County
14
A true and correct copy of the complaint filed in the above action is attached hereto as
15
Exhibit A.
16
Dated: February 20, 2024.
17
SOCIAL MEDIA VICTIMS LAW CENTER
18
By:
19 Laura Marquez-Garrett, SBN 221542
laura@socialmediavictims.org
20 Sydney Lottes, SBN 345387
sydney@socialmediavictims.org
21 Matthew P. Bergman (pro hac vice anticipated)
matt@socialmediavictims.org
22 Glenn S. Draper (pro hac vice anticipated)
glen@socialmediavictims.org
23
SOCIAL MEDIA VICTIMS LAW CENTER
24 600 1st Ave, Ste 102 - PMB 2383
Seattle, WA 98104
25 Telephone: 206-741-4862
Facsimile: 206-957-9549
26
Attorneys for Plaintiffs
27
28
NOTICE OF POTENTIAL ADD-ON CASE AND REQUEST FOR COORDINATION
1
1 PROOF OF SERVICE
2 Re: Social Media Cases
Case No. JCCP 5255
3
I am employed by Social Media Victims Law Center, PLLC, 600 1st Ave, Ste 102 - PMB
4
2383, Seattle, Washington, 98104. I am over the age of 18 years and am not a party to this action.
5
On February 20, 2024, I served a copy of the following document(s):
6
NOTICE OF POTENTIAL ADD-ON CASE, AND REQUEST FOR COORDINATION
7
on the interested parties in this action pursuant to the most recent Omnibus Service List by
8
submitting an electronic version of the document(s) via file transfer protocol (FTP) to Case
9 Anywhere through the upload feature at www.caseanywhere.com.
10 I declare under penalty of perjury under the laws of the State of California that the
11 foregoing is true and correct.
12
Executed on February 20, 2024, at Seattle, Washington.
13
14 __________________________
Julie Sojot, Legal Assistant
15
16
17
18
19
20
21
22
23
24
25
26
27
28
NOTICE OF POTENTIAL ADD-ON CASE AND REQUEST FOR COORDINATION
2
Exhibit A
1 Laura Marquez-Garrett, SBN 221542
laura@socialmediavictims.org
2 Sydney Lottes, SBN 345387
3 sydney@socialmediavictims.org
SOCIAL MEDIA VICTIMS LAW CENTER
4 600 1st Avenue, Suite 102-PMB 2383
Seattle, WA 98104
5 Ph: 206-741-4862
6
Attorneys for Plaintiffs
7
[Additional counsel appear on signature page.]
8
IN THE SUPERIOR COURT OF CALIFORNIA
9 COUNTY OF LOS ANGELES
10
11 AVIANNAH-LEIGH SALTERS, individually; CIVIL ACTION NO.
F.R. on behalf of C.R.; J.K. on behalf of A.K.;
12 T.J. individually, and A.J., individually and on
behalf of K.J.; L.M. on behalf of F.M.; COMPLAINT
13 SHANYA RAY, individually; M.M. on behalf
of S.M.; S.R. on behalf of S.H.; JENNIFER
14 BUTA, individually; JOHN DEMAY,
individually and as successor-in-interest to
15 JORDAN DEMAY; K.K. on behalf of A.L.; JURY DEMAND
J.B. on behalf of P.B.; K.W. individually and on
16 behalf of K.A.; T.L. and L.T., individually;
M.S. on behalf of C.L.; R.S. on behalf of A.D.;
17 G.B on behalf of I.B.; I.V. on behalf of V.V.;
C.L., individually; L.B. on behalf of A.G.; N.F.
18 on behalf of A.F.; T.S. on behalf of A.S.; H.J.
individually and on behalf of H.C.; and M.P. on
19 behalf of N.L.,
20 Plaintiff(s),
21 v.
22
META PLATFORMS, INC.; INSTAGRAM,
23 LLC; FACEBOOK PAYMENTS, INC.;
SICULUS, INC.; FACEBOOK OPERATIONS,
24 LLC; SNAP, INC.; BYTEDANCE, LTD.;
BYTEDANCE, INC; TIKTOK, LTD.;
25 TIKTOK, LLC; TIKTOK, INC., GOOGLE
LLC; YOUTUBE, LLC; DISCORD, INC.,
26
Defendant(s).
27
28
COMPLAINT
1 I. INTRODUCTION
2 1. American children are suffering an unprecedented mental health crisis fueled by
3 Defendants’ addictive and dangerous social media products.
4 2. In the past decade, Americans’ engagement with social media grew exponentially,
5
nowhere more dramatically than among our country’s youth. That explosion in usage is no accident.
6
It is the result of Defendant(s) META PLATFORMS, INC., INSTAGRAM, LLC. FACEBOOK
7
PAYMENTS, INC., SICULUS, INC., FACEBOOK OPERATIONS, LLC (collectively, “Meta”);
8
SNAP, INC. (“Snap”); BYTEDANCE, LTD., BYTEDANCE, INC, TIKTOK, LTD., TIKTOK,
9
LLC, TIKTOK, INC. (collectively, “ByteDance” or “TikTok”); ALPHABET INC.; GOOGLE
10
LLC; YOUTUBE, LLC (collectively, “Google” or “YouTube”); DISCORD, INC. (“Discord”)
11
studied efforts to induce young people to compulsively use their platforms and products. Borrowing
12
heavily from the behavioral and neurobiological techniques used by slot machines and exploited by
13
the cigarette industry, Defendants deliberately embedded in their products an array of design
14
features aimed at maximizing youth engagement to drive advertising revenue. Defendants know
15
16 children are in a developmental stage that leaves them particularly vulnerable to the addictive effects
17 of these features. Defendants target them anyway, in pursuit of additional profit.
18 3. The defects in Defendants’ products vary by platform, but all exploit children and
19 adolescents. They include but are not limited to an algorithmically-generated, endless feed to keep
20 users scrolling in an induced “flow state;” “intermittent variable rewards” that manipulate dopamine
21 delivery to intensify use; “trophies” to reward extreme usage; metrics and graphics to exploit social
22 comparison; incessant notifications that encourage repetitive account checking by manufacturing
23 insecurity; inadequate, essentially illusory age verification protocols; and deficient tools for parents
24 that create the illusion of control.
25 4. The resulting ubiquity of Defendants’ products in the lives and palms of our kids,
26 and the ensuing harm to them, is hard to overstate. Today, over a third of 13 to 17-year-old kids
27 report using at least one of Defendants’ apps “almost constantly” and admit this is “too much.” Yet
28 more than half of these kids report that they would struggle to cut back on their social media use.
Instead of feeding coins into machines, kids are feeding Defendants’ platforms with an endless
1
COMPLAINT
1 supply of attention, time, and data.
2 5. Defendants’ choices have generated extraordinary corporate profits—and yielded
3 immense tragedy. Suicide rates for youth are up an alarming 57%. Emergency room visits for
4 anxiety disorders are up 117%. In the decade leading up to 2020, there was a 40% increase in high
5
school students reporting persistent sadness and hopelessness, and a 36% increase in those who
6
attempted to take their own lives. In 2019, one in five high school girls had made a suicide plan. In
7
2021, one in three girls seriously considered attempting suicide. Children and their parents and
8
guardians across the country have struggled to cope with the severe, lasting damage visited on their
9
families by anxiety, depression, addiction, eating disorders, self-harm, suicidality, and the loss of
10
outliving one’s child.
11
6. This lawsuit follows on a growing body of scientific research, including Defendants’
12
own internal (previously concealed) studies, that draws a direct line between Defendants’ conscious,
13
intentional design choices and the youth mental health crisis gripping our nation. Defendants and
14
their products have rewired how our kids think, feel, and behave. Disconnected “Likes” have
15
16 replaced the intimacy of adolescent friendships. Mindless scrolling has displaced the creativity of
17 play and sport. While presented as “social,” Defendants’ products have in myriad ways promoted
18 disconnection, disassociation, and a legion of resulting mental and physical harms.
19 7. The U.S. Surgeon General recently explained that children versus Big Tech is “just
20 not a fair fight.”1 “You have some of the best designers and product developers in the world who
21 have designed these products to make sure people are maximizing the amount of time they spend
22 on these platforms. And if we tell a child, use the force of your willpower to control how much time
23 you’re spending, you’re pitting a child against the world’s greatest product designers.”
24 8. Over the past decade, Defendants have relentlessly pursued a strategy of growth-at-
25 all-costs, recklessly ignoring the impact of their products on children’s mental and physical health
26
27
28
1
Allison Gordon & Pamela Brown, Surgeon General says 13 is ‘too early’ to join social media, CNN (Jan. 29, 2023),
https://www.cnn.com/2023/01/29/health/surgeon-general-social-media/index.html. Exhibits and referenced materials
are incorporated in this Master Complaint as if fully stated herein.
2
COMPLAINT
1 and well-being. In a race to corner the “valuable but untapped” market of tween and teen users, each
2 Defendant designed product features to promote repetitive, uncontrollable use by kids.2
3 9. Adolescents and children are central to the Defendants’ business models. These age
4 groups are highly connected to the Internet, more likely to have social media accounts, and more
5
likely to devote their downtime to social media usage. Additionally, youth influence the behavior
6
of their parents and younger siblings. As one Defendant put it, “los[ing] the teen foothold in the
7
U.S.” would mean “los[ing] the pipeline” for growth.3
8
10. Recognizing the power of engaging young users, Defendants deliberately tweaked
9
the design and operation of their apps to exploit the psychology and neurophysiology of kids.
10
Because children’s and adolescents’ brains are not fully developed, they lack the same emotional
11
maturity, impulse control, and psychological resiliency as adults. As a result, they are uniquely
12
susceptible to addictive features in digital products and highly vulnerable to the consequent harms.
13
Knowing this, Defendants wrote code designed to manipulate dopamine release in children’s
14
developing brains and, in doing so, create compulsive use of their apps.
15
16 11. Defendants’ strategy paid off. Users of their products now number in the billions,
17 and the frequency and time spent by these users has grown exponentially. This has allowed
18 Defendants to harvest a vast amount of personal user data—from the school you attend, to the
19 sneakers you covet, to the places you’ve been and the people you’ve met. This, in turn, has allowed
20 Defendants to mint a fortune, by selling to others the ability to micro-target advertisements to
21 incredibly narrow slices of the public.4
22 12. Defendants’ growth has come at the expense of its most vulnerable users: children
23 around the world, including Plaintiffs, who they cultivated and exploited. Plaintiffs are not merely
24 the collateral damage of Defendants’ products. They are the direct victims of the intentional product
25
2
Georgia Wells & Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, Documents
26 Show, Wall St. J. (Sept. 28, 2021), https://www.wsj.com/articles/facebook-instagram-kids-tweens-attract-
11632849667; see also Haugen_00022339.
27 3
Sheera Frenkel et al., Instagram Struggles with Fears of Losing Its ‘Pipeline’: Young Users, N.Y. Times (Oct. 26,
2021), available at https://www.nytimes.com/2021/10/16/technology/instagram-teens.html.
28 4
See Snap, Inc., 2022 Annual Report (Form 10-K) at 15 (Jan. 31, 2023) (“[W]e rely heavily on our ability to collect
and disclose data[] and metrics to our advertisers so we can attract new advertisers and retain existing advertisers. Any
restriction or inability, whether by law, regulation, policy, or other reason, to collect and disclose data and metrics
which our advertisers find useful would impede our ability to attract and retain advertisers.”).
3
COMPLAINT
1 design choices made by each Defendant. They are the intended targets of the harmful features that
2 pushed them into self-destructive feedback loops.
3 13. As a direct result of Defendants’ successful promotion of their defective products,
4 the rates of mental health issues among children have climbed steadily since 2010. By 2018, suicide
5
was the second leading cause of death for youth.5
6
14. Weeks later, the U.S. Surgeon General issued an advisory “to highlight the urgent
7
need to address the nation’s youth mental health crisis.”6 In a scathing rebuke of the assault on our
8
children, the Surgeon General recognized the dangerous designs in Defendants’ products and
9
Defendants’ abdication of responsibility for the resulting harms:
10
In these digital public spaces, which are privately owned and tend to
11 be run for profit, there can be tension between what’s best for the
technology company and what’s best for the individual user or for
12 society. Business models are often built around maximizing user
engagement as opposed to safeguarding users’ health and ensuring
13 that users engage with one another in safe and healthy ways . . . .
[T]echnology companies must step up and take responsibility for
14 creating a safe digital environment for children and youth. Today,
most companies are not transparent about the impact of their products,
15 which prevents parents and young people from making informed
16 decisions and researchers from identifying problems and solutions.7
17 15. The Surgeon General’s comments have since been echoed by President Biden
18 himself. In both his 2022 and 2023 State of the Union Addresses, the President urged the nation to
19 “hold social media platforms accountable for the national experiment they’re conducting on our
20 children for profit.”8 In a January 11, 2023 op-ed, President Biden amplified this point: “The risks
21 Big Tech poses for ordinary Americans are clear. Big Tech companies collect huge amounts of data
22 on the things we buy, on the websites we visit, on the places we go and, most troubling of all, on
23
24
25 5
CDC, Deaths: Leading Causes for 2018, 70(4) National Vital Statistics Reports at 10 (May 17, 2021),
https://www.cdc.gov/nchs/data/nvsr/nvsr70/nvsr70-04-508.pdf.
26 6
Press Release, U.S. Dep’t Health & Hum. Servs., U.S. Surgeon General Issues Advisory on Youth Mental Health
Crisis Further Exposed by COVID-19 Pandemic (Dec. 7, 2021), https://www.hhs.gov/about/news/2021/12/07/us-
27 surgeon-general-issues-advisory-on-youth-mental-health-crisis-further-exposed-by-covid-19-pandemic.html.
7
U.S. Surgeon General’s Advisory, Protecting Youth Mental Health (Dec. 7, 2021),
28 https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf (emphasis in original).
8
The White House, President Biden’s State of the Union Address (Mar. 1, 2022), https://www.whitehouse.gov/state-
of-the-union-2022/; see also The White House, President Biden’s State of the Union Address (Feb. 7, 2023),
https://www.whitehouse.gov/state-of-the-union-2023/.
4
COMPLAINT
1 our children.”9 The President observed that “millions of young people are struggling with bullying,
2 violence, trauma and mental health” as a result of Defendants’ conduct and products, and again
3 stated that “[w]e must hold social-media companies accountable” for their role in this crisis.10
4 16. These statements by President Biden and the Surgeon General are in line with a
5
substantial body of peer-reviewed scientific literature documenting the harmful impact that
6
Defendants’ apps have on our children, including the various injuries suffered by Plaintiffs. This
7
body of research demonstrates that Defendants’ defectively designed products can cause the harms
8
Plaintiffs suffer: addiction, compulsive use, anxiety, depression, eating disorders, body dysmorphia,
9
self-harm, sexual exploitation, suicidal ideations, other serious diseases and injuries, and suicide
10
itself. Overall rates of these disorders have increased greatly because of widespread consumption of
11
Defendants’ products by children in this country and across the world.
12
17. Defendants knew or should have known about the risks of such addiction—which at
13
least one Defendant euphemistically calls “problematic use.” They could have changed their
14
products to avoid the harm. They could have warned the public and Plaintiffs about the danger.
15
16 Instead, Defendants placed growth first.
17 18. Plaintiffs AVIANNAH-LEIGH SALTERS, individually, F.R. on behalf of C.R., J.K.
18 on behalf of A.K., T.J. individually, and A.J., individually and on behalf of K.J., L.M. on behalf of
19 F.M., SHANYA RAY, M.M. on behalf of S.M., S.R. on behalf of S.H., JENNIFER BUTA,
20 individually; JOHN DEMAY, individually and as successor-in-interest to JORDAN DEMAY, K.K.
21 on behalf of A.L., J.B. on behalf of P.B., K.W. individually and on behalf of K.A., T.L. and L.T.,
22 M.S. on behalf of C.L, R.S. on behalf of A.D., G.B. on behalf of I.B., I.V. on behalf of V.V., C.L.,
23 L.B. on behalf of A.G., N.F. on behalf of A.F., T.S. on behalf of A.S., H.J. individually and on
24 behalf of H.C., and M.P. on behalf of N.L., bring this action for personal injuries and, where
25 applicable, wrongful death, against Defendants for harms caused because of use of Defendants’
26
27 9
Joe Biden, Republicans and Democrats, Unite Against Big Tech Abuses, Wall St. J. (Jan. 11, 2023),
https://www.wsj.com/articles/unite-against-big-tech-abuses-social-media-privacy-competition-antitrust-children-
28 algorithm-11673439411.
10
Joe Biden, Republicans and Democrats, Unite Against Big Tech Abuses, Wall St. J. (Jan. 11, 2023),
https://www.wsj.com/articles/unite-against-big-tech-abuses-social-media-privacy-competition-antitrust-children-
algorithm-11673439411.
5
COMPLAINT
1 platforms and wrongful conduct, including: (a) designing defective products that caused serious
2 injuries to users; (b) failing to provide adequate warnings about serious and reasonably foreseeable
3 health risks from use of the products; (c) failing to utilize reasonable care in, among other things,
4 developing, designing, managing, operating, testing, producing, labeling, marketing, advertising,
5
promoting, controlling, selling, supplying, and distributing their products; and (d) engaging in the
6
deliberate concealment, misrepresentation, and obstruction of public awareness of serious health
7
risks to users of its products.
8
II. THE PARTIES
9
A. PLAINTIFFS
10
19. This Complaint is filed by and on behalf of children who suffered personal injuries
11
due to their use of Defendants’ products and, where applicable, their parents, guardians, spouses,
12
children, siblings, and close family members, who suffered loss of society and other injuries as a
13
consequence of the harms to Plaintiffs (collectively, “Plaintiffs”).
14
20. Plaintiffs have suffered various personal injuries because of their use of Defendants’
15
16 products. Plaintiffs have been harmed as a direct and proximate result of Defendants’ wrongful
17 conduct. These harms include pain, suffering, disability, impairment, disfigurement, death, an
18 increased risk of injury and other serious illnesses, loss of enjoyment of life, loss of society,
19 aggravation or activation of preexisting conditions, scarring, inconvenience, incurred costs for
20 medical care and treatment, loss of wages and wage-earning capacity, and other economic and non-
21 economic damages, as set forth herein. These losses are often permanent and continuing in nature.
22 21. Plaintiffs expressly disaffirm any contract they may have made with Defendants, or
23 that Defendants may claim they made with them, before reaching the age of majority, as they lacked
24 capacity to contract.
25 22. Plaintiffs also expressly disaffirm any contract they may have made with any of the
26 Defendants, or that Defendants may claim they made with them, after reaching the age of majority,
27 because Plaintiffs’ continued use of Defendants’ products was compulsive and due to addiction, not
28 an affirmation of any contract.
6
COMPLAINT
1 B. DEFENDANTS
2 23. The defendants identified in this section are collectively referred to as “Defendants”
3 throughout this Complaint.
4 1. Meta
5
24. Defendant Meta Platforms, Inc. (“Meta Platforms”) is a Delaware corporation and
6
multinational technology conglomerate. Its principal place of business is in Menlo Park, CA.
7
25. Meta Platforms’ subsidiaries include, but may not be limited to, the entities identified
8
in this section, as well as a dozen others whose identity or involvement is presently unclear.
9
26. Defendant Facebook Payments, Inc. (“Facebook 1”) is a wholly owned subsidiary of
10
Meta Platforms that was incorporated in Florida on December 10, 2010. Facebook 1 manages,
11
secures, and processes payments made through Meta Platforms, among other activities. Its principal
12
place of business is in Menlo Park, CA.
13
27. Defendant Siculus, Inc. (“Siculus”) is a wholly owned subsidiary of Meta Platforms
14
that was incorporated in Delaware on October 19, 2011. Siculus constructs data facilities to support
15
16 Meta Platforms’ products. Its principal place of business is in Menlo Park, CA.
17 28. Defendant Facebook Operations, LLC (“Facebook 2”) is a wholly owned subsidiary
18 of Meta Platforms that was incorporated in Delaware on January 8, 2012. Facebook 2 is likely a
19 managing entity for Meta Platforms’ other subsidiaries. Meta Platforms is the sole member of this
20 LLC, whose principal place of business is in Menlo Park, CA.
21 29. Defendant Instagram, LLC (“Instagram, LLC”) launched an app called Instagram in
22 October 2010. On or around April 7, 2012, Meta Platforms purchased Instagram, LLC for over one
23 billion dollars and reincorporated the company in Delaware. Meta Platforms is the sole member of
24 this LLC, whose principal place of business is in Menlo Park, CA.
25 30. Meta Platforms, Instagram, Siculus, Facebook 1, and Facebook 2 are referred to
26 jointly as “Meta.”
27 31. Meta owns, operates, controls, produces, designs, maintains, manages, develops,
28 tests, labels, markets, advertises, promotes, supplies, and distributes digital products available
through mobile- and web-based applications (“apps”), including Instagram and Facebook (together,
7
COMPLAINT
1 “Meta products”); Messenger; and Messenger Kids, as well as well as the virtual reality (VR)
2 headset, Oculus. Meta’s apps and devices are widely distributed to consumers throughout the United
3 States.
4 2. Snap
5
32. Defendant Snap Inc. (“Snap”) is a Delaware corporation. Its principal place of
6
business is in Santa Monica, CA.
7
33. Snap owns, operates, controls, produces, designs, maintains, manages, develops,
8
tests, labels, markets, advertises, promotes, supplies, and distributes the app Snapchat. Snapchat is
9
widely available to consumers throughout the United States.
10
3. ByteDance
11
34. Defendant ByteDance Ltd. is a global company incorporated in the Cayman Islands.
12
Its principal place of business is in Beijing, China. ByteDance Ltd. also maintains offices in the
13
United States, Singapore, India, and the United Kingdom, among other locations.
14
35. ByteDance Ltd. wholly owns its subsidiary Defendant ByteDance Inc., a Delaware
15
16 corporation whose principal place of business is in Mountain View, CA.
17 36. ByteDance Ltd.’s key Chinese subsidiary is Beijing Douyin Information Service
18 Limited, f/k/a Beijing ByteDance Technology Co. Ltd. (“Beijing ByteDance”).11 Beijing ByteDance
19 owns, operates, and holds key licenses to Douyin, the Chinese version of TikTok. On or around
20 April 30, 2021, the Chinese government took a 1% stake in, and received one of three seats on the
21 board of directors of, Beijing ByteDance.12 Specifically, 1% of Beijing ByteDance is now owned
22 by WangTouZhongWen (Beijing) Technology, which in turn is owned by China Internet Investment
23 Fund (China’s top Internet regulator and censor), China Media Group (China’s national broadcaster,
24 controlled by the Chinese Communist Party’s propaganda department), and the Beijing municipal
25 government’s investment arm.
26
27 11
See Sophie Webster, ByteDance Changes Names of Subsidiaries to Douyin, Speculated to be Mulling an IPO, Tech
Times (May 8, 2022), available at https://www.techtimes.com/articles/275188/20220508/bytedance-changes-names-
28 subsidiaries-douyin-speculated-mulling-ipo.htm.
12
See Juro Osawa & Shai Oster, Beijing Tightens Grip on ByteDance by Quietly Taking Stake, China Board Seat, The
Information (Aug. 16, 2021), available at https://www.theinformation.com/articles/beijing-tightens-grip-on-
bytedance-by-quietly-taking-stake-china-board-seat?rc=ubpjcg.
8
COMPLAINT
1 37. ByteDance Ltd. wholly owns its subsidiary Defendant TikTok, Ltd., a Cayman
2 Island corporation with its principal place of business in Shanghai, China.
3 38. TikTok, Ltd. wholly owns its subsidiary Defendant TikTok, LLC which is, and at all
4 relevant times was, a Delaware limited liability company.
5
39. TikTok, LLC wholly owns its subsidiary Defendant TikTok, Inc. f/k/a Musical.ly,
6
Inc. (“TikTok, Inc.”), a California corporation with its principal place of business in Culver City,
7
CA.
8
40. Defendants TikTok, Ltd.; TikTok, LLC; TikTok, Inc.; ByteDance Ltd.; and
9
ByteDance Inc. are referred to jointly as “ByteDance.”
10
41. ByteDance owns, operates, controls, produces, designs, maintains, manages,
11
develops, tests, labels, markets, advertises, promotes, supplies, and distributes the app TikTok.
12
TikTok is widely available to consumers throughout the United States.
13
4. Google
14
42. Google Inc. was incorporated in California in September 1998 and reincorporated in
15
Delaware in August 2003. In or around 2017, Google Inc. converted to a Delaware limited liability
16
company, Defendant Google, LLC (together with its predecessor-in-interest Google Inc.,
17
“Google”). Google’s principal place of business is in Mountain View, CA.
18
43. Since 2006, Google has operated, done business as, and wholly owned as its
19
subsidiary Defendant YouTube, LLC (“YouTube, LLC”). YouTube, LLC is a Delaware limited
20
liability company with its principal place of business in San Bruno, CA. YouTube is widely
21
available to consumers throughout the United States.13
22
44. On October 2, 2015, Google reorganized and became a wholly owned subsidiary of
23
a new holding company, Alphabet Inc., a Delaware corporation with its principal place of business
24
in Mountain View, CA.
25
26
27
28
13
See, e.g., Alphabet Inc., Form 10-Q, Oct. 25, 2022, at 4 (defining Alphabet as “Alphabet Inc. and its subsidiaries.”),
available at https://www.sec.gov/Archives/edgar/data/1652044/000165204422000090/goog-20220930.htm.
9
COMPLAINT
1 45. Google, LLC and YouTube, LLC (together, “Google”) are alter egos of one another:
2 together and in concert they own, operate, control, produce, design, maintain, manage, develop, test,
3 label, market, advertise, promote, supply, and distribute the app YouTube.
4 5. Discord
5
46. Defendant Discord Inc. (“Discord”) is a Delaware corporation. Its principal place of
6
business is in San Francisco, CA.
7
47. Discord owns, operates, controls, produces, designs, maintains, manages, develops,
8
tests, labels, markets, advertises, promotes, supplies, and distributes the app Discord.
9
III. JURISDICTION AND VENUE
10
48. This Court has personal jurisdiction over Defendants because they are incorporated
11
in and have their principal places of business in California, and because they have contacts with
12
California that are so continuous and systematic that they are essentially at home in this state. All
13
Defendants regularly conduct and solicit business in California, provide products and/or services by
14
or to persons here, and derive substantial revenue from the same. All Defendants affirmatively and
15
16 extensively engage with a significant percentage of this State’s residents through messages,
17 notifications, recommendations, and other communications.
18 49. There is no federal jurisdiction in this case. All claims are brought pursuant to
19 California state law. There are no federal causes of action and Plaintiff expressly disclaim any
20 federal causes of action.
21 50. Venue is proper in Los Angeles County because one or more defendants are
22 headquartered here and/or one or more Plaintiffs reside here; in addition, Plaintiffs will be relating
23 this case to and filing a Short Form Complaint in Judicial Council Coordination Proceeding No.
24 5255 (“JCCP 5255”), which proceeding is pending in Los Angeles County.
25 IV. FACTUAL ALLEGATIONS SPECIFIC TO EACH DEFENDANT
26 A. GENERAL FACTUAL ALLEGATIONS APPLICABLE TO ALL
DEFENDANTS
27
51. On May 15, 2023, a Master Complaint was filed in JCCP 5255, on May 15, 2023
28
(“Master Complaint”), in Los Angeles County Superior Court. Plaintiffs hereby incorporate and
adopt Sections IV.A.1 through IV.A.7 of the Master Complaint as though set forth in full herein.
10
COMPLAINT
1 B. FACTUAL ALLEGATIONS AS TO META
2 52. Plaintiffs hereby incorporate and adopt Section IV.B (and all applicable subsections)
3 of the Master Complaint as though set forth in full herein.
4 53. Plaintiffs make the following, additional allegations relating to Meta’s Oculus
5
product.
6
54. In June 2012, Palmer Luckey formed the company Oculus. Mr. Luckey had been
7
interested in virtual reality (“VR”) since the age of 15 and began his company with the goal of
8
building a better VR headset than what currently was available to consumers.
9
55. In August 2012, Oculus raised $2.5 million on Kickstarter and, in March 2014, Meta
10
acquired Oculus for $2 billion.14 Since that purchase, Meta has made numerous VR acquisitions
11
and Meta founder and CEO, Mark Zuckerberg, intends for VR to become a central part of “sports
12
and education and health care.”15
13
56. At all times relevant, Meta advertised and sold its Oculus product as a gaming
14
headset, while failing to reasonably disclose that it also could be used to access social media and
15
16 worked in some regards similar to social media, including defective and/or inherently dangerous
17 features such as affirmatively recommending and connecting minors to predatory adults, and in a
18 situation where Meta’s product design and operational decisions prevent young users from being
19 able to look away.
20 57. At all times relevant, users of the Oculus headset could also use that product to access
21 Meta’s social media products, Facebook and Instagram, a fact Meta did not call attention to or
22 reasonably disclose in product advertising or packaging.
23 58. Linking use of the Oculus to Meta’s social media products served no countervailing
24 benefit to consumers and was not necessary to operation of the virtual reality games Meta advertised
25 and offered via the Oculus headset.
26
14
27 According to industry insiders, Meta’s quest for the metaverse began in 2014, with its purchase
of Oculus. See https://qz.com/2086381/what-facebooks-vr-acquisitions-tell-us-about-metas-
28 future/.
15
https://www.vox.com/2016/3/24/11587234/two-years-later-facebooks-oculus-acquisition-has-
changed-virtual; see also https://www.cnet.com/tech/computing/mark-zuckerberg-sees-the-future-
of-ar-inside-vr-like-oculus-quest/.
11
COMPLAINT
1 C. FACTUAL ALLEGATIONS AS TO SNAP
2 59. Plaintiffs hereby incorporate and adopt Section IV.C (and all applicable subsections)
3 of the Master Complaint as though set forth in full herein.
4 D. FACTUAL ALLEGATIONS AS TO TIKTOK
5
60. Plaintiffs hereby incorporate and adopt Section IV.D (and all applicable subsections)
6
of the Master Complaint as though set forth in full herein.
7
E. FACTUAL ALLEGATIONS AS TO GOOGLE
8
61. Plaintiffs hereby incorporate and adopt Section IV.E (and all applicable subsections)
9
of the Master Complaint as though set forth in full herein.
10
F. FACTUAL ALLEGATIONS AS TO DISCORD
11
62. Discord is an on-line social media product that was launched in 2015. The product
12
includes a number of features that allow users to set up accounts and communicate with other
13
product users through group and private means, including video calls, text messaging, and exchange
14
of photos and videos. https://en.wikipedia.org/wiki/Discord.
15
63. In theory, Discord’s terms of use prohibit users under 13. However, Discord does not
16
verify user age or identity. Moreover, Discord has long been on notice that children 13 and younger
17
use the product—and that adults preying on children do too.16 Discord is aware, and it is commonly
18
known and understood that nobody follows that rule and that Discord allows children 13 and
19
younger to use its social media product. Children as young as eight years old currently are using the
20
Discord social media product. See Kellen Browning, 5 Ways Young People Are Using Discord,
21
N.Y. Times, Dec. 29, 2021.
22
64. Discord could enforce its supposed policy of prohibiting children 13 or younger if it
23
wanted to. Among other things, minor users often state their real age in their bio and/or tell other
24
users their real age in group and private chats using one or more Discord product features. Likewise,
25
minor users post photos that often reflect their actual age.
26
65. Discord’s product features also create an unreasonable opportunity for and risk of
27
28
16
See, e.g., Nellie Bowles and Michael Keller, Video Games and Online Chats Are ‘Hunting Grounds’ for Sexual
Predators, N.Y. Times, Dec. 7. 2019, available at https://www.nytimes.com/interactive/2019/12/07/us/video-games-
child-sex-abuse.html.
12
COMPLAINT
1 sexual exploitation of kids. By default, all product users—including users under 18—can receive
2 friend invitations from anyone in the same server, which opens the ability for them to send and
3 receive private messages from strangers.17
4 66. Minor users lack the cognitive ability and life experience to identify online grooming
5 behavior by prurient adults and the psychosocial maturity to decline invitations to exchange
6 salacious material and mass-messaging capabilities. At all times relevant, Discord allowed direct
7 messaging with and by minors without parental notification.
8 67. Discord has designed and operates its product in such a way that it also allows people
9 to chat using fake names and the task of enforcing community standards is largely delegated to the
10 organizers of individual Discord “servers.”18 “Server” is the used to describe a key feature of the
11 Discord product that allow users (including, as described above, underage users) to connect,
12 exchange photo and video files, and user Discord’s audio and video communication features.
13 68. The Discord product generates revenue for Discord through subscription fees that
14 gives users access to features like custom emojis for $5 or $10 per month. Discord also began
15 experimenting in December 2021 with a new feature that allow some users to charge for access to
16 their server, up to $100 a month, of which the company takes a 10 percent cut.19
17 69. Discord is well aware that users can and do use Discord in ways that pose risks of
18 harm, abuse and exploitation, including using Discord to initiate and engage in explicit sexual
19 conduct. Discord creates false assurances that users (including minor users) can turn on feature to
20 “keep [them] safe.” On information and belief, at times relevant, the Discord support page
21 represented its “Safe Direct Messaging” feature as follows:
22 Keep me safe: The safest option. This will have Discord scan any image sent in all
DMs, regardless of whether you’ve added the user on your friend list, or the user is
23 DMing you just by sharing a mutual server.
24 My friends are nice: The medium-est option! This will let Discord know to scan
25 any images sent in DMs from users that aren’t on your friends list, but also to trust
your previously-added friends and not worry about any images they send.
26
17
Samantha Murphy Kelly, The dark side of Discord for teens, CNN Business, Mar. 22, 2022,
27 https://www.cnn.com/2022/03/22/tech/discord-teens/index.html.
18
Kellen Browning, How Discord, Born From an Obscure Game, Became a Social Hub for Young People, N.Y.
28 Times, Dec. 29, 2021, available at https://www.nytimes.com/2021/12/29/business/discord-server-social-media.html.
19
Kellen Browning, How Discord, Born From an Obscure Game, Became a Social Hub for Young People, N.Y. Times,
Dec. 29, 2021, available at https://www.nytimes.com/2021/12/29/business/discord-server-social-media.html.
13
COMPLAINT
1 Do not scan: The self-confident option. Enabling this option will completely
disable Discord’s image scanning process, and leave you for a walk on the wild
2 side for any and all DMs you receive. Careful, it’s a jungle out there!20
3 The representations that Discord will “scan” images and provide “Safe Messaging!” reasonably
4 would lead minor users to believe that Discord will scan for and block sexually explicit materials,
5 including materials used to groom and/or blackmail users, and otherwise “keep [the user] safe.”
6 70. As a threshold matter, Discord’s direct messaging and “Safe Direct Messaging”
7 product features are inherently defective and/or dangerous for minor users because Discord should
8 not be presenting less-safe product options (“My friends are nice” and “Do not scan”) to users under
9 18 without their parents’ knowledge and consent.
10 71. Moreover, Discord downplays the serious dangers its product presents to minor users
11 by “jokingly” referring to unsolicited sexual overtures, grooming, displays of explicit sexual
12 materials, abuse, exploitation, and blackmail as “the wild side” and “a jungle.” Presenting these less-
13 safe options to minors and without verification of parental consent (for minor users) is a product
14 defect and/or inherently dangerous because Discord is attempting to contract with minors without
15 warning to minor parents and in a manner that poses significant risk to a significant number of its
16 minor users.
17 72. Discord’s Safe Direct Messaging product also is defective and/or does not operate as
18 promised and as reasonably understood by Discord’s minor users. This product does not operate as
19 promised and as reasonably understood by Discord’s minor (and even adult) users and creates a
20 false sense of safety that results in direct harm to a significant number of Discord’s minor users.
21 73. A reasonable person reviewing Discord’s description of its “Safe Direct Messaging”
22 feature (including but not limited to children like S.U.) reasonably would understand that Discord
23 was representing it would keep them “safe” from harmful interactions and predatory users, including
24 by monitoring and saving conversations had via Discord’s social media product and as long as the
25 user selects the “Keep me safe” option. Discord’s product feature does nothing of the sort, however.
26 On information and belief, at best, Discord scans image files for malware and similar issues. As
27 such, users (especially minor users) are lulled into a false and dangerous sense of security while
28
20
Discord Safety: Safe Messaging!, https://support.discord.com/hc/en-us/articles/115000068672-Discord-Safety-Safe-
Messaging-
14
COMPLAINT
1 using the product.
2 74. Discord’s platform is defective as its “Safe Direct Messaging” feature does not
3 actually keep minor Discord users safe and is deceptive and misleading and inherently dangerous
4 for minor Discord users because it creates a false sense of security and encourages them to use the
5 Discord product under the false belief that it is safe for children.
6 75. Discord’s platform also is defective and unreasonably unsafe for minor users because
7 it has no effective parental controls. Children are able to and do routinely use Discord despite being
8 13 or younger and despite having no parental consent to use the product, let alone to be placed at
9 risk of trauma, abuse, and exploitation through use of the product exactly in the manner and for the
10 purpose for which Discord designed the product.
11 76. Discord’s platform also is defective for failure to provide any effective warnings to
12 minor users, let alone their guardians, of the known risk of harm from the abuse and sexual
13 exploitation when children use the product.
14 77. Discord’s platform also is defective for failure to provide guardians with any means
15 to effectively report or otherwise stop Discord from providing their minor children with access to
16 its platform.
17 II. PLAINTIFF SPECIFIC FACTUAL ALLEGATIONS
18 A. Aviannah-Leigh Salters
19 78. Plaintiff Aviannah-Leigh Salters asserts claims against the Meta, Snap, and TikTok
20 Defendants, in connection with the Instagram, Snapchat, and TikTok social media platforms and
21
platform features.
22
79. Aviannah-Leigh currently is 18 and is a resident of Massachusetts.
23
80. Aviannah-Leigh got her first phone at age 12 and began using the Instagram,
24
Snapchat, and TikTok applications almost immediately. She opened and began using those without
25
her parents’ knowledge or consent. Meta, Snap, and TikTok further marketed their platforms as fun
26
and safe for children, such that her parents had no means to suspect that these platforms were
27
defective and/or inherently dangerous even once Aviannah-Leigh’s use became known.
28
81. Aviannah-Leigh’s Instagram, Snapchat, and TikTok use coincided