Preview
1 LAURA MARQUEZ-GARRETT, ESQ., CA Bar No. 221542
laura@socialmediavictims.org
2 SOCIAL MEDIA VICTIMS LAW CENTER
821 Second Avenue, Suite 2100
3 Seattle, WA 98104
4 Telephone: (206) 741-4862
Facsimile: (206) 957-9549
5
KEVIN M. LOEW, ESQ., CA Bar No. 238080
6 kloew@waterskraus.com
WATERS, KRAUS & PAUL
7 222 N. Pacific Coast Hwy, Suite 1900
El Segundo, CA 90245
8
Tel: 310-414-8146
9 Fax: 310-414-8156
10 Attorneys for Plaintiffs
11
12 IN THE SUPERIOR COURT OF CALIFORNIA
13 COUNTY OF LOS ANGELES
14
CHRISTINA ARLINGTON SMITH, Case No.: 22STCV21355
Electronically Received 08/12/2022 09:21 AM
15 individually and as successor-in-interest to
LALANI WALTON, Deceased, FIRST AMENDED COMPLAINT FOR
16 HERIBERTO ARROYO, individually and as WRONGFUL DEATH AND SURVIVAL
successor-in-interest to ARRIANI JAILEEN ACTION (STRICT LIABILITY;
17 NEGLIGENCE; AND VIOLATION OF
ARROYO, Deceased, and CHRISTAL THE CALIFORNIA CONSUMER LEGAL
ARROYO, Individually, JESSICA REMEDIES ACT, CAL. CIV. § 1750, et
18
WILLIAMS, individually and as successor- seq.)
19 in-interest to ZAIDEN BALDWIN,
Deceased. JURY DEMAND
20
Plaintiffs,
21 vs.
22
TIKTOK INC.;
23 BYTEDANCE INC.;
and DOES 1 - 100, INCLUSIVE.
24
Defendants.
25
26 //
27 //
28 //
FIRST AMENDED COMPLAINT 1
1 COME NOW PLAINTIFFS CHRISTINA ARLINGTON SMITH, HERIBERTO ARROYO,
2 CHRISTAL ARROYO, and JESSICA WILLIAMS, who allege as follows:
3 In these digital public spaces, which are privately owned and tend to be run for profit,
there can be tension between what’s best for the technology company and what’s best
4 for the individual user or for society. Business models are often built around
maximizing user engagement as opposed to safeguarding users’ health and ensuring
5 that users engage with one another in safe and healthy ways. . . . Technology companies
must step up and take responsibility for creating a safe digital environment for children
6 and youth.
7 United States Surgeon General’s Advisory December 7, 2021
8 Plaintiffs Christina Arlington Smith, Heriberto Arroyo, Christal Arroyo, and Jessica Williams
9 bring this action for wrongful death and survivorship against Defendants TikTok Inc. and ByteDance
10 Inc. (collectively, “TikTok”) for the death of eight-year-old Lalani Erika Walton, nine-year-old Arriani
11 Jaileen Arroyo and eleven-year old Zaiden Baldwin.
12 I. INTRODUCTION
13 1. This product liability action seeks to hold TikTok responsible for causing the deaths of
14 Lalani Erika Walton, Arriani Jaileen Arroyo and Zaiden Baldwin who all died of self-strangulation
15 after being presented with and encouraged to take the “TikTok Blackout Challenge” on Defendant’s
16 social media product.
17 2. TikTok’s algorithm served up to eight-year old Lalani, nine-year old Arriani and
18 eleven-year old Zaiden the viral and deadly TikTok Blackout Challenge. According to TikTok, its
19 proprietary algorithm is “a recommendation system that delivers content to each user that is likely to
20 be of interest to that particular user...each person’s feed is unique and tailored to that specific
21 individual.” In other words, TikTok has specifically curated and determined that these Blackout
22 Challenge videos – videos featuring users who purposefully strangulate themselves until losing
23 consciousness – are appropriate and fitting for small children.
24 3. TikTok has invested billions of dollars to intentionally design and develop its product
25 to encourage, enable, and push content to teens and children that Defendant knows to be problematic
26 and highly detrimental to its minor users’ mental health.
27 4. Plaintiffs bring claims of strict product liability based upon TikTok’s defective design
28 of its social media product that renders such product addictive and not reasonably safe for ordinary
FIRST AMENDED COMPLAINT 2
1 consumers and minor users. It is technologically feasible for TikTok to design social media products
2 that prevent young users from being affirmatively directed to highly dangerous content such as the
3 Blackout Challenges with a negligible increase in production cost. In fact, on information and belief,
4 the Blackout Challenge currently cannot be found on TikTok’s social media product or, in fact,
5 anywhere online. It appears to have been removed from archiving providers, such as
6 www.wayback.archive.org, as well.
7 5. Plaintiffs also bring claims for strict liability based on TikTok’s failure to provide
8 adequate warnings to minor users and their parents that TikTok is addictive and directs vulnerable
9 users to highly dangerous and harmful challenges including but not limited to the Blackout Challenge.
10 The addictive quality of TikTok’s product and its tendency to direct young users to highly dangerous
11 challenges are unknown to minor users and their parents.
12 6. Plaintiffs also bring claims for common law negligence arising from TikTok’s
13 unreasonably dangerous social media product and their failure to warn of such dangers. TikTok knew,
14 or in the exercise or ordinary care should have known, that its social media product is addictive to
15 young users and directs them to highly dangerous content promoting self-harm yet failed to re-design
16 its product to ameliorate these harms or warn minor users and their parents of dangers arising out of
17 the foreseeable use of the TikTok product.
18 II. PARTIES
19 7. Plaintiff Christina Arlington Smith is the mother of Lalani Erika Walton who died on
20 July 15, 2021, and is the successor-in-interest to her estate.
21 8. Christina Arlington has not entered into a User Agreement or other contractual
22 relationship with TikTok herein in connection with Lalani Walton’s use of Defendants’ social media
23 product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or class action waiver
24 set forth in said User Agreements. Additionally, as successor-in-interest to the Estate of Lalani Walton,
25 Plaintiff expressly disaffirms any and all User Agreements with TikTok into which Lalani may have
26 entered.
27 9. Plaintiff Heriberto Arroyo is the father of Arriani Jaileen Arroyo who died on February
28 26, 2021, and is the successor-in-interest to her estate.
FIRST AMENDED COMPLAINT 3
1 10. Heriberto Arroyo has not entered into a User Agreement or other contractual
2 relationship with TikTok herein in connection with Arriani Jaileen Arroyo’s use of Defendants’ social
3 media product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or class action
4 waiver set forth in said User Agreements. Additionally, as successor-in-interest to the Estate of Arriani
5 Jaileen Arroyo, Plaintiff expressly disaffirms any and all User Agreements with TikTok into which
6 Arriani may have entered.
7 11. Christal Arroyo is the mother of Arriani Jaileen Arroyo who died on February 26, 2021.
8 12. Christal Arroyo has not entered into a User Agreement or other contractual relationship
9 with TikTok herein in connection with Arriani Jaileen Arroyo’s use of Defendants’ social medial
10 product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or class action waiver
11 set forth in said User Agreements.
12 13. Jessica Williams is the grandmother and guardian of Zaiden Baldwin who died on June
13 11, 2022.
14 14. Jessica Williams has not entered into a User Agreement or other contractual
15 relationship with TikTok herein in connection with Zaiden Baldwin’s use of Defendants’ social media
16 product. Plaintiff is not bound by any arbitration, forum selection, choice of law, or class action waiver
17 set for in said User Agreements.
18 15. Defendant TikTok Inc. is a California corporation with its principal place of business
19 in Culver City, CA. Defendant TikTok owns and operates the TikTok social media platform, an
20 application that is widely marketed by TikTok and available to users throughout the United States.
21 16. At all times relevant hereto, Defendant TikTok Inc. was acting by and through its
22 employees, servants, agents, workmen, and/or staff, all of whom were acting within the course and
23 scope of their employment, for and on behalf of TikTok Inc.
24 17. Defendant ByteDance Inc. is a Delaware corporation with its principal place of
25 business in Mountain View, CA. Defendant ByteDance owns TikTok Inc., and owns/operates the
26 TikTok social media platform.
27 18. At all times relevant hereto, Defendant ByteDance Inc. was acting by and through its
28 employees, servants, agents, workmen, and/or staff, all of whom were acting within the course and
FIRST AMENDED COMPLAINT 4
1 scope of their employment, for and on behalf of ByteDance Inc.
2 19. TikTok is highly integrated with its Chinese parent, ByteDance. TikTok’s engineering
3 manager works on both TikTok and ByteDance’s similar Chinese app, Douyin. TikTok’s development
4 processes are closely intertwined with Douyin’s processes. TikTok employees are also deeply
5 interwoven into ByteDance’s ecosystem. They use a ByteDance product called Lark, a corporate
6 internal communications system like Slack but with aggressive performance-management features
7 aimed at forcing employees to use the system more.
8 III. JURISDICTION AND VENUE
9 20. This Court has general jurisdiction over Defendants because TikTok Inc. and
10 ByteDance Inc. have their principal places of business in California and are “at home” in this State.
11 21. Venue is proper in this Los Angeles County because TikTok is headquartered here.
12 IV. FACTUAL ALLEGATIONS
13 A. TikTok’s Applications Are Products
14 22. TikTok is a video sharing social media application where users create, share, and view
15 short video clips. TikTok exclusively controls and operates the TikTok platform for profit, which
16 creates advertising revenue through maximizing the amount of time users spend on the platform and
17 their level of engagement. The greater the amount of time that young users spend on TikTok, the
18 greater the advertising revenue TikTok earns.
19 23. Users on TikTok who open the TikTok application are automatically shown an endless
20 stream of videos selected by an algorithm developed by TikTok to show content on each user’s For
21 You Page (“FYP”) based upon each user’s demographics, likes, and prior activity on the app. In
22 addition, TikTok’s algorithm uses individualized user data and demographic information gleaned from
23 third party sources and statistical data, as well as other data points collected by TikTok, in directing
24 users to particular content.
25 24. TikTok is a social media product designed to be used by children and actively marketed
26 to children across the United States including in the State of California. Further, TikTok is aware that
27 large numbers of children under the age of 13 use its product despite user terms or “community
28 standards” that purport to restrict use to individuals who are 13 and older.
FIRST AMENDED COMPLAINT 5
1 25. In fact, this product is designed to be used by minors and is actively marketed to minors
2 across the United States. TikTok markets to minors through its own marketing efforts and design. But
3 also, TikTok works with and actively encourages advertisers to create ads targeted at and appealing to
4 teens, and even to children under the age of 13. TikTok spends millions researching, analyzing, and
5 experimenting with young children to find ways to make its product more appealing and addictive to
6 these age groups, as these age groups are seen as the key to TikTok’s long-term profitability and
7 market dominance.
8 26. TikTok is aware that large numbers of children under the age of 18 use its product
9 without parental consent. It designs its product in a manner that allows and/or does not prevent such
10 use to increase user engagement and, thereby, its own profits.
11 27. TikTok is likewise aware that large numbers of children under the age of 13 use its
12 product despite user terms or “community standards” that purport to restrict use to individuals who
13 are 13 and older. It has designed its product in a manner that allows and/or does not prevent such use
14 to increase user engagement and, thereby, its own profits.
15 28. Moreover, even in instances where TikTok has actual and/or constructive knowledge
16 of underage users opening accounts, posting, and otherwise using its social media product, TikTok
17 fails to prevent and protect against such harmful and illegal use.
18 B. TikTok Designed its Product to be Addictive to Young Users
19 29. TikTok has designed its algorithms to addict users and cause them to spend as much
20 time on the application as possible through advanced analytics that create a variable reward system
21 tailored to user’s viewing habits and interests.
22 30. There are four main goals for TikTok’s algorithm: which the company translates as
23 “user value,” “long-term user value,” “creator value,” and “platform value.”
24 31. An internal TikTok document entitled “TikTok Algo 101” was created by TikTok’s
25 engineering team in Beijing and offers details about both the product’s mathematical core and insight
26 into the company’s understanding of human nature. The document explains frankly that in the pursuit
27 of the company’s “ultimate goal” of adding daily active users, TikTok has chosen to optimize for two
28 closely related metrics in the stream of videos it serves: “retention” — that is, whether a user comes
FIRST AMENDED COMPLAINT 6
1 back — and “time spent.” The document offers a rough equation for how videos are scored, in which
2 a prediction driven by machine learning and actual user behavior are summed up for each of three bits
3 of data: likes, comments and playtime, as well as an indication that the video has been played.
4 32. A recent Wall Street Journal report revealed how TikTok relies heavily on how much
5 time users spend watching each video to steer them toward more videos that will keep them scrolling,
6 and that process can sometimes lead young viewers down dangerous rabbit holes, in particular, toward
7 content that promotes suicide or self-harm.
8 33. TikTok purports to have a minimum age requirement of 13-years-old but does little to
9 verify user’s age or enforce its age limitations despite having actual knowledge that use by underage
10 users is widespread. TikTok knows that hundreds of thousands of children as young as six years old
11 are currently using its social media product but undertakes no attempt to identify such users and
12 terminate their usage. On information and belief, the reason TikTok has not sought to limit usage of
13 its social media product by young children is because it would diminish the advertising revenue
14 TikTok earns through such users. TikTok also does not seek parental consent for underage users or
15 provide any warnings or controls that would allow parents to monitor and limit the use of TikTok by
16 their children, despite TikTok’s own current Terms of Service claiming that users under the age of 18
17 require parental consent to use its product. TikTok could quickly and reasonably implement tools to
18 verify age and identity of its users but knows that doing so would result in the loss of millions of
19 current TikTok users—due to some being under the age of 13 and others not having parental consent.
20 34. Until mid 2021, TikTok by default made all users profiles “public,” meaning that
21 strangers, often adults, could view and message underage users of the TikTok app. This is an inherently
22 harmful product feature, particularly when combined with TikTok’s failure to enforce legal and self-
23 imposed age limitations, as it makes small children available to predatory TikTok users in a manner
24 that actively interferes with parental oversight and involvement and puts them in an inherently
25 vulnerable and dangerous position.
26 35. TikTok does not seek parental consent for underage users or provide any warnings or
27 controls that would allow parents to monitor and limit the use of TikTok by their children.
28 //
FIRST AMENDED COMPLAINT 7
1 36. TikTok has developed images and memes to enact images for users to decorate the snap
2 pictures or videos they post. TikTok has also developed memes and other images for users to apply to
3 images they post on TikTok. TikTok also has acquired publication rights to music that its users can
4 incorporate in the pictures and videos they post on TikTok. When users incorporate images, memes
5 and music supplied by TikTok into their postings, TikTok becomes a co-publisher of such content. A
6 TikTok user who incorporates images, memes and musical content supplied by TikTok into their posts
7 is functionally equivalent to a novelist who incorporates illustrations into her story. TikTok can no
8 longer characterize the images, memes and musical content it supplies to its users as third-party content
9 as the novelist can disclaim responsibility for illustrations contained in her book.
10 37. TikTok has developed artificial intelligence technology that detects adult users of
11 TikTok who send sexually explicit content to children and receive sexually explicit images from
12 children. This technology furnishes TikTok with actual knowledge that a significant number of minor
13 TikTok users are solicited to send and actually do send sexually explicit photos and videos of
14 themselves to adult users in exchange for consideration in violation of 18 U.S.C. § 1591(a)(1)–B.
15 C. TikTok’s Business Model is Based on Maximizing User Screen Time
16 38. TikTok advertises its product as “free,” because it does not charge users for
17 downloading or using the product. What many users do not know is that, in fact, TikTok makes its
18 astronomical profits by targeting advertisements and harmful content to young users and by finding
19 unique and increasingly dangerous ways to keep those young users hooked on its social media product.
20 TikTok receives revenue from advertisers who pay a premium to target advertisements to specific
21 demographic groups of TikTok users including, and specifically, users in California under the age of
22 18. TikTok also receives revenue from selling its users’ data, including data belonging to users under
23 the age of 13, to third parties.
24 39. The amount of revenue TikTok receives is based upon the amount of time and user
25 engagement on its platform, which directly correlates with the number of advertisements that can be
26 shown to each user.
27 40. TikTok is designed around a series of design features that do not add to the
28 communication and communication utility of the application, but instead seek to exploit users’
FIRST AMENDED COMPLAINT 8
1 susceptibility to persuasive design and unlimited accumulation of unpredictable and uncertain
2 rewards, including “likes,” “followers” and “views.” In the hands of children, this design is
3 unreasonably dangerous to the mental well-being of underage user’s developing minds.
4 41. According to industry insiders, TikTok has employed thousands of engineers to help
5 make the TikTok product maximally addicting. For example, TikTok’s “pull to refresh” is based on
6 how slot machines operate. It creates an endless feed, designed to manipulate brain chemistry and to
7 prevent natural end points that would otherwise encourage users to move on to other activities.
8 42. TikTok does not warn users of the addictive design of the TikTok product. On the
9 contrary, TikTok actively tries to conceal the dangerous and addictive nature of its product, lulling
10 users and parents into a false sense of security. This includes consistently playing down its product’s
11 negative effects on teens in public statements and advertising, making false or materially misleading
12 statements concerning product safety, marketing TikTok as a family application that is fun and safe
13 for all ages, and refusing to make its research public or available to academics or lawmakers who have
14 asked for it.
15 43. TikTok product managers and designers attend and even present at an annual
16 conference held in Silicon Valley called the Habit Summit, the primary purpose of which is to learn
17 how to make products more habit forming.
18 44. TikTok engineers its social media product to keep users, and particularly young users,
19 engaged longer and coming back for more. This is referred to as “engineered addiction,” and examples
20 include features like bottomless scrolling, tagging, notifications, and live stories.
21 D. TikTok Has Designed Complex Algorithms to Addict Young Users
22 45. TikTok has intentionally designed its product to maximize users’ ‘screen time, using
23 complex algorithms designed to exploit human psychology and driven by the most advanced computer
24 algorithms and artificial intelligence available to two of the largest technology companies in the
25 world.”
26 46. TikTok has designed and progressively modified its product to promote excessive use
27 that it knows is indicative of addictive and problematic use.
28 47. One of these features present in TikTok is the use of complex algorithms to select and
FIRST AMENDED COMPLAINT 9
1 promote content that is provided to users in an unlimited and never ending “feed.” TikTok is well-
2 aware that algorithm-controlled feeds promote unlimited “scrolling”—a type of use that studies have
3 identified as detrimental to users’ mental health – however, TikTok maintains this harmful product
4 feature as it allows TikTok to display more advertisements and, thus, obtain more revenue.
5 48. TikTok has also designed its algorithm-controlled feeds to promote content most likely
6 to increase user engagement, which often means content that TikTok knows to be harmful to their
7 users. This is content that users might otherwise never see but for TikTok affirmative pushing such
8 content to their accounts.
9 49. The addictive nature of TikTok’s product and the complex and psychologically
10 manipulative design of its algorithms is unknown to ordinary users.
11 50. TikTok goes to significant lengths to prevent transparency, including posing as a “free”
12 social media platform, burying advertisements in personalized content, and making public statements
13 about the safety of the TikTok product that simply are not true.
14 51. TikTok also has developed unique product features designed to limit and has in other
15 ways limited parents’ ability to monitor and prevent problematic use by their children.
16 52. The algorithms that render TikTok’s social product addictive are designed to be content
17 neutral. They adapt to the social media activity of individual users to promote whatever content will
18 trigger a particular user’s interest and maximize their screen time. TikTok’s algorithm designs do not
19 distinguish, rank, discriminate or prioritize between particular types of content on their social media
20 platforms. If User One is triggered by elephants and User Two is triggered by moonbeams, TikTok’s
21 algorithm design will promote elephant content to User One and moonbeam content to User Two.
22 TikTok’s above-described algorithms are solely quantitative devices and make no qualitative
23 distinctions between the nature and type of content they promote to users.
24 E. Young Users’ Incomplete Brain Development Renders Them Particularly Susceptible to
Manipulative Algorithms with Diminished Capacity to Eschew Self- Destructive
25 Behaviors and Less Resiliency to Overcome Negative Social Media Influences
26 53. The human brain is still developing during adolescence in ways consistent with
27 adolescents’ demonstrated psychosocial immaturity. Specifically, adolescents’ brains are not yet fully
28 developed in regions related to risk evaluation, emotional regulation, and impulse control.
FIRST AMENDED COMPLAINT 10
1 54. The frontal lobes - and in particular the prefrontal cortex - of the brain play an essential
2 part in higher-order cognitive functions, impulse control and executive decision- making. These
3 regions of the brain are central to the process of planning and decision-making, including the
4 evaluation of future consequences and the weighing of risk and reward. They are also essential to the
5 ability to control emotions and inhibit impulses. MRI studies have shown that the prefrontal cortex is
6 one of the last regions of the brain to mature.
7 55. During childhood and adolescence, the brain is maturing in at least two major ways.
8 First, the brain undergoes myelination, the process through which the neural pathways connecting
9 different parts of the brain become insulated with white fatty tissue called myelin. Second, during
10 childhood and adolescence, the brain is undergoing “pruning” - the paring away of unused synapses,
11 leading to more efficient neural connections. Through myelination and pruning, the brain’s frontal
12 lobes change to help the brain work faster and more efficiently, improving the “executive” functions
13 of the frontal lobes, including impulse control and risk evaluation. This shift in the brain’s composition
14 continues throughout adolescence and continues into young adulthood.
15 56. In late adolescence, important aspects of brain maturation remain incomplete,
16 particularly those involving the brain’s executive functions and the coordinated activity of regions
17 involved in emotion and cognition. As such, the part of the brain that is critical for control of impulses
18 and emotions and mature, considered decision-making is still developing during adolescence,
19 consistent with the demonstrated behavioral and psychosocial immaturity of juveniles.
20 57. The algorithms in TikTok’s social media product exploit minor users’ diminished
21 decision-making capacity, impulse control, emotional maturity, and psychological resiliency caused
22 by users’ incomplete brain development. TikTok knows, or in the exercise of reasonable care should
23 know, that because its minor users’ frontal lobes are not fully developed, such users are much more
24 likely to sustain serious physical and psychological harm through their social media use than adult
25 users. Nevertheless, TikTok has failed to design the TikTok product with any protections to account
26 for and ameliorate the psychosocial immaturity of its minor users.
27 //
28 //
FIRST AMENDED COMPLAINT 11
1 F. TikTok Misrepresents the Addictive Design and Effects of its Social Media Product
2 58. During the relevant time period, TikTok stated in public comments that the TikTok
3 product is not addictive and was not designed to be addictive. TikTok knew or should have known
4 that those statements were untrue.
5 59. TikTok did not warn users or their parents of the addictive and mentally harmful effects
6 that the use of its product was known to cause amongst minor users, like Lalani Walton and Arriani
7 Arroyo. On the contrary, TikTok has gone to significant lengths to conceal and/or avoid disclosure as
8 to the true nature of the TikTok social media product.
9 G. TikTok Promotes “TikTok Challenges” to Young Users and Knowingly Directs Them to
10 Dangerous Content
11 60. TikTok also features and promotes various “challenges” where users film themselves
12 engaging in behavior that mimics and “one ups” other users posting videos related to a particular
13 challenge. TikTok promotes users creating and posting videos of challenges identified by a system of
14 hashtags that are promoted within TikTok’s search feature.
15 61. At all times relevant, TikTok’s algorithm was designed to promote “TikTok
16 Challenges” to young users to increase their engagement and maximize TikTok’s profits. TikTok
17 “challenges” involve users filming themselves engaging in behavior that mimics and often times “one-
18 ups” other users posting videos performing the same or similar conduct. These TikTok “challenges”
19 routinely involve dangerous or risky conduct. TikTok’s algorithm presents these often-dangerous
20 “challenges” to users on their FYP and encourages users to create, share, and participate in the
21 “challenge.”
22 62. There have been numerous dangerous TikTok challenges that TikTok’s app and
23 algorithm have caused to spread rapidly, which promote dangerous behavior, including:
• Fire Mirror Challenge – involves participants spraying shapes on their mirror with a
24
flammable liquid and then setting fire to it.
25 • Orbeez Shooting Challenge – involves participants shooting random strangers with tiny
water-absorbent polymer beads using gel blaster guns.
26 • Milk Crate Challenge – involves participants stacking a mountain of milk crates and
attempting to ascend and descend the unstable structure without falling.
27
• Penny Challenge – involves sliding a penny behind a partially plugged-in phone
28 charger.
FIRST AMENDED COMPLAINT 12
• Benadryl Challenge – involves consuming a dangerous amount of Benadryl in order to
1
achieve hallucinogenic effects.
2 • Skull Breaker Challenge – involves users jumping in the air while friends kick their feet
out from underneath them, causing the users to flip in the air and fall back on their
3 head.
• Cha-Cha Slide Challenge – involves users swerving their vehicles all over the road to
4 the famous song by the same name.
5 • Dry Scoop Challenge – involves users ingesting a heaping scoop of undiluted
supplemental energy powder.
6 • Nyquil Chicken Challenge – involves soaking chicken breast in cough medicine like
Nyquil and cooking it, boiling off the water and alcohol in it and leaving the chicken
7 saturated with a highly concentrated amount of drugs in the meat.
8 • Tooth Filing Challenge – involves users filing down their teeth with a nail file.
• Face Wax Challenge – involves users covering their entire face, including their eyes,
9 with hot wax before ripping it off.
• Coronavirus Challenge – involves users licking random items and surfaces in public
10 during the midst of the global COVID-19 pandemic.
11 • Scalp Popping Challenge – involves users twisting a piece of hair on the crown of
someone's head around their fingers and pulling upward, creating a “popping” effect on
12 their scalp.
• Nutmeg Challenge – involves users consuming dangerously large amounts of nutmeg
13 with the aim of achieving an intoxicating high.
• Throw it in the Air Challenge – involves users standing in a circle looking down at a
14
cellphone on the ground as someone throws an object into the air, and the goal is to not
15 flinch as you watch the object fall on one of the participant’s heads.
• Corn Cob Challenge – involves users attaching a corn cob to a power drill and
16 attempting to each the corn as it spins.
• Gorilla Glue Challenge – involves users using a strong adhesive to stick objects to
17 themselves.
18 • Kiki Challenge – involves users getting out of moving vehicles to dance alongside in
the roadway.
19 • Salt and Ice Challenge – involves users putting salt on their skin and then holding an
ice cube on the spot for as long as possible, creating a chemical reaction that causes
20 pain and can lead to burns.
21 • Snorting Challenge – involves users snorting an entire latex condom into their nose
before pulling it out of their mouth.
22 • Hot Water Challenge – involves users pouring boiling hot water on someone else.
• Fire Challenge – involves users dousing themselves in a flammable liquid and then
23 lighting themselves on fire.
24 H. TikTok Had Actual Knowledge that Children Were Dying From its Blackout Challenge
Yet Failed to Redesign its Algorithm to Prevent Such Deaths
25
26 63. The deadliest “TikTok Challenge” being promoted by TikTok’s algorithm is the
27 “TikTok Blackout Challenge,” which encourages users to choke themselves with belts, purse strings,
28 or anything similar until passing out. Tragically, Lalani Walton, Arriani Jaileen Arroyo and Zaiden
FIRST AMENDED COMPLAINT 13
1 Baldwin are just the latest in a growing list of children killed because of TikTok’s algorithm and
2 promotion of the Blackout Challenge to kids.
3 64. On January 21, 2021, a 10-year-old girl in Italy died after TikTok’s app and algorithm
4 recommended the Blackout Challenge to her vis-à-vis her FYP. According to Italian news reports,
5 after the young girl saw the Blackout Challenge on her TikTok app, she tied a belt around her neck
6 and choked herself, causing her to go into cardiac arrest. She was rushed to the hospital but was
7 declared braindead upon arrival and ultimately died.
8 65. TikTok had knowledge of this death and its connection to TikTok’s promulgation of
9 the Blackout Challenge sometime after the death but before the deaths of Lalani, Arriani, Zaiden and
10 several other children, and failed to take reasonable and appropriate steps to fix its social media
11 product, including by verification of age and identity of users, or by removing the TikTok Blackout
12 Challenge from content promoted or recommended by TikTok’s algorithms to its minor users.
13 66. On March 22, 2021, a 12-year-old boy, Joshua Haileyesus, died after attempting the
14 Blackout Challenge that TikTok’s app and algorithm recommended to him through his FYP. Joshua
15 was discovered breathless and unconscious by his twin brother and ultimately died after 19 days on
16 life support. Joshua attempted the Blackout Challenge by choking himself with a shoelace.
17 67. On June 14, 2021, a 14-year-old boy died in Australia while attempting to take part in
18 TikTok’s Blackout Challenge after TikTok’s app and algorithm presented the deadly challenge to him
19 through his FYP.
20 68. In July 2021, a 12-year-old boy died in Oklahoma while attempting the Blackout
21 Challenge after TikTok’s app and algorithm recommended the dangerous and deadly video to him
22 through his FYP.
23 69. In December 2021, a 10-year-old girl, Nyla Anderson died in Pennsylvania after
24 attempting the Blackout Challenge that the TikTok’s algorithm recommended to her through her FYP.
25 Nyla attempted the Blackout Challenge by using a purse strap.
26 70. TikTok unquestionably knew that the deadly Blackout Challenge was spreading
27 through their app and that their algorithm was specifically feeding the Blackout Challenge to children,
28 including those who have died.
FIRST AMENDED COMPLAINT 14
1 71. TikTok knew or should have known that failing to take immediate and significant
2 action to extinguish the spread of the deadly Blackout Challenge would result in more injuries and
3 deaths, especially among children, because of these young users attempting the viral challenge.
4 72. TikTok knew or should have known that its product was dangerously defective and in
5 need of immediate and significant change to prevent users, especially children, from being directed to
6 dangerous challenges that were known to have killed children and, even if not known, where such
7 deaths were reasonably foreseeable based on the inherently dangerous and defective nature of
8 TikTok’s product.
9 73. TikTok knew or should have known that a failure to take immediate and significant
10 corrective action would result in an unreasonable and unacceptable risk that additional users, and
11 additional children, would fall victim to the deadly Blackout Challenge.
12 74. Despite this knowledge, TikTok outrageously took no and/or completely inadequate
13 action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent its
14 algorithm from directing children to the Blackout Challenge, despite notice and/or foreseeability that
15 such a failure would inevitably lead to more injuries and deaths, including those of children.
16 75. Despite this knowledge, TikTok outrageously failed to change, update, and/or correct
17 its algorithm to prevent it from directing users, specifically children, with the dangerous and deadly
18 Blackout Challenge despite knowing that such a failure would inevitably lead to more injuries and
19 deaths, including those of children.
20 76. TikTok failed or refused to take the necessary corrective action to cure its defective
21 algorithm because TikTok knew that such fixes would result in less user engagement and, thus, less
22 profits.
23 77. TikTok prioritized greater corporate profits over the health and safety of its users and,
24 specifically, over the health and safety of vulnerable children TikTok knew or should have known
25 were actively using its social media product.
26 //
27 //
28 //
FIRST AMENDED COMPLAINT 15
I. Plaintiffs Expressly Disclaim Any and All Claims Seeking to Hold TikTok Liable as the
1 Publisher or Speaker of Any Content Provided, Posted or Created by Third Parties
2 78. Plaintiffs seek to hold TikTok accountable for their own alleged acts and omissions.
3 Plaintiffs’ claims arise from TikTok’s status as designers and marketers of a dangerously defective
4 social media product, as well as TikTok’s own statements and actions, and are not based on TikTok
5 as the speaker or publisher of third-party content.
6 79. TikTok also failed to warn minor users and their parents of known dangers arising from
7 anticipated use of its social media platform in general and the Blackout Challenge in particular. These
8 dangers, which are unknown to ordinary consumers, do not arise from third-party content contained
9 on the TikTok social media product, but rather, from TikTok’s algorithm designs that 1) addict minor
10
users to the TikTok product; 2) affirmatively select and promote harmful content to vulnerable users
11 based on their individualized demographic data and social media activity; and 3) put minor users in
12 contact with dangerous adult predators.
13 80. TikTok’s product is addictive on a content neutral basis. For example, TikTok designs
14 and operates its algorithms in a manner intended to and that does change behavior and addict users,
15 including through a natural selection process that does not depend on or require any specific type of
16 third-party content.
17 81. TikTok’s product features are designed to be and are addictive and harmful in
18 themselves, without regard to any content that may exist on TikTok’s platform, for example, TikTok’s
19 “like” feature.
20 82. TikTok has designed other product features for the purpose of encouraging and
21
assisting children in evasion of parental oversight, protection, and consent, which features are wholly
22 unnecessary to the operation of TikTok’s product.
23 83. TikTok has information and knowledge that can determine with reasonably certainty
24 each user’s age, habits, and other personal information, regardless of what information the user
25 provides at the time of account setup. In other words, TikTok knows when a user claims to be 21 but
26 is really 12 and, likewise, it knows when a user claims to be 13 but is really 31.
27 84. In short, none of Plaintiffs’ claims rely on treating TikTok as the publisher or speaker
28 of any third party’s words or content. Plaintiffs’ claims seek to hold TikTok accountable for TikTok’s
FIRST AMENDED COMPLAINT 16
1 own allegedly wrongful acts and omissions, not for the speech of others or for TikTok’s good faith
2 attempts to restrict access to objectionable content.
3 85. Plaintiffs are not alleging that TikTok is liable for what third parties said or did, but for
4 what TikTok did or did not do.
5 86. None of Plaintiffs’ claims set forth herein treat TikTok as the speaker or publisher of
6 content posted by third parties. Rather, Plaintiffs seek to hold TikTok liable for its own speech and its
7 own silence in failing to warn of foreseeable dangers arising from anticipate use of its social media
8 product. TikTok could manifestly fulfill its legal duty to design a reasonably safe social product and
9 furnish adequate warnings of foreseeable dangers arising out of the use of TikTok’s product without
10 altering, deleting, or modifying the content of a single third-party post or communication.
11 V. PLAINTIFF-SPECIFIC ALLEGATIONS
12 Lalani Erika Renee Walton (2013-2021)
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27 87. Lalani Erika Renee Walton was born on April 23, 2013. Lalani had a large, blended
28 family with many siblings.
FIRST AMENDED COMPLAINT 17
1 88. Lalani was extremely sweet and outgoing. She loved dressing up as a princess and
2 playing with makeup. She enjoyed being the center of attention and didn’t shy away from the spotlight.
3 When she grew up, she wanted to be a famous rapper, like Cardi B.
4 89. Lalani got her first cellphone on her 8th birthday on April 23, 2021. Shortly thereafter
5 she downloaded TikTok. Parental controls were installed on Lalani’s TikTok account by Lalani’s
6 stepmother, Rashika Watson.
7 90. Lalani quickly became addicted to watching TikTok videos and posted many TikTok
8 videos of herself singing and dancing, in the hopes of becoming TikTok famous.
9 91. In 2020, Lalani was involved in a car accident in which one of her stepbrothers died
10 and in which Lalani was seriously injured. Following the accident, Lalani’s stepmother, Rashika,
11 struggled with the loss of her son so Lalani asked to spend a year living with Rashika. Plaintiff agreed
12 and allowed Lalani to live with Rashika for a one year period, but maintained constant contact with
13 her. Often they would talk several times each day.
14 92. Unbeknownst to either Plaintiff or Rashika Walton, sometime in July of 2021,
15 TikTok’s algorithm directed Lalani to the “TikTok Blackout Challenge.” On or about July 13, 2021,
16 Lalani had some bruises on her neck but explained those away to her family as having fallen and
17 bumped herself on her bedframe. Neither Rashika nor Lalani’s siblings attributed those bruises to self-
18 harmful behavior. Likewise, upon information and belief and as was told to Rashika after Lalani’s
19 death, the daughter of one of Rashika’s neighbors was sent the “TikTok Blackout Challenge”
20 sometime in July of 2021. Luckily, in that instance, the mother found her daughter in the act of
21 performing the TikTok Blackout Challenge and made her stop immediately.
22 93. Lalani, Rashika, and Plaintiff Christina Arlington Smith were not so fortunate.
23 94. From July 14 to July 15, 2021, Lalani was with Rashika Walton and two of her step
24 siblings. Rashika was taking two of her children to stay with their grandparents. During the 20-hour
25 round trip, Lalani sat in the backseat watching TikTok videos. For most of that time, Rashika was
26 driving the car and could not see what Lalani was watching on TikTok but, even on the few occasions
27 where they pulled over and/or Rashika asked, Lalani appeared to be watching age-appropriate videos.
28 Plaintiff subsequently learned that Lalani had been watching the “TikTok Blackout Challenge” during
FIRST AMENDED COMPLAINT 18
1 some, if not most, of that 20-hour drive.
2 95. When Rashika and Lalani returned to their home, Rashika told Lalani to clean up her
3 room and that they would then go swimming. Rashika was tired from the long trip and took a short
4 nap. When she awoke approximately an hour later, she walked upstairs to Lalani’s room and was
5 surprised to find the door closed. She walked in and found Lalani hanging from her bed with a rope
6 around her neck, and still warm to the touch. Rashika called a neighbor wh