Preview
1 PANISH | SHEA | BOYLE | RAVIPUDI LLP
BRIAN J. PANISH, State Bar No. 116060
2 bpanish@psbr.law
RAHUL RAVIPUDI, State Bar No. 204519
3 rravipudi@psbr.law
JESSE CREED, State Bar No. 272595
4 jcreed@psbr.law
11111 Santa Monica Boulevard, Suite 700
5 Los Angeles, CA 90025
Telephone: 310.477.1700
6 Facsimile: 310.477.1699
7 MORGAN & MORGAN
EMILY C. JEFFCOTT (admitted pro hac vice)
8 ejeffcott@forthepeople.com
633 West Fifth Street, Suite 2652
9 Los Angeles, CA 90071
Tel: (213) 787-8590
10 Fax: (213) 418-3983
11 BEASLEY ALLEN
JOSEPH VANZANDT (admitted pro hac vice)
12 joseph.vanzandt@beasleyallen.com
234 Commerce Street
13 Montgomery, AL 36103
Tel: (334)269-2343
14
Co-Lead and Co-Liaison Counsel for Plaintiffs
15
16 SUPERIOR COURT OF THE STATE OF CALIFORNIA
17 FOR THE COUNTY OF LOS ANGELES
18
19 COORDINATION PROCEEDING JUDICIAL COUNCIL COORDINATION
SPECIAL TITLE [RULE 3.400] PROCEEDING NO. 5255
20
SOCIAL MEDIA CASES For Filing Purposes: 22STCV21355
21
_____________________________________
22 THIS DOCUMENT RELATES TO: Judge: Hon. Carolyn B. Kuhl
Dept.: SSC-12
23 (Christina Arlington Smith, et al., v. TikTok
Inc., et al., Case No. 22STCV21355) PLAINTIFFS’ SUPPLEMENTAL BRIEF IN
24 OPPOSITION TO DEFENDANTS’
25 (A.S. et al. v. Meta Platforms, Inc. et al., Case MOTION TO STRIKE THIRD-PARTY
No. 22STCV28202) MISCONDUCT AND ONLINE
26 CHALLENGE ALLEGATIONS FROM
(Glenn-Mills v. Meta Platforms, Inc. et al., IDENTIFIED SHORT-FORM
27 Case No. 23SMCV03371) COMPLAINTS
28
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 (J.S. et al. v. Meta Platforms, Inc. et al., Case Date: April 24, 2024
No. CV2022-1472) Time: 9:00 a.m.
2 Dept.: SSC-12
(K.K. et al. v. Meta Platforms, Inc. et al., Case
3 No. 23SMCV03371)
4
(K.L. et al. v. Meta Platforms, Inc. et al., Case
5 No. CIVSB2218921)
6 (N.S. et al. v. Snap Inc., Case No. 22CV019089)
7 (P.F. et al. v. Meta Platforms, Inc. et al., Case
No. 23SMCV03371)
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
2
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 TABLE OF CONTENTS
Page
2
3 I. PLAINTIFFS’ CLAIMS TARGET DEFENDANTS’ AFFIRMATIVE CONDUCT, NOT
THEIR ROLE AS PUBLISHERS OF OTHERS’ CONTENT .......................................................1
4
A. Defendants’ conduct increased children’s risk of sexual exploitation and CSAM .............1
5
B. Defendants intentionally designed their platforms to exploit adolescents’ social
6 insecurities ...........................................................................................................................6
7 C. Defendants’ conduct increased children’s risk of harm from dangerous challenges...........9
8 II. DOE II V. MYSPACE DOES NOT REQUIRE A CONTRARY RESULT .....................................9
9 III. CONCLUSION ..............................................................................................................................10
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
i
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 I. PLAINTIFFS’ CLAIMS TARGET DEFENDANTS’ AFFIRMATIVE CONDUCT, NOT
2 THEIR ROLE AS PUBLISHERS OF OTHERS’ CONTENT
3 When assessing whether Section 230 precludes liability for a Defendant’s actions, courts must
4 “consider the gravamen of the cause of action brought against the provider” because “Section 230 bars
5 liability only if the cause of action seeks to impose liability for the provider’s publication decisions
6 regarding third party content—for example, whether or not to publish and whether or not to depublish.”
7 (Dem. Order at p. 19.) In holding that Section 230 did not bar Plaintiffs’ negligence claims, the Court
8 reasoned that claims targeting “features allegedly negligently crafted or implemented by Defendants” did
9 not treat them as publishers, but sought liability for Defendants’ “own actions.” (Id. at pp. 60-61.) This
10 Court should take the same approach as to Defendants’ role in developing tools and features that increase
11 the risk of children’s exploitation, the proliferation of CSAM, and harm from challenges. At a minimum,
12 a reasonable factfinder could view Plaintiffs’ targeted allegations as supporting liability for Defendants’
13 own conduct. (Id. at p. 66.) Because Defendants’ “motion to strike is so broad as to include relevant
14 matters, the motion should be denied in its entirety.” (Hill v. Wrather (1958) 158 Cal.App.2d 818, 823.)
15 For Plaintiffs to prevail at the pleading stage, “there has to be an act that creates greater harm, that
16 Defendants do themselves, and that act has to not be a publication decision about third-party content.”
17 (Hearing Tr., 3/20/24, at 51:8-12.) This brief therefore highlights allegations that focus on Defendants’
18 conduct, not on publication of others’ content, illustrating how Defendants’ non-publishing decisions
19 greatly increased the risk of sexual exploitation of children and facilitated dangerous challenges. These
20 acts differentiate this case from Doe II v. MySpace (2009) 175 Cal.App.4th 561.
21 A. Defendants’ conduct increased children’s risk of sexual exploitation and CSAM
22 Plaintiffs do not seek to hold Defendants liable for publishing sexual messages or CSAM. They
23 seek to hold Defendants responsible for creating several tools that they know facilitate predation and
24 increase the risk that children will be harmed.1 Defendants rely on the no-duty-to-protect rule, but this rule
25 does not apply where a claim rests on “an affirmative act of defendant which created an undue risk of
26
27
1
See e.g., MC ¶¶ 156, 366-70, 377-78, 390-91, 472, 481, 494-98, 503, 510-13, 666-78, 684, 762, 774,
28 784-87, 799.
1
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 harm[,]” (Weirum v. RKO (1975) 15 Cal.3d 40, 41), or “increased the risk of harm to the plaintiff.” (Brown
2 v. USA Taekwondo (2021) 11 Cal. 5th 204, n.7.) Each feature described below increased that risk for child
3 users. Even though the harms Plaintiffs suffered due to Defendants’ negligent design and implementation
4 of these features sometimes involved third party misconduct, Defendants’ liability turns on their
5 negligence, not their role as publishers. (See Hassell v. Bird (2018) 5 Cal.5th 522, 542-43; accord Dem.
6 Order at pp. 63-64.) And as the Court previously noted, the “[d]uty to warn of harm allegedly flowing
7 from interactive features allegedly known to risk harm to minors is not barred by Section 230.” (Dem.
8 Order at p. 87.) Because Plaintiffs’ claims of negligence, failure to warn, and fraudulent concealment all
9 target harms that “flow” from Defendants’ negligently designed and implemented features, Section 230
10 gives no cause to strike allegations that support finding Defendants liable for their own conduct.2
11 Location:3 Defendants designed and implemented features that increase the risk that predators will
12 locate children. Meta’s geotag feature allows users to indicate a location for a post or photo, MC ¶ 380,
13 and TikTok encourages users to tag their location. MC ¶ 668. Similarly, Snap’s Snap Map posts children’s
14 location to the public by default. MC ¶¶ 478, 511. Defendants created these location tools, and the tools
15 themselves increase the risk that predators will find and exploit children. MC ¶¶ 381, 511, 669. Indeed,
16 predators can use Meta’s search by location tool to find victims, MC ¶¶ 380-81, who are even more
17 vulnerable because Defendants set children’s profiles to public by default. MC ¶¶ 373-75, 555-56.
18 Defendants created these tools, and the tools themselves create the risk of harm. (See Lemmon v. Snap
19 (9th Cir. 2021) 995 F.3d 1085, 1093 [finding Section 230 did not immunize Snap from its “creation of
20 the Speed Filter” and the ensuing harms from users’ engagement with the tool] [emphasis in original].)
21 To the extent that this Court deems these location tools to touch on others’ content, Defendants are too
22
23 2
Plaintiffs’ special relationship argument is set forth in their briefing at pp. 5-7 and is another basis to
24 deny Defendants’ motion.
3
This section focuses on tools by Meta, Snap and TikTok because the Short Form Plaintiffs A.S., K.L.,
25 N.S., Glenn-Mills, P.F., and K.K. allege harms from exploitation or sexual abuse against Defendants Meta,
Snap, and TikTok. See Am. Short Form Compl. (SFC), A.S. v. Meta, 22STCV28202 (Jan. 5, 2024) at 5;
26
Am. SFC, K.L. v. Meta, CIV SB 2218921 (Jan. 5, 2024) at 5; Am. SFC, Glenn Mills v. Meta,
27 23SMCV03371 (Jan. 5, 2024) at 5; Am. SFC, N.S. v. Snap, 22CV019089 (Jan. 5, 2024) at 2, 5; Am. SFC,
P.F. v. Meta, 23SMCV03371 (Jan. 5, 2024) at 2, 5; 2d Am. SFC, K.K. v. Meta, 23SMCV03371 (Jan. 17,
28 2024) at 5.
2
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 involved in soliciting and producing user locations to claim immunity as Defendants created the tools,
2 Snap set its tool to default, TikTok encourages children to use its tool, and Meta created a location search.
3 (See Wozniak v. YouTube, 2024 WL 1151750, at *20 (Cal. App. Mar. 15, 2024); see also PH II v. Superior
4 Ct. (1995) 33 Cal.App.4th 1680, 1683 [“[S]uch use of the motion to strike should be cautious and
5 sparing.”].)
6 Concealment: Defendants increased the risk of harm to children by implementing tools that
7 conceal activity from parents, including features that encourage and permit minor users to block their
8 parents’ accounts or restrict parental oversight, as through Instagram’s Close Friends Only, Facebook’s
9 “restricted list,” Snap’s My Eyes Only, and TikTok’s Friend’s Only features. MC ¶¶ 263-64, 476, 494-
10 97, 553. Meta permits adults to send encrypted messages to children. MC ¶¶ 377, 385. Defendants also
11 created tools to make private messages, photos and videos disappear. Instagram’s “photo bomb” and
12 Meta’s Vanish Mode allow users to send disappearing images or videos. MC ¶ 376. Snaps are, by their
13 nature, disappearing audiovisual messages, and Snap prevents parents from seeing these messages—even
14 with parental controls enabled. MC ¶¶ 412, 415-16, 471-72, 475. Snap further created a self-destruct tool
15 for My Eyes Only content if someone (including a parent) attempts to access it. MC ¶ 477. TikTok’s
16 direct messaging tool also allows messages with virtually no evidence of its content. MC ¶ 678.
17 Defendants’ concealment features directly make parental supervision more difficult while making
18 children less apprehensive to share CSAM content with predators, increasing the overall risk of unchecked
19 exploitation. MC ¶¶ 385, 412, 472, 476, 494, 496-98, 501-02. Worse, Defendants intentionally worked to
20 create spaces where children and their parents operate separately. MC ¶¶ 263-64 (“If Mom starts using an
21 app all the time, the app can lose a ‘cool’ factor, if we’re not conscious of separation.”). Defendants’
22 actions carry the cost of making children more vulnerable to predation. MC ¶¶ 377, 678. Defendants’
23 liability stems from creating tools for concealed messaging, and from making the concealment tools
24 available to minors—actions that do not require publishing or depublishing any particular content. (Accord
25 Dem. Order at p. 19.)
26 Meta’s choice to conceal the risks to children further increased the risk of harm to children. For
27 example, Meta told the public, including parents, that its products were safe for children and designed to
28 limit interactions between children and stranger adults, while hiding its knowledge that its features
3
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 exacerbated the risk of harm to child users. MC ¶¶ 343-54, 358, 361-63, 370. Snap, likewise, told the
2 public that Snap makes it difficult for adults to find minors and does not facilitate connections with
3 strangers, when the opposite is true. MC ¶ 511. These concealments increase the risk that parents will let
4 their children use Defendants’ platforms or more loosely monitor their activities under a false sense of
5 security. Moreover, Snap leads children to believe incorrectly that their images and videos are lost
6 forever—even though recipients can save the images or videos. MC ¶¶ 473-74. This concealment
7 increases the risk that children will engage in communications that they otherwise would not have. Such
8 claims relate to Defendants’ conduct, not Defendants’ publication of others’ content.
9 Money and gifts: Defendants’ features that allow predators to send money and gifts to children
10 increase the risk of sexual exploitation and solicitation of CSAM. These features have nothing to do with
11 Defendants role as publishers of others’ content, yet facilitate payments from adults to children who are
12 creating livestream content without their parents’ knowledge. MC ¶¶ 178, 203, 501, 603 (Facebook Live,
13 Instagram Live, Snap audio or video calls, and TikTok LIVE). On TikTok LIVE, a predator can award
14 “LIVE Gifts” and “Diamonds” to the child, which the minor can convert to money or virtual items. MC
15 ¶¶ 676-77. Similarly, from 2014-2018, Snapcash allowed adults to send cash to minors. MC ¶¶ 499-500.
16 These tools greatly increase the risk of sexual exploitation and creation of CSAM. MC ¶¶ 499-501, 677.
17 Again, Plaintiffs do not seek to hold Defendants liable as the publisher of others’ content. Defendants
18 could have avoided liability by not creating a feature that allows users to send and receive money or gifts,
19 or at minimum by not making that feature available to minors. Even if this Court determines that others’
20 content is involved, Defendants’ conduct (including creating TikTok “Diamonds”) is too intertwined to
21 determine—at least at this stage—that Defendants are immune.
22 Recommendations: Defendants increase the risk of harm to children by recommending that minors
23 and adult strangers connect.4 Worse, Facebook does not let users disable its recommendations, without
24 which the recommended minors would otherwise be nearly impossible for predators to find. MC ¶ 172.
25 An estimated 80% of “violating adult/minor connections” on Facebook result from its recommendations.
26
4
27 See MC ¶¶172, 372, 391 (Facebook’s People You May Know, Instagram’s Suggested for You,
Instagram’s Because You Watched); 481-83, 494, 511 (Snap’s Quick Add); 555 (TikTok’s Find
28 Friends and People You May Know).
4
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 MC ¶¶ 172, 372. Snap increases children’s Snapscore when they accept recommendations, incentivizing
2 children to connect with strangers. MC ¶¶ 480-82. Plaintiffs again do not seek to hold Defendants liable
3 for publishing the content of any user; rather, Defendants’ own recommendations that connect children
4 with adult users increase the risk of harm. Defendant’s liability arises from their platforms’
5 recommendation that the accounts connect (as opposed a friend request from the adult).
6 Predator sharing of CSAM: Defendants created tools that make it easier for predators to share
7 CSAM with other predators. This includes the concealment tools discussed above. In addition, TikTok’s
8 Private Videos (also called Post-in-Private) allows a user to store videos that are only accessible to the
9 user. MC ¶ 670. Predators use this tool to store and share CSAM. MC ¶¶ 670-71, 673. Worse, if a user
10 follows a small number of Post-in-Private accounts, TikTok will recommend other Post-in-Private
11 accounts to the user—making it easy for predators to find each other to view and share more CSAM. MC
12 ¶ 670. Plaintiffs again do not seek to hold Defendants liable for their publication decisions as to others’
13 content. Defendants could avoid liability by refraining from creating concealment tools, and from
14 recommending Post-in-Private accounts to users that frequently use the tool.
15 Failure to Protect: Defendants’ failure to take protective measures is negligent because—as
16 explained above—Defendants have increased the risk of harm through their affirmative conduct.
17 Defendants did not design adequate parental controls,5 methods to report exploitation or CSAM,6 and age-
18 verification measures.7 Inadequate age verification is particularly dangerous because predators can
19 pretend to be children. MC ¶ 438. For parental controls, Defendants do not require parental consent to
20 open accounts, connect with adults, or post location; do not link a child’s account with their parent’s
21 account; and do not allow parents to limit usage.8 Defendants also do not inform parents when CSAM is
22 found or of connections and interactions with adults. MC ¶¶ 261, 554. Nor do they warn parents of the
23
24
5
25 MC ¶¶ 258-67, 379 (Meta); 475, 491-93 (Snap); 540, 659 (TikTok); 688, 775 (YouTube),
6
26 MC ¶¶ 267, 382-84, 386-89, 394-96 (Meta); 506-07 (Snap); 679 (TikTok); 791-94, 801-02 (YouTube),
7
MC ¶¶ 238-57, 373 (Meta); 433, 435-36 (Snap); 540-50, 576, 659, 675 (TikTok); 688, 701, 714-22,
27 726, 783, 796 (YouTube)
28 8
MC ¶¶ 259, 261, 401 (Meta); 491-94 (Snap); 540, 576, 659 (TikTok); 775 (YouTube).
5
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 risk of predation.9 Although in August 2022, Snap’s Family Center finally allowed parents to see who
2 their child communicates with, parents cannot view the content of any message. MC ¶ 493. TikTok added
3 Family Pairing in April 2020, but children can skirt parental oversight by using a web browser. MC ¶¶
4 551-52. Moreover, parents can only use such features if they already know about their child’s account,
5 and children can conceal their activity by creating a second account. MC ¶¶ 493, 552. Plaintiffs, again, do
6 not seek to hold Defendants liable for publishing the content of any user; rather, Plaintiffs allege that
7 Defendants’ own conduct creating inadequate features to mitigate the risks that Defendants created.
8 Ultimately, Defendants have engaged in numerous affirmative acts that increased the risk of sexual
9 exploitation on their platforms, and as such Plaintiffs’ claims are not barred. (Dem. Order at pp. 61-62
10 [noting congressional intent when enacting Section 230 to “maximize user control over what information
11 is received,” and “empower parents to restrict their children’s access to objectionable or inappropriate
12 online material”].)
13 B. Defendants intentionally designed their platforms to exploit adolescents’ social
14 insecurities
15 Many features of Defendants’ platforms increase the risks of child sexual exploitation because
16 Defendants’ platforms are designed to exploit children’s vulnerabilities. But Plaintiffs’ relevant
17 allegations on that issue are not limited to the allegations that Defendants seek to strike. Under California
18 law, courts inquire “whether the defendant’s ‘entire conduct created a risk of harm.’” (Kuciemba v. Victory
19 Woodworks (2023) 14 Cal.5th 993, 1017 [quoting Brown, supra, 11 Cal.5th at n.6].) The proper approach
20 to this holistic inquiry requires the Court to consider the totality of Defendants’ allegedly negligent
21 conduct, which only further underscores the impropriety of Defendants’ motion.
22 The Court should consider Plaintiffs’ allegations in context and alongside their allegations that
23 Defendants’ affirmative conduct exploited adolescents’ developing brains, desire for social validation,
24 and fear of feeling socially inadequate and alone.10 Adolescents “are especially vulnerable to developing
25
26 9
MC ¶¶ 403 (Meta); 514, 518 (Snap); 652, 662 (TikTok); 803, 809 (YouTube).
10
27 See, e.g., MC ¶¶ 2, 12, 66, 77, 80, 91, 194, 235-36, 411-12, 415, 439-440, 443-45, 447, 450-52, 454-
56, 461, 490, 529, 533, 535, 557, 601-06, 659, 686-87, 700, 713, 723, 725, 732, 735-36, 767, 769.
28
6
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 harmful behaviors because their prefrontal cortex is not fully developed,” and have “less impulse control
2 and less ability to evaluate risks, regulate emotions and regulate their responses to social rewards....” MC
3 ¶¶ 64-65. When they step away from Defendants’ platforms for hours or days, youth may experience
4 anxiety, loneliness, depression, dysphoria, irritability, fatigue, or a greater reduction in impulse control.11
5 Defendants knew this and capitalized on it, designing their platforms with features that present social
6 rewards in a way that leads young users to compulsively seek out more stimulation, feel negative
7 withdrawal symptoms when denied stimulation, and reduce impulse control and emotional regulation. MC
8 ¶¶ 78, 128, 283. By creating the social pressure for minors to accept predator communications and then
9 recommending predators to minors, Defendants increased the risk of harm. The Court should reject
10 Defendants’ attempt to isolate certain factual allegations from Defendants’ whole course of conduct.
11 Social approval features: Defendants’ social approval features encourage children to connect and
12 engage with predators. These features (such as likes, reactions, hearts, view counts, comments, follows,
13 shares or reposts)12 function as a social measuring stick. MC ¶ 173. By creating an environment where
14 minors feel socially inadequate if they do not accept Defendants’ recommendation to connect with an
15 adult stranger, Defendants increased the risk of connections between predators and children.
16 Notifications: Defendants use notifications to exploit adolescents’ cravings for more dopamine
17 and their need for “reciprocity”—the psychological desire to respond to an initial gesture.13 This increases
18 the risk that children will respond to a message from a predator. For example, TikTok sends notifications
19 (TikTok Now) that manufacture a sense of urgency by making users create private posts within three
20 minutes or lose the ability to see their friends’ posts made within that timeframe. MC ¶ 604. Similarly,
21 because a Snap disappears within 10 seconds of being viewed, users feel the urge to respond immediately.
22 MC ¶ 441. And Meta intentionally limits the information in notifications to increase curiosity. MC ¶ 279.
23
24 Plaintiffs also extensively detail Defendants’ knowledge that children were addicted to their platforms.
See, e.g., MC ¶¶ 2, 12, 69, 73-74, 84-85, 105, 120, 156, 159, 265, 269, 272, 285-86, 288-89, 292, 311,
25 314, 339-40, 412-13, 440, 458, 480, 529, 599-600, 684.
11
26 See, e.g., MC ¶¶ 70, 72-73, 76-77, 79, 87, 102, 110, 113, 121-22, 124, 126, 128, 288, 412, 440, 458,
479, 605, 763-65.
27 12
MC ¶¶ 71, 80, 86, 92-93, 128, 179, 202, 235, 271-74, 316, 325, 598, 606, 724, 726, 735, 767.
28 13
MC ¶¶ 3, 81-83, 86, 89-90, 107, 109, 236, 268, 272, 278, 280, 440, 460, 554, 604-05, 607, 726.
7
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 Defendants’ calculated use of notifications enhances children’s desire for social acceptance when they
2 open a message from another user, by Defendants’ design. Children using Defendants’ platforms are thus
3 more likely to feel a psychological need to open and respond to a message from a predator.
4 Snap rewards: Snap’s highly addictive features reward users for using Snapchat and punish them
5 for time away from the platform. MC ¶¶ 412, 440, 446. These score-keeping features incentivize minors
6 to connect with adult strangers and to keep the conversation going. Snapscores, for example, give users
7 a point score that is visible to other users, based on their number of messages sent and stories posted. MC
8 ¶¶ 440, 442-44. Young users are more likely to associate their worth with their Snapscore. MC ¶¶ 444-45.
9 Snap’s trophies rewarded users for increasing their Snapscores, sending creative Snaps, or posting live
10 stories. Other users could view a user’s trophies in their “trophy box.” MC ¶¶ 412, 447. In 2020, Snap
11 replaced trophies with charms, which reward users for hitting milestones in their relationships with other
12 users. MC ¶ 448. Not only does Snap send positive charms like “Best Friends Forever” when two users
13 communicate extensively, but Snap sends negative charms, such as “It’s Been Forever” or “It’s Been a
14 Minute,” to encourage users to reengage. Further, Snap Streak is a highly-addictive feature designed to
15 encourage users to interact every day. MC ¶¶ 92, 412, 414, 440, 452, 454, 456. Starting with the fire emoji
16 to reward users for interacting three consecutive days, users earn additional emojis as their Streak grows,
17 such as the 100 emoji for 100 consecutive days. MC ¶ 452. To manufacture a sense of urgency, Snap
18 sends notifications to users when their Streak is about to expire in order to keep the Streak alive. MC ¶
19 455. Snap has even made a special form where users who lost their Streak can petition to get it back,
20 showing how significant the Streak is to users. MC ¶ 456. Taken together, these features increase the risk
21 of lengthy back-and-forth communications between predators and children.
22 Barriers to account deletion: Defendants intentionally designed cumbersome processes to delete
23 accounts while simultaneously dissuading users from leaving. These unnecessary hurdles make it more
24 difficult for children escape harmful relationships with adults. For instance, Defendants remind users of
25 the friends they will leave behind, force users explain why they are leaving, and then keep user accounts
26 open for a month anyway in the hopes that users will change their minds.14 By pressuring children to
27
28 14
See MC ¶¶ 332-37 (Meta); 461-62 (Snap); 641-51 (TikTok); MC ¶¶ 770-73 (YouTube).
8
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 maintain their accounts and making the deletion process convoluted, Defendants increase the risk that
2 children will abandon deletion attempts and remain on the social network during harmful relationships.
3 C. Defendants’ conduct increased children’s risk of harm from dangerous challenges
4 Plaintiffs’ challenge-related allegations focus narrowly on certain actions of Defendants—their
5 marketing and platform design to foster addiction. These acts—all to drive unhealthy levels of
6 engagement—are themselves harmful, regardless of the content shown. For example, TikTok built
7 challenges into its “architecture and user interface” to further attract and addict children and adolescent
8 users by fostering competition among users seeking to gain more social validation through likes and views.
9 MC ¶ 608; see also MC ¶ 767 (YouTube). And Defendants such as TikTok “encourage[] business to create
10 challenges as a form of marketing” because challenges drive engagement and revenue. MC ¶ 611. Because
11 of TikTok’s “engagement-maximization design,” TikTok automatically promotes whatever draws
12 attention, regardless of its substance. MC ¶ 612. But Plaintiffs’ even expressly connect the risks from
13 challenges to Defendants’ design choices. MC ¶ 626 (lack of age verification, designing algorithm to push
14 challenges to young children, and lack of warnings). These allegations target Defendants’ negligent design
15 and marketing to drive harmful engagement, not their decisions to publish or depublish particular content.
16 MC ¶¶ 608, 612-26 (TikTok); 767 (YouTube); 128 (both). Section 230 therefore gives no shelter.
17 II. DOE II V. MYSPACE DOES NOT REQUIRE A CONTRARY RESULT
18 Doe II does not support Defendants’ motion. The Doe II court believed it was “undeniable that
19 [the plaintiffs] seek to hold MySpace responsible for the communications between the Julie Does and their
20 assailants.” (Doe II, supra, 175 Cal.App.4th at 565.) The allegations here are decidedly different—and
21 thus, Doe II is not binding regarding the specific duties discussed above. Indeed, the Court of Appeal in
22 Bolger described the claims in Doe II as “based on a website’s decision ‘to restrict or make available
23 certain material,’” and noted that such claims are different from those that seek to hold a website liable
24 for its own non-publishing conduct. (Bolger v. Amazon.com, LLC (2020) 53 Cal.App.5th 431, 465–66.)
25 Defendants cite Doe II as supporting broad immunity whenever content is in the causal chain, but
26 California law does not support a but-for test. (See id. [“The fact that some content provided by Lenoge
27 was posted on the Amazon website does not automatically immunize Amazon for its own choices and
28 activities unrelated to that content.”]; accord Hassell, supra, 5 Cal.5th at 542-43 [plurality]; id. at 559
9
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 [Kruger, J., concurring]; Dem. Order at 19-20, 63; Lee v. Amazon.com (2022) 76 Cal.App.5th 200, 256
2 [permitting claim where issue was whether plaintiffs were warned about mercury]; Demetriades v. Yelp
3 (2014) 228 Cal.App.4th 294, 313 [permitting claim for misrepresenting accuracy of filter]; see also
4 HomeAway.com v. City of Santa Monica, 918 F.3d 676, 682 (9th Cir. 2019); Barnes v. Yahoo (9th Cir.
5 2009) 570 F.3d 1096, 1107 [harm stemmed from posts, but permitted promissory estoppel claim].) Courts
6 in California instead ask whether the duty at issue treats the defendant as the publisher of another’s content.
7 (Dem. Order at pp. 57-59 [citing Lee, supra, 76 Cal. App. 5th at p. 256 and Lemmon, supra, 995 F.3d at
8 p. 1092]; Hassell, supra, 5 Cal.5th at pp. 542-543 [“[N]ot all legal duties owed by Internet intermediaries
9 necessarily treat them as the publishers of third-party content, even when these obligations are in some
10 way associated with their publication of this material”] [plurality, emphasis added]; id. at 559 [Kruger, J.,
11 concurring {quoting this sentence from plurality with approval}].)
12 For these reasons, nothing in Doe II requires striking the very different allegations in this case.
13 III. CONCLUSION
14 For these reasons, this Court should deny the motion to strike.
15 / / /
16 / / /
17 / / /
18 / / /
19 / / /
20 / / /
21 / / /
22 / / /
23 / / /
24 / / /
25 / / /
26 / / /
27 / / /
28
10
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 DATED: April 3, 2024 PANISH | SHEA | BOYLE | RAVIPUDI LLP
2
3
Brian J. Panish
4 Rahul Ravipudi
Jesse Creed
5 11111 Santa Monica Boulevard, Suite 700
Los Angeles, CA 90025
6 Tel.: (310) 477-1700
7 panish@psbr.law
rravipudi@psbr.law
8 jcreed@psbr.law
9 Emily Jeffcott
MORGAN & MORGAN
10 633 West Fifth Street, Suite 2652
11 Los Angeles, CA 90071
Tel.: 213-787-8590
12 Fax: 213-418-3983
ejeffcott@forthepeople.com
13
Joseph G. VanZandt
14 BEASLEY ALLEN CROW METHVIN PORTIS &
15 MILES, LLC
234 Commerce Street
16 Montgomery, AL 36103
Tel.: 334-269-2343
17 Joseph.VanZandt@BeasleyAllen.com
18
Paul R. Kiesel
19 Mariana A. McConnell
Cherisse H. Cleofe
20 KIESEL LAW LLP
8648 Wilshire Boulevard
21 Beverly Hills, CA 90211
Tel.: 310-854-4444
22
Fax: 310-854-0812
23 kiesel@kiesel.law
mcconnell@kiesel.law
24 cleofe@kiesel.law
25
26
27
28
11
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 Christopher L. Ayers
SEEGER WEISS LLP
2 55 Challenger Road
Ridgefield Park, NJ 07660
3
Tel.: 973-639-9100
4 Fax: 973-679-8656
cayers@seegerweiss.com
5
Matthew Bergman
6 Laura Marquez-Garrett
SOCIAL MEDIA VICTIMS LAW CENTER
7
1390 Market Street, Suite 200
8 San Francisco, CA 94102
Tel.: 206-741-4862
9 matt@socialmediavictims.org
laura@socialmediavictims.org
10
11 Brooks Cutter
CUTTER LAW P.C.
12 401 Watt Avenue
Sacramento, CA 95864
13 Tel.: 916-290-9400
Fax: 916-588-9330
14 bcutter@cutterlaw.com
15
Thomas P. Cartmell
16 WAGSTAFF & CARTMELL LLP
4740 Grand Avenue Suite 300
17 Kansas City, MO 64112
Tel.: 816-701-1100
18 tcartmell@wcllp.com
19
Amy Eskin
20 SCHNEIDER WALLACE COTTRELL
KONECKY LLP
21 2000 Powell Street Suite 1400
Emeryville, CA 94608
22
Tel.: 415-421-7100
23 Fax: 415-421-7105
aeskin@schneiderwallace.com
24
Kirk Goza
25 GOZA & HONNOLD, LLC
9500 Nall Avenue, Suite 400
26
Overland Park, KS 66207
27 Tel.: 913-386-3547
Fax: 913-839-0567
28 kgoza@gohonlaw.com
12
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 Rachel Lanier
THE LANIER LAW FIRM, P.C.
2 2829 Townsgate Road, Suite 100
Westlake Village, CA 91361
3
Tel.: 713-659-5200
4 Rachel.Lanier@LanierLawFirm.com
5 Sin-Ting Mary Liu
AYLSTOCK, WITKIN, KREIS & OVERHOLTZ,
6 PLLC
17 E Main St #200
7
Pensacola, FL 32502
8 Tel.: 850-202-1010
mliu@awkolaw.com
9
Marc J. Mandich
10 SOUTHERN MED LAW
11 2762 B M Montgomery Street, Suite 101
Homewood, AL 35209
12 Tel.: 205-564-2741
Fax: 205-649-6346
13 marc@southernmedlaw.com
14 Kelly McNabb
15 LIEFF CABRASER HEIMANN & BERNSTEIN,
LLP
16 275 Battery Street, 29th Floor
San Francisco, CA 94111-3339
17 Tel.: 415-956-1000
kmcnabb@lchb.com
18
19 Jonathan D. Orent
MOTLEY RICE LLC
20 40 Westminster St., 5th Fl.
Providence RI 02903
21 Tel.: 401-457-7723
Fax: 401-457-7708
22
jorent@motleyrice.com
23
Ruth Rizkalla
24 THE CARLSON LAW FIRM, PC
1500 Rosecrans Avenue, Suite 500
25 Manhattan Beach, CA 90266
Tel.: 254-526-5688
26
Fax: 254-526-8204
27 rrizkalla@carlsonattorneys.com
28
13
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 Frederick Schenk
CASEY GERRY SCHENK FRANCAVILLA
2 BLATT & PENFIELD, LLP
110 Laurel Street
3
San Diego, CA 92101-1486
4 Tel.: 619-238-1811
Fax: 619-544-9232
5 Fschenk@cglaw.com
6 Dean Kawamoto
KELLER ROHRBACK L.L.P.
7
1201 Third Ave., Ste. 3200
8 Seattle, WA 98101
Tel.: 206-623-1900
9 Fax: 206-623-3384
dkawamoto@kellerrohrback.com
10
11 James P. Frantz
FRANTZ LAW GROUP
12 402 West Broadway, Suite 860
San Diego, CA 92102
13 Tel: 619-831-8966
Fax: 619-525-7672
14 jpf@frantzlawgroup.com
15
Co-Lead, Co-Liaison, and Leadership Counsel
16 for Plaintiffs
17
18
19
20
21
22
23
24
25
26
27
28
14
PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY
MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS
1 PROOF OF SERVICE
2 STATE OF CALIFORNIA, COUNTY OF LOS ANGELES
3 At the time of service, I was over 18 years of age and not a party to this action. I am
employed in the County of Los Angeles, State of California. My business address is 8648 Wilshire
4 Boulevard, Beverly Hills, CA 90211-2910.
5
On April 3, 2024, I served true copies of the following document(s) described as
6 PLAINTIFFS’ SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION
TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE
7 ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS on the interested
parties in this action as follows:
8
9
BY ELECTRONIC SERVICE VIA CASE ANYWHERE: In accordance with the Court’s
10 Order Authorizing Electronic Service requiring all documents to be served upon interested parties
via the Case Anywhere System.
11
I declare under penalty of perjury under the laws of the State of California that the foregoing
12 is true and correct.
13
Executed on April 3, 2024, at Beverly Hills, California.
14
15
16 Sylvia Mendoza
17
18
19
20
21
22
23
24
25
26
27
28
PROOF OF SERVICE