Preview
1 PANISH | SHEA | BOYLE | RAVIPUDI LLP
BRIAN J. PANISH, State Bar No. 116060
2 bpanish@psbr.law
RAHUL RAVIPUDI, State Bar No. 204519
3 rravipudi@psbr.law
JESSE CREED, State Bar No. 272595
4
jcreed@psbr.law
5 11111 Santa Monica Boulevard, Suite 700
Los Angeles, California 90025
6 Telephone: 310.477.1700
Facsimile: 310.477.1699
7
MORGAN & MORGAN
8 EMILY C. JEFFCOTT (admitted pro hac vice)
ejeffcott@forthepeople.com
9 633 West Fifth Street, Suite 2652,
Los Angeles, CA 90071
10 Tel: (213) 787-8590
Fax: (213) 418-3983
11
12 BEASLEY ALLEN
JOSEPH VANZANDT (admitted pro hac vice)
13 joseph.vanzandt@beasleyallen.com
234 Commerce Street
14 Montgomery, AL 36103
Tel: (334)269-2343
15
Co-Lead and Co-Liaison Counsel for Plaintiffs
16
SUPERIOR COURT OF THE STATE OF CALIFORNIA
17
FOR THE COUNTY OF LOS ANGELES
18
COORDINATION PROCEEDING JUDICIAL COUNCIL COORDINATION
19 SPECIAL TITLE [RULE 3.400] PROCEEDING NO. 5255
20
SOCIAL MEDIA CASES For Filing Purposes: 22STCV21355
21 _____________________________________
NOTICE OF FILING PLAINTIFFS’
THIS DOCUMENT RELATES TO:
22 AMENDED SUPPLEMENTAL BRIEF IN
OPPOSITION TO DEFENDANTS’
23 All Cases
MOTION TO STRIKE THIRD-PARTY
24 MISCONDUCT AND ONLINE
(Christina Arlington Smith, et al., v. TikTok
CHALLENGE ALLEGATIONS FROM
Inc., et al., Case No. 22STCV21355)
25 IDENTIFIED SHORT-FORM
COMPLAINTS
26
Judge: Hon. Carolyn B. Kuhl
27
SSC-12
28
1
NOTICE OF FILING PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’
MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM
IDENTIFIED SHORT-FORM COMPLAINTS
1 TO ALL PARTIES AND THEIR ATTORNEYS FOR RECORD:
2 PLEASE TAKE NOTICE THAT Plaintiffs filed an Amended Supplemental Brief in
3 Opposition to Defendants’ Motion to Strike Third-Party Misconduct and Online Challenge
4 Allegations from Identified Short-Form Complaints (“Amended Brief”). As stated in Footnote 1 on
5 page 1 of the Amended Brief, Plaintiffs filed the corrected pleading to bring the allegations in the
6 Brief in line with the allegations against Snap in the Complaint. While Plaintiffs’ counsel regret this
7 error, the corrections do not alter the Section 230 argument or analysis. The Amended Brief is
8 attached as Exhibit “1,” and a redlined version indicating the changes from the original pleading is
9 attached as Exhibit “2.” Plaintiffs apologize to the Court and all Parties for any confusion or
10 inconvenience.
11 DATED: April 8, 2024 MORGAN & MORGAN
12 /s/ Emily Jeffcott
Emily Jeffcott
13
633 West Fifth Street, Suite 2652
14 Los Angeles, CA 90071
Tel.: 213-787-8590
15 Fax: 213-418-3983
ejeffcott@forthepeople.com
16
17 Brian J. Panish
Rahul Ravipudi
18 Jesse Creed
PANISH | SHEA | BOYLE | RAVIPUDI LLP
19 11111 Santa Monica Boulevard, Suite 700
Los Angeles, CA 90025
20 Tel.: (310) 477-1700
21 panish@psbr.law
rravipudi@psbr.law
22 jcreed@psbr.law
23 Joseph G. VanZandt
BEASLEY ALLEN CROW METHVIN PORTIS &
24 MILES, LLC
25 234 Commerce Street
Montgomery, AL 36103
26 Tel.: 334-269-2343
Joseph.VanZandt@BeasleyAllen.com
27
28
1
NOTICE OF FILING PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’
MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM
IDENTIFIED SHORT-FORM COMPLAINTS
1 Paul R. Kiesel
Mariana A. McConnell
2 Cherisse H. Cleofe
KIESEL LAW LLP
3
8648 Wilshire Boulevard
4 Beverly Hills, CA 90211
Tel.: 310-854-4444
5 Fax: 310-854-0812
kiesel@kiesel.law
6 mcconnell@kiesel.law
cleofe@kiesel.law
7
8 Christopher L. Ayers
SEEGER WEISS LLP
9 55 Challenger Road
Ridgefield Park, NJ 07660
10 Tel.: 973-639-9100
11 Fax: 973-679-8656
cayers@seegerweiss.com
12
Matthew Bergman
13 Laura Marquez-Garrett
SOCIAL MEDIA VICTIMS LAW CENTER
14 1390 Market Street, Suite 200
15 San Francisco, CA 94102
Tel.: 206-741-4862
16 matt@socialmediavictims.org
laura@socialmediavictims.org
17
Brooks Cutter
18
CUTTER LAW P.C.
19 401 Watt Avenue
Sacramento, CA 95864
20 Tel.: 916-290-9400
Fax: 916-588-9330
21 bcutter@cutterlaw.com
22
Thomas P. Cartmell
23 WAGSTAFF & CARTMELL LLP
4740 Grand Avenue Suite 300
24 Kansas City, MO 64112
Tel.: 816-701-1100
25 tcartmell@wcllp.com
26
Amy Eskin
27 SCHNEIDER WALLACE COTTRELL
KONECKY LLP
28 2000 Powell Street Suite 1400
2
NOTICE OF FILING PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’
MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM
IDENTIFIED SHORT-FORM COMPLAINTS
1 Emeryville, CA 94608
Tel.: 415-421-7100
2 Fax: 415-421-7105
aeskin@schneiderwallace.com
3
4 Kirk Goza
GOZA & HONNOLD, LLC
5 9500 Nall Avenue, Suite 400
Overland Park, KS 66207
6 Tel.: 913-386-3547
Fax: 913-839-0567
7
kgoza@gohonlaw.com
8
Rachel Lanier
9 THE LANIER LAW FIRM, P.C.
2829 Townsgate Road, Suite 100
10 Westlake Village, CA 91361
11 Tel.: 713-659-5200
Rachel.Lanier@LanierLawFirm.com
12
Sin-Ting Mary Liu
13 AYLSTOCK, WITKIN, KREIS & OVERHOLTZ,
PLLC
14 17 E Main St #200
15 Pensacola, FL 32502
Tel.: 850-202-1010
16 mliu@awkolaw.com
17 Marc J. Mandich
SOUTHERN MED LAW
18
2762 B M Montgomery Street, Suite 101
19 Homewood, AL 35209
Tel.: 205-564-2741
20 Fax: 205-649-6346
marc@southernmedlaw.com
21
Kelly McNabb
22
LIEFF CABRASER HEIMANN & BERNSTEIN,
23 LLP
275 Battery Street, 29th Floor
24 San Francisco, CA 94111-3339
Tel.: 415-956-1000
25 kmcnabb@lchb.com
26
Jonathan D. Orent
27 MOTLEY RICE LLC
40 Westminster St., 5th Fl.
28 Providence RI 02903
3
NOTICE OF FILING PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’
MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM
IDENTIFIED SHORT-FORM COMPLAINTS
1 Tel.: 401-457-7723
Fax: 401-457-7708
2 jorent@motleyrice.com
3
Ruth Rizkalla
4 THE CARLSON LAW FIRM, PC
1500 Rosecrans Avenue, Suite 500
5 Manhattan Beach, CA 90266
Tel.: 254-526-5688
6 Fax: 254-526-8204
rrizkalla@carlsonattorneys.com
7
8 Frederick Schenk
CASEY GERRY SCHENK FRANCAVILLA
9 BLATT & PENFIELD, LLP
110 Laurel Street
10 San Diego, CA 92101-1486
11 Tel.: 619-238-1811
Fax: 619-544-9232
12 Fschenk@cglaw.com
13 Dean Kawamoto
KELLER ROHRBACK L.L.P.
14 1201 Third Ave., Ste. 3200
15 Seattle, WA 98101
Tel.: 206-623-1900
16 Fax: 206-623-3384
dkawamoto@kellerrohrback.com
17
James P. Frantz
18
FRANTZ LAW GROUP
19 402 West Broadway, Suite 860
San Diego, CA 92102
20 Tel: 619-831-8966
Fax: 619-525-7672
21 jpf@frantzlawgroup.com
22
Co-Lead, Co-Liaison, and Leadership Counsel
23 for Plaintiffs
24
25
26
27
28
4
NOTICE OF FILING PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’
MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM
IDENTIFIED SHORT-FORM COMPLAINTS
EXHIBIT 1
1 PANISH | SHEA | BOYLE | RAVIPUDI LLP
BRIAN J. PANISH, State Bar No. 116060
2 bpanish@psbr.law
RAHUL RAVIPUDI, State Bar No. 204519
3 rravipudi@psbr.law
JESSE CREED, State Bar No. 272595
4 jcreed@psbr.law
11111 Santa Monica Boulevard, Suite 700
5 Los Angeles, CA 90025
Telephone: 310.477.1700
6 Facsimile: 310.477.1699
7 MORGAN & MORGAN
EMILY C. JEFFCOTT (admitted pro hac vice)
8 ejeffcott@forthepeople.com
633 West Fifth Street, Suite 2652
9 Los Angeles, CA 90071
Tel: (213) 787-8590
10 Fax: (213) 418-3983
11 BEASLEY ALLEN
JOSEPH VANZANDT (admitted pro hac vice)
12 joseph.vanzandt@beasleyallen.com
234 Commerce Street
13 Montgomery, AL 36103
Tel: (334)269-2343
14
Co-Lead and Co-Liaison Counsel for Plaintiffs
15
16 SUPERIOR COURT OF THE STATE OF CALIFORNIA
17 FOR THE COUNTY OF LOS ANGELES
18
19 COORDINATION PROCEEDING JUDICIAL COUNCIL COORDINATION
SPECIAL TITLE [RULE 3.400] PROCEEDING NO. 5255
20
SOCIAL MEDIA CASES For Filing Purposes: 22STCV21355
21
_____________________________________
22 THIS DOCUMENT RELATES TO: Judge: Hon. Carolyn B. Kuhl
Dept.: SSC-12
23 (Christina Arlington Smith, et al., v. TikTok
Inc., et al., Case No. 22STCV21355) PLAINTIFFS’ AMENDED
24
SUPPLEMENTAL BRIEF IN OPPOSITION
25 (A.S. et al. v. Meta Platforms, Inc. et al., Case TO DEFENDANTS’ MOTION TO STRIKE
No. 22STCV28202) THIRD-PARTY MISCONDUCT AND
26 ONLINE CHALLENGE ALLEGATIONS
(Glenn-Mills v. Meta Platforms, Inc. et al., FROM IDENTIFIED SHORT-FORM
27 Case No. 23SMCV03371) COMPLAINTS
28
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 (J.S. et al. v. Meta Platforms, Inc. et al., Case Date: April 24, 2024
No. CV2022-1472) Time: 9:00 a.m.
2 Dept.: SSC-12
(K.K. et al. v. Meta Platforms, Inc. et al., Case
3 No. 23SMCV03371)
4
(K.L. et al. v. Meta Platforms, Inc. et al., Case
5 No. CIVSB2218921)
6 (N.S. et al. v. Snap Inc., Case No. 22CV019089)
7 (P.F. et al. v. Meta Platforms, Inc. et al., Case
No. 23SMCV03371)
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
2
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 TABLE OF CONTENTS
Page
2
3 I. PLAINTIFFS’ CLAIMS TARGET DEFENDANTS’ AFFIRMATIVE CONDUCT, NOT
THEIR ROLE AS PUBLISHERS OF OTHERS’ CONTENT .......................................................1
4
A. Defendants’ conduct increased children’s risk of sexual exploitation and CSAM .............1
5
B. Defendants intentionally designed their platforms to exploit adolescents’ social
6 insecurities ...........................................................................................................................6
7 C. Defendants’ conduct increased children’s risk of harm from dangerous challenges...........9
8 II. DOE II V. MYSPACE DOES NOT REQUIRE A CONTRARY RESULT .....................................9
9 III. CONCLUSION ..............................................................................................................................10
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
i
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 I. PLAINTIFFS’ CLAIMS TARGET DEFENDANTS’ AFFIRMATIVE CONDUCT, NOT
2 THEIR ROLE AS PUBLISHERS OF OTHERS’ CONTENT 1
3 When assessing whether Section 230 precludes liability for a Defendant’s actions, courts must
4 “consider the gravamen of the cause of action brought against the provider” because “Section 230 bars
5 liability only if the cause of action seeks to impose liability for the provider’s publication decisions
6 regarding third party content—for example, whether or not to publish and whether or not to depublish.”
7 (Dem. Order at p. 19.) In holding that Section 230 did not bar Plaintiffs’ negligence claims, the Court
8 reasoned that claims targeting “features allegedly negligently crafted or implemented by Defendants” did
9 not treat them as publishers, but sought liability for Defendants’ “own actions.” (Id. at pp. 60-61.) This
10 Court should take the same approach as to Defendants’ role in developing tools and features that increase
11 the risk of children’s exploitation, the proliferation of CSAM, and harm from challenges. At a minimum,
12 a reasonable factfinder could view Plaintiffs’ targeted allegations as supporting liability for Defendants’
13 own conduct. (Id. at p. 66.) Because Defendants’ “motion to strike is so broad as to include relevant
14 matters, the motion should be denied in its entirety.” (Hill v. Wrather (1958) 158 Cal.App.2d 818, 823.)
15 For Plaintiffs to prevail at the pleading stage, “there has to be an act that creates greater harm, that
16 Defendants do themselves, and that act has to not be a publication decision about third-party content.”
17 (Hearing Tr., 3/20/24, at 51:8-12.) This brief therefore highlights allegations that focus on Defendants’
18 conduct, not on publication of others’ content, illustrating how Defendants’ non-publishing decisions
19 greatly increased the risk of sexual exploitation of children and facilitated dangerous challenges. These
20 acts differentiate this case from Doe II v. MySpace (2009) 175 Cal.App.4th 561.
21 A. Defendants’ conduct increased children’s risk of sexual exploitation and CSAM
22 Plaintiffs do not seek to hold Defendants liable for publishing sexual messages or CSAM. They
23 seek to hold Defendants responsible for creating several tools that they know facilitate predation and
24
25
26
1
27 Plaintiffs file this corrected brief to bring the allegations in the brief in line with the Complaint’s
allegations against Snap. While Plaintiffs’ counsel regret this error, the corrections do not alter the
28 Section 230 argument or analysis.
1
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 increase the risk that children will be harmed. 2 Defendants rely on the no-duty-to-protect rule, but this rule
2 does not apply where a claim rests on “an affirmative act of defendant which created an undue risk of
3 harm[,]” (Weirum v. RKO (1975) 15 Cal.3d 40, 41), or “increased the risk of harm to the plaintiff.” (Brown
4 v. USA Taekwondo (2021) 11 Cal. 5th 204, n.7.) Each feature described below increased that risk for child
5 users. Even though the harms Plaintiffs suffered due to Defendants’ negligent design and implementation
6 of these features sometimes involved third party misconduct, Defendants’ liability turns on their
7 negligence, not their role as publishers. (See Hassell v. Bird (2018) 5 Cal.5th 522, 542-43; accord Dem.
8 Order at pp. 63-64.) And as the Court previously noted, the “[d]uty to warn of harm allegedly flowing
9 from interactive features allegedly known to risk harm to minors is not barred by Section 230.” (Dem.
10 Order at p. 87.) Because Plaintiffs’ claims of negligence, failure to warn, and fraudulent concealment all
11 target harms that “flow” from Defendants’ negligently designed and implemented features, Section 230
12 gives no cause to strike allegations that support finding Defendants liable for their own conduct. 3
13 Location:4 Defendants designed and implemented features that increase the risk that predators will
14 locate children. Meta’s geotag feature allows users to indicate a location for a post or photo, MC ¶ 380,
15 and TikTok encourages users to tag their location. MC ¶ 668. Similarly, Snap’s Snap Map is designed to
16 allow children to post their location to the public. MC ¶¶ 478, 511. Defendants created these location
17 tools, and the tools themselves increase the risk that predators will find and exploit children. MC ¶¶ 381,
18 511, 669. Indeed, predators can use Meta’s search by location tool to find victims, MC ¶¶ 380-81, who
19 are even more vulnerable because Defendants Meta and TikTok set children’s profiles to public by default.
20 MC ¶¶ 373-75, 555-56. Defendants created these tools, and the tools themselves create the risk of harm.
21
2
22 See e.g., MC ¶¶ 156, 366-70, 377-78, 390-91, 472, 481, 494-98, 503, 510-13, 666-78, 684, 762, 774,
784-87, 799.
23 3
Plaintiffs’ special relationship argument is set forth in their briefing at pp. 5-7 and is another basis to
deny Defendants’ motion.
24
4
This section focuses on tools by Meta, Snap and TikTok because the Short Form Plaintiffs A.S., K.L.,
25 N.S., Glenn-Mills, P.F., and K.K. allege harms from exploitation or sexual abuse against Defendants Meta,
Snap, and TikTok. See Am. Short Form Compl. (SFC), A.S. v. Meta, 22STCV28202 (Jan. 5, 2024) at 5;
26
Am. SFC, K.L. v. Meta, CIV SB 2218921 (Jan. 5, 2024) at 5; Am. SFC, Glenn Mills v. Meta,
27 23SMCV03371 (Jan. 5, 2024) at 5; Am. SFC, N.S. v. Snap, 22CV019089 (Jan. 5, 2024) at 2, 5; Am. SFC,
P.F. v. Meta, 23SMCV03371 (Jan. 5, 2024) at 2, 5; 2d Am. SFC, K.K. v. Meta, 23SMCV03371 (Jan. 17,
28 2024) at 5.
2
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 (See Lemmon v. Snap (9th Cir. 2021) 995 F.3d 1085, 1093 [finding Section 230 did not immunize Snap
2 from its “creation of the Speed Filter” and the ensuing harms from users’ engagement with the tool]
3 [emphasis in original].) To the extent that this Court deems these location tools to touch on others’ content,
4 Defendants are too involved in soliciting and producing user locations to claim immunity as Defendants
5 created the tools, Snap set its tool to default, TikTok encourages children to use its tool, and Meta created
6 a location search. (See Wozniak v. YouTube, 2024 WL 1151750, at *20 (Cal. App. Mar. 15, 2024); see
7 also PH II v. Superior Ct. (1995) 33 Cal.App.4th 1680, 1683 [“[S]uch use of the motion to strike should
8 be cautious and sparing.”].)
9 Concealment: Defendants increased the risk of harm to children by implementing tools that
10 conceal activity from parents, including features that encourage and permit minor users to block their
11 parents’ accounts or restrict parental oversight, as through Instagram’s Close Friends Only, Facebook’s
12 “restricted list,” Snap’s My Eyes Only, and TikTok’s Friend’s Only features. MC ¶¶ 263-64, 476, 494-
13 97, 553. Meta permits adults to send encrypted messages to children. MC ¶¶ 377, 385. Defendants also
14 created tools to make private messages, photos and videos disappear. Instagram’s “photo bomb” and
15 Meta’s Vanish Mode allow users to send disappearing images or videos. MC ¶ 376. Snaps are, by their
16 nature, disappearing audiovisual messages, and Snap prevents parents from seeing these messages—even
17 with parental controls enabled. MC ¶¶ 412, 415-16, 471-72, 475. Snap further created a self-destruct tool
18 for My Eyes Only content if someone (including a parent) attempts to access it. MC ¶ 477. TikTok’s
19 direct messaging tool also allows messages with virtually no evidence of its content. MC ¶ 678.
20 Defendants’ concealment features directly make parental supervision more difficult while making
21 children less apprehensive to share CSAM content with predators, increasing the overall risk of unchecked
22 exploitation. MC ¶¶ 385, 412, 472, 476, 494, 496-98, 501-02. Worse, Defendants intentionally worked to
23 create spaces where children and their parents operate separately. MC ¶¶ 263-64 (“If Mom starts using an
24 app all the time, the app can lose a ‘cool’ factor, if we’re not conscious of separation.”). Defendants’
25 actions carry the cost of making children more vulnerable to predation. MC ¶¶ 377, 678. Defendants’
26 liability stems from creating tools for concealed messaging, and from making the concealment tools
27 available to minors—actions that do not require publishing or depublishing any particular content. (Accord
28 Dem. Order at p. 19.)
3
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 Meta’s choice to conceal the risks to children further increased the risk of harm to children. For
2 example, Meta told the public, including parents, that its products were safe for children and designed to
3 limit interactions between children and stranger adults, while hiding its knowledge that its features
4 exacerbated the risk of harm to child users. MC ¶¶ 343-54, 358, 361-63, 370. Snap, likewise, told the
5 public that Snap makes it difficult for adults to find minors and does not facilitate connections with
6 strangers, when the opposite is true. MC ¶ 511. These concealments increase the risk that parents will let
7 their children use Defendants’ platforms or more loosely monitor their activities under a false sense of
8 security. Moreover, Snap leads children to believe incorrectly that their images and videos are lost
9 forever—even though recipients can save the images or videos. MC ¶¶ 473-74. This concealment
10 increases the risk that children will engage in communications that they otherwise would not have. Such
11 claims relate to Defendants’ conduct, not Defendants’ publication of others’ content.
12 Money and gifts: Defendants’ features that allow predators to send money and gifts to children
13 increase the risk of sexual exploitation and solicitation of CSAM. These features have nothing to do with
14 Defendants role as publishers of others’ content, yet facilitate payments from adults to children who are
15 creating livestream content without their parents’ knowledge. MC ¶¶ 178, 203, 501, 603 (Facebook Live,
16 Instagram Live, Snap audio or video calls, and TikTok LIVE). On TikTok LIVE, a predator can award
17 “LIVE Gifts” and “Diamonds” to the child, which the minor can convert to money or virtual items. MC
18 ¶¶ 676-77. Similarly, from 2014-2018, Snapcash allowed adults to send cash to minors. MC ¶¶ 499-500.
19 These tools greatly increase the risk of sexual exploitation and creation of CSAM. MC ¶¶ 499-501, 677.
20 Again, Plaintiffs do not seek to hold Defendants liable as the publisher of others’ content. Defendants
21 could have avoided liability by not creating a feature that allows users to send and receive money or gifts,
22 or at minimum by not making that feature available to minors. Even if this Court determines that others’
23 content is involved, Defendants’ conduct (including creating TikTok “Diamonds”) is too intertwined to
24 determine—at least at this stage—that Defendants are immune.
25 Recommendations: Defendants increase the risk of harm to children by recommending that minors
26
27
28
4
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 and adult strangers connect.5 Worse, Facebook does not let users disable its recommendations, without
2 which the recommended minors would otherwise be nearly impossible for predators to find. MC ¶ 172.
3 An estimated 80% of “violating adult/minor connections” on Facebook result from its recommendations.
4 MC ¶¶ 172, 372. Snap increases children’s Snapscore when they accept recommendations, incentivizing
5 children to connect with strangers. MC ¶¶ 480-82. Plaintiffs again do not seek to hold Defendants liable
6 for publishing the content of any user; rather, Defendants’ own recommendations that connect children
7 with adult users increase the risk of harm. Defendant’s liability arises from their platforms’
8 recommendation that the accounts connect (as opposed a friend request from the adult).
9 Predator sharing of CSAM: Defendants created tools that make it easier for predators to share
10 CSAM with other predators. This includes the concealment tools discussed above. In addition, TikTok’s
11 Private Videos (also called Post-in-Private) allows a user to store videos that are only accessible to the
12 user. MC ¶ 670. Predators use this tool to store and share CSAM. MC ¶¶ 670-71, 673. Worse, if a user
13 follows a small number of Post-in-Private accounts, TikTok will recommend other Post-in-Private
14 accounts to the user—making it easy for predators to find each other to view and share more CSAM. MC
15 ¶ 670. Plaintiffs again do not seek to hold Defendants liable for their publication decisions as to others’
16 content. Defendants could avoid liability by refraining from creating concealment tools, and from
17 recommending Post-in-Private accounts to users that frequently use the tool.
18 Failure to Protect: Defendants’ failure to take protective measures is negligent because—as
19 explained above—Defendants have increased the risk of harm through their affirmative conduct.
20 Defendants did not design adequate parental controls,6 methods to report exploitation or CSAM,7 and age-
21 verification measures.8 Inadequate age verification is particularly dangerous because predators can
22 pretend to be children. MC ¶ 438. For parental controls, Defendants do not require parental consent to
23
5
24 See MC ¶¶172, 372, 391 (Facebook’s People You May Know, Instagram’s Suggested for You,
Instagram’s Because You Watched); 481-83, 494, 511 (Snap’s Quick Add); 555 (TikTok’s Find
25 Friends and People You May Know).
6
26 MC ¶¶ 258-67, 379 (Meta); 475, 491-93 (Snap); 540, 659 (TikTok); 688, 775 (YouTube),
7
MC ¶¶ 267, 382-84, 386-89, 394-96 (Meta); 506-07 (Snap); 679 (TikTok); 791-94, 801-02 (YouTube),
27
8
MC ¶¶ 238-57, 373 (Meta); 433, 435-36 (Snap); 540-50, 576, 659, 675 (TikTok); 688, 701, 714-22,
28 726, 783, 796 (YouTube)
5
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 open accounts, connect with adults, or post location; do not link a child’s account with their parent’s
2 account; and do not allow parents to limit usage.9 Defendants also do not inform parents when CSAM is
3 found or of connections and interactions with adults. MC ¶¶ 261, 554. Nor do they warn parents of the
4 risk of predation.10 Although in August 2022, Snap’s Family Center finally allowed parents to see who
5 their child communicates with, parents cannot view the content of any message. MC ¶ 493. TikTok added
6 Family Pairing in April 2020, but children can skirt parental oversight by using a web browser. MC ¶¶
7 551-52. Moreover, parents can only use such features if they already know about their child’s account,
8 and children can conceal their activity by creating a second account. MC ¶¶ 493, 552. Plaintiffs, again, do
9 not seek to hold Defendants liable for publishing the content of any user; rather, Plaintiffs allege that
10 Defendants’ own conduct creating inadequate features to mitigate the risks that Defendants created.
11 Ultimately, Defendants have engaged in numerous affirmative acts that increased the risk of sexual
12 exploitation on their platforms, and as such Plaintiffs’ claims are not barred. (Dem. Order at pp. 61-62
13 [noting congressional intent when enacting Section 230 to “maximize user control over what information
14 is received,” and “empower parents to restrict their children’s access to objectionable or inappropriate
15 online material”].)
16 B. Defendants intentionally designed their platforms to exploit adolescents’ social
17 insecurities
18 Many features of Defendants’ platforms increase the risks of child sexual exploitation because
19 Defendants’ platforms are designed to exploit children’s vulnerabilities. But Plaintiffs’ relevant
20 allegations on that issue are not limited to the allegations that Defendants seek to strike. Under California
21 law, courts inquire “whether the defendant’s ‘entire conduct created a risk of harm.’” (Kuciemba v. Victory
22 Woodworks (2023) 14 Cal.5th 993, 1017 [quoting Brown, supra, 11 Cal.5th at n.6].) The proper approach
23 to this holistic inquiry requires the Court to consider the totality of Defendants’ allegedly negligent
24 conduct, which only further underscores the impropriety of Defendants’ motion.
25 The Court should consider Plaintiffs’ allegations in context and alongside their allegations that
26
27 9
MC ¶¶ 259, 261, 401 (Meta); 491-94 (Snap); 540, 576, 659 (TikTok); 775 (YouTube).
28 10
MC ¶¶ 403 (Meta); 514, 518 (Snap); 652, 662 (TikTok); 803, 809 (YouTube).
6
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 Defendants’ affirmative conduct exploited adolescents’ developing brains, desire for social validation,
2 and fear of feeling socially inadequate and alone.11 Adolescents “are especially vulnerable to developing
3 harmful behaviors because their prefrontal cortex is not fully developed,” and have “less impulse control
4 and less ability to evaluate risks, regulate emotions and regulate their responses to social rewards....” MC
5 ¶¶ 64-65. When they step away from Defendants’ platforms for hours or days, youth may experience
6 anxiety, loneliness, depression, dysphoria, irritability, fatigue, or a greater reduction in impulse control. 12
7 Defendants knew this and capitalized on it, designing their platforms with features that present social
8 rewards in a way that leads young users to compulsively seek out more stimulation, feel negative
9 withdrawal symptoms when denied stimulation, and reduce impulse control and emotional regulation. MC
10 ¶¶ 78, 128, 283. By creating the social pressure for minors to accept predator communications and then
11 recommending predators to minors, Defendants increased the risk of harm. The Court should reject
12 Defendants’ attempt to isolate certain factual allegations from Defendants’ whole course of conduct.
13 Social approval features: Defendants’ social approval features encourage children to connect and
14 engage with predators. These features (such as likes, reactions, hearts, view counts, comments, follows,
15 shares or reposts)13 function as a social measuring stick. MC ¶ 173. By creating an environment where
16 minors feel socially inadequate if they do not accept Defendants’ recommendation to connect with an
17 adult stranger, Defendants increased the risk of connections between predators and children.
18 Notifications: Defendants use notifications to exploit adolescents’ cravings for more dopamine
19 and their need for “reciprocity”—the psychological desire to respond to an initial gesture. 14 This increases
20 the risk that children will respond to a message from a predator. For example, TikTok sends notifications
21
22
11
See, e.g., MC ¶¶ 2, 12, 66, 77, 80, 91, 194, 235-36, 411-12, 415, 439-440, 443-45, 447, 450-52, 454-
23
56, 461, 490, 529, 533, 535, 557, 601-06, 659, 686-87, 700, 713, 723, 725, 732, 735-36, 767, 769.
24 Plaintiffs also extensively detail Defendants’ knowledge that children were addicted to their platforms.
See, e.g., MC ¶¶ 2, 12, 69, 73-74, 84-85, 105, 120, 156, 159, 265, 269, 272, 285-86, 288-89, 292, 311,
25 314, 339-40, 412-13, 440, 458, 480, 529, 599-600, 684.
12
26 See, e.g., MC ¶¶ 70, 72-73, 76-77, 79, 87, 102, 110, 113, 121-22, 124, 126, 128, 288, 412, 440, 458,
479, 605, 763-65.
27 13
MC ¶¶ 71, 80, 86, 92-93, 128, 179, 202, 235, 271-74, 316, 325, 598, 606, 724, 726, 735, 767.
28 14
MC ¶¶ 3, 81-83, 86, 89-90, 107, 109, 236, 268, 272, 278, 280, 440, 460, 554, 604-05, 607, 726.
7
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 (TikTok Now) that manufacture a sense of urgency by making users create private posts within three
2 minutes or lose the ability to see their friends’ posts made within that timeframe. MC ¶ 604. Similarly,
3 because a Snap disappears within 10 seconds of being viewed, users feel the urge to respond immediately.
4 MC ¶ 441. And Meta intentionally limits the information in notifications to increase curiosity. MC ¶ 279.
5 Defendants’ calculated use of notifications enhances children’s desire for social acceptance when they
6 open a message from another user, by Defendants’ design. Children using Defendants’ platforms are thus
7 more likely to feel a psychological need to open and respond to a message from a predator.
8 Snap rewards: Snap’s highly addictive features reward users for using Snapchat and punish them
9 for time away from the platform. MC ¶¶ 412, 440, 446. These score-keeping features incentivize minors
10 to connect with adult strangers and to keep the conversation going. Snapscores, for example, give users
11 a point score that is visible to other users, based on their number of messages sent and stories posted. MC
12 ¶¶ 440, 442-44. Young users are more likely to associate their worth with their Snapscore. MC ¶¶ 444-45.
13 Snap’s trophies rewarded users for increasing their Snapscores, sending creative Snaps, or posting live
14 stories. Other users could view a user’s trophies in their “trophy box.” MC ¶¶ 412, 447. In 2020, Snap
15 replaced trophies with charms, which reward users for hitting milestones in their relationships with other
16 users. MC ¶ 448. Not only does Snap send positive charms like “Best Friends Forever” when two users
17 communicate extensively, but Snap sends negative charms, such as “It’s Been Forever” or “It’s Been a
18 Minute,” to encourage users to reengage. Further, Snap Streak is a highly-addictive feature designed to
19 encourage users to interact every day. MC ¶¶ 92, 412, 414, 440, 452, 454, 456. Starting with the fire emoji
20 to reward users for interacting three consecutive days, users earn additional emojis as their Streak grows,
21 such as the 100 emoji for 100 consecutive days. MC ¶ 452. To manufacture a sense of urgency, Snap
22 sends notifications to users when their Streak is about to expire in order to keep the Streak alive. MC ¶
23 455. Snap has even made a special form where users who lost their Streak can petition to get it back,
24 showing how significant the Streak is to users. MC ¶ 456. Taken together, these features increase the risk
25 of lengthy back-and-forth communications between predators and children.
26 Barriers to account deletion: Defendants intentionally designed cumbersome processes to delete
27 accounts while simultaneously dissuading users from leaving. These unnecessary hurdles make it more
28 difficult for children escape harmful relationships with adults. For instance, Defendants remind users of
8
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 the friends they will leave behind, force users explain why they are leaving, and then keep user accounts
2 open for a month anyway in the hopes that users will change their minds. 15 By pressuring children to
3 maintain their accounts and making the deletion process convoluted, Defendants increase the risk that
4 children will abandon deletion attempts and remain on the social network during harmful relationships.
5 C. Defendants’ conduct increased children’s risk of harm from dangerous challenges
6 Plaintiffs’ challenge-related allegations focus narrowly on certain actions of Defendants—their
7 marketing and platform design to foster addiction. These acts—all to drive unhealthy levels of
8 engagement—are themselves harmful, regardless of the content shown. For example, TikTok built
9 challenges into its “architecture and user interface” to further attract and addict children and adolescent
10 users by fostering competition among users seeking to gain more social validation through likes and views.
11 MC ¶ 608; see also MC ¶ 767 (YouTube). And Defendants such as TikTok “encourage[] business to create
12 challenges as a form of marketing” because challenges drive engagement and revenue. MC ¶ 611. Because
13 of TikTok’s “engagement-maximization design,” TikTok automatically promotes whatever draws
14 attention, regardless of its substance. MC ¶ 612. But Plaintiffs’ even expressly connect the risks from
15 challenges to Defendants’ design choices. MC ¶ 626 (lack of age verification, designing algorithm to push
16 challenges to young children, and lack of warnings). These allegations target Defendants’ negligent design
17 and marketing to drive harmful engagement, not their decisions to publish or depublish particular content.
18 MC ¶¶ 608, 612-26 (TikTok); 767 (YouTube); 128 (both). Section 230 therefore gives no shelter.
19 II. DOE II V. MYSPACE DOES NOT REQUIRE A CONTRARY RESULT
20 Doe II does not support Defendants’ motion. The Doe II court believed it was “undeniable that
21 [the plaintiffs] seek to hold MySpace responsible for the communications between the Julie Does and their
22 assailants.” (Doe II, supra, 175 Cal.App.4th at 565.) The allegations here are decidedly different—and
23 thus, Doe II is not binding regarding the specific duties discussed above. Indeed, the Court of Appeal in
24 Bolger described the claims in Doe II as “based on a website’s decision ‘to restrict or make available
25 certain material,’” and noted that such claims are different from those that seek to hold a website liable
26 for its own non-publishing conduct. (Bolger v. Amazon.com, LLC (2020) 53 Cal.App.5th 431, 465–66.)
27
28 15
See MC ¶¶ 332-37 (Meta); 461-62 (Snap); 641-51 (TikTok); MC ¶¶ 770-73 (YouTube).
9
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 Defendants cite Doe II as supporting broad immunity whenever content is in the causal chain, but
2 California law does not support a but-for test. (See id. [“The fact that some content provided by Lenoge
3 was posted on the Amazon website does not automatically immunize Amazon for its own choices and
4 activities unrelated to that content.”]; accord Hassell, supra, 5 Cal.5th at 542-43 [plurality]; id. at 559
5 [Kruger, J., concurring]; Dem. Order at 19-20, 63; Lee v. Amazon.com (2022) 76 Cal.App.5th 200, 256
6 [permitting claim where issue was whether plaintiffs were warned about mercury]; Demetriades v. Yelp
7 (2014) 228 Cal.App.4th 294, 313 [permitting claim for misrepresenting accuracy of filter]; see also
8 HomeAway.com v. City of Santa Monica, 918 F.3d 676, 682 (9th Cir. 2019); Barnes v. Yahoo (9th Cir.
9 2009) 570 F.3d 1096, 1107 [harm stemmed from posts, but permitted promissory estoppel claim].) Courts
10 in California instead ask whether the duty at issue treats the defendant as the publisher of another’s content.
11 (Dem. Order at pp. 57-59 [citing Lee, supra, 76 Cal. App. 5th at p. 256 and Lemmon, supra, 995 F.3d at
12 p. 1092]; Hassell, supra, 5 Cal.5th at pp. 542-543 [“[N]ot all legal duties owed by Internet intermediaries
13 necessarily treat them as the publishers of third-party content, even when these obligations are in some
14 way associated with their publication of this material”] [plurality, emphasis added]; id. at 559 [Kruger, J.,
15 concurring {quoting this sentence from plurality with approval}].)
16 For these reasons, nothing in Doe II requires striking the very different allegations in this case.
17 III. CONCLUSION
18 For these reasons, this Court should deny the motion to strike.
19 / / /
20 / / /
21 / / /
22 / / /
23 / / /
24 / / /
25 / / /
26 / / /
27 / / /
28 / / /
10
PLAINTIFFS’ AMENDED SUPPLEMENTAL BRIEF IN OPPOSITION TO DEFENDANTS’ MOTION TO STRIKE
THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM
COMPLAINTS
1 DATED: April 8, 2024 MORGAN & MORGAN
2 /s/ Emily Jeffcott
Emily Jeffcott
3
633 West Fifth Street, Suite 2652
4 Los Angeles, CA 90071
Tel.: 213-787-8590
5 Fax: 213-418-3983
ejeffcott@forthepeople.com
6
7 Brian J. Panish
Rahul Ravipudi
8 Jesse Creed
PANISH | SHEA | BOYLE | RAVIPUDI LLP
9 11111 Santa Monica Boulevard, Suite 700
Los Angeles, CA 90025
10 Tel.: (310) 477-1700
11 panish@psbr.law
rravipudi@psbr.law
12