arrow left
arrow right
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
  • CHRISTINA  ARLINGTON SMITH INDIVIDUALLY AND AS SUCCESSOR-IN-INTEREST TO LALANI WALTON, DECEASED, ET AL. VS TIKTOK INC., ET AL. Other Personal Injury/Property Damage/Wrongful Death (General Jurisdiction) document preview
						
                                

Preview

1 COVINGTON & BURLING LLP Ashley M. Simonsen, SBN 275203 2 asimonsen@cov.com 3 1999 Avenue of the Stars Los Angeles, CA 90067 4 Tel.: 424-332-4800 5 Attorneys for Defendants Meta Platforms, Inc. f/k/a Facebook, Inc.; Facebook Holdings, LLC; 6 Facebook Operations, LLC; Facebook Payments, 7 Inc.; Facebook Technologies, LLC; Instagram, LLC; and Siculus, Inc. 8 [Additional Counsel on Signature Page] 9 10 SUPERIOR COURT OF THE STATE OF CALIFORNIA FOR THE COUNTY OF LOS ANGELES 11 12 COORDINATION PROCEEDING SPECIAL JUDICIAL COUNCIL COORDINATION 13 TITLE [RULE 3.400] PROCEEDING NO. 5255 14 SOCIAL MEDIA CASES For Filing Purposes: 22STCV21355 15 THIS DOCUMENT RELATES TO: Judge: Hon. Carolyn B. Kuhl SSC-12 16 (Arlington Smith et al. v. TikTok Inc. et al., Case No. 22STCV21355) DEFENDANTS’ NOTICE OF MOTION 17 (A.S. et al. v. Meta Platforms, Inc. et al., AND MOTION TO STRIKE THIRD- 18 Case No. 22STCV28202) PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM (Glenn-Mills v. Meta Platforms, Inc. et al., 19 IDENTIFIED SHORT-FORM Case No. 23SMCV03371) COMPLAINTS; MEMORANDUM OF 20 (J.S. et al. v. Meta Platforms, Inc. et al., POINTS AND AUTHORITIES Case No. CV2022-1472) 21 (K.K. et al. v. Meta Platforms, Inc. et al., [Filed concurrently with Declaration of 22 Case No. 23SMCV03371) Alexander L. Schultz] 23 (K.L. et al. v. Meta Platforms, Inc. et al., Date: March 20, 2024 Case No. CIV SB 2218921) Time: 1:45 p.m. 24 Dept.: SSC-12 (N.S. et al. v. Snap Inc., Case No. 22CV019089) 25 (P.F. et al. v. Meta Platforms, Inc. et al., Case No. 23SMCV03371) 26 27 28 1 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 NOTICE OF MOTION AND MOTION TO STRIKE 2 TO PLAINTIFFS AND THEIR ATTORNEYS OF RECORD: 3 PLEASE TAKE NOTICE THAT on March 20, 2024, at 1:45 p.m., or as soon thereafter as the 4 matter may be heard, in Department 12 of the Superior Court for the County of Los Angeles, located at 5 312 North Spring Street, Los Angeles, California 90012, Defendants Meta Platforms, Inc. f/k/a Facebook, 6 Inc.; Facebook Holdings, LLC; Facebook Operations, LLC; Facebook Payments, Inc.; Facebook 7 Technologies, LLC; Instagram, LLC; and Siculus, Inc. (collectively, “Meta” or the “Meta Defendants”); 8 Defendant Snap Inc. (“Snap”); Defendants ByteDance Ltd.; ByteDance Inc.; TikTok Ltd.; TikTok LLC; 9 and TikTok Inc. (collectively, “TikTok” or the “TikTok Defendants”); and Defendants Google LLC and 10 YouTube LLC (collectively, “YouTube” or the “YouTube Defendants”) (and, together with all 11 Defendants, “Defendants”) will and hereby do, pursuant to Code of Civil Procedure sections 435–437 and 12 California Rule of Court 3.1322(a), move to strike portions of Plaintiffs’ below-identified Short-Form 13 Complaints (“SFCs”) and Master Complaint (Personal Injury) (“MC”) (collectively, the “Operative 14 Pleadings”). 15 This Motion is made on the grounds that the identified portions of the Operative Pleadings are not 16 drawn or filed in conformity with the laws of this state, as they are legally insufficient to support a viable 17 cause of action under Section 230 of the Communications Decency Act, 47 U.S.C. § 230 (“Section 230”); 18 the First Amendment to the U.S. Constitution and the Liberty of Speech Clause in the California 19 Constitution (the “First Amendment”); and California tort law. The identified allegations are therefore 20 irrelevant or improper. See Civ. Proc. Code § 436. 21 Defendants move to strike the following portions of the Operative Pleadings (collectively, the 22 “Identified Allegations”): 23 1. First Amended Short Form Complaint For Damages And Demand For Jury Trial, A.S. on 24 behalf of E.S. v. Meta Platforms, Inc. et al., Case No. 22STCV28202 (L.A. Super. Ct. filed 25 Jan. 5, 2024) (“A.S. SFC”) — Page 5: “(As against Defendant Snap) Exploitation and/or 26 sexual abuse related harms.” 27 2. First Amended Short Form Complaint For Damages And Demand For Jury Trial, Glenn- 28 Mills v. Meta Platforms, Inc. et al., Case No. 23SMCV03371 (L.A. Super. Ct. filed Jan. 5, 2 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 2024) (“Glenn-Mills SFC”) — Page 5: “(As against Defendants Meta, Snapchat, and 2 TikTok) Exploitation and/or sexual abuse related harms.” 3 3. First Amended Short Form Complaint For Damages And Demand For Jury Trial, K.L. on 4 behalf of S.S. v. Meta Platforms, Inc. et al., Case No. CIV SB 2218921 (L.A. Super. Ct. 5 filed Jan. 5, 2024) (“K.L. SFC”) — Page 5: “(As against Defendant Snap) Exploitation 6 and/or sexual abuse related harms.” 7 4. First Amended Short Form Complaint For Damages And Demand For Jury Trial, N.S. on 8 behalf of Z.H. v. Snap Inc., Case No. 22CV019089 (L.A. Super. Ct. filed Jan. 5, 2024) 9 (“N.S. SFC”) — Page 5: “Exploitation and/or sexual abuse related harms.” 10 5. First Amended Short Form Complaint For Damages And Demand For Jury Trial, P.F. on 11 behalf of A.F. v. Meta Platforms, Inc. et al., Case No. 23SMCV03371 (L.A. Super. Ct. 12 filed Jan. 5, 2024) (“P.F. SFC”) — Page 5: “Exploitation and/or sexual abuse related 13 harms.” 14 6. Second Amended Short Form Complaint For Damages And Demand For Jury Trial, J.S. 15 and D.S. on behalf of L.H.S. v. Meta Platforms, Inc. et al., Case No. CV2022-1472 (L.A. 16 Super. Ct. filed Jan. 9, 2024) (“J.S. SFC”) — Page 5: “(As against Meta and TikTok 17 defendants) Harms resulting from being the victim of viral challenges engaged in by other 18 minors.” 19 7. Second Amended Short Form Complaint For Damages And Demand For Jury Trial, K.K. 20 on behalf of S.K. v. Meta Platforms, Inc. et al., Case No. 23SMCV03371 (L.A. Super. Ct. 21 filed Jan. 17, 2024) (“K.K. SFC”) — Page 5: “(As to Meta only) Exploitation and/or sexual 22 abuse related harms.” 23 8. In their entirety, Master Complaint (Personal Injury) (“MC”) ¶¶ 137–138, 172, 365–370, 24 372–374, 377–378, 380–385, 387–392, 394–400, 472–473, 476–477, 481–483, 494–495, 25 497–510, 512–513, 518, 550, 555–556, 608–626, 662, 666–667, 669–670, 673–674, 677– 26 681, 767, 774, 781–782, 784–799, 801–802, and 809. 27 9. MC ¶ 18: “. . . sexual exploitation . . . .” 28 3 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 10. MC ¶ 156: “Further, Meta It [sic] also understands that these problems can be so extreme 2 as to include encounters between adults and minors—with such ‘sex-talk’ 32x more 3 prevalent on Instagram than on Facebook.” 4 11. MC ¶ 157: “. . . bullying & harassment, connections . . . .” 5 12. MC ¶ 261: “. . . (d) notify parents about interactions with accounts associated with adults; 6 (e) notify parents when CSAM is found on a minor’s account . . . .” 7 13. MC ¶ 344(b): “‘We make it so that adults can’t contact minors who they—they aren’t 8 already friends with.’” 9 14. MC ¶ 359: “The third ad informs the viewer that they ‘look lonely’ and encourages them 10 to ‘[f]ind your partner now to make a love connection.’” 11 15. MC ¶ 371: “Meta also fails to enforce its own policies regarding adolescent users, and 12 does not incorporate simple, cost-effective technologies into the design of its products that 13 would help reduce the prevalence of CSAM.” 14 16. MC ¶ 402: “. . . and they enable predators to identify, connect to, and exploit children.” 15 17. MC ¶ 403: “. . . sexual exploitation from adult users . . . .” 16 18. MC ¶ 411: “Snap also dangerously encourages adolescents to increase engagement on the 17 app indiscriminately, pushing tools to share sensitive material with an ever-expanding 18 group of friends and strangers.” 19 19. MC ¶ 413: “. . . ‘to promote bullying . . . and help teenagers buy dangerous drugs or engage 20 in reckless behavior.’” 21 20. MC ¶ 416: “That feature foreseeably and quickly drove users to exchange sexually explicit 22 ‘Snaps,’ sometimes called ‘sexts’ even though they are photos. Because of its brand 23 identity among millennials as the original ephemeral-messaging app, Snapchat almost 24 immediately became known as the ‘sexting’ app—a fact that Snap was or should have been 25 on notice of from public sources.” 26 21. MC ¶ 496: “This feature also contributes to a sense of impunity for many users, 27 encouraging and fomenting exploitation and predatory behavior, which has been observed 28 in multiple empirical studies. According to these studies, Snapchat users believe their 4 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 conduct is hidden and accordingly feel empowered to engage in criminal behavior through 2 the product without fear of getting caught.” 3 22. MC ¶ 511: “Stout claimed that Snap makes it harder for strangers to find minors when, in 4 fact, Snapchat’s ‘Quick Add’ feature is responsible for introducing minors to complete 5 strangers, and its ‘Snap Map’ feature has enabled threats, exploitation, and location of 6 minors by complete strangers. Likewise, Snap’s Head of Global Platform Safety, 7 Jacqueline Beauchere, represented to the public that ‘Snapchat is designed for 8 communications between and among real friends; it doesn’t facilitate connections with 9 unfamiliar people like some social media platforms.’ But again, this is not true and/or 10 historically was not the case.” 11 23. MC ¶ 514: “. . . sexual exploitation by adult users . . . .” 12 24. MC ¶ 533: “. . . dangerous viral challenges . . . .” 13 25. MC ¶ 652: “. . . sexual exploitation from adult users . . . .” 14 26. MC ¶ 682: “. . . and ByteDance continues to benefit financially from predators who 15 commit sexual abuse against children and/or share CSAM using ByteDance’s product.” 16 27. MC ¶ 684: “. . . dangerous and deadly TikTok challenges; sexual exploitation of minor 17 users; and the exchange of CSAM on TikTok.” 18 28. MC ¶ 779: “. . . including users that violate laws prohibiting the sexual exploitation of 19 children.” 20 29. MC ¶ 803: “. . . sexual exploitation from adult users . . . .” 21 30. MC ¶ 861(h): “Sexual predators use Defendants’ respective products to produce and 22 distribute CSAM.” 23 31. MC ¶ 861(i): “Adult predators target young children for sexual exploitation, sextortion, 24 and CSAM on Defendants’ respective products, with alarming frequency.” 25 32. MC ¶ 861(j): “Usage of Defendants’ respective products can increase the risk that children 26 are targeted and sexually exploited by adult predators.” 27 28 5 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 33. MC ¶ 861(k): “Usage of Defendants’ respective products can increase risky and 2 uninhibited behavior in children, making them easier targets to adult predators for sexual 3 exploitation, sextortion, and CSAM.” 4 34. MC ¶ 861(l): “End-to-end encryption and/or the ephemeral nature of Direct Messaging on 5 the Meta, ByteDance, and Snap products prevent the reporting of CSAM.” 6 35. MC ¶ 922: “. . . exposure to predators, sexual exploitation . . . .” 7 36. MC ¶ 929(c): “Including features and algorithms in their respective platforms that, as 8 described above, are currently structured and operated in a manner that unreasonably 9 exposes youth users to sexual predators and sexual exploitation, including features that 10 recommend or encourage youth users to connect with adult strangers on or through the 11 platform.” 12 37. MC ¶ 930(g): “Failing to set up, monitor, and modify the algorithms used on their 13 platforms to prevent the platforms from actively driving youth users into unsafe, distorted, 14 and unhealthy online experiences, including highly sexualized, violent, and predatory 15 environments and environments promoting eating disorders and suicide.” 16 38. MC ¶ 930(h): “Failing to implement reasonably available means to monitor for, report, 17 and prevent the use of their platforms by sexual predators to victimize, abuse, and exploit 18 youth users.” 19 39. MC ¶ 933: “. . . exposure to predators, sexual exploitation . . . .” 20 Defendants base this Motion on this Notice, the accompanying Memorandum of Points and 21 Authorities, the Declaration of Alexander L. Schultz, all other pleadings and papers on file in this action, 22 and such other arguments or evidence that may be submitted to the Court prior to or during the hearing on 23 the Motion. 24 25 26 27 28 6 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 DATED: January 25, 2024 COVINGTON & BURLING LLP 2 3 By: /s/ Ashley M. Simonsen 4 Ashley M. Simonsen 5 ASHLEY M. SIMONSEN, SBN 275203 asimonsen@cov.com 6 COVINGTON & BURLING LLP 7 1999 Avenue of the Stars Los Angeles, CA 90067 8 Tel.: 424-332-4800 9 Attorneys for Defendants Meta Platforms, Inc. f/k/a Facebook, Inc.; Facebook Holdings, LLC; 10 Facebook Operations, LLC; Facebook Payments, 11 Inc.; Facebook Technologies, LLC; Instagram, LLC; and Siculus, Inc. 12 [Additional Counsel on Signature Page] 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 7 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 TABLE OF CONTENTS 2 MEMORANDUM OF POINTS AND AUTHORITIES .............................................................. 13 3 I. INTRODUCTION ............................................................................................................ 13 4 II. BACKGROUND .............................................................................................................. 13 5 III. LEGAL STANDARD ....................................................................................................... 15 6 IV. ARGUMENT .................................................................................................................... 15 7 A. Section 230 Bars Plaintiffs’ Allegations of Harm From Third-Party Misconduct, 8 CSAM, and Online Challenges. ............................................................................ 15 9 1. Plaintiffs’ Claims Based on the Misconduct of Sexual Predators and 10 CSAM Treat Defendants as Publishers of Harmful Third-Party 11 Content. ..................................................................................................... 15 12 2. Claims Based on Plaintiffs’ Exposure to Challenge Videos Treat 13 Defendants as Publishers of User Content. ............................................... 20 14 B. Plaintiffs’ Allegations of Injury from Online Challenges Are Independently 15 Barred by the First Amendment. ........................................................................... 21 16 C. Plaintiffs Fail to Plead that Defendants Have an Affirmative Duty to Prevent 17 Injury from Third-Party Actors. ............................................................................ 23 18 D. Plaintiffs Fail to Plead that Defendants Proximately Caused Their Alleged Harm 19 from Predatory Users, CSAM, and Online Challenges. ....................................... 25 20 V. CONCLUSION ................................................................................................................. 26 21 22 23 24 25 26 27 28 8 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 TABLE OF AUTHORITIES 2 Page(s) 3 Cases 4 303 Creative LLC v. Elenis, 5 600 U.S. 570 (2023) .................................................................................................................21 6 Anderson v. TikTok, Inc., 637 F. Supp. 3d 276 (E.D. Pa. 2022) .......................................................................................21 7 Baral v. Schnitt, 8 1 Cal. 5th 376 (2016) ...............................................................................................................15 9 Barrett v. Rosenthal 10 40 Cal. 4th 33 (2006) ...............................................................................................................15 11 Bill v. Super. Ct., 137 Cal. App. 3d 1002 (1982) .................................................................................................23 12 Bride v. Snap Inc., 13 2023 WL 2016927 (C.D. Cal. Jan. 10, 2023) ..........................................................................18 14 Brown v. Ent. Merchs. Ass’n, 15 564 U.S. 786 (2011) .................................................................................................................21 16 Brown v. USA Taekwondo, 11 Cal. 5th 204 (2021) .......................................................................................................24, 25 17 Carafano v. Metrosplash.com, Inc., 18 339 F.3d 1119 (9th Cir. 2003) .................................................................................................18 19 Cohen v. Facebook, Inc., 252 F. Supp. 3d 140 (E.D.N.Y. 2017) .....................................................................................15 20 21 Crosby v. Twitter, Inc., 921 F.3d 617 (6th Cir. 2019) ...................................................................................................26 22 DeFilippo v. Nat’l Broad. Co., 23 446 A.2d 1036 (R.I. 1982) .......................................................................................................22 24 Delgado v. Trax Bar & Grill, 36 Cal. 4th 224 (2005) .............................................................................................................23 25 Doe #1 v. Twitter, Inc., 26 2023 WL 3220912 (9th Cir. May 3, 2023) ..............................................................................19 27 Doe II v. MySpace, Inc., 28 175 Cal. App. 4th 561 (2009) .......................................................................................... passim 9 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 Doe v. Grindr Inc., 2023 WL 9066310 (C.D. Cal. Dec. 28, 2023) .............................................................17, 18, 20 2 Doe v. Internet Brands, Inc., 3 824 F.3d 846 (9th Cir. 2016) ...................................................................................................17 4 Doe v. MySpace, Inc., 5 528 F.3d 413 (5th Cir. 2008) .............................................................................................17, 21 6 Doe v. Twitter, Inc., 2023 WL 8568911 (N.D. Cal. Dec. 11, 2023) .........................................................................19 7 Doe v. Twitter, Inc., 8 555 F. Supp. 3d 889 (N.D. Cal. 2021) .....................................................................................19 9 Does 1‒6 v. Reddit, Inc., 10 51 F.4th 1137 (9th Cir. 2022) ..................................................................................................19 11 Dyroff v. Ultimate Software Grp., Inc., 2017 WL 5665670 (N.D. Cal. Nov. 26, 2017) ........................................................................24 12 Dyroff v. Ultimate Software Grp., Inc., 13 934 F.3d 1093 (9th Cir. 2019) ...............................................................................17, 20, 23, 25 14 In re Facebook, Inc., 15 625 S.W.3d 80 (Tex. 2021)................................................................................................17, 21 16 Fields v. Twitter, Inc., 881 F.3d 739 (9th Cir. 2018) ...................................................................................................26 17 Force v. Facebook, Inc., 18 934 F.3d 53 (2d Cir. 2019).................................................................................................18, 20 19 Frisby v. Schultz, 20 487 U.S. 474 (1988) .................................................................................................................22 21 Gavra v. Google Inc., 2013 WL 3788241 (N.D. Cal. July 17, 2013) ..........................................................................23 22 Grossman v. Rockaway Twp., 23 2019 WL 2649153 (N.J. Super. Ct. June 10, 2019) .................................................................18 24 Hassell v. Bird, 5 Cal. 5th 522 .....................................................................................................................16, 21 25 26 Herceg v. Hustler Mag., Inc., 814 F.2d 1017 (5th Cir. 1987) .................................................................................................22 27 Jane Doe No. 1 v. Uber Techs., Inc., 28 79 Cal. App. 5th 410 (2022) ....................................................................................................24 10 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 Kimzey v. Yelp! Inc., 836 F.3d 1263 (9th Cir. 2016) .................................................................................................18 2 L.W. v. Snap Inc., 3 2023 WL 3830365 (S.D. Cal. June 5, 2023)..........................................................17, 19, 20, 21 4 Lodi v. Lodi, 5 173 Cal. App. 3d 628 (1985) ...................................................................................................23 6 Marshall’s Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263 (D.C. Cir. 2019) ...............................................................................................18 7 McCollum v. CBS, Inc., 8 202 Cal. App. 3d 989 (1988) .............................................................................................13, 22 9 Miami Herald Publ’g Co. v. Tornillo, 10 418 U.S. 241 (1974) ...........................................................................................................21, 22 11 Modisette v. Apple Inc., 30 Cal. App. 5th 136 (2018) ........................................................................................13, 25, 26 12 Moran v. Prime Healthcare Mgmt., Inc., 13 94 Cal. App. 5th 166 (2023) ....................................................................................................23 14 NetChoice, LLC v. Att’y Gen., Fla., 15 34 F.4th 1196 (11th Cir. 2022) ..........................................................................................21, 22 16 Noble v. L.A. Dodgers, Inc., 168 Cal. App. 3d 912 (1985) ...................................................................................................26 17 O’Handley v. Padilla, 18 579 F. Supp. 3d 1163 (N.D. Cal. 2022) .............................................................................21, 22 19 Olivia N. v. Nat’l Broad. Co., 20 126 Cal. App. 3d 488 (1981) ...................................................................................................22 21 PH II, Inc. v. Super. Ct., 33 Cal. App. 4th 1680 (1995) ..................................................................................................15 22 Prager Univ. v. Google LLC, 23 85 Cal. App. 5th 1022 (2022) ............................................................................................15, 20 24 Regents of Univ. of Cal. v. Super. Ct., 4 Cal. 5th 607 (2018) ...............................................................................................................25 25 26 Snyder v. Phelps, 562 U.S. 443 (2011) .................................................................................................................22 27 In re Soc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig., 28 2023 WL 7524912 (N.D. Cal. Nov. 14, 2023) ................................................................ passim 11 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 Sorrell v. IMS Health Inc., 564 U.S. 552 (2011) .................................................................................................................21 2 Twitter, Inc. v. Taamneh, 3 598 U.S. 471 (2023) .................................................................................................................25 4 Velez v. Smith, 5 142 Cal. App. 4th 1154 (2006) ................................................................................................15 6 Weirum v. RKO Gen., Inc., 15 Cal. 3d 40 (1975) ................................................................................................................24 7 Young v. Facebook, Inc., 8 2010 WL 4269304 (N.D. Cal. Oct. 25, 2010)..........................................................................25 9 Ziencik v. Snap, Inc., 10 2023 WL 2638314 (C.D. Cal. Feb. 3, 2023)............................................................................24 11 Other Authorities 12 47 U.S.C. § 230 ...................................................................................................................... passim 13 Cal. Const. art. I, § 2(a) ......................................................................................................... passim 14 Cal. Civ. Proc. Code § 436 ............................................................................................................15 15 U.S. Const. amend. I .............................................................................................................. passim 16 17 18 19 20 21 22 23 24 25 26 27 28 12 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 MEMORANDUM OF POINTS AND AUTHORITIES 2 I. INTRODUCTION 3 The Court should strike the Identified Allegations relating to third-party misconduct (e.g., by 4 sexual predators), child sexual abuse material (“CSAM”), and online “viral challenges” from the 5 Operative Pleadings. First, Section 230 bars all such allegations because they impermissibly treat 6 Defendants as the publishers or speakers of third-party content. See Doe II v. MySpace, Inc., 175 Cal. 7 App. 4th 561, 569–73 (2009); In re Soc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig. (“Soc. 8 Media”), __ F. Supp. 3d __, 2023 WL 7524912, at *14 (N.D. Cal. Nov. 14, 2023). Second, Plaintiffs’ 9 “challenge” video allegations are independently barred by the First Amendment because such videos 10 constitute protected speech over which Defendants have the right to exercise editorial control and 11 discretion. See, e.g., McCollum v. CBS, Inc., 202 Cal. App. 3d 989, 993–94 (1988). Third, the Identified 12 Allegations are all improper because (1) Defendants have no affirmative duty under California law to 13 protect users from harm allegedly caused by other users of their services, Soc. Media, 2023 WL 7524912, 14 at *35–39; and (2) Defendants were not the proximate cause of any such harm, see Modisette v. Apple 15 Inc., 30 Cal. App. 5th 136, 154–55 (2018). 16 II. BACKGROUND1 17 Defendants demurred to Plaintiffs’ MC and three identified SFCs on July 14, 2023, and the Court 18 ruled on the demurrer on October 13, 2023 (the “Order”). The Court sustained the demurrer as to 19 Plaintiffs’ product liability and various other claims (Counts 1–4, 6, 8, and 9) and overruled it as to 20 Plaintiffs’ general negligence and fraudulent concealment claims (Counts 5 and 7). The Order did not 21 address Defendants’ arguments regarding “injuries allegedly caused by child predators, by child sexual 22 abuse material, and by dangerous ‘challenges,’” Order at 11 n.1, and the Court subsequently permitted 23 Defendants to file the instant Motion to Strike as to “the types of injuries that were specified in footnote 24 1” of the Order, Tr. of Proceedings of Nov. 7, 2023 at 18–23; Tr. of Proceedings of Dec. 7, 2023 at 28. 25 26 27 1 Defendants included a detailed background of the claims and parties in their initial demurrer to the MC 28 (see Dem. at 28–36), which is incorporated herein by reference. 13 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 The seven SFCs at issue here, filed on behalf of California residents, allege harm from predatory 2 third parties, CSAM, and/or online “challenges,” and incorporate the allegations in the MC by reference. 3 E.g., A.S. SFC at 1. Six of the seven SFCs summarily allege injury from “[e]xploitation and/or sexual 4 abuse related harms” by predatory users of certain Defendants’ services, or from CSAM specifically. In 5 particular, one SFC alleges these harms against Meta, Snap, and TikTok (Glenn-Mills SFC at 5); one 6 against Meta and TikTok (P.F. SFC at 5); three against Snap (A.S. SFC at 5, K.L. SFC at 5, N.S. SFC at 7 5); and one against Meta (K.K. SFC at 5). As the MC elaborates, Plaintiffs allege that some third parties 8 target minors for “sexual exploitation” and “sextortion” through Defendants’ services; that predatory 9 adults misuse Defendants’ services “to recruit and sexually exploit children for the production of CSAM 10 and its distribution” via those services; and that Defendants do not adequately monitor or remove CSAM 11 from the services. E.g., MC ¶¶ 137, 861(h)–(l); see also id. ¶¶ 372, 376–81, 396–400 (Meta); id. ¶¶ 476– 12 83 (Snap); id. ¶¶ 666–82 (TikTok); id. ¶¶ 784–802 (YouTube).2 Defendants’ terms of use prohibit 13 criminal “activity such as by third parties who caused harm to plaintiffs.” Soc. Media, 2023 WL 7524912, 14 at *38 n.72; see MC ¶¶ 215, 540, 640, 647, 671 n.748 (incorporating Defendants’ terms of use by 15 reference). 16 The seventh SFC asserts that the Plaintiff experienced “[h]arms” “from being the victim of viral 17 challenges engaged in by other minors” on Meta’s and TikTok’s services. J.S. SFC at 5.3 As alleged in 18 the MC, challenges “are campaigns that compel users to create and post . . . certain types of videos, such 19 as performing a dance routine or a dangerous prank.” MC ¶ 608. The MC acknowledges that challenge 20 videos are created and developed by third-party users. Id. ¶¶ 609‒26. 21 22 23 2 The MC similarly alleges that users of some Defendants’ services have been harmed by cyberbullying, another form of third-party misconduct. See id. ¶ 157 (Meta); id. ¶¶ 413, 472 (Snap). To Defendants’ 24 knowledge, no SFC in these proceedings alleges injury from cyberbullying. Defendants reserve the right 25 to move to strike any such allegations if asserted in future and/or amended SFCs. 3 Defendants are aware of three other SFCs in these proceedings that allege harm from online challenges, 26 all filed on June 14, 2023. See Arroyo v. TikTok, Inc. et al., Case No. 22STCV21355 (L.A. Super. Ct.) (plaintiff allegedly died “from participation in viral challenges selected for and sent to” them via TikTok); 27 Arlington Smith v. TikTok, Inc. et al., Case No 22STCV21355 (L.A. Super. Ct.) (same); Williams v. 28 TikTok, Inc. et al., Case No. 22STCV21355 (L.A. Super. Ct.) (same). 14 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 III. LEGAL STANDARD 2 Motions to strike may be used to “[s]trike out any irrelevant, false, or improper matter inserted in 3 any pleading,” as well as “any part of any pleading not drawn or filed in conformity with the laws of this 4 state.” Civ. Proc. Code § 436; see Baral v. Schnitt, 1 Cal. 5th 376, 393–94 (2016). In cases where “a 5 portion of a cause of action” is “substantively defective on the face of the complaint,” defendants “may 6 attack that portion of the cause of action” through a motion to strike. PH II, Inc. v. Super. Ct., 33 Cal. 7 App. 4th 1680, 1682–83 (1995); see also Velez v. Smith, 142 Cal. App. 4th 1154, 1161 (2006). A court 8 may strike allegations in a pleading “at any time in its discretion.” Civ. Proc. Code § 436. 9 IV. ARGUMENT 10 A. Section 230 Bars Plaintiffs’ Allegations of Harm From Third-Party Misconduct, CSAM, and Online Challenges. 11 12 As multiple courts have held while addressing materially identical claims, Plaintiffs’ allegations 13 of harm from third-party misuse of Defendants’ services—including through sexual exploitation or abuse 14 and CSAM—would treat Defendants as publishers of third-party content and are barred by Section 230. 15 The Court should therefore strike these allegations from the Operative Pleadings. The Court should 16 similarly strike allegations of harm from “viral challenges” on Defendants’ services because those 17 allegations are expressly premised on the dissemination of “videos” that third parties “create and post” on 18 Defendants’ services—activity at the heart of Section 230 immunity. MC ¶¶ 613, 608. 19 1. Plaintiffs’ Claims Based on the Misconduct of Sexual Predators and CSAM Treat Defendants as Publishers of Harmful Third-Party Content. 20 21 Section 230 bars Plaintiffs’ claims of injury from third-party predators’ misuse of Defendants’ 22 services because they would treat Defendants “as the publisher[s] or speaker[s] of . . . information 23 provided by another information content provider.” 47 U.S.C. § 230(c)(1); Prager Univ. v. Google LLC, 24 85 Cal. App. 5th 1022, 1032 (2022) (Section 230 bars allegations “seeking to hold a service provider liable 25 for its exercise of a publisher’s traditional editorial functions” (quoting Barrett v. Rosenthal 40 Cal. 4th 26 33, 43 (2006))). Section 230 applies not only to “claims that explicitly” rely on third-party content but 27 also to claims that “implicitly require recourse to that content.” Cohen v. Facebook, Inc., 252 F. Supp. 3d 28 15 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 140, 156 (E.D.N.Y. 2017); see also Hassell v. Bird, 5 Cal. 5th 522, 542 (no amount of “creative pleading” 2 allows plaintiffs to advance arguments that Section 230 otherwise bars (quotation omitted)). 3 Plaintiffs have conceded that Section 230 precludes claims against Defendants to the extent they 4 would treat Defendants as “publisher[s] or speaker[s]” of third-party content. 47 U.S.C. § 230(c)(1); see 5 Pls.’ Opp’n to Dem. at 24, 26 (acknowledging that Section 230’s other requirements are satisfied); see 6 also Order at 16. The allegations that are the focus of this motion do just that. Their gravamen is that 7 Defendants are liable because their services allow minor users to “connect[]” and “communicat[e]” with 8 other users, some of whom are alleged to be predatory, adult strangers or other bad actors who may harm 9 minor users, including through “sexual exploitation, sextortion, and production and distribution of 10 CSAM.” E.g., MC ¶¶ 494, 501‒02; see also, e.g., MC ¶¶ 372, 377, 380‒81, 478, 481, 668, 678, 785 11 (profile matching, messaging, and location features allegedly “help[] predators connect with underage 12 users”). Numerous courts—including the California Court of Appeal, the federal MDL court presiding 13 over parallel litigation, and more—have held that Section 230 bars near-identical allegations because they 14 treat defendants as publishers of the content created or disseminated by other users. 15 Doe II v. MySpace, Inc., 175 Cal. App. 4th 561 (2009), is directly on point. There, the Court of 16 Appeal held that Section 230 barred claims arising from the sexual assault of minor plaintiffs by adults 17 they met through the social media service MySpace. Id. at 563–66. As here, plaintiffs argued that Section 18 230 was inapplicable because their harms were caused not by “harmful or offensive content” but by 19 MySpace’s purported failure to “implement reasonable, basic safety precautions” to “protect[] young 20 children from sexual predators,” such as “age-verification software” and making user accounts private by 21 default. Id. at 569, 565. The Court of Appeal rejected this effort to plead around Section 230, explaining 22 that plaintiffs fundamentally sought to impose liability “for the communications between [plaintiffs] and 23 their assailants” and for MySpace’s purported failure “to regulate what appears on its Web site.” Id. at 24 573. That is, plaintiffs “want[ed] MySpace to ensure that sexual predators [did] not gain access to (i.e., 25 communicate with) minors on its Web site,” but “[t]hat type of activity—to restrict or make available 26 certain material—is expressly covered by section 230.” Id. 27 Doe II is controlling and bars Plaintiffs’ equivalent attempt to hold Defendants liable for 28 “exploitation” or “abuse” by predatory users who communicated with plaintiffs via Defendants’ services. 16 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 E.g., A.S. SFC at 5; see also MC ¶¶ 365, 370, 503–13, 666‒82, 790‒802. No matter how their allegations 2 are couched, Plaintiff seek “to hold [Defendants] liable for . . . deciding whether to publish certain 3 material,” and Plaintiffs cannot avoid Section 230 by repackaging such a claim as a challenge to a lack of 4 controls or safety features. Doe II, 175 Cal. App. 4th at 573–74. 5 Doe II is in accord with a long line of similar cases applying Section 230 to bar claims arising from 6 sexual predators’ use of online services to communicate with and exploit minors. Indeed, the Texas 7 Supreme Court has observed that “[u]nder the view of section 230 adopted in every published decision of 8 which we are aware,” Section 230 precludes claims that a defendant failed “to prevent adults from 9 contacting minors.” In re Facebook, Inc., 625 S.W.3d 80, 93‒95 (Tex. 2021); see also, e.g., Doe v. 10 MySpace, Inc., 528 F.3d 413, 420‒22 (5th Cir. 2008) (Section 230 barred claims that minors would not 11 have been assaulted but for MySpace’s “failure to implement measures” “that would have prevented . . . 12 communicat[ions]” between plaintiffs and their abusers); Doe v. Grindr Inc., __ F. Supp. 3d __, 2023 WL 13 9066310, at *4 (C.D. Cal. Dec. 28, 2023) (Section 230 barred claims where website’s alleged “defect” 14 was to “ma[k]e it easier . . . for other users to communicate” with plaintiff and commit offline sexual 15 assaults (citing Herrick v. Grindr LLC, 756 F. App’x 586, 590 (2d Cir. 2019))); L.W. v. Snap Inc., __ F. 16 Supp. 3d __, 2023 WL 3830365, at *4–5 (S.D. Cal. June 5, 2023) (Section 230 barred claim that 17 Snapchat’s “Quick Add” feature recommended sexual predators to minors).4 18 The MDL court likewise held that Section 230 barred “allegations that [Defendants] recommend 19 adult accounts to adolescents.” Soc. Media, 2023 WL 7524912, at *14; see, e.g., MC ¶¶ 372, 481–83 20 (similarly challenging recommendation features on Meta and Snap’s services). As that court explained, 21 “recommending one user’s profile to another is publishing of third-party content, which is entitled to 22 Section 230 immunity.” Soc. Media, 2023 WL 7524912, at *14; see also id. (“user accounts or profiles 23 are third-party ‘content,’” and “recommendation function[s]” are the “means through which [Defendants] 24 publish third-party content to users” (citing Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1096 25 26 4 In contrast, Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016), addressed claims against a developer who learned from an outside source (unconnected to its website) that third parties might 27 physically assault the plaintiff—not claims alleging that a website should have monitored online content. 28 Id. at 848–49, 853. 17 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 (9th Cir. 2019)). Courts nationwide have similarly recognized that Section 230 bars allegations of harm 2 from “friend suggestions” or “‘[c]onnections’ or ‘matches’ of information and individuals.” Force v. 3 Facebook, Inc., 934 F.3d 53, 65–67 (2d Cir. 2019); see also, e.g., Carafano v. Metrosplash.com, Inc., 339 4 F.3d 1119, 1124–25 (9th Cir. 2003) (no user “profile has any content until a user actively creates it,” and 5 Section 230 covers features “such as ‘matching’ profiles with similar characteristics or highly structured 6 searches”). Indeed, as recently as December 2023, a court in the Central District of California held that 7 Section 230 barred claims that plaintiff and “adult men were matched [in the Grindr dating app] based on 8 their user profiles and geographical proximity,” “leading to . . . in-person meetings and sexual assaults.” 9 Grindr, 2023 WL 9066310, at *3. The court explained that the app’s challenged features “were only 10 relevant to [plaintiff’s] injury to the extent [the features] made it easier or more difficult for other users to 11 communicate,” and thus the plaintiff’s claim sought “to hold Grindr liable for its failure to regulate third 12 party content.” Id. at *4. The same is true here. 13 Section 230 likewise bars claims that Defendants are liable for allowing users to post or view 14 location information, since “limiting publication of geolocation data provided by users . . . inherently 15 targets the publishing of third-party content and would require defendants to refrain from publishing such 16 content.” Soc. Media, 2023 WL 7524912, at *13 & n.18; see also Herrick, 765 F. App’x at 590 (challenge 17 to “app’s geolocation function” barred because “location information was necessarily provided” by users); 18 Grindr, 2023 WL 9066310, at *3, *5 (same); Marshall’s Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 19 1263, 1270‒71 (D.C. Cir. 2019) (similar). Section 230 similarly precludes allegations predicated on user- 20 to-user messaging, as such communications are “prototypical” user content. Kimzey v. Yelp! Inc., 836 21 F.3d 1263, 1266 (9th Cir. 2016) (cleaned up); see also Herrick, 765 F. App’x at 590 (similar).5 22 23 5 24 Insofar as Plaintiffs allege injury from cyberbullying, these allegations are also barred by Section 230. Plaintiffs expressly attribute such harms to user-generated content and communications on Defendants’ 25 services—including “photos depicting deviant behavior.” MC ¶ 472; see, e.g., Bride v. Snap Inc., 2023 WL 2016927, at *6 (C.D. Cal. Jan. 10, 2023) (Section 230 barred claims that messaging apps promoted 26 “bullying and harassment” because “[d]efendants did not create or develop the harassing and explicit messages that led to the harm suffered by [p]laintiff; the sending users did”); Grossman v. Rockaway Twp., 27 2019 WL 2649153, at *13‒14 (N.J. Super. Ct. June 10, 2019) (similar); see also Doe II, 175 Cal. App. 28 4th at 573 (Section 230 bars liability for users’ “communications”). 18 DEFENDANTS’ MOTION TO STRIKE THIRD-PARTY MISCONDUCT AND ONLINE CHALLENGE ALLEGATIONS FROM IDENTIFIED SHORT-FORM COMPLAINTS; Case No. 22STCV21355 1 To the extent Plaintiffs allege that Defendants are liable for allegedly failing to monitor, report, or 2 remove CSA