arrow left
arrow right
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
  • V. V. INDIVIDUALLY AND AS NEXT FRIEND TO MINOR C.O v. META PLATFORMS INCT90 - Torts - All other document preview
						
                                

Preview

DOCKET NO. X06-UWY-CV23-5032685-S SUPERIOR COURT V.V. and E.Q., individually and as next friends to minor C.O. COMPLEX LITIGATION DOCKET v. AT WATERBURY META PLATFORMS, INC., formally known as FACEBOOK, INC.; SNAP, INC.; SNAPCHAT, LLC; REGINALD SHARP; and EDDIE RODRIGUEZ June 7, 2024 REPLY IN SUPPORT OF DEFENDANT SNAP INC.’S MOTION TO STRIKE SUBSTITUTED COMPLAINT Plaintiffs’ Opposition confirms that their claims against Snap should be stricken. As an initial matter, Plaintiffs have not shown that the Substituted Complaint is “materially different” from the Original Complaint, so they have waived any challenge to this Court’s prior order holding that “the allegations of this case fall squarely within the ambit of the immunity afforded” by Section 230. V.V. v. Meta Platforms, 2024 WL 678248, at *11 (Conn. Super. Ct. Feb. 16, 2024); Parker v. Ginsburg Dev. Ct., LLC, 85 Conn. App. 777, 780–82 (2004). To the contrary, Plaintiffs devote multiple pages to relitigating the issues already decided in the Court’s prior order, including whether Quick Add involves publication of “third party content” (Opp. 3, 6–10, 21–25), and whether the existence of Bitmojis—cartoon avatars that users can choose to create—take this case outside the scope of Section 230 (id. 18–25). “Reemphasiz[ing] existing allegations” cannot “correct [the] defects noted in the [prior] order[],” and Plaintiffs have therefore waived their right to challenge the Court’s prior order. St. Denis v. de Toledo, 90 Conn. App. 690, 694–95 (2005). Even if the Substituted Complaint is considered on its merits, Plaintiffs’ claims against Snap remain squarely barred by Section 230. Plaintiffs allege the same conduct by Snap and the same theory of harm as the Original Complaint: that C.O. connected with third parties through Snapchat’s algorithmic recommendation feature and that those third parties harmed C.O. Sub. Compl. ¶¶ 199–201, 209. As the Court explained, Section 230 bars all claims based on this “factual background” no matter how they are pleaded because all of Snap’s alleged conduct— 1 including providing “user recommendation technologies” and “fail[ing] to regulate content provided by third parties”—is publishing conduct covered by Section 230. V.V., 2024 WL 678248, at *10–11. Plaintiffs’ arguments to the contrary are inconsistent with the Court’s prior order, which Plaintiffs acknowledge is “law of the case” yet fail to meaningfully address. Opp. 16. Plaintiffs’ Opposition places greater emphasis on their allegations that the Individual Defendants displayed Bitmojis on their Snapchat profiles. But Plaintiffs included similar allegations in their Original Complaint—even including an image of Sharp’s Bitmoji—and they were not enough to overcome Section 230. That result was unquestionably correct—Section 230 protects Snap’s Bitmoji creation tools so long as they are “not intrinsically offensive or unlawful,” are “available equally to bad actors and the app’s intended users,” and do not “require[]” unlawful content, which is plainly the case. Herrick v. Grindr, LLC, 306 F. Supp. 3d 579, 589 (S.D.N.Y. 2018), aff'd, 765 F. App'x 586 (2d Cir. 2019). As Plaintiffs’ allegations make clear, there is nothing intrinsically unlawful about allowing all users to create cartoon avatars, and the overwhelming majority of Snapchat’s hundreds of millions of users create Bitmojis for entirely lawful and innocuous reasons. It is telling that Plaintiffs cite only irrelevant cases in support of their position. For instance, in Anthony v. Yahoo Inc., 421 F. Supp. 2d 1257 (N.D. Cal. 2006), Yahoo itself allegedly created fake profiles for fictitious users. Id. 1262–63; Opp. 19. Here, there is no dispute that the Individual Defendants, not Snap, created and published their Bitmojis, using the same neutral tools available to all Snapchat users. Finally, even if these Bitmoji allegations were not barred, Plaintiffs still cannot state a claim without relying on allegations of harm stemming from the publication of third-party content that the Court already deemed barred by Section 230. In other words, Plaintiffs have not pled and cannot plead a viable theory of liability based on Bitmojis alone. Beyond Section 230, Plaintiffs’ claims are also barred by the First Amendment and fail as a matter of state law. The Court should strike Plaintiffs’ claims. THE SUBSTITUTED COMPLAINT IS NOT MATERIALLY DIFFERENT Plaintiffs concede that they cannot avoid waiver unless they made changes in the 2 Substituted Complaint that “are ‘responsive’ to the defects identified by the trial court.” Opp. 11 (quoting Lund v. Milford Hosp., 326 Conn. 846, 852 (2017)). But none of Plaintiffs’ changes remedies the core defect identified by the Court—that Section 230 bars claims that “arise out of” Snap’s alleged “fail[ure] to regulate content provided by third parties such as Sharp and Rodriguez when they were using [Snapchat].” V.V., 2024 WL 678248, at *11. Plaintiffs have provided “no … correction” that makes “the stricken count[s] legally sufficient.” Perugini v. Giuliano, 148 Conn. App. 861, 878–79 (2014). Plaintiffs simply rehash prior allegations and arguments, while remaining focused on the same core theory that the Court deemed barred by Section 230. In all of Plaintiffs’ cited cases (Opp. 11–13), the plaintiffs materially revised their claims by either adding missing explanations of how they were harmed or removing the specific allegations that led the court to strike their prior complaint.1 By contrast here, all of Plaintiffs’ revisions continue to focus on the “same factual background” as their earlier stricken claims. V.V., 2024 WL 678248, at *11. Plaintiffs assert, for example, that the Substituted Complaint is materially different from the Original Complaint because it “explicitly [seeks] to hold Snap liable ‘as the publisher of malign content co-created by Snapchat.’” Opp. 12. But Plaintiffs ignore that—as noted in Snap’s Motion—the Original Complaint made that exact argument, Mot. 9; Orig. Compl. ¶ 54 (“Snapchat makes a material contribution to the creation and/or development of their Snapchat postings and becomes a co-publisher of [ ] content” created by “malign users”).2 Connecticut courts are unanimous that “simply reemphasiz[ing] existing allegations” is insufficient. St. Denis, 90 Conn. 1 See Parsons v. United Techs. Corp., Sikorsky Aircraft Div., 243 Conn. 66, 75 (1997) (court struck prior complaint because it “fail[ed] to specify a particular [unsafe] ‘workplace’”; amended complaint identified location); O’Donnell v. AXA Equitable Life Ins. Co., 210 Conn. App. 662, 671–72, 677–78 (2022) (court struck prior complaint because it relied “on what [a third party] may have done”; amended complaint explained how “defendant’s actions” harmed plaintiff “independent” of third party); Alexander v. Comm’r of Admin. Servs., 86 Conn. App. 677, 683 (2004) (plaintiffs “transformed their previous, generic equal protection claim into a colorable claim of selective enforcement” by adding specific allegations of unfair targeting); Lund, 326 Conn. at 856–58 (amended complaint removed all allegations that implicated the firefighter’s rule). 2 See id. (alleging that Snap offers its users tools to create content; that “in many cases, the only content in a user’s Snapchat post [is]… content supplied and created by Snapchat”). 3 App. at 695. Plaintiffs’ “attempt to plead around any potential CDA immunity” by characterizing their claims is also a “legal conclusion” that is irrelevant on a motion to strike. V.V., 2024 WL 678248, at *9 n.10; Faulkner v. United Tech. Corp., 240 Conn. 576, 588 (1997). Plaintiffs also focus on their allegations regarding Bitmojis (Opp. 5–8, 18–25), but Plaintiffs ignore that—as noted in Snap’s Motion—the Original Complaint likewise alleged Snapchat allowed adults like Sharp to “disguis[e] their identity and/or pos[e] as minors” to exploit minors, and that Sharp did this by misrepresenting himself using a Bitmoji. Mot 7–8; Orig. Compl. ¶¶ 41, 90, 205, 232. The Original Complaint even included an image of Sharp’s Bitmoji, showing the cartoonish way that he would have appeared to C.O. on Snapchat. Mot. 7; Orig. Compl. ¶ 205. Because the Court already rejected these allegations and held that established caselaw bars claims based on “recommendation technologies and algorithms that operate to connect users together,” V.V., 2024 WL 678248, at *10, Plaintiffs cannot rely on them now to avoid waiver. Plaintiffs also argue that the Substituted Complaint includes “106 new paragraphs.” Opp. 12. But they ignore Snap’s explanation of how these paragraphs (1) repeat, rephrase, or reorder prior allegations; (2) are irrelevant to the Court’s prior order (including because they concern alleged “experiments” conducted by Plaintiffs’ counsel that are not alleged to resemble C.O.’s experience in any way)3; or (3) are legal conclusions. Mot. 7–9. Plaintiffs nowhere explain how these additions “correct [the] defects noted in the order[],” St. Denis, 90 Conn. App. at 694–95, or are anything more than “attempts to bolster [their] allegations by adding [ ] factual details” where their “claim[s] remain[] the same basic one[s] that this court previously” rejected, Mettler v. Fortunati, 2013 WL 6989515, at *6 (Conn. Super. Ct. Dec. 18, 2013). PLAINTIFFS’ CLAIMS AGAINST SNAP ARE LEGALLY INSUFFICIENT I. Plaintiffs’ Claims Are Barred by Section 230 of the Communications Decency Act. The Court already held that Snap “qualif[ies] as a ‘provider … of an interactive computer service,’” and that “it is apparent that the harms alleged in this case arise out of communications 3 These allegations also appear to violate the Rules of Professional Conduct. Mot. 8 n.1. 4 that originated from third parties.” V.V., 2024 WL 678248, at *8 & n.9. The Court also held that Plaintiffs’ allegations “fall squarely within the ambit of” Section 230 because they “clearly allege that [Snap] failed to regulate content provided by third parties such as Sharp or Rodriguez while they were using” Snapchat. Id. at *11. Plaintiffs admit that the Court’s prior order is “law of the case” and do not ask the Court to reconsider any aspect of its rulings. Opp. 16; Wasko v. Manella, 87 Conn. App. 390, 395 (2005) (“[A] judge should hesitate to change his own rulings in a case and should be even more reluctant to overrule those of another judge.” (cleaned up)). Plaintiffs’ arguments, however, ignore or flatly contradict the Court’s prior order. At the outset, the Court has already considered and rejected Plaintiffs’ argument that Section 230 does not bar their claims because one of Congress’s goals in enacting Section 230 was to protect minors. See V.V., 2024 WL 678248, at *7; Opp. 14. The Court explained that while Congress sought “to protect minors,” it “was also concerned with ... the continued development of the Internet,” and “made the legislative judgment to effectively immunize” internet platforms “from civil liability in tort with respect to material disseminated by them but created by others.” Id. (quoting Vazquez v. Buhl, 150 Conn. App. 117, 123 (2014)). And because Section 230 “protects websites not only from ultimate liability, but also from having to fight costly and protracted legal battles,” courts apply this immunity bar “at the earliest possible stage,” as the Court properly did by striking the Original Complaint. Word of God Fellowship, Inc. v. Vimeo, Inc., 166 N.Y.S.3d 3, 8 (App. Div. 2022) (cleaned up); see Vazquez, 150 Conn. App. at 123 (holding Section 230 barred Plaintiffs’ claims on motion to strike). Plaintiffs also repeat their argument that Section 230’s immunity should be construed narrowly because there is a presumption against federal preemption. Opp. 14–16. The Court likewise correctly rejected this argument, explaining that because Section 230 “contains an express preemption clause,” the presumption against preemption does not apply. V.V., 2024 WL 678248, at *11 n.15 (quoting Buono v. Tyco Fire Products, LP, 78 F.4th 490, 495 (2d Cir. 2023)). Plaintiffs also once again incorrectly claim that Snap posits a “but-for test” for Section 230 immunity. Opp. 25–28. Plaintiffs made an identical argument in their prior opposition, and the Court nevertheless 5 held that Section 230 barred their claims. See Dkt. 143 25–28. And Plaintiffs devote much of the Opposition to repeating allegations from the Original Complaint regarding Quick Add (Opp. 3–4, 6–10) and their prior arguments that Quick Add falls outside of Section 230 (id. 21–23). Finally, Plaintiffs argue that the Substituted Complaint is not barred by Section 230 because it alleges that “irrespective of its status of a ‘publisher,’ Snap … co-creat[ed] content that materially contributed to C.O.’s abuse.” Opp. 16. Here, too, the Original Complaint made an identical allegation, which the Court necessarily rejected when it struck Plaintiffs’ claims. Supra at 3; Mot. 9. Plaintiffs cannot ignore the Court’s prior order and relitigate their claims anew. A. Plaintiffs’ Claims Are Based on Snap’s Publication of Third-Party Content. Plaintiffs’ specific arguments against Section 230 immunity equally lack merit. Plaintiffs do not meaningfully address the Court’s prior ruling that Section 230 bars claims based on: (1) harmful communications exchanged on Snapchat; (2) miscellaneous Snapchat features including rewards, content recommendations, unlimited scrolling features, and push notifications; (3) allegedly deficient age verification and parental controls; (4) allegedly deficient reporting processes; or (5) Snap’s alleged failure to warn minor users and their parents of the alleged dangers posed by Snapchat. Mot. 14–21; V.V., 2024 WL 678248, at *1–2, *9–11. Indeed, Plaintiffs double down on their reliance on third-party content, admitting that the Substituted Complaint is based on allegations of “how predatory adults use social media to groom minor victims.” Opp. 12. Plaintiffs focus on arguing that Section 230 does not bar claims based on recommendation tools and Bitmoji creation tools, but both arguments are barred by the Court’s prior order and governing precedent. User account recommendation technologies. The Court already held that Section 230 bars claims premised on “user recommendation technologies,” as this “does not change the computer service’s status as a publisher.” V.V., 2024 WL 678248, at *10. Plaintiffs do not explain how their repeated arguments could be consistent with this ruling or the cases the Court relied upon, including the Second Circuit’s decision in Force v. Facebook, 934 F.3d 53, 65–67 (2d Cir. 6 2019),4 holding that Section 230 protects features that generate suggestions to connect with other users. Mot. 12–13.5 Plaintiffs seek to evade the Court’s order by asserting that Quick Add recommendations “are communications generated by Snapchat.” Opp. 22. But the Original Complaint likewise alleged that Snapchat—not third parties—sent Quick Add recommendations. Orig. Compl. ¶¶ 22, 68, 185, 303. As the Court explained, Section 230 prevents Snap from being held liable based on those recommendations because they “implicate [Snap’s] role, broadly defined, in publishing [the] third party [c]ommunications” that harmed C.O.—those sent by the Individual Defendants. V.V., 2024 WL 678248, at *8 (quoting Cohen v. Facebook, Inc., 252 F. Supp. 3d 140, 156 (E.D.N.Y. 2017)). Section 230 also bars Plaintiffs’ claims because they “require recourse to that [third-party] content to establish liability.” Id. The same was true in Force, where the Second Circuit held that Section 230 protected Facebook’s “‘friend suggestions,’ based on analysis of users’ existing social connections on Facebook and other behavioral and demographic data,” and which allegedly facilitated terrorists connecting with and recruiting other terrorists. 934 F.3d at 58, 65. In its prior order, the Court held that Force prevents Plaintiffs from holding Snap liable for Quick Add and noted that “at oral argument, the plaintiffs’ counsel essentially admitted that the court would have to rule in favor of the defendants, at least with respect to the Quick Add feature, if it followed the reasoning of the Force case.” V.V., 2024 WL 678248, at *10 & n.12. That remains the case.6 4 Plaintiffs do not dispute that “it is well established that when Connecticut courts interpret federal statutes, the decisions of the Second Circuit Court of Appeals carry particularly persuasive weight.” V.V., 2024 WL 678248, at *10 n.11 (cleaned up). 5 For example, in L.W. v. Snap Inc., 675 F. Supp. 3d 1087 (S.D. Cal. 2023), the court held that Section 230 barred virtually identical claims against Snapchat based on Quick Add, applying the same standard from Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc). Id. at 1097; see Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1095 (9th Cir. 2019) (Section 230 protects feature that “recommended groups”); Doe v. Grindr Inc., 2023 WL 9066310, at *3 (C.D. Cal. Dec. 28, 2023) (Section 230 protects “match function” that that resulted in in-person assaults). 6 Courts continually affirm this conclusion. For example, most recently in Jane Doe v. Backpage.com, LLC, 2024 WL 2853969 (N.D. Cal. Mar. 20, 2024), the court held that Section 230 7 Bitmoji Creation Tools. Although Plaintiffs’ Opposition places great weight on their Bitmoji allegations (Opp. 5–8, 18–25), Plaintiffs cannot skirt Section 230 by targeting Snap’s provision of neutral tools that allow all users to create and post Bitmojis. Such allegations do not render Snap a developer of unlawful content under the CDA, for several independent reasons. First, Bitmoji creation tools are neutral tools that are protected by Section 230. As Plaintiffs acknowledge (Opp. 24), under settled law from the Second Circuit and elsewhere, Section 230 protects content-creation tools that are “available equally to bad actors and the app’s intended users,” Herrick, 765 F. App’x at 591, are “not intrinsically offensive or unlawful,” Herrick, 306 F. Supp. 3d at 589, and do not “require[]” unlawful content, id. See also Dyroff, 934 F.3d at 1099 (Section 230 protects “provid[ing] neutral tools that a user exploits” to unlawful ends). This includes tools, as Plaintiffs concede, that provide users with content that they can choose to adopt, edit, and publish. Herrick, 306 F. Supp. 3d at 589 (Section 230 protected “Grindr’s drop-down menu for ‘preferred sexual position’” that users can choose to publish); Opp. 24; infra 9–10. Although the Opposition conclusorily asserts that Bitmojis are somehow “intrinsically offensive” (Opp. 24), Plaintiffs’ own allegations demonstrate otherwise. As Plaintiffs allege, Snap’s Bitmoji tools are available equally to “all users,” Sub. Compl. ¶ 17, and the overwhelming majority of Bitmojis are associated with accounts that do not harm anyone. Indeed, Plaintiffs allege that “[t]he use of Bitmojis to facilitate online communication makes message friendlier and fun.” Id. ¶ 220. That Bitmojis allegedly may be abused by a tiny subset of Snapchat’s hundreds of millions of users in a “sinister” manner, id. ¶ 219, does not convert Snap into a content provider. Plaintiffs’ own cited cases—where the courts applied Section 230 to bar the asserted claims—confirm as much. See Opp. 17–19. As these courts held, a defendant must “materially barred claims that Meta facilitated sex trafficking by third parties because Meta “[r]ecommend[ed] that vulnerable persons become Instagram ‘friends’ with sex traffickers.” Id. at *1–2. The court held that the plaintiff could not “plead around Section 230 immunity by framing [a] claim as one involving Meta’s connection algorithms rather than the user-generated content that those algorithms facilitate” because “the two are intertwined.” Id. at *2 (cleaned up). 8 contribut[e] to [the] alleged unlawfulness” of third-party content or “specifically encourage[] development of what is offensive about the content.” Vazquez, 150 Conn. App. at 136–37; see also Shiamili v. Real Est. Grp. of New York, Inc., 17 N.Y.3d 281, 292 (2011) (same). Vazquez held that though the website there allegedly “amplified,” “endorsed,” and “adopted” unlawful third-party statements, Section 230 barred the claim. 150 Conn. App. at 139. Similarly, in Shiamili, the court held that even though the website provided a “heading, subheading, and illustration that accompanied the [allegedly defamatory] reposting” of a third party, and even though some of the website’s own content was “offensive,” it did not “‘contribute[ ] materially to the alleged illegality’ of the third-party content within the meaning of the CDA” because the content contributions on their own were “not defamatory as a matter of law.” 17 N.Y.3d at 292– 93. Likewise, Plaintiffs provide no argument or authority, nor could they, that Bitmojis are intrinsically unlawful or themselves actionable as a matter of law. These cases underscore that neutral content-creation tools like the Bitmojis tools challenged here do not affect the applicability of Section 230 immunity.7 Second, Plaintiffs do not address the fact that the Bitmoji tools merely transform a “selfie provided by the user” into a different form. Mot. 17; Sub. Compl. ¶ 211; see id. ¶ 210 (noting that Snap’s Terms refer to “Bitmoji avatars that you assemble”). Plaintiffs fail to adequately explain how these allegations are distinguishable from the many cases cited by Snap that have deemed website features that merely convert user-provided content into a different form protected by Section 230. See Smith v. Airbnb, Inc., 316 Or. App. 378, 388 (2021) (Section 230 barred allegations that website created and “add[ed] icons” to listings posted by third parties based on information in listings); Kimzey v. Yelp! Inc., 836 F.3d 1263, 1269 (9th Cir. 2016) (Section 230 7 Nor is there or can there be any allegation that Snap itself encourages the unlawful use of Bitmojis. Indeed, Snap expressly forbids all the harmful and illegal conduct alleged against the Individual Defendants in Snapchat’s Terms of Service. 9 barred allegations that website “designed and created” star ratings based on third-party inputs).8 Third, Plaintiffs ignore that “users ultimately determine what content to post,” such as how to edit a Bitmoji or whether to publish a Bitmoji at all. Mot. 17; Sub. Compl. ¶¶ 211–213, 366; Goddard v. Google, Inc., 640 F. Supp. 2d 1193, 1197 (N.D. Cal. 2009) (citing Roommates, 521 F.3d at 1172). Plaintiffs have no adequate response to the fact that courts consistently have held that tools providing users with suggested content that they can choose to adopt, edit, or reject entirely fall squarely within the protection of Section 230, because the content ultimately chosen becomes the user’s content. Herrick 306 F. Supp. 3d at 589 (CDA protected “Grindr’s drop-down menu for ‘preferred sexual position’” that users could publish); Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1121–24 (9th Cir. 2003) (content provided from drop-down menus were third party content as “no profile has any content until a user actively creates it”); Goddard, 640 F. Supp. 2d at 1197 (same). Indeed, Plaintiffs admit that “Sharp appears to have used Snapchat’s editing features to make his Bitmoji appear more youthful,” highlighting that what allegedly made his Bitmoji harmful was based on his own third-party input. Opp. 8; Sub. Compl. ¶ 293. Plaintiffs’ reliance (Opp. 19–20) on Anthony is misplaced because in that case, the plaintiffs alleged that Yahoo independently “create[d] false profiles.” 421 F. Supp. 2d at 1262– 63. Plaintiffs do not allege (nor could they) that Snap itself created the Individual Defendants’ Bitmojis, nor can Plaintiffs demonstrate that providing content creation tools like Bitmojis was itself unlawful. E.g., Dyroff v. Ultimate Software Grp., Inc., 2017 WL 5665670, at *9–10 (N.D. Cal. Nov. 26, 2017) (Anthony inapplicable where the defendant “did not create or use unlawful content and merely provided its neutral social-network functionalities” or where “it is the users’ voluntary inputs that create the [allegedly harmful] content”), aff’d, 934 F.3d 1093 (9th Cir. 2019); 8 Plaintiffs’ claim that Snap allegedly retains intellectual property rights in Bitmoji avatars is irrelevant to Section 230, Opp. 5 (quoting Sub. Compl. ¶ 210). See, e.g., Small Just. LLC v. Xcentric Ventures LLC, 2014 WL 1214828, at *6–7 (D. Mass. Mar. 24, 2014) (website’s “acquisition of an exclusive license to the [third-party] content” does not “nullify CDA immunity”), aff’d, 873 F3d 313 (1st Cir. 2017); Joseph v. Amazon.com, Inc., 46 F. Supp. 3d 1095, 1106–07 (W.D. Wash. 2014) (where claims target “publishing content first created and posted by third parties,” that website retains “copyrights” in content is irrelevant to Section 230 analysis). 10 Baldino’s Lock & Key Serv., Inc. v. Google LLC, 285 F. Supp. 3d 276, 281 n.3 (D.D.C. 2018) (Anthony inapplicable where “it is the [third parties’] underlying [ ] claim that is allegedly false”), aff’d sub nom, Marshall’s Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263 (D.C. Cir. 2019). Finally, even if Plaintiffs’ Bitmoji allegations somehow could survive Section 230, Plaintiffs do not and cannot advance a viable theory of liability based on Bitmojis alone, because their claims would still “require recourse to [third-party] content”—the Individual Defendants’ communications with C.O.—“to establish liability,” and such allegations are barred as the Court already has held. V.V., 2024 WL 678248, at *8. Indeed, in Shiamili (Opp. 18), the court held that even though an illustration provided by the defendant website accompanying a third-party defamatory posting was itself “offensive,” Section 230 still barred the plaintiff’s defamation lawsuit because the illustration could not “by itself support Shiamili’s claim.” 17 N.Y.3d at 282; Roommates, 521 F.3d at 1167–68 (“we interpret the term ‘development’ as referring not merely to augmenting the content generally, but to materially contributing to its alleged unlawfulness”). No different here, Plaintiffs do not and cannot predicate any claim on Bitmojis alone. B. Plaintiffs’ Remaining Arguments and Cited Cases Are Inapposite. Plaintiffs offer a laundry list of other arguments to avoid Section 230. The Court has already rejected Plaintiffs’ argument that Snap suggests a “but-for test” for Section 230 immunity. Opp. 25–28. The cases Plaintiffs cite in support, many of which the Court already has considered and rejected, are also inapposite. In several cases, the courts held that Section 230 barred the plaintiffs’ claims9 or did not mention Section 230 at all.10 Others involved harms caused by 9 See Hassell v. Bird, 5 Cal. 5th 522, 548 (2018); Cross v. Facebook, Inc., 14 Cal. App. 5th 190, 206–07 (2017). 10 Maynard v. Snapchat, Inc., 313 Ga. 533 (2022). The Court also explained that Maynard is “not helpful” because it “does not even analyze” the CDA. V.V., 2024 WL 678248, at *10. 11 physical products (which the Court has considered and rejected)11 or breach-of-contract claims.12 The remaining cases Plaintiffs cite involved harms that did not depend on third-party content. Plaintiffs cite Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021), but Lemmon reinforced that Section 230 bars claims that “at bottom, depend[] on a third party’s content, without which no liability could have existed.” Id. at 1094. For this reason, the Court already correctly rejected Plaintiffs’ effort to invoke Lemmon. V.V., 2024 WL 678248, at *10. And other courts have repeatedly confirmed that Lemmon and the other cases on which Plaintiffs rely do not apply to claims that depend on “messages and photos sent by” third parties, Doe v. Snap Inc., 2022 WL 2528615, at *14 (S.D. Tex. July 7, 2022), “allegation[s] that the defendant’s website transmitted potentially harmful content,” Herrick, 765 F. App’x at 591, or “theor[ies] of liability” that are “inherently tied to content,” Fields v. Twitter, Inc., 217 F. Supp. 3d 1116, 1126 (N.D. Cal. 2016). Plaintiffs rely again on Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016), but there, the victim alleged the website had a duty to warn users of information it learned from an outside source about a scheme to assault women; there was no relevant third-party content on the site and the claim did not depend on any such content or online messages. Id. at 848–49, 852. Finally, Plaintiffs’ suggestion that Snap violated Connecticut criminal laws forbidding child abuse is baseless and irrelevant. Opp. 20–21. The suggestion that Snap willfully violated the referenced criminal statutes is patently false and unsupported by any allegations. At any rate, Plaintiffs do not seek to hold Snap liable under the cited statutes and even if they did, Section 230 preempts inconsistent state criminal laws. 47 U.S.C. § 230(e)(1) (Section 230 does not preempt enforcement of “any … Federal criminal statute” (emphasis added)); Backpage.com, LLC v. 11 Webber v. Armslist LLC, 70 F.4th 945, 949, 965 (7th Cir. 2023) (decedents killed by firearms; court did not decide whether Section 230 applied); Erie Ins. Co. v. Amazon.com, Inc., 925 F.3d 135, 139–40 (4th Cir. 2019) (headlamp that caused fire); Lee v. Amazon.com, 76 Cal. App. 5th 200, 256 (2022) (tangible products); Bolger v. Amazon.com, 53 Cal. App. 5th 431, 465 (2020) (laptop battery that exploded). The Court already held that Bolger and Lee are irrelevant because “there is no allegation here that C.O. actually purchased any product from” Snap. V.V., 2024 WL 678248, at *11. 12 Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1109 (9th Cir. 2009) (Section 230 barred all tort claims, leaving only claim for breach of contract). There is no breach of contract claim here. 12 McKenna, 881 F. Supp. 2d 1262, 1275 (W.D. Wash. 2012). The Court’s prior order and established caselaw demonstrate that Plaintiffs’ claims remain barred by Section 230. II. The First Amendment Also Bars Plaintiffs’ Claims. The First Amendment provides an independent basis for striking Plaintiffs’ claims against Snap. Plaintiffs are wrong that Snap’s First Amendment challenge cannot be raised in a motion to strike. Courts dismiss claims barred by the First Amendment as early as possible in litigation, including at the pleadings stage. See, e.g., Watters v. TSR, Inc., 715 F. Supp. 819, 822 (W.D. Ky. 1989), aff’d, 904 F.2d 378 (6th Cir. 1990). Sylvester v. Town of Greenwich, 2007 WL 1599744 (Conn. Super. Ct. May 18, 2007), is inapposite. There, the court declined to resolve a First Amendment challenge on a motion to strike because it “raise[d] additional facts … [that] were [not] alleged.” Id. at *3. Plaintiffs’ claims here are barred by the First Amendment as alleged.13 On the substance of Snap’s First Amendment argument, Plaintiffs first argue that the First Amendment does not apply to criminal speech. Opp. 28. But Snap does not argue that the First Amendment protects the criminal speech of the Individual Defendants. Snap argues that Plaintiffs’ claims threaten other users’ lawful speech and Snap’s role as a platform for such speech. To prevent the type of harm alleged here—based on the Individual Defendants’ communications with C.O.—Snap would be forced to moderate, edit, or restrict third-party speech across Snapchat, including broad swaths of protected speech.14 That requirement would plainly infringe on Snap’s “exercise of editorial control” over such content. Goodrich v. Waterbury Republican-Am., Inc., 13 Plaintiffs’ citations do not suggest otherwise. See Woodcock v. Journal Pub. Co., Inc., 230 Conn. 525, 559 n.6 (1994) (Berdon, J., concurring) (recognizing that “courts should resolve free speech litigation more expeditiously whenever possible,” with summary judgment as an example (cleaned up)); Snyder v. Phelps, 562 U.S. 443, 453 (2011) (stating that courts should “make sure that the judgment does not constitute a forbidden intrusion on the field of free expression”). 14 Even if Plaintiffs alleged that any Bitmojis were themselves a form of harmful content, Snap merely disseminated Bitmojis as created and posted by third parties. See supra at 8–10. 13 188 Conn. 107, 132 (1982); see Mot. 21–22.15 Plaintiffs also claim that the First Amendment does not apply to product defect claims. Opp. 29–30. But Plaintiffs have no persuasive response to the numerous cases holding that the First Amendment bars claims based on alleged “product defects” that depend on the allegedly harmful effects of speech. Mot. 22–23. Plaintiffs claim that the First Amendment does not apply to their allegations regarding “rewards” and other speech generated by Snapchat. But those allegations fault Snap for communicating factual information like notifications that users are on communication “streaks,” see, e.g., Sub. Compl. ¶¶ 139–145, allowing users to share their location, see id. ¶ 133, and telling users that certain other accounts are “people [they] may know,” see id. ¶¶ 169, 414. These factual statements do not fall within one of the “few limited areas” of unprotected speech.” Brown v. Ent. Merchs. Ass’n, 564 U.S. 786, 790–91 (2011); Mot. 22–23. Plaintiffs also offer no meaningful response to Snap’s many cited cases holding that the First Amendment bars their claims based on Snapchat’s alleged inadequate age verification and parental controls. Mot. 23–24.16 Plaintiffs attempt to minimize those cases because they involved statutes rather than tort claims. Opp. 32 n.5. But if anything, First Amendment concerns are more acute here given the risk of restricting speech through free-wheeling tort liability rather than a tailored law drafted by the political branches. James v. Meow Media, 300 F.3d 683, 697 (6th Cir. 2002); Mot 22. The foregoing analysis applies with equal force to Plaintiffs’ failure to warn claim, which would require Snap to warn about the dangers of speech on Snapchat. Courts have repeatedly rejected similar efforts to get around the First Amendment. See Watters, 715 F. Supp. at 822 n.1 (First Amendment barred claim that creators of role-playing game failed to warn “of possible 15 Weirum v. RKO Gen., Inc., 15 Cal. 3d 40 (1975), and Risenhoover v. England, 936 F. Supp. 392 (W.D. Tex. Apr. 2, 1996), are irrelevant because both cases involved speech by the defendants that harmed others. See McCollum v. CBS, Inc., 202 Cal. App. 3d 989, 1005 (1988) (noting that in Weirum, radio station “repeatedly encouraged listeners” to drive at unsafe speeds to win a contest); Risenhoover, 936 F. Supp. at 409–10 (media entities alerted cult members to law enforcement raid despite knowing they were heavily armed, causing shootout). 16 See NetChoice, LLC v. Griffin, 2023 WL 5660155, at *1–2, *17 (W.D. Ark. Aug. 31, 2023); NetChoice, LLC v. Bonta, 2023 WL 6135551, at *10–18 (N.D. Cal. Sept. 18, 2023); NetChoice, LLC v. Yost, 2024 WL 555904, at *1–2, *14 (S.D. Ohio Feb. 12, 2024). 14 consequences of reading or playing the game materials”); Winter v. G.P. Putnam’s Sons, 938 F.2d 1033, 1037–38 (9th Cir. 1991) (refusing to “introduce a duty we have just rejected by renaming it a ‘mere’ warning label” where plaintiffs’ claims raised First Amendment concerns). Plaintiffs’ cited cases do not show otherwise, as they all concerned physical harms caused by products.17 III. Plaintiffs’ Tort Claims Fail to Allege Proximate Causation. Plaintiffs’ product liability and negligence claims fail for the independent reason that the Substituted Complaint does not sufficiently plead proximate causation. Contrary to Plaintiffs’ assertion (Opp. 33), Connecticut courts consistently have held that the “‘inextricably bound concepts of proximate causation and duty’ are ones that can be challenged by way of a motion to strike.” Muisener v. Saranchak, 2008 WL 853773, at *1 (Conn. Super. Ct. Mar. 13, 2008); see Ganim v. Smith & Wesson Corp., 258 Conn. 313, 364 (2001). Connecticut courts have affirmed dismissal on a motion to strike where, as here, unlawful acts of third parties broke the causal chain. Burns v. Gleason Plant Sec., Inc., 10 Conn. App. 480, 485–86 (1987). As to the substance of their causation theory, Plaintiffs offer only one argument: that C.O.’s harms were foreseeable, so the superseding unlawful acts of the Individual Defendants do not break the chain of causation. Opp. 34. This argument fails. Courts repeatedly have held that providers of online services like Snapchat, which have hundreds of millions of users, cannot foresee that a small fraction of those users may engage in specific unlawful misconduct, even if potential misuse in general may be foreseeable. As the Supreme Court has explained, “bad actors ... are able to use platforms like [Snapchat] for illegal—and sometimes terrible—ends. But the same could be said of cell phones, email, or the internet generally.” Twitter, Inc. v. Taamneh, 598 U.S. 471, 499 (2023). “Yet, we generally do not think that internet or cell service providers incur culpability merely for providing their services to the public writ large.” Id.; see Crosby v. Twitter, Inc., 921 F.3d 617, 625 (6th Cir. 2019) (“[Twitter] ... do[es] not proximately cause 17 See In re Factor VIII or IX Concentrate Blood Prods. Litig., 25 F. Supp. 2d 837, 839 (N.D. Ill. 1998) (infections contracted from blood products); Cty. of Annapolis, Maryland, 2022 WL 4548226, at *1 (D. Md. Sept. 29, 2022) (climate effects caused by fossil fuels); Allen v. Am. Cyanamid, 527 F. Supp. 3d 982, 985 (E.D. Wis. 2021) (lead poisoning from ingesting paint). 15 everything that an individual may do” while using its service).18 The same is true here. C.O.’s specific alleged harms were not foreseeable merely because Snap was allegedly aware that misconduct generally might occur. Plaintiffs do not allege that Snap had notice of the specific bad actor Individual Defendants or any ability to foresee the specific harms C.O. might suffer. In these circumstances, Plaintiffs have not adequately alleged causation. “To hold otherwise would be to convert the imperfect vision of reasonable foreseeability into the perfect vision of hindsight.” Burns, 10 Conn. App. at 485–86. IV. The Product Liability Claims Must Be Stricken. A. Snapchat Is Not a Product under the CPLA. Plaintiffs ignore almost all of Snap’s cited cases that have repeatedly held that Snapchat and other online communication services are “services” rather than “products” and thus are not subject to product liability law. Mot. 27–28. Plaintiffs also offer no substantive response to Ziencik v. Snap, Inc., 2023 WL 2638314 (C.D. Cal. Feb. 3, 2023), which squarely held that “Snapchat is more like a service than a product” because product liability does not apply to “non- tangible objects like apps.” Id. at *4; see Opp. 38–39. Plaintiffs instead rely on inapplicable out- of-state decisions. Plaintiffs rely, for example, on Maynard v. Snapchat, Inc., 870 S.E.2d 739 (Ga. 2022), but the parties there did not brief, and the court did not address, whether Snapchat is a “product” for product liability purposes, because the plaintiffs did not plead any product liability claims. Plaintiffs also point to Brookes v. Lyft Inc., 2022 WL 19799628 (Fla. Cir. Ct. Sept. 30, 2022), which involved a challenge to an application that “required [the] Defendant to take his eyes off the road” and caused a crash. But the claim there was “not based on expressions or ideas transmitted by the [ ] application”—it was based on the app directly leading to a crash. Id. at *1– 4. Plaintiffs’ claims are more closely analogous to the many cases that have specifically held that 18 Plaintiffs suggest that cases like Taamneh and Crosby are inapplicable because they involved claims under the Anti-Terrorism Act. Opp. 35. But both cases expressly relied on “common-law principles.” Taamneh, 598 U.S. at 497; see Crosby, 921 F.3d at 623. 16 Snapchat, Facebook, and other communications platforms are “services” rather than “products.”19 Plaintiffs also do not meaningfully address Connecticut courts’ consistent holdings that product liability claims are restricted to “tangible personal property,” Travelers Prop. & Cas. Ins. Corp. v. Yankee Gas Servs. Co., 2000 WL 775558, at *5 (Conn. Super. Ct. May 19, 2000), and “item[s], thing[s], [and] commodit[ies]” with a “physical existence,” Ferrucci v. Atlantic City Showboat, Inc., 51 F. Supp. 2d 129, 133 (D. Conn. 1999); Wilson v. Midway Games, Inc., 198 F. Supp. 2d 167, 174 (D. Conn. 2002) (video game cannot be a “product” because it is “intangible”). Plaintiffs’ sole authority pushing back on the tangibility requirement discuss electricity starting fires. But as the Restatement explains, electricity is a “force[],” the use of which may be “sufficiently analogous to the distribution and use of tangible personal property” to be treated as a product. Restatement (Third) of Torts: Prod. Liab. § 19. Plaintiffs point to no authorities that treat as “products” anything analogous to Snapchat—an intangible communication service.20 Finally, Plaintiffs’ discussion of “public policy factors” is irrelevant. Opp. 36. Plaintiffs offer no authority indicating that the Court can deem something a “product” based on public policy alone. The cases Plaintiffs cite in support of their public policy argument all involved physical injuries caused by tangible objects, and several nonetheless rejected product liability claims.21 B. Snap Is Not a “Product Seller” under the CPLA. Even if Snapchat were a “product,” Snap is not a “product seller.” Plaintiffs offer no explanation as to how their claims satisfy the rule—consistently affirmed by Connecticut courts— 19 Grossman v. Rockaway Twp., 2019 WL 2649153, at *15 (N.J. Super. Ct. June 10, 2019) (Snapchat); Doe. v. Facebook, Inc., No 2019-16262 (151st Dist. Ct., Harris Cnty., Tex. Oct. 4, 2019) (Facebook); Jacobs v. Meta Platforms, Inc., 2023 WL 2655586, at *4 (Cal. Super. Ct. Mar. 10, 2023) (Facebook); Jane Doe No. 1 v. Uber Techs., Inc., 79 Cal. App. 5th 410 (2022) (Uber). 20 Plaintiffs note that some courts have treated software as a “good” under the UCC (Opp. 37), but they cite no Connecticut decisions holding that software is a “product” under the CPLA. 21 Leahey v. Lawrence D. Coon & Sons, Inc., 2006 WL 2130438, at *1, *3–4 (Conn. Super. Ct. July 14, 2006) (shed); Stephens v. Sid’s Auto Sales & Used Parts Co., 2014 WL 2262703, at *1, *3 (Conn. Super. Ct. Apr. 29, 2014) (truck); Potter v. Chicago Pneumatic Tool Co., 241 Conn. 199, 204 (1997) (hand tools); Stanton v. Carlson Sales, Inc., 45 Conn. Supp. 531, 533 (1998) (punch press); Bifolck v. Philip Morris, Inc., 324 Conn. 402, 408–09 (2016) (cigarettes). 17 that product liability applies only “when a sale of a product is a principal part of the transaction,” Normandy v. Am. Med. Sys., Inc., 262 A.3d 698, 704 (Conn. 2021), and where the defendant was “‘engaged in the business of selling’ [allegedly defective] product[s],” Burkert v. Petrol Plus of Naugatuck, Inc., 216 Conn. 65, 72 (1990). Plaintiffs’ sole citation confirms that a plaintiff must show that “the thing which caused [the] harm was ... sold, leased, or bailed.” Saraceno v. Am. Ladders & Scaffolds, 2020 WL 3064470, at *2 (Conn. Super. Ct. Apr. 22, 2020). Snap has not “sold, leased, or bailed” Snapchat to anyone, nor do Plaintiffs allege otherwise. V. The Negligence Claims Must Be Stricken Because Snap Did Not Owe a Duty to C.O. Plaintiffs’ negligence claims independently fail because the Substituted Complaint does not allege a cognizable duty. The Opposition fails to distinguish the many cases that have refused to impose duties on communication services to prevent third-party misconduct. Mot. 29–32. The Opposition conclusorily asserts that “Plaintiffs allege harms not involving third-party conduct at all” (Opp. 42), but the Court has already recognized that “each of the plaintiffs’ causes of action arise out of” Snap’s alleged “fail[ure] to regulate content provided by third parties such as Sharp and Rodriguez when they were using [Snap’s] service.” V.V., 2024 WL 678248, at *11. Plaintiffs cannot overcome “the general prohibition against imposing upon an individual” a “duty to control the conduct of a third party.” Cannizzaro v. Marinyak, 312 Conn. 361, 368 (2014). Plaintiffs’ allegations that Snap knew or should have known that Snapchat was “addictive” and “facilitating communication between minor users and unknown adult users” (Opp. 42–43) are precisely the type of allegations that other courts have held “do[] not sufficiently allege misfeasance such that a duty [to prevent third-party harm] should attach.” In re Soc. Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig. (“Soc. Media”), 2023 WL 7524912 at *37–39 (N.D. Cal. Nov. 14, 2023) (allegations that online services “sought to increase minors’ use of their platforms while ‘knowing or having reason to know’ that [bad actors] also used the sites” were “insufficient” to trigger a duty under all fifty states’ laws); see Jane Doe No. 1, 79 Cal. App. 5th at 428; Ziencik, 2023 WL 2638314, at *5. The same is true here. Plaintiffs’ argument that Snap had a heightened duty to protect its minor users is similarly 18 unavailing. “[A]bsent a special relationship of custody or control, there is no duty to protect a third person from the conduct of another.” Fraser v. United States, 236 Conn. 625, 632 (1996). Plaintiffs cite no authority, nor is Snap aware of any, where a court has recognized such a duty arising solely out of a minor’s use of a defendant’s online communication services. See Opp. 45– 46. To the contrary, courts have made clear that websites have no special relationship with their users for duty purposes. See, e.g., Dyroff, 934 F.3d at 1101 (“[N]o special relationship [exists] between Facebook and its users.” (citing Klayman v. Zuckerberg, 753 F.3d 1354, 1359–60 (D.C. Cir. 2014)); Doe No. 14 v. Internet Brands, Inc., 2016 WL 11824793, at *5 (C.D. Cal. Nov. 14, 2016) (“[T]he Court easily concludes that Internet Brands did not have a special relationship with [third-party bad actors].”); Soc. Media, 2023 WL 7524912, at *35 (same). In any case, Plaintiffs’ duty theory fails because imposing affirmative duties of the type outlined by Plaintiffs would be contrary to public policy. Plaintiffs nowhere address, let alone distinguish, this case from the many others holding that imposing a duty of care on defendants who disseminate expressive content would offend public policy by stifling free expression. See Mot. 31–32. “No website could function if a duty of care was created when a website facilitates communication, in a content-neutral fashion, of its users’ content.” Dyroff, 934 F.3d at 1101. VI. The CUTPA Claim Must Be Stricken. Plaintiffs fail to rebut Snap’s cited authority showing that if the Court allows Plaintiffs’ product liability claim to proceed, it must strike their CUTPA claim. Mot. 32–33. The court in Gerrity v. R.J. Reynolds Tobacco Co., 263 Conn. 120 (2003), allowed a CUTPA claim to “be asserted in conjunction with [its] product liability act claim” o