arrow left
arrow right
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
  • S.L.B., ET AL. VS META PLATFORMS, INC., ET AL. Claims Involving Mass Tort (General Jurisdiction) document preview
						
                                

Preview

1 PANISH | SHEA | RAVIPUDI LLP BRIAN J. PANISH, State Bar No. 116060 2 panish@panish.law RAHUL RAVIPUDI, State Bar No. 204519 3 rravipudi@panish.law JESSE CREED, State Bar No. 272595 4 jcreed@panish.law 11111 Santa Monica Boulevard, Suite 700 5 Los Angeles, California 90025 Telephone: 310.477.1700 6 Facsimile: 310.477.1699 7 MORGAN & MORGAN EMILY JEFFCOTT (pro hac vice)* 8 ejeffcott@forthepeople.com NARMEEN NKEITI (pro hac vice)* 9 nnkeiti@forthepeople.com 220 West Garden Street, 9th Floor 10 Pensacola, Florida 32502 Telephone: 850.316.9100 11 Attorneys for Plaintiffs, 12 13 SUPERIOR COURT OF THE STATE OF CALIFORNIA 14 FOR THE COUNTY OF LOS ANGELES 15 S.L.B., A Minor by and through her Guardian CASE NO. ad Litem, Donna Bookhart and SHARON 16 RUBLE as Personal Representative to COMPLAINT FOR DAMAGES Deceased Plaintiff JADEN HALEY RUBLE 17 DEMAND FOR JURY TRIAL 18 Plaintiffs, v. 19 META PLATFORMS, INC., a Delaware 20 Corporation; FACEBOOK HOLDINGS, LLC, a Delaware Limited Liability Company; 21 FACEBOOK OPERATIONS, LLC, a 22 Delaware Limited Liability Company; INSTAGRAM, LLC, a Delaware Limited 23 Liability Company; SNAP INC., a Delaware Corporation; TIKTOK, INC., a California 24 Corporation; BYTEDANCE, INC., a Delaware 25 Corporation; and DOES 1 through 100, inclusive, 26 Defendants. 27 28 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL TABLE OF CONTENTS 1 Page 2 I. INTRODUCTION .................................................................................................................... 3 3 II. JURISDICTION AND VENUE ............................................................................................... 5 4 III. PARTIES .................................................................................................................................. 6 5 IV. GENERAL FACTUAL ALLEGATIONS.............................................................................. 26 6 A. Teenagers Are Particularly Vulnerable to the Perils of Excessive Social Media Use .............................................................................................................................. 26 7 B. Unknown and Innumerable Product Defects .............................................................. 33 8 C. Defendants Knowingly Exploit Teenage Vulnerabilities for Unjust Gain ................. 35 9 D. Defendants’ Business Models Encourage Problematic Use to Maximize Screen 10 Time, Thereby Increasing Revenue ............................................................................ 35 11 E. Plaintiffs Expressly Disclaim Any and All Claims Seeking to Hold Defendants Liable as the Publisher or Speaker of Any Content Provided, Posted, or Created 12 by Third Parties........................................................................................................... 36 13 V. PLAINTIFF-SPECIFIC ALLEGATIONS ............................................................................. 37 14 1. PLAINTIFF S.L.B. ................................................................................................................. 37 15 VI. CAUSES OF ACTION ........................................................................................................... 40 16 VII. TIMELINESS AND TOLLING OF STATUTES OF LIMITATIONS ............................... 106 17 VIII. PRAYER FOR RELIEF ....................................................................................................... 106 18 IX. DEMAND FOR JURY TRIAL ............................................................................................ 108 19 20 21 22 23 24 25 26 27 28 2 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 Plaintiffs S.L.B., A Minor, by and through her Guardian Ad Litem, Donna Bookhart and 2 Sharon Ruble as Personal Representative to deceased Plaintiff Jaden Haley Ruble, (“Plaintiffs”) 3 allege upon personal knowledge and information and belief, based upon, inter alia, the investigation 4 made by and through their attorneys as to all other matters, as follows: 5 I. INTRODUCTION 6 1. This matter arises from an egregious breach of the public trust by Defendants Meta 7 Platforms, Inc. (hereinafter, “Meta Platforms”), Facebook Holdings, LLC (hereinafter, “Facebook 8 1”), Facebook Operations, LLC (hereinafter, “Facebook 2”), Instagram, LLC (hereinafter, 9 “Instagram,” and collectively with Meta Platforms, Facebook 1, and Facebook 2, “Meta”), Snap Inc. 10 (hereinafter, “Snap”), TikTok, Inc. (hereinafter, “TikTok”), ByteDance, Inc. (hereinafter, 11 “ByteDance,” and collectively with Snap and TikTok, “Non-Meta”), and DOES 1-100 (collectively 12 with “Non-Meta” and “Meta,” “Defendants”). 13 2. Over the last two decades, more and more of our lives have moved onto social media 14 platforms and other digital public spaces. In this vast, still largely unregulated universe of digital 15 public spaces, which are privately owned and primarily run for profit, there exists a tension between 16 what is best for technology companies’ profit margins and what is best for the individual user 17 (especially the predictable adolescent user) and society. Business models are often built around 18 maximizing user engagement without regard to whether users engage with the platform and one 19 another in safe and healthy ways. Technology companies focus on maximizing time spent, not time 20 well spent. In recent years, there has been growing concern about the impact of digital technologies, 21 particularly social media, on the mental health and wellbeing of adolescents. Many researchers argue 22 that Defendants’ social media products facilitate cyberbullying, contribute to obesity and eating 23 disorders, instigate sleep deprivation to achieve around-the-clock platform engagement, encourage 24 children to negatively compare themselves to others, and develop a broad discontentment for life. 25 They have been connected to depression, anxiety, self-harm, and ultimately suicidal ideation, suicide 26 attempts, and completed suicide. 27 / / / 28 / / / 3 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 3. Defendants have intentionally designed their products to maximize users’ screen time, 2 using complex algorithms designed to exploit human psychology and driven by advanced computer 3 algorithms and artificial intelligence available to two of the largest technology companies in the 4 world. Defendants have progressively modified their products to promote problematic and excessive 5 use that they know threatens the actuation of addictive and self-destructive behavioral patterns. 6 4. Excessive screen time is harmful to adolescents’ mental health, sleep patterns, 7 emotional well-being. Defendants’ products lack any warnings that foreseeable product use can 8 disrupt healthy sleep patterns, or specific warnings to parents when their child’s product usage 9 exceeds healthy levels or occurs during sleep hours, rendering the platforms unreasonably dangerous. 10 Reasonable and responsible parents are not able to accurately monitor their child’s screen time 11 because most adolescents own or can obtain access to mobile devices and engage in social media use 12 outside their parents’ presence. 13 5. Defendants do not charge their users to use their platforms but instead, receive money 14 from advertisers who pay a premium to target advertisements to specific categories of people as 15 studied and sorted by Defendants’ algorithms. Thus, Defendants generate revenue based upon the 16 total time spent on the application, which directly correlates with the number of advertisements that 17 can be shown to each user. 18 6. Rather than making meaningful changes to safeguard the health and safety of its users, 19 Defendants have consistently chosen to prioritize profit over safety by continuing to implement and 20 require its users to submit to product components that increase the frequency and duration of users’ 21 engagement, resulting in the pernicious harms described in greater detail below. 22 7. Plaintiffs bring claims of strict liability based upon Defendants’ defective design of 23 their social media products that renders such products not reasonably safe for ordinary consumers in 24 general and minors in particular. It is technologically feasible to design social media products that 25 substantially decrease the incidence and magnitude of harm to ordinary consumers and minors arising 26 from their foreseeable use of Defendants’ products with a negligible increase in production cost. It is 27 also technologically feasible to design and implement effective age and identity verification protocols 28 to ensure that only children of an appropriate age may have access to these products. 4 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 8. Plaintiffs also bring claims for strict liability based on Defendants’ failure to provide 2 adequate warnings to minor users and their parents of the danger of the mental, physical, and 3 emotional harms arising from the foreseeable use of their social media products. 4 9. Plaintiffs also bring claims for common law negligence arising from Defendants’ 5 unreasonably dangerous social media products and their failure to warn of such dangers. Defendants 6 knew or, in the exercise of ordinary care, should have known that their social media products were 7 harmful to a significant percentage of their minor users and failed to re-design their products to 8 ameliorate these harms or warn minor users and their parents of dangers arising out of the foreseeable 9 use of their products. Defendants intentionally created an attractive nuisance to children, but 10 simultaneously failed to provide adequate safeguards from the harmful effects they knew were 11 occurring. 12 10. As is now generally known, in Fall 2021, Frances Haugen, a former Facebook 13 employee turned whistleblower, came forward with internal documents showing that Meta was aware 14 that its platforms and products cause significant harm to its users, especially our children. Non-Meta 15 Defendants’ products—their social media platforms—have similar designs and mechanisms of action 16 resulting in similar addictive qualities and harmful outcomes to minor users. To this day, the addictive 17 qualities of Defendants’ products and their harmful algorithms are not fully known or understood by 18 minor users or their parents. 19 II. JURISDICTION AND VENUE 20 11. This Court has jurisdiction over this action pursuant to CAL. CODE CIV. PRO. §§ 395 21 and 410.10. 22 12. This Court has subject matter jurisdiction over all causes of action alleged in this 23 complaint pursuant to CAL. CONST. art. VI, § 10, and this is a court of competent jurisdiction to grant 24 the relief requested. 25 13. This Court has general personal jurisdiction over Defendants because each is 26 headquartered and has its principal place of business in the State of California and has continuous 27 and systematic operations within the State of California. 28 5 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 14. This Court also has specific personal jurisdiction over Defendants because they 2 actively do business in Los Angeles County and the State of California. Defendants have purposely 3 availed themselves of the benefits, protections, and privileges of the laws of the State of California 4 through the design, development, programming, manufacturing, promotion, marketing, and 5 distribution of the products at issue and have purposely directed their activities toward this state. 6 Defendants have sufficient minimum contacts with this state to render the exercise of jurisdiction by 7 this Court permissible. 8 15. Venue is proper in Los Angeles Superior Court pursuant to CAL. CODE CIV. PRO. §§ 9 395 and 395.5 because Defendants regularly conduct business and certain of Defendants’ liability 10 arose in Los Angeles County. 11 III. PARTIES 12 Plaintiffs 13 16. Plaintiff Donna Bookhart and S.L.B., A Minor, are residents of the City of Rock Hill, 14 South Carolina. S.L.B. is the minor daughter of Donna Bookhart. 15 17. Plaintiff Sharon Ruble is a resident of Spartanburg, South Carolina. Sharon Ruble is 16 the Personal Representative to deceased Plaintiff Jaden Haley Ruble. 17 Defendant Meta Platforms, Inc. 18 18. Meta Platforms is a multinational technology conglomerate, having its principal place 19 of business in Menlo Park, California. Meta develops and maintains social media platforms, 20 communication platforms, and electronic devices.1 Meta Platforms was originally incorporated in 21 Delaware on July 29, 2004, as “TheFacebook, Inc.” On September 20, 2005, the company changed 22 its name to “Facebook, Inc.” On October 28, 2021, the company assumed its current designation. 23 While Plaintiffs have attempted to identify the specific Meta Platforms subsidiary(ies) that committed 24 each of the acts alleged in this Complaint, Plaintiffs were not always able to do so, in large part due 25 to ambiguities in Meta Platforms’ and its subsidiaries’ own documents, public representations, and 26 27 1 These platforms and products include Facebook (its self-titled app, Messenger, Messenger Kids, Marketplace, Workplace, etc.), Instagram (and its self-titled app), and a line of electronic virtual reality 28 devices called Oculus Quest (soon to be renamed “Meta Quest”). 6 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 lack of public information. However, upon information and belief, Meta Platforms oversees the 2 operations of its various platforms and subsidiaries, some of which have been identified and are listed 3 below. For this reason, unless otherwise specified, the shorthand “Meta” contemplates the apparent 4 control that Meta Platforms wields over its subsidiaries’ overall operations and, therefore, further 5 refers to its various subsidiaries and predecessors. To the extent this assumption is incorrect, the 6 knowledge of which Meta Platforms’ subsidiary, current or former, is responsible for specific conduct 7 is knowledge solely within Meta’s possession, the details of which Plaintiffs should be permitted to 8 elucidate during the discovery phase. 9 19. Meta Platforms’ subsidiaries include but may not be limited to: Facebook 1 10 (Delaware); Facebook 2 (Delaware); Facebook Payments. Inc. (Florida); Facebook Technologies, 11 LLC (Delaware); FCL Tech Limited (Ireland); Instagram (Delaware); Novi Financial, Inc. 12 (Delaware); Runways Information Services Limited (Ireland); Scout Development LLC (Delaware); 13 Siculus (Delaware); and a dozen other entities whose identity or relevance is presently unclear. 14 Subsidiary Meta Defendants 15 20. Facebook 1 was incorporated in Delaware on March 11, 2020, and is a wholly owned 16 subsidiary of Meta Platforms. Facebook 1 is primarily a holding company for entities involved in 17 Meta Platforms’ supporting and international endeavors, and its principal place of business is in 18 Menlo Park, California. Meta Platforms is the sole member of this LLC Defendant. 19 21. Facebook 2 was incorporated in Delaware on January 8, 2012, and is a wholly owned 20 subsidiary of Meta Platforms. Facebook 2 is likely a managing entity for Meta Platforms’ other 21 subsidiaries, and its principal place of business is in Menlo Park, California. Meta Platforms is the 22 sole member of this LLC Defendant. 23 22. Instagram was founded by Kevin Systrom and Mike Krieger in October 2010. In April 24 2012, Meta Platforms purchased the company for $1 billion (later statements from Meta Platforms 25 have indicated the purchase price was closer to $2 billion). Meta Platforms reincorporated the 26 company on April 7, 2012, in Delaware. Currently, the company’s principal place of business is in 27 in Menlo Park, CA. Instagram is a social media platform tailored for photo and video sharing. Meta 28 Platforms is the sole member of this LLC Defendant. 7 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 23. By admission, Facebook and Instagram are products (Meta’s Vice President of 2 Messaging Products Loredana Crisan, Celebrating 10 Years of Messenger With New Features 3 (August 25, 2021, last visited July 29, 2022, at 1:10 PM CST) https://about.fb.com/news/2021/08/m 4 essenger-10th-birthday/), the safety of which of was not duly addressed prior to public distribution 5 (Our Progress Addressing Challenges and Innovating Responsibly (September 21, 2021, last visited 6 July 29, 2022, at 1:17 PM CST) https://about.fb.com/news/2021/09/our-progress-addressing- 7 challenges-and-innovating-responsibly/). 8 24. Meta knowingly exploited its most vulnerable users—children worldwide—to drive 9 corporate profit. Meta operates the world’s largest family of social networks, enabling billions of 10 users worldwide to connect, view, and share content through mobile devices, personal computers, 11 and virtual reality headsets. A user does not have to pay to create an account. Instead of charging 12 account holders to access the platform, Meta became one of the world’s most valuable companies 13 from the sale of advertisement placements to marketers across its various platforms and applications. 14 For example, upon information and belief, Meta generated $69.7 billion from advertising in 2019, 15 more than 98% of its total revenue for the year. Meta can generate such revenues by marketing its 16 user base to advertisers. Meta collects and analyzes data to assemble virtual dossiers on its users, 17 covering hundreds if not thousands of user-specific data segments. This data collection and analysis 18 allows advertisers to micro-target advertising and advertising dollars to very specific categories of 19 users, who can be segregated into pools or lists using Meta’s data segments. Only a fraction of these 20 data segments come from content that is explicitly designated by users for publication or explicitly 21 provided by users in their account profiles. Many of these data segments are collected by Meta 22 through surveillance of each user’s activity on the platform and off the platform, including behavioral 23 surveillance that users are not even aware of, like navigation paths, watch time, and hover time. At 24 bottom, the larger Meta’s user database grows, the more time the users spend on the database, and 25 the more detailed information that Meta can extract from its users, the more money Meta makes. 26 25. As of October 2021, Facebook had roughly 2.91 billion monthly active users, thus 27 reaching 59% of the world’s social networking population, the only social media platform to reach 28 over half of all social media users. Instagram has become the most popular photo-sharing social media 8 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 platform amongst teenagers and young adults in the United States, with over 57 million users below 2 the age of eighteen, meaning that 72 percent of America’s youth use Instagram. 11 percent of parents 3 in the U.S. know their child between the ages of 9 and 11 uses Instagram.2 Likewise, 6 percent of 4 parents in the U.S. know their child between the ages of 9 and 11 uses Facebook.3 5 26. Two Meta products, the www.Facebook.com (“Facebook”) and www.Instagram.com 6 (“Instagram”) websites and respective interrelated apps (collectively “Meta 2”), rank among the most 7 popular social networking products, with more than two billion combined users worldwide. It is 8 estimated that nine out of ten teens use social media platforms, with the average teen using the 9 platforms roughly three hours per day. Given the delicate, developing nature of the teenage brain and 10 Meta’s creation of social media platforms designed to be addictive, it comes as no surprise that we 11 are now grappling with the ramifications of Meta’s growth-at-any-cost approach, to wit, a generation 12 of children physiologically entrapped by products the effects of which collectively result in long- 13 lasting adverse impact on their rapidly evolving and notoriously precarious mental health. 14 27. Meta, as originally conceived, ostensibly functioned like an enormous virtual bulletin 15 board, where content was published by authors. But Meta has evolved over time with the addition of 16 numerous features and products designed by Meta to engage users. The earliest of these—the search 17 function and the “like” button—were primarily user-controlled features. In more recent years, 18 however, Meta has taken a more active role in shaping the user-experience on the platform with more 19 complex features and products. The most visible of these are curated recommendations, which are 20 pushed to each user in a steady stream as the user navigates the website, and in notifications sent to 21 the user’s smartphone and email addresses when the user is disengaged with the platform. These 22 proprietary Meta products include News Feed (a newsfeed of stories and posts published on the 23 platform, some of which are posted by your connections, and others that are suggested for you by 24 Meta), People You May Know (introductions to persons with common connections or background), 25 Suggested for You, Groups You Should Join, and Discover (recommendations for Meta groups to 26 2 27 Katherine Schaeffer, 7 facts about Americans and Instagram, Pew Research Center (Oct. 7, 2021), https://www.pewresearch.org/fact-tank/2021/10/07/7-facts-about-americans-and-instagram/. 28 3 Id. 9 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 join). These curated and bundled recommendations are developed through sophisticated algorithms. 2 As distinguished from the earliest search functions that were used to navigate websites during the 3 Internet’s infancy, Meta’s algorithms are not based exclusively on user requests or even user inputs. 4 Meta’s algorithms combine the user’s profile (e.g., the information posted by the user on the platform) 5 and the user’s dossier (the data collected and synthesized by Meta to which Meta assigns categorical 6 designations), make assumptions about that user’s interests and preferences, make predictions about 7 what else might appeal to the user, and then make very specific recommendations of posts and pages 8 to view and groups to visit and join based on rankings that will optimize Meta’s key performance 9 indicators. 10 28. A user’s “feed” on both Facebook and Instagram is comprised of an endless series of 11 photos, videos, text captions, and comments posted by accounts that the user follows, along with 12 advertising and content specifically selected and promoted by Instagram and Facebook. 13 29. Instagram also features a “discover” page where a user is shown an endless feed of 14 content that is selected by an algorithm designed by Instagram based upon the users’ data profile: 15 demographics, prior activity in the platform, and other data points. Meta has added similar features 16 to Facebook on the apps “menu” and “watch” sections. 17 30. Engineered to meet the evolving demands of the “attention economy,”4 a term used to 18 describe the supply and demand of a person’s attention, which is a highly valuable commodity for 19 internet websites, in February 2009, Meta introduced perhaps its most conspicuous effort to addict 20 users—intermittent variable rewards (“IVR”): its “Like” button; Instagram launched that same year 21 and came ready-made with a like function shaped as a heart. Additional features of Meta’s IVR 22 include its delay-burst notification system, comments, posts, shares, and other dopamine-triggering 23 content. Instagram’s notification algorithm delays notifications to deliver them in spaced-out, larger 24 bursts. Facebook likely uses a similar feature. These designs take advantage of users’ dopamine- 25 driven desire for social validation and optimize the balance of negative and positive feedback signals 26 to addict users. 27 4 The business model is simple: The more attention a platform can pull from its users, the more 28 effective its advertising space becomes, allowing it to charge advertisers more. 10 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 31. IVR is a method used to addict a user to an activity by spacing out dopamine triggering 2 stimuli with dopamine gaps—a method that allows for anticipation and craving to develop and 3 strengthens the addiction with each payout. The easiest way to understand this term is by imagining 4 a slot machine. You pull the lever (intermittent action) with the hope of winning a prize (variable 5 reward). In the same way, you refresh Defendants’ feeds, endure the brief delay, and then learn if 6 anyone has tagged you in a photo, mentioned you in a post, sent you a message, or liked, commented 7 on, or shared either of your posts. As explained below, Meta (and, upon information and belief, all 8 Defendants) space out notifications into multiple bursts (dopamine gaps) rather than notifying users 9 in real-time to maximize the platforms’ addictiveness. 10 32. Over the past decade or so, Meta has added features and promoted the use of auto- 11 playing short videos and temporary posts on Facebook and Instagram, with the former being referred 12 to as “Reels,” while the latter is referred to as Instagram “Stories.” 13 33. Facebook and Instagram notify users by text and email of activity that might be of 14 interest, which is designed to and does prompt users to open Facebook and Instagram and be exposed 15 to content selected by the platforms to maximize the length of time and amount of content viewed by 16 the user. Facebook and Instagram include many other harm-causing features, as discussed below. 17 34. Equipped with ample information about the risks of social media, the ineffectiveness 18 of its age-verification protocols, and the mental processes of teens, Meta has expended significant 19 effort to attract preteens to its products, including substantial investments in designing products that 20 would appeal to children ages 10-to-12. Meta views pre-teens as a valuable, unharnessed commodity, 21 so valuable that it has contemplated whether there is a way to engage children during play dates.5 22 Meta’s unabashed willingness to target children—in the face of its conscious, long-standing, plainly 23 deficient age-verification protocols—demonstrates the depths to which Meta is willing to reach to 24 maintain and increase its profit margin. 25 26 27 Georgia Wells and Jeff Horwitz, Facebook’s Effort to Attract Preteens Goes Beyond Instagram Kids, 5 Documents Show (2021), https://www.wsj.com/articles/facebook-instagram-kids-tweens-attract- 28 11632849667. 11 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 35. Faced with the potential for reduction in value due to its declining number of users, in 2 or around early 2018, Meta (and likely Meta 2) revamped its interface to transition away from 3 chronological ranking, which organized the interface according to when content was posted or sent, 4 to prioritize Meaningful Social Interactions, or “MSI,” which emphasizes users’ connections’ 5 interactions (e.g., likes and comments) and gives greater significance to the interactions of 6 connections that appeared to be the closest to users. To effectuate this objective, Facebook developed 7 and employed an “amplification algorithm” to execute engagement-based ranking, which considers 8 a post’s likes, shares, and comments, as well as a respective user’s past interactions with similar 9 content, and exhibits the post in the user’s newsfeed if it otherwise meets certain benchmarks. The 10 algorithm covertly operates on the proposition that intense reactions invariably compel attention. As 11 it measures reactions and contemporaneously immerses users in the most reactive content, and 12 negative content routinely elicits passionate reactions, the algorithm effectively works to steer users 13 toward the most negative content. 14 36. Meta CEO Zuckerberg publicly recognized this in a 2018 post, in which he 15 demonstrated the correlation between engagement and sensational content that is so extreme that it 16 impinges upon Meta’s own ethical limits, with the following chart:6 17 18 19 20 21 22 23 24 25 26 27 6 Mark Zuckerberg, A Blueprint for Content Governance and Enforcement, FACEBOOK, 28 https://www.facebook.com/notes/751449002072082/ (last visited January 8, 2022). 12 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 37. The algorithm controls what appears in each user’s News Feed and promotes content 2 that is objectionable and harmful to many users. In one internal report, Meta concluded that “[o]ur 3 approach has had unhealthy side effects on important slices of public content, such as politics and 4 news,” with one data scientist noting that “[t]his is an increasing liability.” In other internal memos, 5 Meta concluded that because of the new algorithm, “[m]isinformation, toxicity, and violent content 6 are inordinately prevalent.” Other documents show that Meta employees also discussed Meta’s 7 motive for changing its algorithm—namely, that users began to interact less with the platform, which 8 became a worrisome trend for Meta’s bottom line. Meta found that the inflammatory content that the 9 new algorithm was feeding to users fueled their return to the platform and led to more engagement, 10 which, in turn, helped Meta sell more of the digital ads that generate most of its revenue. All told, 11 Meta’s algorithm optimizes for angry, divisive, and polarizing content because it’ll increase its 12 number of users and the time users stay on the platform per viewing session, which thereby increases 13 its appeal to advertisers, thereby increasing its overall value and profitability. 14 38. Upon information in belief, at least as far back as 2019, Meta initiated, inter alia, a 15 Proactive Incident Response experiment, which began researching the effect of Meta on the mental 16 health of today’s youth.7 Meta’s own in-depth analyses show significant mental-health issues 17 stemming from the use of Instagram among teenage girls, many of whom linked suicidal thoughts 18 and eating disorders to their experiences on the app.8 Meta’s researchers have repeatedly found that 19 Instagram is harmful for a sizable percentage of teens that use the platform. In an internal presentation 20 from 2019, Meta researchers concluded that “[w]e make body issues worse for one in three teen 21 girls,” and “[t]eens blame Instagram for increases in the rate of anxiety and depression.” Similarly, 22 in a March 2020 presentation posted to Meta’s internal message board, researchers found that 23 “[t]hirty-two percent of teen girls said that when they feel bad about their bodies, Instagram made 24 them feel worse.” Sixty-six percent of teen girls and forty-six percent of teen boys have experienced 25 7 See Protecting Kids Online: Testimony from a Facebook Whistleblower, United States Senate 26 Committee on Commerce, Science, & Transportation, Sub-Committee on Consumer Protection, Product Safety, and Data Security, https://www.c-span.org/video/?515042-1/whistleblower-frances- 27 haugen-calls-congress-regulate-facebook. 8 See Wall Street Journal Staff, The Facebook Files, WSJ (2021), https://www.wsj.com/articles/the- 28 facebook-files-11631713039?mod=bigtop-breadcrumb. 13 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 negative social comparisons on Instagram. Thirteen-and-one-half percent of teen-girl Instagram users 2 say the platform makes thoughts of “suicide and self-injury” worse. Seventeen percent of teen-girl 3 Instagram users say the platform makes “[e]ating issues” worse. Instagram users are twice as likely 4 to develop an eating disorder as those who don’t use social media. 5 39. Meta is aware that teens often lack the ability to self-regulate. Meta is further aware 6 that, despite the platforms’ adverse impact on teenage users’ well-being, the absence of impulse 7 control often renders teens powerless to oppose the platforms’ allure. Meta is conscious of the fact 8 that the platform dramatically exacerbates bullying and other difficulties prevalent within the high 9 school experience, as the reach of the same now affects users within the ideally otherwise safe 10 confines of the home. The advent of social media largely occurred after today’s parents became 11 adults, the consequence being a large swath of parents that lack the context needed to appreciate the 12 contemporary perils of Meta and Instagram, who are likewise ill-equipped to offer advice sufficient 13 to effectively mitigate against it. 14 40. The shift from chronological ranking to the algorithm modified the social networking 15 environment in such a way that it created a new iteration of the Meta experience, one that is 16 profoundly more negative, one that exploits some of the known psychological vulnerabilities of 17 Facebook’s most susceptible patronage, to wit, juveniles, resulting in a markedly enlarged threat to 18 the cohort’s mental health and the related frequency of suicidal ideation. 19 41. Meta professes to have implemented protective measures to counteract the well- 20 established dangers of its sites customized, doggedly harmful content; however, its protocols apply 21 only to content conveyed in English and remove only three-to-five percent of harmful content. Meta 22 knows its quality-control and age-verification protocols are woefully ineffective but is either 23 unwilling or incapable of properly managing its platforms. This is consistent with its established 24 pattern of recognizing and subsequently ignoring the needs of its underage users and its obligation to 25 create a suitable environment accessible only by its age-appropriate users, all in the interest of reaping 26 obscene profit. 27 / / / 28 / / / 14 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 42. Instead of providing warnings at sign-up or during use, Meta provides no warning at 2 all. Rather, the most accessible and full information regarding the mental and physical health risks of 3 Meta’s platforms comes from third parties. Meta has a “Youth Portal” website that does not appear 4 to be widely promoted by Meta or even recommended to teen users on its platforms.9 Although the 5 website claims to be comprehensive in its coverage of safety information for the platforms, it fails to 6 directly address any of the features or health risks listed above. The website states, “Welcome to our 7 Youth Portal. Consider this your guide to all things Facebook: general tips, insider tricks, privacy 8 and safety information, and everything else you need to have a great experience on Facebook. It’s 9 also a space for you to hear from people your age, in their own voices, about the issues that matter to 10 them online. Take a look around — these resources were made specifically for you, your friends, and 11 your real-life experiences online and off.”10 The website merely provides instructional guides 12 regarding mental health in general—it does not identify, warn, or take responsibility for the impact 13 of the platform and its features on users’ mental health. By contrast, it shifts blame to other factors, 14 such as third parties posting “suicide challenges,” the general societal issue of substance abuse, and 15 the COVID-19 pandemic. 16 43. The only content on the website that has a semblance of a warning for the issues listed 17 above is a link to a “Family Digital Wellness Guide” created by the Boston Children’s Hospital 18 Digital Wellness Lab. Buried in this guide is a mention that screens should not be used an hour before 19 bed, because “[u]sing screens before bedtime or naptime can excite kids and keep them from falling 20 asleep. The ‘blue light’ that comes from TVs and other screen devices can disrupt your child’s natural 21 sleep cycle, making it harder for them to fall asleep and wake up naturally. . . . [Late screen use can] 22 result [ ] in your child getting less sleep and struggling to wake up on time. On average, school-age 23 children need 9-12 hrs of sleep each night.” 24 44. The “Family Digital Wellness Guide” only alludes to the platforms’ manipulation, 25 addictiveness, behavioral control, and data tracking of users: “Advertisers target children with lots of 26 commercials, everything from sneakers and toys to unhealthy foods and snacks high in fat, sugar, and 27 9 Safety Center, Facebook, https://www.facebook.com/safety/youth (last visited Sept. 20, 2022). 28 10 Id. 15 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 calories. Your children may also start becoming familiar with online influencers, who are also often 2 paid to advertise different products and services on social media. Helping your child think critically 3 about how advertising tries to change behaviors, helps your child understand the purpose of ads, and 4 empowers them to make informed decisions.” The guide also briefly discusses cyberbullying. 5 45. The guide mentions the body image harms social media inflicts but solely blames 6 influencers as the cause rather than the platforms’ algorithms and features and asserts that the burden 7 to remedy the issue is on parents rather than social media companies. “Science says: Tweens are often 8 exposed to a lot of information online and through other media, both true and false, about how bodies 9 ‘should’ look and what they can do to ‘improve’ their appearance. Certain body types are often 10 idolized, when in reality bodies are incredibly diverse. There are many online accounts, websites, and 11 influencers that make youth feel inadequate by encouraging them to lose weight or build up muscle, 12 harming both their mental and physical health. . . . Protip: Actively listen and show that you care 13 about how your child is feeling about puberty and how their body is changing. Talk with them about 14 images on social and other media as these often set unrealistic ideals and help them understand that 15 these images are often digitally altered or filtered so that people look more ‘beautiful’ than they really 16 are.” No similar warning is offered to young users in Meta’s advertisements, at signup, or anywhere 17 on the platform. Instead, Meta unreasonably and defectively leaves it to individuals’ research ability 18 for a user to be informed about the key dangers of their platforms. 19 46. This informational report is from a third party, not Meta. Meta merely links to this 20 information on a “Youth Portal” website in a location that is difficult and time-consuming to find. 21 The guide does not mention the strong role that Facebook’s and Instagram’s individual or collective 22 algorithm(s) and features play in each of these harms. Furthermore, it is uncertain how long even this 23 limited information has been tethered by Meta. 24 47. On another Meta created website that proposes to “help young people become 25 empowered in a digital world,” its “Wellness” subpage lists five activities, “mindful breathing,” 26 “finding support,” “building resilience: finding silver linings,” “a moment for me,” and “taking a 27 28 16 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 break.”11 Nowhere does the website mention the mental health risks posed by Facebook and 2 Instagram as a result of the product features listed above. 3 Defendant Snap Inc. 4 48. Snap is a Delaware corporation with its principal place of business in Santa Monica, 5 California. Snap owns and operates the Snapchat social media platform, an application that is widely 6 marketed by Snap and available to users throughout the United States. Snapchat is a platform for 7 engaging in text, picture, and video communication. The platform is also for editing and 8 dissemination of content. The app contains a discovery page and a TikTok-like short video feed that 9 algorithmically presents endless content to users. The primary objective of the platform is to 10 maximize the frequency and length of each user’s viewing sessions. Indeed, 59 percent of teenagers 11 in the U.S. actively use Snapchat,12 and 22 percent of parents in the U.S. know their children between 12 the ages of 9 and 11 use Snapchat.13 13 49. Snapchat was founded in 2011 by Reggie Brown, Evan Spiegel, and Bobby Murphy, 14 three Stanford college students. It began as a simple application designed to allow a user to send a 15 picture to a friend that would later disappear. Having gained only 127 users a few months after its 16 launch, Snapchat began to market to high school students. Within the following year, Snapchat grew 17 to more than 100,000 users. 18 50. Snapchat became well-known for the ephemeral nature of its content, which, in effect, 19 removes all accountability for sent content. Specifically, Snapchat allows users to form groups and 20 share posts or “Snaps” that disappear after being viewed by the recipients. However, Snapchat's social 21 media product quickly evolved from there, as its leadership made design changes and rapidly 22 developed new product features intended to, and successfully did, increase Snapchat’s popularity 23 among minors. 24 25 26 11 Wellness, Meta, https://www.facebook.com/fbgetdigital/youth/wellness (last visited Sept. 20, 2022). 27 12 Vogels, et al., supra note 13. 28 13 Schaeffer, supra note 2. 17 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 51. In 2012, Snapchat added video capabilities to its product, pushing the number of Snaps 2 to 50 million per day. It then added the “Stories” and “Chat” features in 2013; live video chat 3 capabilities, text conversations, “Our Story,” Geofilters, and Snapcash in 2014; Discovery, QR code 4 incorporation, and facial recognition software in 2015; and Memories and Snapchat Groups in 2016. 5 52. Upon knowledge, information, and belief, formed after a reasonable inquiry under the 6 circumstances, by 2015, advertisements were pervasive on Snapchat, and by 2018, 99% of Snapchat’s 7 total revenue came from advertising. Like Meta and Defendants in general, Snapchat decided to 8 monetize its userbase and changed its product in ways that made it more harmful for users yet resulted 9 in increased engagement and profits for Snapchat. By 2015, Snapchat had over 75 million active 10 users and was the most popular social media application amongst American teenagers in terms of the 11 number of users and time spent using the product. To further expand its userbase, Snapchat 12 incorporates several product features that serve no purpose other than to create dependency on 13 Snapchat’s social media product. These features, in turn, result in sleep deprivation, anxiety, 14 depression, shame, interpersonal conflicts, and other serious mental and physical harms. Snapchat 15 knows, or should know, that its product is harmful to adolescents, but, as with Defendants in general, 16 it consistently opts for increased profit at the expense of the well-being of its clientele. Defendants’ 17 products are used by millions of children every day, children who have become addicted to these 18 products because of their design and product features, to the point that parents cannot remove all 19 access to the products without minor users adamantly protesting, often engaging in self-harm, 20 threatening hunger strikes and/or suicide, and other foreseeable consequences of withdrawal from 21 these products, where such cessation would require professional intervention. 22 53. In addition to the types of features discussed above, Snapchat’s defective, addictive, 23 harm-causing features include (1) Snapchat streaks, (2) limited availability content, (3) Trophies, (4) 24 Snapscore, (5) Snapmap, (6) image filters, (7) Spotlight, (8) general user interface, and (9) many 25 other design features. 26 54. Snapchat streaks provide a reward to users based on how many consecutive days they 27 communicate with another user. In other words, the longer two users are able to maintain a streak by 28 exchanging a communication (a “snap”) at least once a day, the more rewarded the users are. The 18 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 reward comes in the form of a cartoon emoji appearing next to the conversation within Snapchat’s 2 interface. The longer the streak is maintained, the more exciting the emoji. Eventually, the emoji will 3 change to a flame, and the number of days the streak has lasted will be positioned next to the flame. 4 If the streak is about to end, the emoji changes to an hourglass to add pressure on users to maintain 5 the streak and reengage with the platform:14 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 /// 25 /// 26 27 14 Lizette Chapman, Inside the Mind of a Snapchat Streaker, Bloomberg (Jan. 30, 2017 at 5:00 AM CST), https://www.bloomberg.com/news/features/2017-01-30/inside-the-mind-of-a-snapchat-streak 28 er?leadSource=uverify%20wall. 19 COMPLAINT FOR DAMAGES AND DEMAND FOR JURY TRIAL 1 55. This feature hijacks teens’ craving for social success and connectedness and causes 2 teen users to feel pressure to use Snapchat daily or suffer social consequences. As some academics 3 and mental health treatment providers have described, streaks “provide a validation for the 4 relationship. . . . Attention to your streaks each day is a way of saying ‘we’re OK.’ . . . The makers 5 built into the app a system, so you have to check constantly or risk missing out,” said Nancy Colier, 6 a psychotherapist and author of The Power of Off. “It taps into the primal fear of exclusion, of being 7 out of the tribe and not able to survive.”15 For teens, streaks can become a metric for self-worth and 8 popularity. By design, the user’s mental wellbeing becomes connected to their performance in Snap’s 9 product.16 Some teenagers even provide their log-in information to others to maintain their streaks 10 for them when they know they will not be able to do so for a time. 11 56. Time-limited content also creates pressure to use the platform daily. Users can post 12 stories that will only be available for 24 hours. Many teens feel an obligation to view all their contact’s 13 stories each day before the content disappears. 14 57. Trophies are awarded to users based on actions performed in the app, such as reaching 15 streaks of certain milestone lengths or using different portions of the app. Each trophy is a unique 16 badge to display on a user’s profile. 17 58. All users receive a “Snapscore” based on their total number of snaps sent and received. 18 Users can see the scores of friends, causin