TLDR
- Meta drops 2.6% as New Mexico launches landmark child-safety trial
- High-stakes lawsuit puts Meta’s platform design and safety controls on trial
- Jury case challenges Meta’s handling of child safety and online protection
- Meta braces for legal battle as scrutiny of youth safety intensifies
- Trial tests Meta’s safeguards as regulators push for platform accountability
Meta Platforms (META) shares fell during the session as the company prepared for a high-profile child-safety trial in New Mexico. The stock closed at $718.84, down 2.64%, after steady intraday selling pressure. The court case now moves toward jury selection, and attention shifts to how the legal battle may influence platform oversight.
New Mexico Case Moves Toward Trial
New Mexico will open a landmark trial that challenges Meta’s handling of child-safety on its core platforms. The state alleges the company allowed harmful conduct to occur and enabled predators to reach underage users. Moreover, the complaint argues that Meta created conditions that encouraged harmful engagement patterns among children.
The legal action stems from an undercover operation run by the New Mexico Attorney General in 2023. Officials created accounts posing as children and documented unsolicited explicit messages and adult contact. Furthermore, the probe led to criminal charges that helped shape the state’s broader claims.
The trial will run for seven to eight weeks, and the court will review multiple sets of internal evidence. The state seeks financial penalties along with mandated product-level reforms. Additionally, the case marks the first time such allegations reach a jury against Meta.
Platform Design and Safety Features Under Scrutiny
The lawsuit asserts that Meta promoted features that kept minors engaged for extended periods despite known risks. The filing says infinite scroll and auto-play tools strengthened addictive tendencies that affected mental health. The state argues that Meta ignored early warnings and withheld necessary design updates.
Internal documents cited in the complaint indicate awareness of harmful activity at various levels of the company. These documents reference long-standing concerns involving sexual exploitation and youth well-being. The state claims Meta failed to adopt age-verification tools or clear safeguards.
Meta denies wrongdoing and points to its safety policies and cooperation with law enforcement. The company maintains it deploys numerous systems to detect harmful content on its platforms. It argues the claims rely on selective documents and misinterpret standard product-design practices.
Legal Defenses and Broader Industry Context
Meta plans to defend the case using constitutional and statutory protections for online platforms. The company argues that the claims cannot be separated from user-generated content and platform-level algorithms. It cites Section 230 and First Amendment principles as core defenses.
The trial arrives as concerns grow over youth safety across digital platforms. Additional lawsuits challenge design choices that allegedly contributed to rising mental-health issues among young users. Some cases seek sweeping financial damages and demand design reforms across the sector.
Another trial involving similar allegations began this week in California and includes Meta and Google. Other platforms reached settlements, which narrowed the list of active defendants. The New Mexico case adds pressure as courts evaluate long-running safety debates.


