In a development with far-reaching legal implications, a New Mexico jury’s $375 million penalty against Meta Platforms has expanded the scope of liability beyond consumer protection, raising the prospect of social media platforms being treated as public nuisances under state law.

The case, led by Attorney General Raúl Torrez, combines statutory consumer protection violations with a broader common law argument rooted in the public nuisance doctrine. This doctrine traditionally addresses harms affecting community health and safety—such as pollution or hazardous conditions but is now being applied to digital platforms.

While the jury found Meta liable under the New Mexico Unfair Practices Act for deceptive safety claims, the state has also argued that the company’s platforms Facebook, Instagram, and WhatsApp create conditions harmful to public health, particularly for minors.

The public nuisance claim centers on allegations that platform design features contribute to widespread harm. These include unrestricted access to explicit content by minors, algorithm-driven amplification of harmful material, and engagement tools such as infinite scrolling that may encourage excessive use.

By framing these design elements as systemic risks rather than isolated incidents, prosecutors argue that the platforms themselves constitute a harmful “condition” affecting the broader community.

A key aspect of the case is its attempt to bypass federal protections under Section 230 of the Communications Decency Act. Typically, Section 230 shields platforms from liability for third-party content. However, New Mexico’s legal approach focuses on Meta’s own design choices and operational practices rather than user-generated content alone.

Legal experts note that this distinction is critical. By grounding the case in state tort law and platform architecture, the claim seeks to avoid federal preemption and establish independent grounds for liability.

The approach reflects a broader trend among state regulators exploring alternative legal pathways to hold technology companies accountable for societal harms linked to their platforms.

Next Phase: Determining Abatement Measures

The case will proceed to a second phase in May 2026, where Judge Bryan Biedscheid will oversee a bench trial to determine potential remedies, known in law as “abatement” measures.

Proposed interventions include:

  • Mandatory age verification systems to restrict access by minors
  • Deployment of artificial intelligence tools to detect and remove predatory behavior
  • Algorithmic adjustments to reduce the visibility of harmful content
  • Usage limits or safeguards aimed at reducing excessive engagement among young users

If implemented, these measures could significantly alter how social media platforms operate, particularly in relation to child safety and content moderation.

Meta is expected to challenge the ruling, potentially invoking protections under the First Amendment of the United States Constitution. The company may argue that the proposed remedies are overly broad and infringe on constitutionally protected speech.

However, courts have historically drawn a distinction between protected expression and misleading or harmful commercial practices, leaving room for regulatory intervention in cases involving public welfare.

The case represents a novel application of the public nuisance doctrine in the digital context. If upheld, it could establish a precedent for treating large-scale online platforms as entities capable of causing widespread societal harm subject to judicial regulation.

Legal scholars suggest that the dual-track strategy combining consumer protection statutes with common law nuisance claims may provide a powerful framework for future litigation against technology companies.

As scrutiny of digital platforms intensifies, the outcome of the abatement phase could serve as a model for other jurisdictions seeking to address concerns around child safety, platform accountability, and the broader public impact of social media.

The case underscores a shifting legal landscape in which traditional doctrines are being adapted to address modern technological challenges, potentially redefining the boundaries of liability in the digital age.