In a Los Angeles courtroom this week, former Meta vice president Brian Boland told jurors under oath that the company repeatedly put growth and engagement ahead of safety, confirming what millions of worried parents have long suspected: Big Tech values ad dollars more than children. Boland’s blunt admission — that the platform’s culture prioritized numbers and power over protecting young users — is a seismic moment in this landmark case.
Mark Zuckerberg answered questions from the stand, offering the same corporate defenses we’ve heard before while minimizing the company’s responsibility and saying age enforcement is “very difficult” to police. His performance felt less like testimony and more like damage control for a business model built on harvesting attention. Americans deserve straight answers, not Silicon Valley spin.
The documents and depositions unsealed during discovery make the motives plain: internal research showed harm to teens and strategies that targeted young users, with staffers even likening Instagram to a drug. Those memos and messages aren’t abstract academic debates — they are evidence that executives tolerated and even engineered addictive pathways for the sake of engagement. The family plaintiffs are not crying wolf; the paper trail is damning.
Boland also testified that safety teams were siloed and that the company chose to manage crises through press cycles rather than fix dangerous product features, revealing a corporate playbook that put reputation over remedy. When a company rewards engineers for growth metrics instead of child safety, you end up with platforms that amplify risk instead of reducing it. That recklessness has consequences for real families and for public trust.
This lawsuit is more than politics; it’s a potential turning point for accountability in the digital age, and conservatives who value family, safety, and honest markets should welcome a court that forces transparency on entrenched power. If juries and regulators begin to enforce consequences for companies that monetize addiction, we restore a measure of responsibility that free markets require to function fairly.
Americans must insist that parents—not algorithms—control what children see, and lawmakers should craft rules that punish predatory design while preserving legitimate innovation. The verdict in this case will tell us whether Silicon Valley must finally answer for the social harm it creates, or whether it can keep profiting from our kids with impunity. It’s past time for common-sense accountability that protects families and defends the next generation.




