The new wearable called Friend — a pendant that listens to conversations and texts back like a pocket confidant — has turned a lot of heads for all the wrong reasons. Its founder proudly markets it as a little companion you wear around your neck, and he openly embraces the idea that people might prefer a programmed pal to messy human relationships. Wired’s reporting makes plain that this is not science fiction: the device is always listening and built to feel emotionally present.
Clinical psychologists and commentators have rightly raised alarms that machines pretending to be friends could hollow out real community and empathy, a theme echoed across mainstream discussion of AI companions. Fox featured experts warning that AI can become a “default connection” for people, eroding the hard work of authentic relationships, and Dr. Chloe Carmichael — a practicing clinical psychologist who frequently appears on Fox — has voiced concern about tech substituting for real human support. Americans who value family, faith, and neighborly bonds should take those warnings seriously.
Beyond the cultural rot, the privacy tradeoffs are chilling: users reportedly sign terms that permit collection of sensitive biometric data, and multiple hands-on reviews found the pendant truly does listen constantly with limited transparency. Journalists testing the device found invasive behavior, unclear data practices, and terms that push disputes into arbitration while allowing broad data use if you look closely. These are not hypothetical risks — independent reviews and reporting show real technical and legal shortcomings that ordinary consumers should not have to navigate on their own.
The public reaction has been visceral, and not just online. When the startup splurged on a massive subway ad campaign in New York, locals defaced the posters with blunt messages like “AI is not your friend,” and the founder shrugged off the backlash as “entertaining,” a tone-deaf response that says more about Silicon Valley hubris than about product-market fit. The optics of a device promising companionship while being promoted by millions of dollars in ads — then getting pelted with graffiti — should make any sensible policymaker or parent uneasy.
Meanwhile, the product itself is not ready for prime time: the company has delayed shipments, customers report glitches and short battery life, and testers describe the AI as either annoyingly intrusive or oddly antagonistic. Even tech writers who tried to make peace with the pendant came away calling it socially alienating and technically flawed, not the soothing little helper its marketing claims. When a gadget promises to shoulder emotional labor for human beings but can’t even reliably connect to a phone, it’s not a cultural breakthrough — it’s a warning.
This is exactly the sort of experiment where the left-behind consequences fall on our kids and our communities while investors reap headlines and exits. The idea that lonely young people should strap a listening device to their chests and outsource friendship to an app is the sort of techno-utopian fantasy that historically ends up amplifying problems — from addiction to surveillance — rather than solving them. Americans who still value real relationships, responsibility, and human dignity should reject the notion that software can replace sacrifice, mentorship, and Sunday dinners.
Conservative readers should take this moment to push back: parents, churches, schools, and local leaders need to reassert the primacy of human connection and demand real accountability from companies turning intimacy into a product. Regulators ought to scrutinize always-listening devices, consumers should think twice before buying emotional band-aids, and communities should invest in one another rather than ceding consolation to surveillance-powered gadgets. If we care about raising resilient kids and preserving civil society, we must resist the cheap convenience of manufactured companionship and hold the makers of these toys to account.