“Every AI Model Failed”: Sean Dadashi on Mental Health, Suicide Risk & the Future of Safe AI Podcast Por  arte de portada

“Every AI Model Failed”: Sean Dadashi on Mental Health, Suicide Risk & the Future of Safe AI

“Every AI Model Failed”: Sean Dadashi on Mental Health, Suicide Risk & the Future of Safe AI

Escúchala gratis

Ver detalles del espectáculo
OFERTA POR TIEMPO LIMITADO. Obtén 3 meses por US$0.99 al mes. Obtén esta oferta.
Can AI replace therapy—or is it putting our mental health at risk? This week on The AI Download, we’re diving into one of the most sensitive frontiers in technology: AI and mental health.Shira Lazar is joined by Sean Dadashi, the Co-Founder of Rosebud, an interactive AI-powered journaling app designed to help you build self-awareness, track emotional patterns, and become your best self. But this isn’t just another feel-good AI tool—Rosebud is setting a new standard for ethics, safety, and intention in the wellness-tech space.What started as a personal passion project born from Sean’s own healing journey is now a fast-growing platform with serious backing including from Reddit co-founder Alexis Ohanian; and a user base seeking a better way to integrate tech into their mental health routines.Shira and Sean get into the deep stuff: → How Rosebud’s memory model works differently than ChatGPT or Claude → Why AI can’t (and shouldn’t) replace therapy but can be a helpful coach → What happens when someone journals about self-harm? → And what every major AI model got wrong when tested for crisis response.Yes, every model failed. That’s why Sean and his team built The CARE Benchmark, a new, open-source framework to test how AI models respond to suicide risk, psychosis, and isolation. Spoiler alert: even the best models today still get it dangerously wrong.They also talk privacy vs. intervention, addiction vs. support, and how Rosebud is intentionally limiting user engagement (even if it costs them revenue) in the name of long-term well-being. This is one of the most real conversations we’ve had yet about the human cost—and potential—of AI in our most vulnerable moments.What You’ll Learn: • Why “pattern recognition” is Rosebud’s superpower • The dark side of agreeable AI: psychosis, bias, and feedback loops • What happened in the case of Adam Reign—and how it changed everything • How Sean is pushing for a third-party standard to test AI model safety • The difference between coaching, therapy, and ethical AI boundaries • What makes Rosebud different—from memory to usage caps to bedtime check-ins • What’s coming next: biofeedback integrations, CARE 2.0, and year-end Wrapped reportsAbout the Guest: Sean Dadashi is the CEO and co-founder of Rosebud, a purpose-built AI journaling app focused on helping users develop self-awareness and emotional clarity. His background spans cognitive science, tech innovation, and a lifelong passion for meditation, psychology, and human potential. Prior to Rosebud, Sean worked across product development and personal growth spaces and now leads the company’s mission to responsibly integrate AI with mental wellness.Follow & Subscribe 📲 Download Rosebud: https://www.rosebud.app 💬 Connect with Sean on LinkedIn: https://www.linkedin.com/in/seandadashi 🧠 Learn more about the CARE Benchmark: https://www.rosebud.app/care 🎧 Subscribe for more conversations at the intersection of AI, ethics, and humanity.📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv https://shiras-newsletter.beehiiv.com Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar🎬Visit mussomedia.com for storytelling that connects.SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.aiCreditsThis episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.
Todavía no hay opiniones