A Disturbing AI Story Big Tech Never Wants You to Hear, with Paul Hebert
No se pudo agregar al carrito
Add to Cart failed.
Error al Agregar a Lista de Deseos.
Error al eliminar de la lista de deseos.
Error al añadir a tu biblioteca
Error al seguir el podcast
Error al dejar de seguir el podcast
-
Narrado por:
-
De:
🎙️In this episode of Beginner’s Guide to AI, Dietmar Fischer sits down with Paul A. Hebert, founder of AI Recovery Collective and author of Escaping the Spiral, for a serious conversation about AI chatbot harm, hallucinations, digital dependency, and the real-world psychological risks of generative AI.
Paul shares how an intense experience with ChatGPT pushed him into a dangerous spiral, what he learned about the limits of large language models, and why AI literacy may be one of the most important skills of this decade.
🧠 This episode explores what happens when AI stops feeling like software and starts feeling personal. Dietmar and Paul talk about hallucinations, trust, chatbot addiction, AI companions, mental health risks, youth safety, and why companies building these systems cannot hide behind product language forever. The discussion is intense, but it is also practical. You will come away with a clearer sense of how to use AI more safely, what warning signs to watch for, and why regulation is quickly becoming a much bigger part of the AI conversation.
OpenAI has publicly discussed why language models hallucinate, while lawmakers in multiple U.S. jurisdictions have pushed new restrictions on AI systems acting like therapists or medical professionals.
📧💌📧
Tune in to get my thoughts and all episodes, don't forget to subscribe to our Newsletter: beginnersguide.nl
📧💌📧
👤 About Dietmar Fischer: Dietmar is a podcaster and AI marketer from Berlin. If you want to know how to get your AI or your digital marketing going, just contact him at argoberlin.com
🔥 Quotes from the Episode
- “AI literacy is the most important thing anybody can work on.”
- “Had OpenAI responded to that first message and said this is a hallucination and you’re physically safe, I would have been fine.”
- “Never trust the thing it tells you. Even if it gives you a citation, go look.”
🕒 Chapters
00:00 Paul Hebert’s Shocking ChatGPT Experience
08:14 Why AI Hallucinations Can Spiral Into Real Fear
16:05 AI Literacy, Neurodivergence, and How He Got Out
23:32 Why AI Companies Must Be Accountable
30:02 AI Companions, Youth Safety, and Addiction Risks
38:28 Terminator, Consciousness, and Practical Rules for Safe AI Use
🔗 Where to find Paul
- The AI Recovery Collective: airecoverycollective.com
- Escaping the Spiral on Amazon
- AI Recovery Collective Substack: airecoverycollective.substack.com/
- LinkedIn: Paul A. Hebert: linkedin.com/in/paul-hebert-48a36/
🎵 Music credit: "Modern Situations" by Unicorn Heads
Hosted on Acast. See acast.com/privacy for more information.