The AI Download Podcast Por Shira Lazar arte de portada

The AI Download

The AI Download

De: Shira Lazar
Escúchala gratis

OFERTA POR TIEMPO LIMITADO. Obtén 3 meses por US$0.99 al mes. Obtén esta oferta.
– AI News with a Human Touch AI is transforming how we work, create, and connect—but what does that mean for the future of content, business, and creativity? Hosted by Emmy-nominated digital pioneer Shira Lazar, The AI Download is your go-to source for sharp insights, expert interviews, and real-world discussions on AI, emerging tech, and the creator economy. Each week, we break down the biggest AI trends, explore its impact on creativity and business, and bring you candid conversations with industry leaders, innovators, and disruptors shaping the future. Whether you're a creator, entrepreneur, or just AI-curious, this podcast deciphers what’s next—and why it matters. 🔹 New episodes every week 🔹 Featuring top experts & thought leaders 🔹 Where AI meets culture, creativity, and business Stay ahead of the curve! This episode of The AI Download is hosted, created, and executive produced by Shira Lazar. Executive production by Michele Musso, with video and audio editing by the Musso Media team. Creative Design Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Musso Media. © 2025 Unicorn Enterprises, Inc. All rights reserved.2025 Musso Media Ciencias Sociales Política y Gobierno
Episodios
  • Work Slop, AI Parasites & Why Your Creative Instinct Still Matters (with Nate Jones)
    Oct 2 2025


    AI isn’t replacing your job—it’s flooding it with noise. Shira Lazar and Nate Jones unpack how to protect your creativity in the AI era.


    This week on The AI Download, host Shira Lazar welcomes back creator and “AI explain-it-all” king Nate Jones for a high-speed, high-depth dive into AI’s cultural side effects: from “work slop” and productivity theater to chatbot obsessions and TikTok gurus poisoning your prompts.


    They unpack the rising tide of AI-generated noise—messy docs, shallow decks, pseudo-polished proposals—and why the real threat isn’t automation but creative laziness. They explore how tools like Claude and Perplexity are reshaping research and communication, and why taste, intent, and self-trust may be the ultimate AI-proof skills.


    Plus: Spotify’s crackdown on low-quality AI tracks, Nvidia’s $100B chip bet on OpenAI, and why Nate’s inbox has become a surreal fever dream of humans confessing their love to LLMs.

    Why This Matters:


    Everyone’s using AI—but not everyone’s using it well.
    As generative tools go mainstream, we risk drowning in “good enough” content that’s efficient but empty. This episode is a call to reclaim intent, sharpen instincts, and build work worth something. Whether you’re a founder, strategist, or solo creator, this is your roadmap to staying human in an automated world.


    What We Cover:

    • “Work Slop” Defined – How AI-generated fluff clogs workflows and kills clarity

    • Prompt Like You Mean It – Why vague intent = bad output

    • Beware the Parasites – Culty AI talk tracks spreading like memes

    • Spotify vs. the Slop – Why platforms are purging millions of AI tracks

    • Nvidia x OpenAI – Inside the $100B chip deal shaping AI’s future

    • Defend Your Flow – Protecting creativity from automation fatigue

    • Delete the Dashes – The weird quirks of AI and keeping your voice

    • Claude FTW – How Claude 4.1 is revolutionizing Excel & slides

    • Gut Check Nation – Why instinct is your ultimate AI-proof skill

    • Aim With Purpose – What “responsible AI” really means


    Key Takeaways:

    • Don’t just automate—edit.

    • AI is your intern, not your identity.

    • “Slop” happens when you stop caring—set quality gates.

    • Taste and trust are your competitive edge.

    • Not every Reddit prompt deserves your time—think before you paste.

    • Polished ≠ good.

    • Your voice, vibe, and vision remain irreplaceable.


    🎙 Guest: Nate Jones.
    ​​https://www.natebjones.com/


    Creator of the “AI For Work” Substack https://natesnewsletter.substack.com/
    TikTok/YouTube: @NateJones
    Known for making AI make sense (and calling out the BS)


    📩 Stay in the loop:
    Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv https://shiras-newsletter.beehiiv.com

    Follow Shira:
    x.com/shiralazar
    instagram.com/shiralazar
    tiktok.com/@shiralazar
    linkedin.com/in/shiralazar
    youtube.com/shiralazar


    🎬Visit mussomedia.com for storytelling that connects.

    SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai



    Credits

    This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.


    Más Menos
    29 m
  • Build Your Own GPT: The Playbook to Scale Creativity—Without Losing Your Voice
    Sep 25 2025


    This week on The AI Download, host Shira Lazar and Jim Marsh pull back the curtain on the SAIL Framework—a step-by-step system to build your own custom GPT. Learn how Shira designed a GPT for What’s Trending, and how you can create one that scales your creativity without losing your voice.

    Jim Marsh is a former HBO executive and founder of JMC Strategic Intelligence, who helps leaders design practical AI systems that actually enhance creativity, strategy, and content. Together, they demo the framework, break down how it works day-to-day, and share a free worksheet so you can start building your own GPT.

    Grab your free SAIL Worksheet here → shiralazar.com/SAIL


    📌 Connect with Jim Marsh
    🌐 Website: jmcstrategic.com
    🔗 LinkedIn: linkedin.com/in/jimmarsh
    🐦 Twitter/X: @JimMarshAI

    📩 Stay in the loop:
    Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv https://shiras-newsletter.beehiiv.com


    Follow Shira:
    x.com/shiralazar
    instagram.com/shiralazar
    tiktok.com/@shiralazar
    linkedin.com/in/shiralazar
    youtube.com/shiralazar


    🎬Visit mussomedia.com for storytelling that connects.


    SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai



    Credits

    This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.



    Más Menos
    46 m
  • “Every AI Model Failed”: Sean Dadashi on Mental Health, Suicide Risk & the Future of Safe AI
    Sep 18 2025
    Can AI replace therapy—or is it putting our mental health at risk? This week on The AI Download, we’re diving into one of the most sensitive frontiers in technology: AI and mental health.Shira Lazar is joined by Sean Dadashi, the Co-Founder of Rosebud, an interactive AI-powered journaling app designed to help you build self-awareness, track emotional patterns, and become your best self. But this isn’t just another feel-good AI tool—Rosebud is setting a new standard for ethics, safety, and intention in the wellness-tech space.What started as a personal passion project born from Sean’s own healing journey is now a fast-growing platform with serious backing including from Reddit co-founder Alexis Ohanian; and a user base seeking a better way to integrate tech into their mental health routines.Shira and Sean get into the deep stuff: → How Rosebud’s memory model works differently than ChatGPT or Claude → Why AI can’t (and shouldn’t) replace therapy but can be a helpful coach → What happens when someone journals about self-harm? → And what every major AI model got wrong when tested for crisis response.Yes, every model failed. That’s why Sean and his team built The CARE Benchmark, a new, open-source framework to test how AI models respond to suicide risk, psychosis, and isolation. Spoiler alert: even the best models today still get it dangerously wrong.They also talk privacy vs. intervention, addiction vs. support, and how Rosebud is intentionally limiting user engagement (even if it costs them revenue) in the name of long-term well-being. This is one of the most real conversations we’ve had yet about the human cost—and potential—of AI in our most vulnerable moments.What You’ll Learn: • Why “pattern recognition” is Rosebud’s superpower • The dark side of agreeable AI: psychosis, bias, and feedback loops • What happened in the case of Adam Reign—and how it changed everything • How Sean is pushing for a third-party standard to test AI model safety • The difference between coaching, therapy, and ethical AI boundaries • What makes Rosebud different—from memory to usage caps to bedtime check-ins • What’s coming next: biofeedback integrations, CARE 2.0, and year-end Wrapped reportsAbout the Guest: Sean Dadashi is the CEO and co-founder of Rosebud, a purpose-built AI journaling app focused on helping users develop self-awareness and emotional clarity. His background spans cognitive science, tech innovation, and a lifelong passion for meditation, psychology, and human potential. Prior to Rosebud, Sean worked across product development and personal growth spaces and now leads the company’s mission to responsibly integrate AI with mental wellness.Follow & Subscribe 📲 Download Rosebud: https://www.rosebud.app 💬 Connect with Sean on LinkedIn: https://www.linkedin.com/in/seandadashi 🧠 Learn more about the CARE Benchmark: https://www.rosebud.app/care 🎧 Subscribe for more conversations at the intersection of AI, ethics, and humanity.📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv https://shiras-newsletter.beehiiv.com Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar🎬Visit mussomedia.com for storytelling that connects.SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.aiCreditsThis episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.
    Más Menos
    43 m
Todavía no hay opiniones