106 | AI Ethics and Security with Elizabeth Goede (Part 2)
No se pudo agregar al carrito
Add to Cart failed.
Error al Agregar a Lista de Deseos.
Error al eliminar de la lista de deseos.
Error al añadir a tu biblioteca
Error al seguir el podcast
Error al dejar de seguir el podcast
-
Narrado por:
-
De:
Send us a text
Are you feeding your AI tools private info you’d never hand to a stranger?
If you’re dropping sensitive data into ChatGPT, Canva, or Notion without blinking, this episode is your wake-up call. In Part 2 of our eye-opening conversation with AI ethics strategist Elizabeth Goede, we delve into the practical aspects of AI use and how to safeguard your business, clients, and future.
This one isn’t about fear. It’s about founder-level responsibility and smart decision-making in a world where the tools are evolving faster than most policies.
Grab your ticket to the AI in Action Conference — March 19–20, 2026 in Grand Rapids, MI. You’ll get two days of hands-on AI application with 12 done-with-you business tools. This isn’t theory. It’s transformation.
In This Episode, You'll Learn:
- Why founders must have an AI policy (yes, even solopreneurs)
- The #1 AI tool Elizabeth would never trust with sensitive data
- How to vet the tools you already use (based on their founders, not just features)
- What "locking down your data" actually looks like
- A surprising leadership insight AI will reveal about your team
Resources & Links:
- AI in Action Conference – Registration
- Follow Elizabeth Goede socials (LinkedIn, Instagram)
Related episode:
- Episode 104 | AI Ethics and Security (Part 1) with Elizabeth Goede
Want to increase revenue and impact? Listen to “She's That Founder” for insights on business strategy and female leadership to scale your business. Each episode offers advice on effective communication, team building, and management. Learn to master routines and systems to boost productivity and prevent burnout. Our delegation tips and business consulting will advance your executive leadership skills and presence.