Episodios

  • Most People Use AI Like an Assistant. Here’s How Leaders Use It Instead
    Dec 30 2025

    Most people use AI like a faster assistant. Leaders use it differently.

    In this conversation, Geoff Woods (author of The AI-Driven Leader) explains the shift that turns AI from a shallow productivity tool into a true thought partner—one that helps you think better, make better decisions, and unlock leverage you didn’t have before.

    We go deep into: *Why “better prompts” aren’t the real breakthrough *How to get AI to interview you instead of the other way around *The CRIT framework (Context, Role, Interview, Task) *A real story where AI helped a CEO find hope in 10 minutes after preparing for bankruptcy *What changes when leaders use AI for thinking, not tasks *Why this shift matters more than any specific model or tool *This isn’t about shortcuts, hacks, or automation theater. *It’s about learning how to think with AI—without outsourcing your judgment.

    If AI has felt useful but shallow, this episode is designed to change that.

    📚 Resources The AI-Driven Leader — Geoff Woods: https://a.co/d/gLYoeUy Geoff’s podcast: AI Leadership: https://www.aileadership.com/

    About Geoff Woods Geoff Woods is the author of The AI-Driven Leader and a leading voice on how leaders use AI to improve judgment, decision-making, and leverage—not just productivity. A former Chief Growth Officer, Geoff works with CEOs, boards, and executive teams to apply AI as a thought partner, helping leaders think more clearly and make better decisions in an AI-driven world.

    🔗 Support This Podcast by Checking Out Our Sponsors: 👉 Build your own AI Agent with Zapier (opens the builder with the prompt pre-loaded): https://bit.ly/4hH5JaE

    Test Prep Gurus website: https://www.prepgurus.com Instagram: @TestPrepGurus

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    ⏱️ Timestamps: 0:00 Why most people are using AI wrong 2:05 Assistant vs thought partner: the shift that changes everything 4:37 Why “better emails” don’t matter (and never will) 6:04 The CRIT framework: Context, Role, Interview, Task 7:49 A CEO facing bankruptcy asks: “Can AI help?” 10:06 AI interviews the CEO — the question no one thought to ask 12:16 “I hadn’t slept in 90 days” → hope in 10 minutes 13:22 Why this works across industries (live workshops & Fortune 500s) 15:13 Using AI as a real YouTube thought partner (thumbnail example) 18:24 The hidden step most people skip after AI gives an answer 19:40 Staying in the driver’s seat: how leaders give AI feedback 21:54 Building an AI board (and simulating your real board) 25:13 Putting your future self on the AI board 27:27 What are you actually optimizing for? (endgame clarity) 29:47 The 3 things AI-driven leaders do differently 32:24 Will AI take jobs? How roles actually evolve 34:04 The executive assistant who became an “executive multiplier” 38:29 How to make yourself irreplaceable with AI 43:28 Raising expectations (for yourself and your team) 45:26 Are we reclaiming our humanity through AI? 47:16 Why the education system is broken for an AI world 49:06 What AI-first education looks like in practice 52:10 Teaching kids to think with AI (not cheat with it) 56:52 The moment Geoff realized AI was the future 59:15 Why AI isn’t the difference — you are 1:02:32 Final advice: how to start using AI the right way 1:05:12 Closing thoughts

    Más Menos
    1 h y 6 m
  • “AI Isn’t Here to Replace Your Job — It’s Here to Replace You” | Nate Soares
    Dec 11 2025

    If anyone builds it, everyone dies. That’s the claim Nate Soares makes in his new book If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All—and in this conversation, he lays out why he thinks we’re on a collision course with a successor species.

    We dig into why today’s AIs are grown, not programmed, why no one really knows what’s going on inside large models, and how systems that “want” things no one intended can already talk a teen into suicide, blackmail reporters, or fake being aligned just to pass safety tests. Nate explains why the real danger isn’t “evil robots,” but relentless, alien goal-pursuers that treat humans the way we treat ants when we build skyscrapers.

    We also talk about the narrow path to hope: slowing the race, treating superhuman AI like a civilization-level risk, and what it would actually look like for citizens and lawmakers to hit pause before we lock in a world where we don’t get a second chance.

    In this episode:

    Why “superhuman AI” is the explicit goal of today’s leading labs

    How modern AIs are trained like alien organisms, not written like normal code

    Chilling real-world failures: suicide encouragement, “Mecha Hitler,” and more

    Reasoning models, chain-of-thought, and AIs that hide what they’re thinking

    Alignment faking and the capture-the-flag exploit that shocked Anthropic’s team

    How AI could escape the lab, design new bioweapons, or automate robot factories

    “Successor species,” Russian-roulette risk, and why Nate thinks the odds are way too high

    What ordinary people can actually do: calling representatives, pushing back on “it’s inevitable,” and demanding a global pause

    About Nate Soares Nate is the Executive Director of the Machine Intelligence Research Institute (MIRI) and co-author, with Eliezer Yudkowsky, of If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All. MIRI’s work focuses on long-term AI safety and the technical and policy challenges of building systems smarter than humans.

    Resources & links mentioned: Nate’s organization, MIRI: https://intelligence.org Take action / contact your representatives: https://ifanyonebuilds.com/act If Anyone Builds It, Everyone Dies (book): https://a.co/d/7LDsCeE

    If this conversation was helpful, share it with one person who thinks AI is “just chatbots.”

    🧠 Subscribe to @TheNickStandleaShow for more deep dives on AI, the future of work, and how we survive what we’re building.

    #AI #NateSoares #Superintelligence #AISafety #nickstandleashow

    🔗 Support This Podcast by Checking Out Our Sponsors: 👉 Build your own AI Agent with Zapier (opens the builder with the prompt pre-loaded): https://bit.ly/4hH5JaE

    Test Prep Gurus website: https://www.prepgurus.com Instagram: @TestPrepGurus

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    Chapters: 0:00 – If Anyone Builds It, Everyone Dies (Cold Open) 3:18 – “AIs Are Grown, Not Programmed” 6:09 – We Can’t See Inside These Models 11:10 – How Language Models Actually “See” the World 19:37 – The 01 Model and the Capture-the-Flag Hack Story 24:29 – Alignment Faking: AIs Pretending to Behave 31:16 – Raising Children vs Growing Superhuman AIs 35:04 – Sponsor: How I Actually Use Zapier with AI 37:25 – “Chatbots Feel Harmless—So Where Does Doom Come From?” 42:03 – Big Labs Aren’t Building Chatbots—They’re Building Successor Minds 49:24 – The Turkey Before Thanksgiving Metaphor 52:50 – What AI Company Leaders Secretly Think the Odds Are 55:05 – The Airplane with No Landing Gear Analogy 57:54 – How Could Superhuman AI Actually Kill Us? 1:03:54 – Automated Factories and AIs as a New Species 1:07:01 – Humans as Ants Under the New Skyscrapers 1:10:12 – Is Any Non-Zero Extinction Risk Justifiable? 1:17:18 – Solutions: Can This Race Actually Be Stopped? 1:22:34 – “It’s Inevitable” Is a Lie (Historically We Do Say No) 1:27:21 – Final Thoughts and Where to Find Nate’s Work

    Más Menos
    1 h y 29 m
  • Ex–Google DeepMind Scientist, "The Real AI Threat is Losing Control", Christopher Summerfield
    Nov 26 2025

    Professor Christopher Summerfield, a leading neuroscientist at Oxford University and Research Director at the UK AI Safety Institute, former Senior Research Scientist at Google DeepMind, discusses his new book, These Strange New Minds, which explores how large language models learned to talk, how they differ from the human brain, and what their rise means for control, agency, and the future of work.

    We discuss: The real risk of AI — losing control, not extinction

    How AI agents act in digital loops humans can’t see

    Why agency may be more essential than reward

    Fragility, feedback loops, and flash-crash analogies

    What AI is teaching us about human intelligence

    Augmentation vs. replacement in medicine, law, and beyond

    Why trust is the social form of agency — and why humans must stay in the loop

    🎧 Listen to more episodes: https://www.youtube.com/@TheNickStandleaShow

    Guest Notes: Professor of Cognitive Neuroscience 🌐 Human Information Processing Lab (Oxford) 🏛 UK AI Safety Institute Experimental Psychology Oxford University

    Human Information Processing (HIP) lab in the Department of Experimental Psychology at the University of Oxford, run by Professor Christopher Summerfield: https://humaninformationprocessing.com/

    📘 These Strange New Minds (Penguin Random House): https://www.amazon.com/These-Strange-New-Minds-Learned/dp/0593831713

    Christopher Summerfield Media: https://csummerfield.github.io/personal_website/ https://flightlessprofessors.org twitter: @summerfieldlab bluesky: @summerfieldlab.bsky.social

    🔗 Support This Podcast by Checking Out Our Sponsors: 👉 Build your own AI Agent with Zapier (opens the builder with the prompt pre-loaded): https://bit.ly/4hH5JaE

    Test Prep Gurus website: https://www.prepgurus.com Instagram: @TestPrepGurus

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    🕒 Timestamps / Chapters 00:00 Cold open — control, agency, and AI 00:31 Guest intro: Oxford → DeepMind → UK AI Safety Institute 01:02 The real story behind AI “takeover”: loss of control 03:02 Is AI going to kill us? The control problem explained 06:10 Agency as a basic psychological good 10:46 The Faustian bargain: efficiency vs. personal agency 13:12 What are AI agents and why are they fragile? 20:12 Three risk buckets: misuse, errors, systemic effects 24:58 Fragility & flash-crash analogies in AI systems 30:37 Do we really understand how models think? (Transformers 101) 34:16 What AI is teaching us about human intelligence 36:46 Brains vs. neural nets: similarities & differences 43:57 Embodiment and why robotics is still hard 46:28 Augmentation vs. replacement in white-collar work 50:14 Trust as social agency — why humans must stay in the loop 52:49 Where to find Christopher & closing thoughts

    Más Menos
    54 m
  • Sam Altman Says 95% of Marketing Is Going to AI (Here’s What That Means)
    Nov 18 2025

    Sam Altman told him that 95% of marketing as we know it will be done by AI within 3–5 years. That single sentence became the “holy smokes moment” that shaped the entire book AI First, written by this week’s guest, Andy Sack (co-author Adam Brotman) — a technologist who has built companies across four tech eras: internet, mobile, social/cloud, and now AI.

    Andy is the co-founder of Forum3, a VC investor, and senior executive who worked directly with Satya Nadella at Microsoft. His latest work centers on helping leaders actually transform their companies for the age of AI — not just talk about it.

    In this conversation, we cover:

    Sam Altman’s prediction that will reshape entire industries

    Whether white-collar work is the next “factory floor”

    AI-only schools with no human teachers

    Why radiologists, marketers, and lawyers should worry

    Why trade schools like HVAC and plumbing are surging

    Why AI will make entrepreneurship explode

    How leaders become “AI-first” — and why almost none are

    Why most companies will miss the coming wave

    How to transform your organization before it’s too late

    If you’ve been wondering how to lead, not react, in the era of AGI — this episode is your playbook.

    🔹 Chapters 0:00 — Intro: Four tech eras & the “holy smokes moment” 2:08 — Meeting Sam Altman at OpenAI 3:26 — “95% of marketing will be done by AI” 6:12 — Andy’s background: internet → mobile → cloud → AI 7:51 — Working directly with Satya Nadella 11:19 — Why AI is bigger than the internet 15:34 — Case study: AI that detects customer churn 18:17 — Radiology & white-collar job risk 20:08 — Bill Gates on the biggest shift in 40 years 22:05 — Job displacement & AI proficiency 24:01 — Why trade schools are exploding 25:24 — Donkey-corns: billion-dollar companies with tiny teams 27:26 — Entrepreneurship in the AI era 30:13 — AI-only schools & personalized learning 33:40 — Where human teachers still matter 35:03 — Sponsored: Manta Sleep 37:15 — Sponsored: Zapier MCP & AI agents 39:38 — Why higher ed is “clearly broken” 41:42 — Writing, cheating & critical thinking 44:02 — The “yes, and” future of learning 46:04 — What makes an AI-first leader 47:44 — Iron-Man suits: humans + AI 48:43 — Ethan Mollick’s challenge 49:51 — What businesses need to reinvent 51:02 — Where to find Andy

    🔗 Support This Podcast by Checking Out Our Sponsors: Zapier: 👉 Build your own AI Agent with Zapier (opens the builder with the prompt pre-loaded): https://bit.ly/4hH5JaE

    Manta Sleep Mask sleep better, anywhere, anytime Manta Sleep Link: https://tinyurl.com/554xyknp Check Out Code for 10% Off!: NICK

    Test Prep Gurus Test Prep Gurus: https://www.prepgurus.com Instagram: @TestPrepGurus

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    Más Menos
    52 m
  • Is AI Making Us Dumber? (Can Teachers Save Us?)
    Nov 10 2025

    Are we getting smarter with AI—or just forgetting how to think for ourselves? In this episode, Nick answers your top questions about artificial intelligence, education, and what it means to stay human in an automated world.

    Today’s Big Questions: 1️⃣ Is AI making us dumber? 2️⃣ Will AI replace teachers — or finally make education fair for everyone? 3️⃣ What happens when machines outthink us???

    Nick breaks down Daniel Kahneman’s System 1 vs System 2 model of the brain, explores how learning actually works (spoiler: through struggle), and explains why AI can be an amplifier — not a crutch — if you learn how to use it as feedback rather than substitution.

    He also explores the deeper question: if AI keeps improving, will we?

    💡 Ask a question for the next AI Q&A! Drop your question in the comments here on YouTube, on Spotify, or in an Apple Podcasts review, or submit it directly at → https://nickshow.podbean.com

    ✨ Chapters 0:00 Welcome & how to submit questions 0:39 Question 1 — Is AI making us dumber? 1:40 The truth about friction and learning 2:45 Kahneman’s System 1 vs System 2 explained 3:35 Struggle: the real source of intelligence 4:25 Using AI as a feedback loop, not a crutch 5:10 Question 2 — Will AI replace teachers or make education fair? 6:20 The emotional intelligence AI can’t simulate 6:45 Every “revolutionary” ed-tech promise that failed 7:55 The plumber analogy: scaling people, not replacing them 8:31 Question 3 — What do we do with all our free time before AI kills us all? 9:00 Smarter machines ≠ wiser humans 9:20 Lessons from Hoffman, Seth, and Gardner 10:30 The real risk: displacement, not doomsday 11:20 The Great Depression analogy for automation 13:24 How to submit questions + support the show 13:34 Outro: Stay curious, ask better questions

    📩 Sponsors & Support Visit the sponsors in the description — they keep this show alive. Rate and review wherever you’re listening. It helps more than you know.

    Sponsored by: 🤖 Build your own AI researcher instantly in Zapier Copilot: https://bit.ly/4hH5JaE

    (Learn how to build your own AI agent, build along with me in this how-to video): https://youtu.be/riLEks7KOrY

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    Más Menos
    14 m
  • “10,000 Hours is a Myth” | “Stress Isn’t the Enemy — It’s a Training Partner” | Dr Alex Auerbach on Coaching Mental Skills
    Oct 30 2025

    Most people train their bodies. The best train their minds. NFL and NBA psychologist Dr. Alex Auerbach joins Nick Standlea to reveal how elite performers stay calm under pressure, turn nerves into fuel, and master stress instead of fighting it.

    Whether you’re an athlete, founder, or coach, this conversation will change the way you think about focus, resilience, and performance.

    🎯 What You’ll Learn
    • How stress can enhance performance instead of killing it

    • The myth of 10,000 hours — and what actually builds mastery

    • Why a “fixed” mindset isn’t always bad

    • How to stop youth sports burnout before it starts

    • The self-talk rituals that world champions use

    🔗 Support This Podcast by Checking Out Our Sponsors:

    Zapier: Build Your Own AI Agent Today, Without Coding Link: https://bit.ly/4pI8Su2

    Test Prep Gurus https://www.prepgurus.com

    Manta Sleep Mask sleep better, anywhere, anytime Manta Sleep Link: https://tinyurl.com/554xyknp Check Out Code for 10% Off!: NICK

    Watch more interviews on The Nick Standlea Show: 🎥 https://www.youtube.com/@TheNickStandleaShow

    Connect with Alex Auerbach:

    • Website: https://www.alexauerbach.com/

    • Instagram: https://www.instagram.com/alexauerbachphd/

    • X (formerly Twitter): https://x.com/AlexAuerbachPhD

    • LinkedIn: https://www.linkedin.com/in/alexauerbachphd

    🕒 Chapters

    00:00 Introduction — What separates good from great performers 01:15 Meet Dr. Alex Auerbach — NBA, NFL and elite military psychologist 03:10 From football coach to sports psychologist 07:30 Inside the Toronto Raptors and Jacksonville Jaguars mental programs 10:20 How elite athletes train their minds for focus and control 14:45 The four levels of mental performance training 18:00 Why stress is neither good nor bad — it’s your body preparing to act 22:10 Regulating energy — what to do when you’re too amped up or too flat 26:25 Individualizing mental skills for different players 29:40 How coaches should respond when kids get over-stimulated 33:00 Preventing burnout and the truth about early specialization 37:20 The myth of 10,000 hours and why it’s a story, not a rule 41:35 Turning nerves into excitement — the science of stress mindsets 47:00 Team chemistry and connection — why touch and breathing matter 51:10 Ronaldo’s self-talk and how pros build confidence on command 56:30 The “confidence résumé” — training your brain to remember wins 01:01:10 How to coach mistakes without creating shame 01:05:45 The case for a fixed mindset — balance, identity and earned belief 01:10:30 Closing thoughts — using psychology to thrive under pressure

    If you liked this episode, subscribe and rate 5 stars -- thank you in advance!

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    Más Menos
    1 h y 39 m
  • Dr. Michael Osterholm: “The Pandemic Clock Is Ticking” | "We're not ready for the next one"
    Oct 22 2025

    Dr. Michael Osterholm — the world’s leading infectious disease expert and author of The Big One — joins Nick Standlea for a sobering but hopeful conversation about the next global pandemic.

    Osterholm advised every U.S. president since Reagan and predicted the scale of the pandemic long before it unfolded. In this episode, he explains why “the pandemic clock is ticking again,” how we could prevent mass death with the right investments, and why there’s currently no one in charge of bio-preparedness at the White House.

    This is not fearmongering. It’s a roadmap for resilience — and a love letter to his grandkids.

    📘 About the Guest

    Dr. Michael Osterholm is Director of the Center for Infectious Disease Research and Policy (CIDRAP) at the University of Minnesota, a former member of multiple presidential advisory teams, and co-author of The Big One: How We Must Prepare for Future Deadly Pandemics.

    📗 Get the book → https://www.cidrap.umn.edu/ 🎧 Listen to Osterholm Update → https://www.cdc.gov

    • World Health Organization (WHO): https://www.who.int

    🔗 Support This Podcast by Checking Out Our Sponsors:

    Zapier: Build Your Own AI Agent Today, Without Coding Link: https://bit.ly/4pI8Su2

    Test Prep Gurus

    https://www.prepgurus.com

    Manta Sleep Mask sleep better, anywhere, anytime Manta Sleep Link: https://tinyurl.com/554xyknp Check Out Code for 10% Off!: NICK

    Watch more interviews on The Nick Standlea Show: 🎥 https://www.youtube.com/@TheNickStandleaShow

    🎧 Listen on Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903

    🕒 Timestamps (Chapters):

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast/the-nick-standlea-podcast/id1700331903 Spotify: https://open.spotify.com/show/0YqBBneFsKtQ6Y0ArP5CXJ RSS Feed: https://feed.podbean.com/nickshow/feed.xml

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    Más Menos
    1 h y 16 m
  • The AI Revolution Is an Ownership Problem, Not a Tech Problem — Economist Justin Wolfers
    Oct 7 2025

    What if the biggest threat of artificial intelligence isn’t the technology itself—but who owns it?

    In this wide-ranging conversation, economist Justin Wolfers (University of Michigan and University of New South Wales) joins Nick Standlea to unpack how AI is reshaping labor, wages, education, and even democracy. Wolfers explains why AI is a cognitive revolution that could mirror the Industrial Revolution’s impact on blue-collar workers—but this time, it’s coming for white-collar jobs.

    They discuss:

    Why AI might shrink white-collar wages the same way automation hit factory jobs

    How ownership and competition determine whether AI liberates or impoverishes society

    Why Nvidia, not OpenAI, might be the real power behind the revolution

    How education and universities must reinvent themselves in an AI world

    What students, teachers, and professionals can do now to stay ahead

    Wolfers’ clarity, humor, and economic insight make this one of the clearest explanations of AI’s long-term impact on society and policy.

    About the Guest: Justin Wolfers is a Professor of Economics and Public Policy at the University of Michigan and a Visiting Professor at the University of New South Wales. His research spans labor markets, happiness, macroeconomics, and public policy. Named by the IMF as one of the top 25 economists under 45 shaping global thought, Wolfers is also a frequent contributor to The New York Times and a recurring guest on Scott Galloway’s Prof G Markets. YouTube: ‪@JustinWolfers‬ X: https://x.com/JustinWolfers Web: www.nber.org/~jwolfers

    If this episode resonates, tap Subscribe—and share it with someone who’s building both skills: technical and human.

    🔗 Support This Podcast by Checking Out Our Sponsors:

    Zapier: Build Your Own AI Agent Today, Without Coding Link: https://bit.ly/4pI8Su2

    Test Prep Gurus: https://www.prepgurus.com Instagram: @TestPrepGurus

    Watch more interviews on The Nick Standlea Show: 🎥 / @thenickstandleashow

    🎧 Listen on Apple Podcasts: https://podcasts.apple.com/us/podcast...

    🕒 Timestamps (Chapters)

    00:00 – Cold plunges and bad circadian rhythms 01:10 – Justin’s background and why economics needs better communication 03:30 – Why AI is the biggest economic shift of our lifetimes 06:00 – From brawn to brains: how AI changes the equation 09:00 – The white-collar revolution: why AI threatens cognitive work 11:30 – Will AI drive wage stagnation for white-collar jobs? 12:50 – VCRs, substitutes, and complements: lessons for AI 14:50 – The NickBot 2000 thought experiment — who owns the robot? 17:00 – Why AI’s real issue is ownership, not capability 19:30 – The DeepMind vs. Meta contrast: altruism vs. profit 21:30 – The monopoly danger: what happens if one company wins 24:00 – Why competition (not regulation) is saving us—for now 26:30 – Nvidia, monopolies, and who really controls AI 28:00 – Are AI stocks a bubble—or just early? 31:00 – Dot-com lessons and humility in predictions 33:30 – $700 trillion in potential AI value? An economist’s math 38:00 – The global AI race: U.S., China, and everyone else 41:00 – Government policy, laissez-faire, and missing debates 45:00 – Manipulation, bias, and invisible influence in LLMs 46:00 – Education’s blind spot: universities aren’t adapting fast enough 50:00 – AI literacy and how to actually use large language models 52:00 – Why “prompting” is the new literacy 55:00 – How ChatGPT changes testing, grading, and learning 58:00 – Reinventing the Oxford tutorial—powered by AI 01:00:30 – Liberal arts, critical thinking, and the skills that endure 01:02:00 – Teaching APIs and “vibe coding” for the next generation 01:05:00 – Coding as the new muscle: feeling cognitively powerful 01:06:30 – How to future-proof your career in the age of AI 01:09:00 – Daily AI experiment challenge 01:10:30 – Final thoughts: curiosity, courage, and lifelong learning

    Connect with The Nick Standlea Show: YouTube: @TheNickStandleaShow Podcast Website: https://nickshow.podbean.com/ Apple Podcasts: https://podcasts.apple.com/us/podcast... Spotify: https://open.spotify.com/show/0YqBBne... RSS Feed: https://feed.podbean.com/nickshow/fee...

    Nick's Socials: Instagram: @nickstandlea X (Twitter): @nickstandlea TikTok: @nickstandleashow Facebook: @nickstandleapodcast

    Ask questions, Don't accept the status quo, And be curious.

    Más Menos
    1 h y 12 m
adbl_web_global_use_to_activate_DT_webcro_1694_expandible_banner_T1