Everyone Is Betting on Bigger LLMs. She's Betting They're Fundamentally Wrong. (Eve Bodnia, Founder & CEO of Logical Intelligence) Podcast Por  arte de portada

Everyone Is Betting on Bigger LLMs. She's Betting They're Fundamentally Wrong. (Eve Bodnia, Founder & CEO of Logical Intelligence)

Everyone Is Betting on Bigger LLMs. She's Betting They're Fundamentally Wrong. (Eve Bodnia, Founder & CEO of Logical Intelligence)

Escúchala gratis

Ver detalles del espectáculo
Eve Bodnia is the co-founder and CEO of Logical Intelligence, which is developing energy-based reasoning models (EBMs) as an alternative to large language models. She argues that LLMs, which operate by recognizing and recombining patterns within language space, are structurally incapable of genuine reasoning. Eve's alternative: Kona — an EBM that reasons in abstract latent space, learns rules about the world rather than surface patterns, and can interface with language models as one output channel among many. Eve traces the core ideas behind her architecture to decades of work in symmetry groups, condensed matter physics, and brain science — fields that share, as she explains, the same underlying mathematics. In a public demo, Kona solved a complex reasoning task for roughly $4 in compute, compared to an estimated $15,000 using frontier LLMs. With Yann LeCun serving as founding chair of its technical board, Logical Intelligence sits at the center of a small but growing effort to rethink AI beyond language-based models.In our conversation, we explore:Why Eve believes LLMs can’t truly extrapolate knowledge, even at larger scaleWhat energy-based reasoning models are—and where the “energy” concept comes fromThe $4 vs. $15,000 benchmark, and what it tells us about the cost of guessing vs. knowingHow Logical Intelligence showed spontaneous knowledge transfer at just 16M parametersWhy systems like chip design, surgical robotics, and power grids need more than probabilistic AIWhat formally verified code generation means for the future of programmingWhy the math behind particle physics also explains how the brain filters signal from noiseHow meeting Grigori Perelman as a teenager shaped Eve’s views on ego and ownership in scienceWhy Eve believes humans must remain the constraint-setters in advanced AIHow meditation, piano, and Eastern philosophy support her creative process—Thank you to the partners who make this possibleGranola: The app that might actually make you love meetings.Persona: Trusted identity verification for any use case.—Transcript: https://www.generalist.com/p/everyone-is-betting-on-bigger-llms—Timestamps(00:00) Introduction(03:03) Eve’s encounter with Grigori Perelman(05:38) Why bizarre people are Eve’s favorite people(06:56) Her early obsession with math and physics(09:02) The manifold hypothesis and language(11:54) The Kekulé Problem(14:05) Eve’s upbringing and her CERN research in high school(17:40) Eve’s academic path(20:36) Symmetry in nature(22:58) Spirituality and creativity(27:00) Theory vs. experiment(29:03) Uncovering a critical gap in AI models(33:45) What Logical Intelligence is building(35:50) Logical Intelligence’s use cases(42:08) Energy-based models explained(45:06) LLMs vs. EBMs(48:01) AGI defined(51:22) Kona’s knowledge extrapolation(53:20) The team behind Logical Intelligence(58:09) Early investors in Logical Intelligence(58:50) Feynman’s influence on Eve’s work(1:01:15) How Eve sustains her creativity(1:03:42) Final meditations—Follow Eve BodniaLinkedIn: https://www.linkedin.com/in/eve-bodnia-351b41355X: https://x.com/evelovesoliveWebsite: https://logicalintelligence.com—Resources and episode mentions: https://www.generalist.com/p/everyone-is-betting-on-bigger-llms⁠—Production and marketing by penname.co. For inquiries about sponsoring the podcast, email jordan@penname.co.
Todavía no hay opiniones