The Hallucination Trap: How to Use AI in Legal Practice Without Losing $10,000
No se pudo agregar al carrito
Add to Cart failed.
Error al Agregar a Lista de Deseos.
Error al eliminar de la lista de deseos.
Error al añadir a tu biblioteca
Error al seguir el podcast
Error al dejar de seguir el podcast
-
Narrado por:
-
De:
In the first half of their conversation with James Mixon, Managing Attorney at California's Second District Court of Appeal, Tim Kowal and Jeff Lewis ask what is healthy AI use, and unhealthy use? To help organize—yes! To replace judgment—no! Tip: When an attorney does not read AI output before filing a brief, expect sanctions.
James draws on his role on the judicial branch AI Task Force and his monthly Daily Journal AI column to provide a practical roadmap for responsible AI use—from crafting effective prompts to avoiding the automation bias that has led to attorney sanctions across the country.
Key points:
- Treat AI as an on-demand legal treatise, not a research tool: Mixon explains how AI excels at providing background information and organizing legal concepts into digestible narratives—making it ideal for learning complex areas quickly—but should never replace verified legal research or case citation.
- The "Daedalus Doctrine" framework offers a middle path: Drawing from Greek mythology, Mixon warns against flying too high (reckless AI adoption) or too low (ignoring AI entirely), urging lawyers to use AI thoughtfully while maintaining personal judgment and verification responsibilities.
- Effective prompting is critical: Never use open-ended commands like "enhance this brief"—instead, tell AI exactly what you want and ask it to flag changes in italics or bold so you can review selectively.
- Hallucinations remain the biggest risk: Recent sanctions cases show attorneys asking ChatGPT to verify its own fabricated cases—a fatal error that demonstrates why every citation must be independently confirmed.
- Courts aren't using AI for decision-making: Current California court policy prohibits AI use "in any way that would touch a decision" to preserve public confidence over efficiency gains.
- AI works best for background learning: Mixon describes using AI to create narratives and explanations that make legal concepts stick—transforming dry doctrine into memorable stories, like having a personalized treatise writer available on demand.
Tune in to learn how to harness AI's power for legal background and organization without falling into the traps that have cost other attorneys their credibility—and thousands in sanctions.