Scaling Laws Podcast Por Lawfare & University of Texas Law School arte de portada

Scaling Laws

Scaling Laws

De: Lawfare & University of Texas Law School
Escúchala gratis

OFERTA POR TIEMPO LIMITADO. Obtén 3 meses por US$0.99 al mes. Obtén esta oferta.
Scaling Laws explores (and occasionally answers) the questions that keep OpenAI’s policy team up at night, the ones that motivate legislators to host hearings on AI and draft new AI bills, and the ones that are top of mind for tech-savvy law and policy students. Co-hosts Alan Rozenshtein, Professor at Minnesota Law and Research Director at Lawfare, and Kevin Frazier, AI Innovation and Law Fellow at the University of Texas and Senior Editor at Lawfare, dive into the intersection of AI, innovation policy, and the law through regular interviews with the folks deep in the weeds of developing, regulating, and adopting AI. They also provide regular rapid-response analysis of breaking AI governance news.

Hosted on Acast. See acast.com/privacy for more information.

Lawfare
Ciencia Política Política y Gobierno
Episodios
  • AI Safety Meet Trust & Safety with Ravi Iyer and David Sullivan
    Oct 7 2025

    David Sullivan, Executive Director of the Digital Trust & Safety Partnership, and Rayi Iyer, Managing Director of the Psychology of Technology Institute at USC’s Neely Center, join join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to discuss the evolution of the Trust & Safety field and its relevance to ongoing conversations about how best to govern AI.

    They discuss the importance of thinking about the end user in regulation, debate the differences and similarities between social media and AI companions, and evaluate current policy proposals.

    You’ll “like” (bad pun intended) this one.

    Leo Wu provided excellent research assistance to prepare for this podcast.

    Read more from David:

    https://www.weforum.org/stories/2025/08/safety-product-build-better-bots/

    https://www.techpolicy.press/learning-from-the-past-to-shape-the-future-of-digital-trust-and-safety/

    Read more from Ravi:

    https://shows.acast.com/arbiters-of-truth/episodes/ravi-iyer-on-how-to-improve-technology-through-design

    https://open.substack.com/pub/psychoftech/p/regulate-value-aligned-design-not?r=2alyy0&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false

    Read more from Kevin:

    https://www.cato.org/blog/california-chatroom-ab-1064s-likely-constitutional-overreach

    Hosted on Acast. See acast.com/privacy for more information.

    Más Menos
    47 m
  • Rapid Response: California Governor Newsom Signs SB-53
    Sep 30 2025
    In this Scaling Laws rapid response episode, hosts Kevin Frazier and Alan Rozenshtein talk about SB-53, the frontier AI transparency (and more) law that California Governor Gavin Newsom signed into law on September 29.

    Hosted on Acast. See acast.com/privacy for more information.

    Más Menos
    36 m
  • The Ivory Tower and AI (Live from IHS's Technology, Liberalism, and Abundance Conference).
    Sep 30 2025

    Neil Chilson, Head of AI Policy at the Abundance Institute, and Gus Hurwitz, Senior Fellow and CTIC Academic Director at Penn Carey Law School and Director of Law & Economics Programs at the International Center for Law & Economics, join Kevin Frazier, AI Innovation and Law Fellow at the University of Texas School of Law and a Senior Editor at Lawfare, to explore how academics can overcome the silos and incentives that plague the Ivory Tower and positively contribute to the highly complex, evolving, and interdisciplinary work associated with AI governance.

    The trio recorded this podcast live at the Institute for Humane Studies’s Technology, Liberalism, and Abundance Conference in Arlington, Virginia.


    Read about Kevin's thinking on the topic here: https://www.civitasinstitute.org/research/draining-the-ivory-tower

    Learn about the Conference: https://www.theihs.org/blog/curated-event/technology-abundance-and-liberalism/

    Hosted on Acast. See acast.com/privacy for more information.

    Más Menos
    43 m
Todavía no hay opiniones