3: Can AI discriminate?
No se pudo agregar al carrito
Solo puedes tener X títulos en el carrito para realizar el pago.
Add to Cart failed.
Por favor prueba de nuevo más tarde
Error al Agregar a Lista de Deseos.
Por favor prueba de nuevo más tarde
Error al eliminar de la lista de deseos.
Por favor prueba de nuevo más tarde
Error al añadir a tu biblioteca
Por favor intenta de nuevo
Error al seguir el podcast
Intenta nuevamente
Error al dejar de seguir el podcast
Intenta nuevamente
-
Narrado por:
-
De:
But what about when AI is used for decisions that actually matter? Like whether a person with disabilities gets the support they need to live independently, or how the police predict who is going to commit a crime?
With people’s rights and freedom on the line, the stakes are much higher – especially because AI can discriminate.
To unpack all of this we’re joined by Griff Ferris, Senior Legal and Policy Officer at campaign organisation Fair Trials, to discuss the extent to which AI can discriminate, the impact it has on people who are already marginalised, and what we can do about it.
Mentioned in this episode:
- Fair Trials’ predictive policing quiz
- Fair Trials’ report Automating Injustice
- The HART algorithm used by Durham Police
- The Government’s AI regulation white paper
- Public Law Project’s Tracking Automated Government register
Todavía no hay opiniones