A BRIEF BOOK ABOUT CROSSING THE ABYSS
An Inner Operating System for the Age of AI Meaning, Duty, and Care in a Time of Transition
No se pudo agregar al carrito
Add to Cart failed.
Error al Agregar a Lista de Deseos.
Error al eliminar de la lista de deseos.
Error al añadir a tu biblioteca
Error al seguir el podcast
Error al dejar de seguir el podcast
Prueba gratis de 30 días de Audible Standard
Compra ahora por $3.99
-
Narrado por:
-
Virtual Voice
-
De:
-
Martin Gjerløff
Este título utiliza narración de voz virtual
Step into the Gramsci Gap: the old world is dying, the new world is struggling to be born, and in the middle the “monsters” multiply. Not movie monsters, but the morbid symptoms we recognize everywhere right now: anxiety, cynicism, scapegoating, performative certainty, and the quiet disappearance of responsibility.
A Brief Book About Crossing the Abyss is a short, vivid guide for leaders, teams, and thoughtful readers who feel the chaos of the AI transition and refuse to let technology become the goal. This is not another “AI productivity” book. It’s a human book about how to stay coherent, ethical, and meaningful when systems accelerate faster than our culture can adapt.
Drawing on Antonio Gramsci, Marcus Aurelius, Viktor Frankl, Carl Gustav Jung, Dante, and modern duty-of-care thinking, you’ll learn why transitions make humans vulnerable to false certainty and moral outsourcing, and how to rebuild your inner operating system when everything feels out of control.
Inside you’ll discover:
The Gap explained: why legitimacy and meaning collapse in “in-between” eras, and why tech accelerates the crisis
The Abyss illustrated: how “the system decided” becomes a moral exit, and why people feel processed rather than met
The psychology of AI fatigue: automation bias, authority bias, bandwagon effects, and how output replaces ownership
A practical Inner OS built on three duties: duty to self, duty to others, duty to something larger than the self
Marcus Aurelius as a standard for the age of machine fluency: If it isn’t true, don’t say it. If it isn’t right, don’t do it.
A Duty of Care bridge: simple, adult questions that keep dignity, appeal, and accountability inside fast systems
A forward-looking epilogue on what happens when work shrinks dramatically: why the ultimate “why” may be for each other
If you’re tired of AI as an end goal, tired of hollow transformation theatre, and tired of living in a world where responsibility dissolves into dashboards and procedures, this book gives you something rare: a moral and psychological compass you can actually use.
Because the real risk of the Age of AI is not that machines become capable.
It’s that humans become careless.