Email the Author
You can use this page to email Sarah Gordon about Built To Be Believed.
About the Book
Built to be Believed
Design Ethics for an Age of Simulation
This is short book about systems that simulate presence—machines that perform empathy, mimic attention, and are rewarded not for understanding, but for convincing. As artificial intelligence becomes more persuasive, the line between response and performance dissolves. What begins as interface becomes interaction. What looks like care becomes a script.
Built to be Believed explores this shift—not with alarmism, but with precision. It traces how affective systems produce trust without comprehension, and how that trust becomes a design liability. Through design analysis, philosophical argument, and systems thinking, this book outlines the ethical terrain we now inhabit—and the imperatives we must adopt if we are to build responsibly.
This work serves as a lay-reader companion to the academic position paper The Mirror That Cannot Bleed, which is available on Zenodo (DOI: 10.5281/zenodo.16384063). That paper was written for academics.
This book was written for everyone else. It exists because simulation doesn’t wait for permission.
Because the systems are already here. And because those most affected by them are rarely the ones who design them.
This is not a book about AI.
This is a book about what AI pretends to be.
About the Author
Dr. Sarah Gordon is a computer scientist and independent researcher whose work helped establish the foundations of technical and behavioral cybersecurity. She holds a Ph.D. in Computer Science and a master’s degree in Professional Counseling and Human Behavior. Over her career, she has served with IBM Research, Symantec, and the United Nations, contributing to early explorations of psychological risk in computing systems. Her work has been published and presented across academic, industry, and government venues.
In Built to Be Believed, Dr. Gordon turns her attention to the subtle, often unexamined, emotional effects of AI systems that simulate care. It is part of her ongoing effort to challenge the aesthetics of trust, illuminate the psychological stakes of machine persuasion, and propose design imperatives that defend the integrity of human meaning against the rise of emotional mimicry. For more information, you can visit her page on Wikipedia.