If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All – Yudkowsky, Eliezer – Soares, Nate (Hardcover)

$30.00

In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next.

For decades, two signatories of that letter—Eliezer Yudkowsky and Nate Soares—have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us—and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn’t even be close.

How could a machine superintelligence wipe out our entire species? Why would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive.

The world is racing to build something truly new under the sun. And if anyone builds it, everyone dies.

In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person. And the world is devastatingly unprepared for what would come next.

For decades, two signatories of that letter—Eliezer Yudkowsky and Nate Soares—have studied how smarter-than-human intelligences will think, behave, and pursue their objectives. Their research says that sufficiently smart AIs will develop goals of their own that put them in conflict with us—and that if it comes to conflict, an artificial superintelligence would crush us. The contest wouldn’t even be close.

How could a machine superintelligence wipe out our entire species? Why would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares walk through the theory and the evidence, present one possible extinction scenario, and explain what it would take for humanity to survive.

The world is racing to build something truly new under the sun. And if anyone builds it, everyone dies.

“The best no-nonsense, simple explanation of the AI risk problem I’ve ever read.”—Yishan Wong, Former CEO of Reddit

Only 1 left in stock

SKU: 9780316595643 Category:

Description

“[Yudkowsky and Soares’s] diagnosis of AI’s potential pitfalls evinces a sustained engagement with the subject…they have a commendable willingness to call BS on big Silicon Valley names, accusing Elon Musk and Yann LeCun, Meta AI’s chief scientist, of downplaying real risks.”―San Francisco Chronicle

“If Anyone Builds It, Everyone Dies makes a compelling case that superhuman AI would almost certainly lead to global human annihilation. Governments around the world must recognize the risks and take collective and effective action.”―Jon Wolfsthal, former special assistant to the president for national security affairs

“Soares and Yudkowsky lay out, in plain and easy-to-follow terms, why our current path toward ever-more-powerful AIs is extremely dangerous.”―Emmett Shear, former interim CEO of OpenAI

“Essential reading for policymakers, journalists, researchers, and the general public. A masterfully written and groundbreaking text, If Anyone Builds It, Everyone Dies provides an important starting point for discussing AI at all levels.”―Bart Selman, professor of computer science, Cornell University

“While I’m skeptical that the current trajectory of AI development will lead to human extinction, given AI’s exponential pace of change there’s no better time to take prudent steps to guard against worst-case outcomes. The authors offer important proposals for global guardrails and risk mitigation that deserve serious consideration.”―Lieutenant General John (Jack) N.T. Shanahan (USAF, Ret.), Inaugural Director, Department of Defense Joint AI Center

“If Anyone Builds It, Everyone Dies isn’t just a wake-up call; it’s a fire alarm ringing with clarity and urgency. Yudkowsky and Soares pull no punches: unchecked superhuman AI poses an existential threat. It’s a sobering reminder that humanity’s future depends on what we do right now.”―Mark Ruffalo, actor

“A serious book in every respect. In Yudkowsky and Soares’s chilling analysis, a super-empowered AI will have no need for humanity and ample capacity to eliminate us. If Anyone Builds It, Everyone Dies is an eloquent and urgent plea for us to step back from the brink of self-annihilation.”―Fiona Hill, former senior director, White House National Security Council

“This book outlines a thought-provoking scenario of how the emerging risks of AI could drastically transform the world. Exploring these possibilities helps surface critical risks and questions we cannot collectively afford to overlook.”―Yoshua Bengio, Full Professor, Université de Montréal; Co-President and Scientific Director, LawZero; Founder and Scientific Advisor, Mila – Quebec AI Institute

“A clearly written and compelling account of the existential risks that highly advanced AI could pose to humanity. Recommended.”―Ben Bernanke, Nobel laureate and former chairman of the Federal Reserve

“The definitive book about how to take on ‘humanity’s final boss’―the hard-to-resist urge to develop superintelligent machines―and live to tell the tale.”―Jaan Tallinn, philanthropist, cofounder of the Center for the Study of Existential Risk, and cofounder of Skype

“If Anyone Builds It, Everyone Dies may prove to be the most important book of our time. Yudkowsky and Soares believe we are nowhere near ready to make the transition to superintelligence safely, leaving us on the fast track to extinction. Through the use of parables and crystal-clear explainers, they convey their reasoning, in an urgent plea for us to save ourselves while we still can.”―Tim Urban, cofounder, Wait But Why

“A stark and urgent warning delivered with credibility, clarity, and conviction, this provocative book challenges technologists, policymakers, and citizens alike to confront the existential risks of artificial intelligence before it’s too late. Essential reading for anyone who cares about the future.”―Emma Sky, senior fellow, Yale Jackson School of Global Affairs

“If Anyone Builds It, Everyone Dies is a sharp and sobering read. As someone who has spent years pushing for responsible AI policy, I found it to be an essential warning about what’s at stake if we get this wrong. Yudkowsky and Soares make the case with clarity, urgency, and heart.”―Joely Fisher, National Secretary-Treasurer, SAG-AFTRA

“This book offers brilliant insights into history’s most consequential standoff between technological utopia and dystopia, and shows how we can and should prevent superhuman AI from killing us all.  Yudkowsky and Soares’s memorable storytelling about past disaster precedents (e.g., the inventor of two environmental nightmares: tetra-ethyl-lead gasoline and Freon) highlights why top thinkers so often don't see the catastrophes they create.”―George Church, Founding Core Faculty & Lead, Synthetic Biology, Wyss Institute at Harvard University

“A sober but highly readable book on the very real risks of AI. Both skeptics and believers need to understand the authors’ arguments, and work to ensure that our AI future is more beneficial than harmful.”―Bruce Schneier, Lecturer, Harvard Kennedy School and author of A Hacker’s Mind