Eliezer Yudkowsky is Research Fellow and Director of the Singularity Institute for Artificial Intelligence, a non–profit research institute dedicated to increasing the likelihood of, and decreasing the time to, a maximally beneficial singularity. He is one of the world's foremost experts on the subject, and frequently speaks on artificial general intelligence (AGI), rationality, and the future.
Leanpub requires cookies in order to provide you the best experience.
Dismiss