Eliezer Yudkowsky

Eliezer Yudkowsky
Yudkowsky at Stanford University in 2006
Born
Eliezer Shlomo (or Solomon) Yudkowsky

(1979-09-11) September 11, 1979
OrganizationMachine Intelligence Research Institute
Known forCoining the term friendly artificial intelligence
Research on AI safety
Rationality writing
Founder of LessWrong
Websitewww.yudkowsky.net

Eliezer Shlomo Yudkowsky (/ˌɛliˈɛzər jʊdˈkski/ EL-ee-EH-zər yuud-KOW-skee; born September 11, 1979) is an American artificial intelligence researcher and writer on decision theory and ethics, known for popularizing ideas related to friendly artificial intelligence. He is the founder of and a research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion influenced philosopher Nick Bostrom's 2014 book Superintelligence: Paths, Dangers, Strategies. He is best known for If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All, a New York Times Best Seller he co-authored with Nate Soares, as well as the Harry Potter fanfiction Harry Potter and the Methods of Rationality.