Eliezer Yudkowsky - Publications

Publications

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality". The Sequences on Less Wrong, comprising over two years of blog posts on epistemology, Artificial Intelligence, and metaethics, form the single largest bulk of Yudkowsky's writing.

He contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks, and "Complex Value Systems are Required to Realize Valuable Futures" to the conference AGI-11.

Yudkowsky is the author of the Singularity Institute publications "Creating Friendly AI" (2001), "Levels of Organization in General Intelligence" (2002), "Coherent Extrapolated Volition" (2004), and "Timeless Decision Theory" (2010).

Yudkowsky has also written several works of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality (The New Yorker described it as "a thousand-page online 'fanfic' text called 'Harry Potter and the Methods of Rationality', which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"), and has been favorably reviewed by authors David Brin and Rachel Aaron, Robin Hanson, Aaron Swartz, and by programmer Eric S. Raymond.

Read more about this topic:  Eliezer Yudkowsky

Famous quotes containing the word publications:

    Dr. Calder [a Unitarian minister] said of Dr. [Samuel] Johnson on the publications of Boswell and Mrs. Piozzi, that he was like Actaeon, torn to pieces by his own pack.
    Horace Walpole (1717–1797)