Coding a Transhuman AI *2.2* [New in 2000!] (358K)
Probably the most intelligent question I get asked about the Singularity is "Just because we have all this computing power doesn't mean we know how to use it. Can we really program an Artificial Intelligence that's smarter than human?" This page explains how.CaTAI discusses how to build a general intelligence, plus those issues particular to self-modifying or "seed" AIs.
Computer science, not philosophy, although other pages use it as support and background for futuristic and even philosophical issues.
The section on "Cognition" is in progress. The published sections "Paradigms" and "Mind" are complete and self-contained. There is a summary.
The Plan to Singularity [New in 2000!] (403K)
A vision of how to reach the Singularity (the creation of greater-than-human intelligence) using open-source Artificial Intelligence development to spark the creation of an AI industry. This page describes the technological timeline leading up to Singularity; discusses strategy including developmental, organizational, and memetic issues; and estimates how much it will cost to get started.Of course, large sections of the page are already obsolete.
The address given above is for a multi-file version. A single-file version is also available (but is a 403K download).
Coding a Transhuman AI 1.0 (346K)
The previous version of CaTAI. Most of it is now obsolete, but it contains some information that isn't in the current version yet. (Do read the current version first, though!)
Staring into the Singularity (79K)
If computing power doubles every two years, what happens when computers are doing the research?This document contains more concentrated future shock than anything I have ever written.
The Singularitarian Principles 1.0 (14K)
The Principles we profess in our high quest to bring about the creation of greater-than-human intelligence: Singularity, Activism, Ultratechnology, Globalism; Apotheosis, Solidarity, Intelligence, Independence, Nonsuppression.Commentary may be found in the Extended Edition (58K).
Frequently Asked Questions about the Meaning of Life (238K)
This is Ask Jeeves's (and hence, Altavista's) answer to the question "What is the Meaning of Life?" I mean, I knew the answer, and the previous answer led to a sophomoric fake 404, so why not?This page is targeted at a non-transhumanist audience. SL3s and above may not find much to be shocked about.
Algernon's Law (95K)
Algernon's Law: Any simple enhancement of the human mind is a net evolutionary disadvantage.
This page is getting on in years...
Bookshelf (27K)
Books which you are not permitted to not read.
Thanks to Cole Kitchen, Lord Proofreader and Supreme Catcher-of-Errors, for many helpful suggestions. (On one occasion, he emailed me a list of all the mistakes in a newly published version of a page - not even a new page, but an update of an old page - 18 hours after I uploaded it and before I announced it anywhere. I can only assume that he's an AI, or that some type of magic is involved.)The Singularitarian mailing list (20K)
This is the mailing list to keep people apprised of efforts to reach the Singularity - a gathering place for people interested in helping out. Restricted membership.Eliezer, the person [New in 2000!] (104K)
You've been asking for this page for the last three years; well, here it is: My life history, my emotional landscape, my honest-to-goodness best shot at conveying what it's like to be me. And yes, by popular demand, the page includes details such as my favorite television show. But beware. It's not your usual about-the-author page. It's as far outside the ordinary scheme of things as anything in the Low Beyond. You have been warned.The Ethics of Cognitive Engineering (44K)
A discussion of the issues persuant to, view 1: Using people as experimental subjects for untested procedures whose effects would be horrifying even if they worked perfectly - or, view 2: Giving people the power to remake their own minds, change the future, make a difference, and choose who they will be and what talents they will have, instead of just accepting genetic fiat.The SL4 mailing list (11K)
This is where Singularitarians and other people with extremely high levels of future shock hang out. (List is not currently very active.) Unrestricted membership.Gaussian Humans (6K)
Neutral terminology for the next century. What can you call an "ordinary human" that won't sound condescending?Future Shock Levels (8K)
Some handy terminology. Also useful for figuring out how much future shock you can safely expose your audience to.Workarounds to the Laws of Physics (8K)
An old, old document that I wrote because I was annoyed at people who talked about the lightspeed limit or conservation of mass and energy or the "laws" of thermodynamics as if they were insuperable limits to growth. Modern physics explicitly specifies loopholes, making assertions of impossibility as foolish as old statements about powered flight or breaking the sound barrier.PGP public key for Eliezer Yudkowsky (2K)
Send me secure email using public-key cryptography.
Help annoy the NSA.
Thanks to the Singularity Institute for Artificial Intelligence, Inc. for supporting my more recent work.
Thanks to kia.net for donating hosting to the Low Beyond.