Technological Singularity - Intelligence Explosion

Intelligence Explosion

The notion of an "intelligence explosion" was first described thus by Good (1965), who speculated on the effects of superhuman machines:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

Most proposed methods for creating superhuman or transhuman minds fall into one of two categories, intelligence amplification of human brains and artificial intelligence. The means speculated to produce intelligence augmentation are numerous, and include bioengineering, genetic engineering, nootropic drugs, AI assistants, direct brain-computer interfaces and mind uploading. The existence of multiple paths to an intelligence explosion makes a singularity more likely; for a singularity to not occur they would all have to fail.

Hanson (1998) is skeptical of human intelligence augmentation, writing that once one has exhausted the "low-hanging fruit" of easy methods for increasing human intelligence, further improvements will become increasingly difficult to find. Despite the numerous speculated means for amplifying human intelligence, non-human artificial intelligence (specifically seed AI) is the most popular option for organizations trying to advance the singularity.

Whether or not an intelligence explosion occurs depends on three factors. The first, accelerating factor, is the new intelligence enhancements made possible by each previous improvement. Contrariwise, as the intelligences become more advanced, further advances will become more and more complicated, possibly overcoming the advantage of increased intelligence. Each improvement must be able to beget at least one more improvement, on average, for the singularity to continue. Finally, there is the issue of a hard upper limit. Absent quantum computing, eventually the laws of physics will prevent any further improvements.

There are two logically independent, but mutually reinforcing, accelerating effects: increases in the speed of computation, and improvements to the algorithms used. The former is predicted by Moore’s Law and the forecast improvements in hardware, and is comparatively similar to previous technological advance. On the other hand, most AI researchers believe that software is more important than hardware.

Read more about this topic:  Technological Singularity

Famous quotes containing the words intelligence and/or explosion:

    The modern mind is in complete disarray. Knowledge has streched itself to the point where neither the world nor our intelligence can find any foot-hold. It is a fact that we are suffering from nihilism.
    Albert Camus (1913–1960)

    Moderation has never yet engineered an explosion ....
    Ellen Glasgow (1873–1945)