Rabu, 18 Januari 2012

Technological singularity

Technological singularity refers to the hypothetical future emergence of greater-than-human intelligence through technological means, very probably resulting in explosive superintelligence.[1] Since the capabilities of such intelligence would be difficult for an unaided human mind to comprehend, the occurrence of a technological singularity is seen as an intellectual event horizon, beyond which the future becomes difficult to understand or predict. Proponents of the singularity typically state an "intelligence explosion"[2][3] is a key factor of the Singularity where superintelligences design successive generations of increasingly powerful minds.
The term was coined by science fiction writer Vernor Vinge, who argues that artificial intelligence, human biological enhancement or brain-computer interfaces could be possible causes of the singularity. The concept is popularized by futurists like Ray Kurzweil and it is expected by proponents to occur sometime in the 21st century, although estimates do vary.

Many of the most recognized writers on the singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of superintelligence, and argue that it is difficult or impossible for present-day humans to predict what a post-singularity world would be like, due to the difficulty of imagining the intentions and capabilities of superintelligent entities.[4][5][6] The term "technological singularity" was originally coined by Vinge, who made an analogy between the breakdown in our ability to predict what would happen after the development of superintelligence and the breakdown of the predictive ability of modern physics at the space-time singularity beyond the event horizon of a black hole.[6] Some writers use "the singularity" in a broader way to refer to any radical changes in our society brought about by new technologies such as molecular nanotechnology,[7][8][9] although Vinge and other prominent writers specifically state that without superintelligence, such changes would not qualify as a true singularity.[4] Many writers also tie the singularity to observations of exponential growth in various technologies (with Moore's Law being the most prominent example), using such observations as a basis for predicting that the singularity is likely to happen sometime within the 21st century.
A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good.[11] Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia.[12] However with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.[13] If superhuman intelligences were invented, either through the amplification of human intelligence or artificial intelligence, it would bring to bear greater problem-solving and inventive skills than humans, then it could design a yet more capable machine, or re-write its source code to become more intelligent. This more capable machine could then go on to design a machine of even greater capability. These iterations could accelerate, leading to recursive self improvement, potentially allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.

Tidak ada komentar:

Posting Komentar