Okay. Here is the definition provided by the Singularity Institute for Artificial Intelligence (SIAI):
"The Singularity is the technological creation of smarter-than-human intelligence. There are several technologies that are often mentioned as heading in this direction. The most commonly mentioned is probably Artificial Intelligence, but there are others: direct brain-computer interfaces, biological augmentation of the brain, genetic engineering, ultra-high-resolution scans of the brain followed by computer emulation. Some of these technologies seem likely to arrive much earlier than the others, but there are nonetheless several independent technologies all heading in the direction of the Singularity – several different technologies which, if they reached a threshold level of sophistication, would enable the creation of smarter-than-human intelligence."
The Institute is described on Wikipedia here. I emailed the founder, Eliezer Yudkowsky, to get his thoughts on what I am doing here. He hasn't responded as yet.
For 'Singularitarians', the Singularity is a big transformative concept, one of those spectacular theories that will change human life as we know it. When we create intelligence greater than our own, we will solve problems that we cannot solve today. Furthermore, many of the technologies (and especially nanotechnology) that will be used to create artificial intelligence will be used by the artificial intelligence to help us with healthcare, energy (PDF), space travel and more. God help us if we create an artificial intelligence that wants to sit on the couch and watch the soaps.
Those who fear artificial intelligence, nanotechnology (and perhaps big transformativc concepts), manage to create a vision of some rough beast slouching towards Bethlehem waiting to be born. From fears of grey goo (nanotechnology run wild) to the destruction of human ambition, AI is a threat--perhaps the biggest outsourcing threat of all, especially if it is embodied in robots.
Comments