Singularitarians say that computing will advance to the stage where artificial intelligence outstrips human capabilities and then will happily begin designing even better artificial intelligence. I hope they're right.
But hand on heart, I honestly believe that if we had the magic box (or network of boxes) that could compute as quickly and efficiently as it will in 2035 (one of the dates proposed for arriving at the Singularity), we couldn't make it work. We couldn't program it to do what we expect AI to do at the time of the Singularity.
Human knowledge will have to increase in order to make the box work. I suspect (and am searching for evidence) that it will have to increase exponentially. (The alternative is finding a genius, which could happen at any time, but we can't count on it. That genius may exist, but have died yesterday in Darfur or Baghdad.)
If human knowledge is not increasing exponentially (or if we don't find and train the right genius), I don't think the Singularity can happen. Ever.
Comments