Instapundit and Centerfield have blogged about singularity. Since I need to save Thew the actual effort of clicking, singularity is the point where advances in technology occur so fast that it exceeds humans' ability to understand or predict. Singularity gained ground in 1993 with a paper by noted sci-fi author Vernor Vinge, who believes it is inevitable. In fact, according to singularity proponents, it could be happening right now.
Nonsense.
While I firmly believe that developments in technology and our understanding of the world are accelerating, I am incredibly unconvinced that singularity will ever happen. If it does, I am in Glenn's camp that super intelligent does not mean a malevolent force. (E.g. "Colossus: the Corbin Project." The new sci-fi action thriller "Stealth" has the same premise.) In fact, one could make a stronger argument that super intelligence would be a benevolent force.
My disbelief focuses around two points: the nature of human thought and the capacity of humans to learn (a point made by Jon Kay at Centerfield).
Being smart or super smart does not make you human. The human thought process involves not only raw computing power (IBM could not even attempt human thought until Blue Gene's 22.8 teraflops or about 22,000x your home PC), but emotion and intuition. So called "fuzzy logic" was created because Boolean logic was not smart enough to operate a refrigerator. The point being that the "raw computing power" in the human brain is only partially involved in thought and decisions. I am not yet convince that computers can be programmed for abstract thought or imagination. If so, I think we will be well beyond the capability of current computing. (What is beyond petaflops?) I am convinced that computers will continue to solve known problems very fast.
Second, I think an underlying assumption in Vinge's notion is the idea that the human capacity to learn and grow is less then the computers. This strikes me as particularly dubious. I see humans' capacity increasing at a pace to keep us well ahead of our silicon creations. Our brains and our capacity to learn is THE reason we are still around and are the dominant creature on this planet. While I am sure that there will be individuals without the capacity to comprehend and use the new technology, I am equally sure that there will be plenty that will. In addition, the flip side of learning is teaching. I may not get every advance mankind has made, but I trust the rocket scientists who do will teach me. Some IBM computer may well be smarter than I am, but singularity states that ALL of mankind will be left in the dust. I don't think so. Mankind is a giant, distributed supercomputer, so while I am pushing the boundaries of my expertise (and spreading the good to anyone who wants it), there are billions of others doing the same. I think I will continue to bet on that team.
Further, it is also unclear that computer learning will maintain its initial pace. To use an analogy: watch how fast a small child learns. It is amazing. But should you abstract that learning capacity to adulthood? I think computing is in the small "child phase." It is unclear to me that the pace of learning can be sustained at the present rate.
Of course, simply because I do not yet believe it, that does not mean that it will not happen. But I do not understand what is fundamentally different about a brain made out of silicon that will allow it to perform so much better and different then a brain made out of gray matter. And billions of brains at that.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment