What is the term that describes the point that technology had to reach before it could start making significant advances? I have had difficulty finding it.
I know it isn't Moore's Law, and while the "Law of Accelerating Returns" might be related, that doesn't seem to describe a specific point in time.
Also, this is very loosely related, but with Moore's Law as well as the "Law of Accelerating Returns" do you think the technological singularity will ever be attained? I believe technological singularity is the point where a created artificial intelligence is intelligent enough to correct errors, make changes, etc. to the point where it will continually evolve itself. How do you think that fits in with (currently) science fiction stories of machines/computers reaching a point where they will either compete with humanity or destroy them? Do you think it is possible?
If you think it is possible, how probable do you think it is? I would like to, from wishful thinking, believe that it isn't possible, or if it is that it is very un-probable. However, wishful thinking isn't always accurate. On a side note, isn't it remarkable that humanity hasn't destroyed itself by now, especially with the introduction of nuclear weapons?
Bookmarks