PDA

View Full Version : What is the term for this event...?



billdotson
November 5th, 2008, 05:07 AM
What is the term that describes the point that technology had to reach before it could start making significant advances? I have had difficulty finding it.

I know it isn't Moore's Law, and while the "Law of Accelerating Returns" might be related, that doesn't seem to describe a specific point in time.

Also, this is very loosely related, but with Moore's Law as well as the "Law of Accelerating Returns" do you think the technological singularity will ever be attained? I believe technological singularity is the point where a created artificial intelligence is intelligent enough to correct errors, make changes, etc. to the point where it will continually evolve itself. How do you think that fits in with (currently) science fiction stories of machines/computers reaching a point where they will either compete with humanity or destroy them? Do you think it is possible?

If you think it is possible, how probable do you think it is? I would like to, from wishful thinking, believe that it isn't possible, or if it is that it is very un-probable. However, wishful thinking isn't always accurate. On a side note, isn't it remarkable that humanity hasn't destroyed itself by now, especially with the introduction of nuclear weapons?

neoflight
November 5th, 2008, 06:45 AM
What is the term that describes the point that technology had to reach before it could start making significant advances? I have had difficulty finding it.

I know it isn't Moore's Law, and while the "Law of Accelerating Returns" might be related, that doesn't seem to describe a specific point in time.

Also, this is very loosely related, but with Moore's Law as well as the "Law of Accelerating Returns" do you think the technological singularity will ever be attained? I believe technological singularity is the point where a created artificial intelligence is intelligent enough to correct errors, make changes, etc. to the point where it will continually evolve itself. How do you think that fits in with (currently) science fiction stories of machines/computers reaching a point where they will either compete with humanity or destroy them? Do you think it is possible?

If you think it is possible, how probable do you think it is? I would like to, from wishful thinking, believe that it isn't possible, or if it is that it is very un-probable. However, wishful thinking isn't always accurate. On a side note, isn't it remarkable that humanity hasn't destroyed itself by now, especially with the introduction of nuclear weapons?


Threshold?

saulgoode
November 5th, 2008, 04:14 PM
Technological singularity (http://en.wikipedia.org/wiki/Technological_singularity) doesn't describe it?

Bölvağur
November 5th, 2008, 05:32 PM
Computers will only do what they will be programed to do. So if you dont program any computer to replicate it self and try to preserve it's kind by all means... you probably will not have any troubles. Even if you make very advanced AI it will not fear death, unless we or it will program it to do it.

I think we need.. wait.. I şink we need to beta test AI for a long time before using it. But it should be ok as long as it is well programmed.

SunnyRabbiera
November 5th, 2008, 05:35 PM
I think its called the beep beep, blink blink, boop boop, tikka tikka effect ;)

pp.
November 5th, 2008, 05:49 PM
Critical mass
Threshold (+1)
Point of no return
self sustaining level

ww711
November 5th, 2008, 07:27 PM
Tipping point.
Crossing the chasm.

snova
November 5th, 2008, 08:59 PM
If the Singularity is ever achieved, humanity will be screwed, because it will be notoriously buggy. ;) (Oops. Lunch? Thought you said launch...)

I don't think it's possible. Maybe with evolutionary algorithms, you could build something to modify itself, but how would a computer be able to tell that one version is any "better" than another? It takes a human to make these kinds of decisions.

billdotson
November 5th, 2008, 11:48 PM
Sorry, the original question was not affiliated with singularity.

Singularity was something pretty much unrelated. The question was what term do you use to describe the point in history where technology reached the point where it could begin to grow very quickly. To illustrate the idea I am talking about, for thousands of years humans basically were farmers. Industrial revolution happened, that changed some. Then computers came along, and at a certain point technology reached the point where it has advanced far enough that it could begin to grow at a seemingly exponential rate.

I believe technological singularity describes a point in time where technology gets to a point where it can improve at a seemingly infinite rate without humans interfering, i.e. artificial intelligence that will improve upon itself.

smoker
November 5th, 2008, 11:57 PM
rubicon? (point of no return!)

pp.
November 6th, 2008, 07:23 AM
Industrial revolution happened, that changed some.

Bingo! Exponential growth started right then, I'd think.