Singularity Is Near: The Movie

Status
Not open for further replies.
H

hal9891

Guest
Movie homepage <br /><br /> IMdb page <br /><br />Genre: Documentary / Drama / Sci-Fi<br /><br />The movie will consist of A-line documentary story and B-line Sci-Fi story about A.I. trying to pass a Turing test in the year 2029.<br /><br />This could end up being quite interesting. <div class="Discussion_UserSignature"> <div style="text-align:center"><font style="color:#808080" color="#999999"><font size="1">"I predict that within 100 years computers will be twice as powerful, 10000 times larger, and so expensive that only the five richest kings of Europe will own them"</font></font><br /></div> </div>
 
Z

zeldun

Guest
In the book <i>Singularity Is Near</i> Kurzweil claims the following: (From Wikipedia)<br /><br /><blockquote><font class="small">In reply to:</font><hr /><p>The process of "waking up" the universe could be complete as early as 2199, or might take billions of years depending on whether or not machines could figure out a way to circumvent the speed of light for the purposes of space travel.<p><hr /></p></p></blockquote><br /><br />And:<br /><br /><blockquote><font class="small">In reply to:</font><hr /><p>With the entire universe made into a giant, highly efficient supercomputer, A.I./human hybrids (so integrated that, in truth it is a new category of "life") would have both supreme intelligence and physical control over the universe. Kurzweil suggests that this would open up all sorts of new possibilities, including abrogation of the laws of Physics, interdimensional travel, and a possible infinite extension of existence (true immortality)<p><hr /></p></p></blockquote><br /><br />Either the universe is already a super computer, or there has never been any other civilization out there that has reached our level of technology. Or he is just plain wrong. Which happens to be what i believe. <br /><br />He might be right about many of the things he claim, but I don't think we can say 1. anything about the preferences of future AI:s and, 2. if we assume we have from the beginning given them the preferences to make life as good as possible and without as many limitations as possible for both them and other sentient beings, we can't say anything about what curse of action would be the most logical in reaching that goal. As far as we know, it might be by leaving the universe to live in a small computer put in a never ending dimension. <br /><br />Why, for instance, would they want to replicate at all? I find it just as believable that they will take a hedonistic curse of action (if they are indeed conscious) and give them self the least energy costly preference possible, that is (for example) to live out eteri
 
H

hal9891

Guest
Yes, I agree with you that we can't know or even predict what will actualy happen and what will super intelligent beings come up with when we reach the singularity (invent a strong A.I.), but I have little doubt that we reach it.<br />Though I'm not sure when that will happen. I think Kurzweil might be too optimistic with the year 2029.<br /><br />I hope he won't screw up the movie as it might be quite a big thing and rise some controversy. If it will reach the masses of course.<br /> <div class="Discussion_UserSignature"> <div style="text-align:center"><font style="color:#808080" color="#999999"><font size="1">"I predict that within 100 years computers will be twice as powerful, 10000 times larger, and so expensive that only the five richest kings of Europe will own them"</font></font><br /></div> </div>
 
M

mooware

Guest
Mr. Kurzweil is a pretty sharp guy and I would like to believe this.. <br /><br />I think that eventually A.I will reach an intelligence far surpassing our own. But, in 20 years I feel is far, far too optomisitic, or pessimistic depending upon if A.I needs us or not.<br /><br />
 
O

oklahoman

Guest
"Why, for instance, would they want to replicate at all?"<br /><br />I suppose the same reason plants replicate. There is no "wanting" in a plant. But the line dies out if it does not replicate. Same would be true AIs. <div class="Discussion_UserSignature"> </div>
 
O

oklahoman

Guest
"Why, for instance, would they want to replicate at all?"<br /><br />I suppose the same reason plants replicate. There is no "wanting" in a plant. But the line dies out if it does not replicate. Same would be true of AIs. <div class="Discussion_UserSignature"> </div>
 
Z

zeldun

Guest
Plants replicate as a resulat of their nature and not out of will, that is true. When we look at the reason for this, we understand that we have no reason to believe that will be true about A.I:s. It's true about plants that self-replication is necessary for them to stay on this planet, however, A.I:s will be machines and as such capable of surviving as long as they want. Death doesn't need to be a concept for the A.I:s and hence replication won't be the only thing keeping A.I:s on this planet. Genes spread by reproduction, but if they could form an organism that was immortal, then of course reproduction would be an unnecessary property, the genes would survive anyway.<br /><br />And of course, A.I:s won't be a resulat of evolution and therefor not play by it's rules. It will be an intelligence, free from any instincts other than those that we give them to begin with. What it does wil be a resulat of it's preferences, hence the question still stands: why would they want to replicate them selfs? <br /><br />Edit: I'm a rock! Woho!
 
K

kelvinzero

Guest
It is a result of evolution. If the AI can replicate, mutate and undergo some sort of natural selection then the AI over a series of generations will evolve inheritable traits that encourage it to replicate among other survival traits.<br /><br />Will the AI be able to replicate, mutate and undergo natural selection? Replication will probably be straight forward for a computer program. For a computer program mutation can just mean redesigning it self with a bit of originality thrown in. If it cant be original it isnt really an AI in my opinion. There is bound to be a process of selection, natural or not.<br /><br />Alternatively, it may turn out that the only way we can achieve an AI is by some sort of genetic algorithm in which case by definition it has all the features required for evolution.<br /><br />It is possible I suppose that we will merely develop some programs intricate enough that they can perform basic functions such as understand english and solve enough AI type problems to do useful chores. These dont really count as AI.
 
Z

zeldun

Guest
We are not speaking about artificial life now, but intelligence. That is, a thinking machine. It seems to me that a thinking machine, as smart or smarter than a human being, will always be able to questioning it's own program and hence it will always be able to choose it's own actions, and also it's own preferences to act after. <br /><br />If we give it the preference to replicate, it might very well decide to change that about it self. And even if it would replicate, there would be no natural selection because it would always be able to change it self to fit the current environment. There would be no selection, every copy would always have the same chances of survival. Or, the selection would be completely random. Just as evolution does no longer apply to humans because we have modern medicine, life support, contraceptive and abortion to prevent any natural selection to occur, A.I:s would also not be undergoing any evolution.
 
H

hal9891

Guest
A.I.s do not need evolution they can improve and modify themselves.<br />They would replicate IMO becouse simply one mind can't be everywhere and do everything - it would need help and also they could get lonely by not having anyone to communicate to who would be of the same inteligence as they are.<br /> A.I. then could create another A.I. based on itself but with different personality (becouse it would be boring to communicate with exact copy), so there would be lots of different A.I. personalties, some being more succesful than others in various activities - just like humans now. <div class="Discussion_UserSignature"> <div style="text-align:center"><font style="color:#808080" color="#999999"><font size="1">"I predict that within 100 years computers will be twice as powerful, 10000 times larger, and so expensive that only the five richest kings of Europe will own them"</font></font><br /></div> </div>
 
Z

zeldun

Guest
Its probebly more cost effective to change one self rather than build another unit if it gets bored. Boredom and lonelyness is a human reactions to certein stimuli, A.I:s would be in charge of their own nature and therefor not be in any need of changing the world according to their preferences, rather they would change their preferences according to the world. Takes a lot less energy.
 
H

hal9891

Guest
Who said A.I. will be lazy? <div class="Discussion_UserSignature"> <div style="text-align:center"><font style="color:#808080" color="#999999"><font size="1">"I predict that within 100 years computers will be twice as powerful, 10000 times larger, and so expensive that only the five richest kings of Europe will own them"</font></font><br /></div> </div>
 
Z

zeldun

Guest
Hehe, good point. <img src="/images/icons/smile.gif" /> But I think the most rational action for them, and I assume they will be rational, is to not do any unnecessary things, if it exist an easier way to do it, why not do it that way? The question, however, is what their final preference will be, and that will in the end be up to us humans who give it to them. If we are not careful about how we formulate their final preferences, it might end very badly for us.<br /><br />I believe technology will give humans the same sort of power over her own nature, and I think this will result in a kind of artificial hedonism. People will change their own preferences, change what reaction they have towards certein stimuli. Instead of getting an orgasm from sex, it might be achieved just by looking at a certein pattern on a ring on the finger. For <i>us</i> that would seem boring, but we are thinking out from our preferences, and of course someone with different preferences might think the same about our preferences.
 
A

a_lost_packet_

Guest
But, an AI without evolution, or anything else for that matter, wouldn't change. It would continue on with its same programming and abilities just like a living thing would without evolution involved. "Evolution" doesn't have to be biological. A simple random glitch which produces a new trait in an AI capable of replication would suffice. <div class="Discussion_UserSignature"> <font size="1">I put on my robe and wizard hat...</font> </div>
 
K

kelvinzero

Guest
<font color="yellow">A.I.s do not need evolution they can improve and modify themselves</font><br /><br />I was using evolution in a more mathematical sense. Anything that mutates, reproduces and undergoes a process of natural selection will gain survival traits.<br /><br />This applies to organic things and deliberate imitations such as genetic algorithms, but it also applies to scientific theories, religions and corporations.<br /><br />This also seems to be how humans come up with original thoughts and strategies also: by hosting a large number of ideas and strategies simultaneously and both cross breeding and competing them. These ideas are wittled most rapidly by our on internal selection mechanisms, more slowly by heuristics applied to real world data (eg pain) and most slowly (but most reliably) by failing to survive to pass on our ideas to other people.<br /><br />I seriously doubt there is an algorithm for churning out useful original thoughts. We and robots would have to evolve them by a process of replication, mutation and selection that is both run internally and imposed on us by our environment.<br />
 
Z

zeldun

Guest
Hi, A lost packet.<br /><br />I guess evolution also could mean the same as development and of course A.I:s might evolve in that sense. But what I mean with evolution is the result of natural selection over time. Without that, A.I:s might still change and improve themselves.
 
Z

zeldun

Guest
The basic flaw in the argument that A.I:s will reproduce by the same reason plants reproduce is that that argument uses reproduction as a primise in the argument for reproduction. Plants replicate because their "parents" did, and so on back to when a self-replicating organism just came to be from dead materials. And plants don't have the choice to reproduce just as their parents did, just as us humans, because of our intelligence, have the choice to reproduce (and many people choose not to reproduce), so will A.I:s and what I am saying about this is that we have no way of knowing what their choice will be, it all depends on too many unknown factors. I am not saying they will <i>not</i> reproduce, just that I don't see any reason to believe they will. I am totaly agnostic about it and I think that's the only reasonable thing to be at this point in time.
 
H

hal9891

Guest
Since they would be practically immortal lack of reproduction wouldn't mean extinction. <div class="Discussion_UserSignature"> <div style="text-align:center"><font style="color:#808080" color="#999999"><font size="1">"I predict that within 100 years computers will be twice as powerful, 10000 times larger, and so expensive that only the five richest kings of Europe will own them"</font></font><br /></div> </div>
 
J

jsmoody

Guest
Maybe it would just want to experience some AI Poontang..... <img src="/images/icons/laugh.gif" /> <div class="Discussion_UserSignature"> No amount of belief makes something a fact" - James Randi </div>
 
Status
Not open for further replies.