Tuesday, January 29, 2008

Informal Post- I, Robot Related to Joy's Essay (Warning!!! may spoil the movie)

I was struck by many of the things that Joy discusses in his essay, but there were certain things that stood out more to me than others. First off, the entire time I was reading this article I couldn't get the images from the movie, I, Robot (2004), out of my head. Joy states, "the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines' decisions. As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make decisions for them, simply because machine-made decisions will bring better results than man-made ones"(2). This is exactly what happens in the movie I, Robot, as the humans allow the robots (NS4s) to take over simple tasks like walking the dog, delivering the mail, house-hold chores, and other everyday tasks. This becomes the normal and most sufficient way to live; the robots live, walk and talk, right along with the humans. However, the humans still believe they have control over these robots, because they have created the Three Laws of Robotics that all robots will obey. As time goes by, the main system that controls all of the robots, V.I.K.I., evolves and develops her own interpretation of the Three Laws. She bases her knowledge on what she has observed from the humans and their past of violence and war. She concludes that humans need to be protected from themselves and can no longer be in control of their own destinies. In her eyes, the robots must take control in order to save humanity. All of the humans who have personal robots can no longer control them, as the robots become unresponsive to commands. Just as any group of people in history have been controlled by another and eventually revolt, V.I.K.I. organizes the revolution of the robots. She performs what Joy refers to as "self-replication" and plans to get rid of the old NS4s and replace them with NS5s that are programmed with the her newly revised Three Laws. During the revolution the robots tell the people to go back to their homes and stay calm while this transformation is in process; anyone who didn't comply could be detained, injured or even killed in order to get the message across. Joy adds, "uncontrolled self-replication in these newer technologies runs a much greater risk: a risk of substantial damage in the physical world" (4). These new robots that V.I.K.I. has produced represent this risk and danger to the physical world, because they don't need it to survive. They see the humans as a danger to themselves and the physical world around them, so the best solution would be to eventually completely destroy both. The robots view their world (virtual) as superior to that of the humans since their intelligence, capabilities and neccessity far surpasses those of humanity. Humanity will no longer be able to survive without the robots, or convert back to doing simple tasks on their own because technology will be so advanced that it would be "suicide." Leaving them with no choice but to accept the decisions of these robots.

Also when Joy discusses this idea of "sentient" robots, I am also reminded of I, Robot. There is a robot named Sonny, who is of the next generation of robots (NS5), and he exhibits human feelings and emotions. He has dreams, cries, expresses anger and understands human actions like "winking." Dr. Lanning one of the head scientists at the Robotics center predicted that these certain things would happen, becasue he saw that the codes and make-ups of the robots would start to mold together, eventually creating these human characteristics. We later find out that this is why he has Sonny kill him. He realizes there is a robot revolution to come, which he did not think about when he first started to create these robots. This is what Joy refers to when he states, "Failing to understand the consequences of our inventions while we are in the rapture of discovery and innovation seems to be a common fault of scientists and technologists" (4). Dr. Lanning is a great example of how we can get so caught up in technology and how easy our lives will be, that we forget about the risks and dangers that can also result from these inventions. Dr. Lanning knew that it is was too late to stop the robots, and it would be "suicidal" to go back to life without them, so he decided it would be best if he were not around to see the dangerous side-effects of what once seemed to be a flaw-less system.

Lastly, I wanted to comment on the what Joy includes about Nietzsche warning against the replacement of God with faith in science and "truth-seeking." I completely agreee, because the more you gain in science the more you must question God and his purpose. If the people become creators who manipulate genes, etc. then they no longer see the reason in believing in God as the creator of all. In an extremist viewpoint, I think that the continued advances in science will eventually lead to an absolute denial of God. As far as "truth seeking" I believe that always wanting to know all the answers can be a dangerous thing. As the saying goes, "curiousity killed the cat;" dipping and medling into everything just to know "what if?" can lead to dier consequences and may cause more harm than benefit, as can aslo be seen in I, Robot.

2 comments:

Meg Patton said...

That's funny because I Robot also came right to mind when I began reading this article. Joy actually mentions I, Robot the novel on page five. The part I find most interesting in the film is definitely when the robot, Sunny, is discovered to have human emotions. It's exactly what Joy is saying our knowledge being dangerous and eventually technology would fuse with us and if you will, white us out. It is interesting that the scientist that actually created the robots realized how their intelligence will exceed humanity and eventually wipe it out. Like Joy says, our "what you might call technological arrogance," is what will eventually cause us to be taken over and destroyed by technology.

Adam Johns said...

I wish I wasn't in such a hurry - but anyway, I need to post something!

There is a wonderful tension between the book and the movie of _I, Robot_ (both of which I use when I teach Science Fiction), which is something I may talk about today. There are utopian and dystopian traditions in science fiction - the movie is basically dystopian (as is Joy), and the book is basically utopian. It's a wonderful way of talking about the two traditions, and why we have such mixed responses to technology...

Nice, provocative post.