Tuesday, September 9, 2008

Knowledge is Power

““Plants” with “leaves” no more efficient than today’s solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough omnivorous “bacteria” could out-compete real bacteria: They could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stop – at least if we make no preparation. We have trouble enough controlling viruses and fruit flies” (10). I chose this passage because Joy is stating that we are not capable of having technology that we can not control known as gray goo. I feel that this problem could be fixed by becoming more powerful and knowledgeable before we make the replicators or never make them in the first place.

First off, I feel that Bill Joy is a credible enough person to write this article. He was credited with co designing three computer processing systems (7). I feel that computers now would not be the same without these systems—like Java. He feels that learning is an exciting thing, especially mathematics (6). This for some reason makes me feel that is credible because he is one of the few that want to learn, not has to learn. For this purpose I feel that his problem he has arisen with is credible.

As the first passage states, we can not even control viruses and fruit flies. If we as humans could become more knowledgeable of robots and even science in general, then maybe we could become more capable of technology. For example, in 1945 when we dropped the atomic bomb, more scientists were hesitant but still went forth. “I have felt it myself. The glitter of nuclear weapons. It is irresistible if you come to them as a scientist. To feel it’s there in your hands, to release this energy that fuels the stars, to let it do your bidding. . . gives people an illusion of illimitable power. . .” (13). If scientists could instead use their knowledge and not the power of making a robot then maybe if could be done. I feel like if the power was put behind us and the use of knowledge was the purpose then they would be made for the right purpose. We need to know more about the world before we do something like that.

Another action we could propose was to not make some of these robots that could destroy the earth. There are certain technologies that I admit are essential for our existence. The gray goo threat in the first passage makes one perfectly clear: We cannot afford certain kinds of accidents with replicating assemblers (11). I can not even imagine the earth becoming nothing because of an accident. That is not worth it, to take a whole species with history of billions of years and with millions of organisms on it and it be gone because someone put the wrong gray goo in a vile. What is the point of greater technology that is irrelevant? I feel like robots are kind of pointless. Maybe technology that can help the environment is more important. Most technologies these days make us lazier. Even Joy states that GNR technologies warnings have not been publicized because there is no profit of it (11). If there are warnings it should not be worth it yet. The Dalai Lama says that we must understand what it is that makes people really happy and it is neither material nor the pursuit of power (18). If happiness, the key to what most people want in this life, is what we want then why do we need to dig into the technology that could ultimately end our civilization?

Joy states the problem that replicators could end our world. I feel as though if we can stray away from that would be a good idea because then we would not be even dipping our hands into that kind of technology. Maybe the power of countries and the willingness to do anything to control power of the world makes people want to but I think that we should just wait or not do anything about it yet until we know more. Knowledge before power is more crucial to the problem than anything.

2 comments:

KaraG said...

Okay, No one commented mine back, so I'm assuming no one was assigned to me..I guess I'll just leave it as it is unless I get a comment before Sat! :(

Adam Johns said...

Your introductory passage is interesting, although it isn't set off very well from the body of the paper - quotes should be in quotation marks, or else indented/set off in some clear way. Your initial formulation of your argument is rather vague - but, on the plus side, it *is* clearly an argument.

Your discussion of Joy's credibility is fine, the actual language has some problems - you are wordy and maybe too informal. "First off" and "for some reason," for instance, are fine verbally, but don't work well in writing.

Now, let's look at the interesting material. "I feel like if the power was put behind us and the use of knowledge was the purpose then they would be made for the right purpose. We need to know more about the world before we do something like that." Your language here is vague and messy, but the idea is quite interesting. Here's what I take away from it: I think you're arguing that we need to push science a lot farther, at least in dangerous areas, before we think of turning those sciences into practical technologies.

It's a good idea - what you needed to do was introduce it earlier, and think it through in detail. Do things even work that way? How would our knowledge about nanotech advance, for instance, if we don't actually *build* the machines? Your idea is promising but not developed.

The next paragraph is a distraction - rather than developing your idea of "knowledge before power," you shift into another area entirely. This is strictly a tangent, towards duller material away from more interesting material. Why? Presumably because developing interesting material is *hard* - which it is.

Short version: I love the idea, but you work *around* it instead of through it. Rather than explaining, using one of Joy's example technologies, how this idea would actually work, you go in various other directions. This was potentially excellent material, but you aren't exploring that potential.