““Plants” with “leaves” no more efficient than today’s solar cells could out-compete real plants, crowding the biosphere with an inedible foliage. Tough omnivorous “bacteria” could out-compete real bacteria: They could spread like blowing pollen, replicate swiftly, and reduce the biosphere to dust in a matter of days. Dangerous replicators could easily be too tough, small, and rapidly spreading to stop – at least if we make no preparation. We have trouble enough controlling viruses and fruit flies” (10). I chose this passage because Joy is stating that we are not capable of having technology that we can not control known as gray goo. I feel that this problem could be fixed by becoming more powerful and knowledgeable before we make the replicators or never make them in the first place.
First off, I feel that Bill Joy is a credible enough person to write this article. He was credited with co designing three computer processing systems (7). I feel that computers now would not be the same without these systems—like Java. He feels that learning is an exciting thing, especially mathematics (6). This for some reason makes me feel that is credible because he is one of the few that want to learn, not has to learn. For this purpose I feel that his problem he has arisen with is credible.
As the first passage states, we can not even control viruses and fruit flies. If we as humans could become more knowledgeable of robots and even science in general, then maybe we could become more capable of technology. For example, in 1945 when we dropped the atomic bomb, more scientists were hesitant but still went forth. “I have felt it myself. The glitter of nuclear weapons. It is irresistible if you come to them as a scientist. To feel it’s there in your hands, to release this energy that fuels the stars, to let it do your bidding. . . gives people an illusion of illimitable power. . .” (13). If scientists could instead use their knowledge and not the power of making a robot then maybe if could be done. I feel like if the power was put behind us and the use of knowledge was the purpose then they would be made for the right purpose. We need to know more about the world before we do something like that.
Another action we could propose was to not make some of these robots that could destroy the earth. There are certain technologies that I admit are essential for our existence. The gray goo threat in the first passage makes one perfectly clear: We cannot afford certain kinds of accidents with replicating assemblers (11). I can not even imagine the earth becoming nothing because of an accident. That is not worth it, to take a whole species with history of billions of years and with millions of organisms on it and it be gone because someone put the wrong gray goo in a vile. What is the point of greater technology that is irrelevant? I feel like robots are kind of pointless. Maybe technology that can help the environment is more important. Most technologies these days make us lazier. Even Joy states that GNR technologies warnings have not been publicized because there is no profit of it (11). If there are warnings it should not be worth it yet. The Dalai Lama says that we must understand what it is that makes people really happy and it is neither material nor the pursuit of power (18). If happiness, the key to what most people want in this life, is what we want then why do we need to dig into the technology that could ultimately end our civilization?
Joy states the problem that replicators could end our world. I feel as though if we can stray away from that would be a good idea because then we would not be even dipping our hands into that kind of technology. Maybe the power of countries and the willingness to do anything to control power of the world makes people want to but I think that we should just wait or not do anything about it yet until we know more. Knowledge before power is more crucial to the problem than anything.