One thing I found to be interesting in Do Androids Dream of Electric Sheep? was that the people left on Earth used pets as sort of a status symbol. In today's society, having the latest gadgets is somewhat of a status symbol. For example, someone who always gets the new iPhone probably has some disposable income. This is kind of in contrast to in the story where the technology has progressed so far that everyone has access to an android (if they emigrate) so something we might take for granted today like a pet is the new status symbol simply because of its rarity. Also somewhat still on the topic of pets, I find it odd that something like an ostrich would be desirable as a pet. I think it would be very difficult to interact with and I personally would rather have an animal like a dog or cat even if it wasn't seen as being very impressive to my neighbors.
response #3Do Androids Dream of Electric Sheep?, questions what it means to be human. Some of the androids even believe that they are human, and if they believe then should they be included? Also they cannot live on Earth, is that due to government regulation on jobs, I feel like its supposed to correlate with a xenophobia issue that certain areas have. Its also interesting how all of the population on Earth is supposed to be android-free, and no one is supposed to know, but the police know, and they are humans on Earth. Marcuse is pretty interesting, especially when he states, "The gov't of advancing industrial societies can maintain and secure itself only when it succeeds in mobilizing, organizing, and exploiting the technology, science and mech productivity... (chapt 1)" I feel like the government of the future doesn't have a lot of control over the Nexus-6 unit, if one got into the World Police organization, then they are really having a hard time controlling was once just a military robot.
Starting out with Do Androids Dream of Electric Sheep?, I can see a clear connection between this story and a certain pattern going on today. In the novel, the police force has to find illegal immigrant robots and “retire them” by killing them once they have been determined to be a robot. They only method they have for detecting a robot is the Voight-Kampff test, and even then it is doubted whether the test is accurate. Although it works in the book, the test seems to be on the brink of becoming obsolete once newer robots are developed. This happened earlier in the book’s history when Kampff had to add to the Voight Test in order for it to work on higher level robots. Before that, the police didn’t even have a proper test according to the book. This enforcement problem is sometimes known as “cultural lag” and is prevalent in the twenty-first century. With the popular usage of the internet, cybercrimes went on unregulated for a long time until the law was able to hire hackers of their own. Slips in the system can still be seen such as the PlayStation Network Outage of 2011 and even the University of Pittsburgh’s bomb threats of last year in which a man from Ireland supposedly organized via e-mail. It is a problem that Philip K. Dick even saw decades before these instances, and a problem that could become even more common as new technology is developed. I also noticed that the “historical alternatives” presented by Marcuse reminded me of a sort of butterfly effect, just more historical in nature. It examines how history could have been different (within reason and the potential of society) and why our history prevailed. This brings me to a more philosophical question of why our society followed this path. I personally like to believe that this reality is the best path that society could have taken, but there is a far more likely chance that it is a combination of beneficial and harmful choices of society. It reminds me of a computer game I’ve played called “Civilization V” in which you build your own society in a board-game like manner. It’s like a much more intricate Stratego as you try to gain victory over other civilizations. Although it is just a game, it is interesting to see the effects of choices you make regarding how to utilize a nation’s output and production. You can put all the resources behind technology and science, but then be conquered by a society who put all their resources into military training. However, putting all resources into military production could leave your people unhappy, causing revolution and destroying your empire from the inside out (simultaneously showing that I make a really bad emperor). It really illustrates the idea of utilizing a society’s production and the point that Marcuse is making about the nation’s production at the time of the Cold War.
It's interesting that "Is X human?" is again an underlying question like it was in the novel Frankenstein. In Frankenstein, however, we're left to determine what humanity is on our own. In Electric Sheep it's been determined for the reader by way of the empathy box and the Voigt-Kampff Empathy Test. But when machines exist like the Penfield Mood Organ that can make the user feel things like empathy on command, then how effective is empathy as a measurement tool? And who's to say that the mood organ can't just give you something other than what you dialed, or make you crave using it like some kind of drug? Who's really in control here?Another important symbol in the novel is the animals. Since the war, it's considered disgraceful to not have an animal to take care of. On the surface this is another way to gauge the empathy of a person, but in reality, which type of animal you own and how well off it is has really become another type of class symbol. My question is how do they deal with people who don't care about animals? Even in a world where they're so rare there has to be someone who just isn't an animal person. Is their lack of empathy enough to void their humanity?
One thing that struck me about Do Androids Dream of Electric Sheep? is how people reacted when they heard that a living animal had died. They kept calling it a "waste," like what someone might say when they see someone throw out good food or something. I understand that in this universe animals are a rarity and therefore another dead creature is a waste of life. But between that, the constant consulting of one's Sidneys, and the prevalence of electric animals, it almost seems like the idea of animals is the most important thing now. People have managed to become detached to the real animals, while attempting to cling to the idea of them.A quote from Marcuse that I liked is "... intellectual freedom would mean the restoration of individual thought now absorbed by mass communication and indoctrination..." (Chapter 1). The reason this quote resonated with me was because it has merit in today's society. With the hive mind mentality of trends and consumerism, lack of individual thought appears all too real much of the time.
For me, the most interesting moment of the first half of “Do Androids Dream of Electric Sheep?” connected with our discussion last week about what makes something human. As early as chapter three, the reader is introduced to Nexus-6 androids that have the ability to believably pretend to be human beings. While these androids can mimic almost every human emotion very precisely, except empathy, which “existed only in the human community” (Dick 28). So I think it is a safe assumption to make that the author might be trying to make a statement that he believes the most “human” trait that exists in the ability to have empathy for another being or creature. In this world, having empathy for animals seems to be an important aspect of the society, so it makes me wonder what the author’s rhetorical mission with this book is because any time you write a science-fiction novel about a war-torn, futuristic civilization, it usually is because you want to say something about the current culture.
Reading Do Androids Dream of Electric Sheep, I am reminded, once again, more of “Why the Future Doesn’t Need Us” than anything else. To me, it is equal parts an Orwellian dystopian future and Ray Kurzweil’s theory of blended human and mechanical consciousness put into practice. The overwhelming presence of the state as a controlling entity combined with a forced collectivist morality (Orwell and perhaps Dick’s critique of Communism) is overshadowed only by the idea of a computerized approach to humanity. The way that moods can simply be dialed in and selected to suit what any of the characters “wants” to feel smacks eerily of Kurzweil’s ideas gone horribly wrong. Indeed, it calls into question the very wisdom of such ideas existing in the first place. Interestingly, Phillip K. Dick takes the idea a step further, forcing the reader to confront the idea of how technology alters reality. If one’s mood can be chosen and does not occur organically, is it real? The feeling is real enough but, at least within the story, does not come from the brain. So is it real at all? Rick provides another example in the book’s earliest pages when he introduces his electric sheep. He says, “He wished to god he had a horse, in fact any animal. Owning and maintaining a fraud had a way of gradually demoralizing one” (Dick, 4). The sheep as they are, electronic beings, exist and therefore are real. However, they are imitation sheep. They’re real but at the same time aren’t at all.
Post a Comment