One of the questions that came up from the first film we watched was whether or not we have a moral obligation to artificial intelligence if it were to come about in existence. And the two films that I have watched since being on the subject of A.I. that have made me really truly feel something for them is the ones I watched this week: Her and A.I.: Artificial Intelligence. Both, I believe do an amazing job at provoking emotion out of the viewers towards the characters and their situations. Samantha and David have been my favorite characters so far, and they have really put me in a dilemma facing the way in which I feel about my first answer to the moral obligation thing.
What I think I can say about the issue is that if AI was to come into existence I believe my moral obligation would be somewhere in between the amount of obligation I feel for my pets and the obligation I feel for other human beings. So since AI would have an easier job of communicating with me with language than my pets I believe my sympathy would rise, but still for my definition of human, I do not see them as such.
The interesting and new thing from A.I.: Artificial Intelligence was the way in which the makers of David deemed him a successful human child or not was his ability to long for becoming organic and to reunite with his mother. The creator named it a sort of test when he came to Manhattan in search of Pinocchio's blue fairy. His mother, Monica, was the ultimate goal of his journey, but he was so stuck on being "real"for her it made me thing about what was not real about him? And the only thing I could think of was the fact he was man-made and not God-made. Thus he existed for thousands of years and still having the ability to work as organic beings died.
The Turing test deems the bot artificial intelligence when it can trick someone into thinking that it is human by the right amount of people, but in this film they deemed him that way when he had gotten a goal, longed for it, searched for it, and did everything to do so. Those are two very different definitions of when AI is completely achieved. And I think that if I were to argue what would mean "reaching" AI, I believe I would take more of the route by David's maker. I think one main part of our humanity is based on longing and what you do with those longings, not whether or not you are convincing over typed conversation.
AI does encourage the viewer to develop more sympathetic feelings toward characters. The boy's longing for love was definitely human to me, but he couldn't or had no need to eat or grow, so that made him obviously not real to me. Good points, McKenzie!
ReplyDelete