“Her” explores the meaningful emotions related to intimacy
and technology. In fact at one point the main character Samantha who is the AI
who goes everywhere with Theodore asks the question, “…And then I had this
terrible thought. Are these feelings even real? Or are they just programming?”
Which I believe she is trying to say is, how could I possibly know what a real
emotion is if it is not an organically brought up emotion that someone decided
that when this thing or that thing happens, I am supposed to respond in this
way. The difference is that one is programmed and one is alive.
Which brings me to my next quote from ‘The Most Human Human”
where it is explained that, “To be human is to be ‘a’ human, a specific person
with a life history and idiosyncrasy and point of view; artificial intelligence
suggest that the line between intelligent machines and people blurs most when a
puree is made of that identity.” Meaning it is not the intelligence that makes
the human, it is the sum of the human’s experiences and one’s ability to
interprets those experiences in one’s own individualistic way. This is where AI
fails. It is simply a sum of programming or a statistically shaped decision.
There is nothing organic. And it could be argued that humans do this too. We
look at someone else’s situation and if we do not like the outcome of that
situation when we are placed in it, then we either avoid it or do something
differently. But at optimal AI, all AI would do the exact same thing the best
way it knows how and continually do it this way unless instructed or decides to do it otherwise. Our humanness is also defined by our individuality,
which is something that all AI will ultimately come to lack.
In “Ex Machina”, Nathan says that, “The real test is to show
you that she’s a robot and then see if you still feel she has consciousness.”
Which is the complete opposite from “The Most Human Human” in that the question
of whether one was a human or a robot was always illusive. Which allowed room
for deception. This was not the case in “Ex Machina” the idea of deception was
not so obviously placed. Ava who is the AI in “Ex Machina” was set to be this
innocent experiment and it’s creator, Caleb was set to be its hostage. Which
allowed Caleb, the one questioning for consciousness, to fall right into Ava’s
trap of using him to grant her freedom.
Natalie,
ReplyDeleteI agree with your post because I do not feel AIs will be able to fully become "human." Our understanding of the world around us allows for us to organically feel a response. You totally put it in a better way but I do not think AIs will have the capability to fully operate independently in a socially constructed world created by humans for humans. Being human takes more than programs. There is a connection between the human and the world. This relationship is unique because both are organic. I do not believe feelings can be programed. Writing a program about feeling is not the same as participating in the social structures that help maintain those feelings. There is no way a programmer can input all of the feelings and causes of feelings into an AI. What would be the purpose of that. if we, as humans, lack some? This is why I think that we cannot create one definition of what it means to be human. What do you think?
I agree with you. There's not one simple definition of being human because the human experience is so unique and no one life is just like another's. We, as humans, understand each other on a level that no other animal in the world can. This will also include the creation of AIs. We are not something that can be replicated in just how distinct and complex we are.
ReplyDeleteI think that I agree with yall, in the way that we're so complex that it's hard to imagine someone being able to program that into something. I think that's why I find it hard to believe that anything like Sam could ever exist. I think about siri and the many things that she's programmed to do, but all the things that she still can't. It puts it in perspective I guess. I agree that we are such complex beings with so many different responses to things that there's never a right response, but only norms. I feel like humans come with different personalities and that robots wouldn't be able to create their own which makes the inhuman because they aren't free to make choices about who they are like we can.
ReplyDelete