Friday, November 24, 2017

Human Avatars?

Humanity is one of those things that no one quite thinks about. We all know the difference between us and something like a turtle, but what is it that truly makes us human? In Brian Christian's book, The Most Human Human, a few things that I agreed with the most in terms of our humanity. The first one is how well we understand the syntax and meaning behind a sentence. How do you program that into a robot? There are endless ways to form one sentence - including the stressing of certain words. Our language might not be unique, but I would like to think the ability to rearrange a sentence can be. Even in our own humanity, this still escapes us now. How many times have you talked to someone from a foreign country and been absolutely confused? It isn't because they aren't speaking the same language, but it's also about the slang or syntax. You can program language into a robot, but can you make it understand different accents and different sentence syntaxes depending on the country? I'm sure you could with a lot of hard work, but I would like to think this might cause a lot of problems.

Second, Brian Christian makes a point to include statelessness, which is defined as "each reply depends only on the current query, without any knowledge of the history of the conversation required to formulate the reply." Yet, not all of our conversations are like this. A.I. programmers would have to find a way to permanently store all conversations the robot has with a human. In theory, this sounds ridiculously easy to do, but think about how many daily interactions we all have. Not all of them are remarkable and need to be remembered, but that's expected of us. If someone is asked, "how are you doing" and they respond "stressed out about my test," the next time you see them, it is courteous to ask how it went and if the stress calmed down. Like I said, it sounds easy, but I am also thinking in terms of computers. I'm no computer science major, and I'm not going to act like it, but I do own a laptop. If I save too much on it, it starts to throw a fit and refuses to run properly. I think this is a huge flaw for potential A.I.s.

Lastly, I want to talk about the human ability to empathize. In The Most Human Human, there is a specific subtitle called, "Scaling Therapy." We think about therapy as personal, but apparently Richard Bandler used a different type. He never once got intimate with patients, but his patients were successful anyways. While I like to think that is just a fluke, I could definitely see how it works for some people. Not everyone wants to sit in an office and have something give them advice. It's pointless, especially if they know what they want to do in their hearts. However, I wanted to point out that this isn't for everyone. We can't say one therapeutic technique is conclusive. It brings up an interesting question, but I would like to argue that this intimacy is what makes us unique. We go to therapy offices sometimes to feel that connection. We put all our secrets out on the table for someone to look at us and say, "I understand." It's not an intimate saying, but it's enough to make us feel valid and calm down. How do you program that into a robot? Can you teach it to love someone unconditionally beyond knowing what words to say and how to say them?

Eventually maybe, but until that happens, I'm sticking with the argument that syntax, statelessness, and empathy are some of our unique traits for humanity.

Now I want to briefly talk about Avatar. I would like to think that they are real people. They are able to think, feel, and understand syntax of things. They might be strange creatures, but I refuse to believe the military guy when he asks, "What does it feel like to betray your humanity?" Because in my eyes, he didn't. The Avatar people had all the makings of a human, but they were just stuck in a different-looking body. Also, if you haven't seen this movie, it's definitely worth the 3 hours you spend on it!

2 comments:

  1. "The Avatar people had all the makings of a human, but they were just stuck in a different-looking body." I'm curious about this. So by Avatar people you mean the Na'vi? Because the "Avatar people" in my eyes implies Jake. I feel like the implication of this sentence is that the goal is to be human. But I don't really know that it is. I would say the Na'vi are advanced creatures as are humans, and have similar traits in terms of emotions and thinking. So if that is what you mean then I understand. But I think Jake did betray a lot of the humans, because the Na'vi are not human. It was the right thing to do though because the humans were causing a lot of harm. Although he he betrayed many of his species, it showed his humanity because it showed his ability for compassion and the distinction of right and wrong.

    ReplyDelete
  2. Destiny, I really, really enjoyed your post! I definitely think the arrangement of sentences separates humans from artificial technology, as computers may not understand simple things that are part of everyday life for humans, such as the word "it." One thing Brian Christian says that I think really sums up the difference between humans and artificial technology is "real world experiences." There is no possible way for computers to have "real world experiences." They have databases of technology, but they don't know what it means to fail or to succeed. They do not "experience" anything. I think the movie, Avatar, displays the importance of real-world experiences, as it is the Na'vi's faith and their connection with nature that saves them in the end, which are human experiences.

    ReplyDelete

Note: Only a member of this blog may post a comment.