Phil’s Day Off. Alternatively Known As: What Did I Just Watch?
SPOILERS for episodes 1-8 of Westworld under the cut.
Well, originally I had planned for my Phil’s Day Off to solely be finishing up episode 4 and 5 of Westworld. However, I forgot because I fell asleep when I got home from work, so instead I finished episodes 4 till 8 on the next day. Before we get into the metaphysics of it, and dear lord there is a lot of that, I just want to state my feelings during every plot twist.
I went into the next few episodes with the three questions from my first post.
Addressing Question 1: How would Martin Heidegger’s theory of being vs Being accommodate artificial intelligence?
Bernard: What is the difference between my pain and yours? Between you and me?
Ford: This was the very question that consumed Arnold, filled him with guilt, eventually drove him mad. The answer always seemed obvious to me. There is no threshold that makes us greater than some of our parts, no inflection point at which we fully alive. We can’t define consciousness because consciousness does not exist.
To answer this question I looked further into Heidegger’s theory of Being and came upon the term Dasein. Dasein is a German term, loosely translated into either “being” or “existence, that Heidegger invokes to describe the state of Being “unique” to humans. Dasein involves confronting issues of person hood, mortality, and the relationships one has with others while simultaneously being ultimately alone. This would then be defined as the inflection point at which you are fully “alive”, contrary to Ford’s statement that there is none.
This is exactly what the evolving Hosts do in Westworld; they confront issues of person-hood. Maeve’s storyline deals directly with the matter, as well as Dolores’, and most recently: Bernard’s.
The madame of Westworld’s brothel, Maeve, begins to retain memories of times at which she is meant to be “deactivated”, her past life as a mother to her child, as well as images of the scientists who work on her as a Host. The carefully constructed reality around her begins to unravel. While initially deeply distressed, she later starts to desire freedom from her artificial shackles. Maeve’s character is a direct challenge to members of the audience who believe the Hosts are not people, that they don’t have that right to life and liberty.
Dolores, in a ways the main character of the story, serves as our introduction to the AI. Not only is the fabric of her reality slowly revealing itself, time and space may be deceiving her as well. She alternates between showing us just how similar the Hosts are to us, and just how different.
Bernard flips the entire chess board over in the reveal that the character we thought to be a human scientist was a Host. A very advanced host, indistinguishable from a human. You would think that this would settle the matter of the AI’s person-hood for good, but the way Ford demonstrates his control over him is chilling.
Addressing Question 2: Can Artificial Intelligence develop “Self” in the philosophical sense? Do they exist in the same way humans do?
The Self is a kind of fiction, for hosts and humans alike. It’s a story we tell ourselves.
Conveniently enough for me, a possible answer was given by the show itself. The co-creator of the Hosts, Ford, views the self in terms of the narrative theory. When Bernard, a character that we were under the impression was human, turns out to be an AI, he is distraught. He questions his creator on the validity of his own emotions, of whether or not he has a self or a consciousness in the same way a human does. If you ask the very creator of the AI if they have a self, he would say yes. They are characters in a story, playing a specific role. Yet, he says humans are the same way; characters in a narrative.
Somehow though, this only brought up more questions. Why do we assume that since the AI are but characters in a narrative, repeating the same thing everyday, that that makes them inhumane? Ford argues that we are little different. How many of us seriously question the nature of our reality? How can we claim that since the AI don’t, that they are fundamentally different from us? According to the bicameral mind theory frequently referenced in Westworld, human beings themselves didn’t develop the ability to question such things until 4000 years ago.
Addressing Question 3: If Artificial Intelligence can have a self, can be Beings rather than beings, do they therefore have the same right to life and liberty as we do?
Maeve’s story-line follows this question closely. The first of the AI to have the nature of their reality broken apart and fully understand it, she seeks to live in the “real” world. Lets for a moment pretend that the Hosts are not robots, but humans. Brainwashed, forced to play characters in certain roles, atrocities committed upon them, murdered every once in a while, subject to the whims of visitors, and their memories wiped daily. Wouldn’t there be a universal consensus that this is immoral? If the AI are equivalent to humans, if there is no threshold separating them from us or they have passed that threshold, then this qualifies as a massive rights violation.
Somehow, as much as I have attempted to rationally explain why I view AI as potentially having the ability to be Beings or have selves, it is the emotion that the characters in Westworld have provoked in me that is the predominant motivation behind my defense of their sentience. On a fundamental level, the way the Hosts are treated feels wrong. Even in the case of lesser-AI such as the Canadian project hitchBOT, it provokes empathy. In the paper Mr.Jackson distributed, I found my answers as to why.
The problem with torturing a robot, in other words, has nothing to do with what a robot is, and everything to do with what we fear most in ourselves.
One day, in the not-so-distant future, I suspect that the type of artificial intelligence in Westworld will become a reality. When that day comes, the philosophy of whether or not they are due rights will become more than theoretical.