If you are intrigued by artificial intelligence, this movie is a must see. A programmer wins a company lottery to spend a week with his seclusive boss at his home/research center out in the boonies to weigh in on a secret project involving A.I. Most importantly, it is thought provoking and feels plausible. It builds slowly for an unexpected ending. I was impressed and surprised. IMO it's best to see this movie without reading the spoilers in advance.
Update: all bets are off for shielding spoilers in this thread. If you plan on watching this movie, go away until you've seen it lol!
Update: all bets are off for shielding spoilers in this thread. If you plan on watching this movie, go away until you've seen it lol!
I won't insist that my reaction is typical, but my impression is that if something acts human enough, besides it clearly being a machine, I/we will have empathy for it, especially when it expresses fear or frustration while I may overlook that it is manipulating towards a goal given to it. You might assume that something like this is innocent and honest but in hindsight, in this case, you be underestimating its capabilities. Blame the programmer!
This movie raises aspects of what it means to human. In the case of this AI, it has been programmed to emulate human aspects of emotions, empathy, sexuality , and self preservation. Maybe not anger and fear, but I have to think they manifested themselves as evidence in a video Cal watches. And if intended or not, manipulation and deceit. Nathan's (the genius boss) goal was to produce something that could pass for human. However where he goes wrong is either he is unaware of Isaac Asimov or chose to ignore the 3 Laws of Robotics contributing to his untimely end.
While it might be argued these androids were justified taking lethal action in self defense, for fear or maybe not fear but resistance to termination, I have to wonder after Ava gets loose, how well will her programming serve her to fit into society? Although she was not the one to plunge the knife, I wonder at what point she would deem it appropriate to take a life? It's not clear if she was programmed with a sanctity for life or if self preservation routines over-rode them, which takes me back to the Three Rules. And she showed no hesitance to leave Cal, who empathized and was ironically motivated to help her escape, locked in the facility presumably to starve. Again, a programming/AI failure.
While it might be argued these androids were justified taking lethal action in self defense, for fear or maybe not fear but resistance to termination, I have to wonder after Ava gets loose, how well will her programming serve her to fit into society? Although she was not the one to plunge the knife, I wonder at what point she would deem it appropriate to take a life? It's not clear if she was programmed with a sanctity for life or if self preservation routines over-rode them, which takes me back to the Three Rules. And she showed no hesitance to leave Cal, who empathized and was ironically motivated to help her escape, locked in the facility presumably to starve. Again, a programming/AI failure.
Last edited: