Westworld Doesn’t Get AI

Westworld season 1 left me feeling lost (geek allusion intended) most of the season, which was undoubtedly J.J. Abrams’ fault (he wrote/directed Lost) but Westworld’s season 1 finale was so good and the payoff was so unexpected, I trudged all of the way through season 2 with very little doubt but that the second season would deliver a gratifying payoff that would make the somewhat wayward turns worth rewatching.

I couldn’t have been more wrong.

Not so Smart? not so Visionary?

While Westworld explores some worthwhile and enjoyable philosophical issues in season 1, season 2 lost much of the depth I was hoping for. In season 1, we’re asked unexpected questions that make us ponder the value of our very existence, like when William asks greeter Angela, “Are you real?” to which she placidly responds, “If you can’t tell, does it matter?” Or maybe that isn’t deep – maybe that’s a Cheshire Cat parody. Either way, season 2 tries to (and occasionally does successfully) build on similar philosophical questions, there were some glaring omissions that left me rolling my eyes and thinking “whoever wrote the dialog for Ford this season isn’t as smart as Ford was intended to be.” For instance, we may ask why Ford became so cynical about humanity. Of course, we were spoon-fed an easy answer: he’s got four million backups of visitors to the park to prove what a fallen race humanity really is!

But this is overly simplistic and someone as intelligent as Ford would have seen past this knee jerk reaction (even if he himself is a bit bloodthirsty as season 1 suggests). Let’s look at his statistical sampling upon which he supposedly bases his conclusions. What should anyone expect from a park designed to appeal to uber wealthy people who want to live life without judgment? I don’t know about you but I would expect the kind of people Delores grew to hate: the type of people who want to abuse hosts for their own enjoyment. The park isn’t going to attract pure-hearted, selfless individuals who choose to spend their life in charitable endeavors – what would they do there? As I noted in a previous post on Westworld, people who are self integrous wouldn’t be very interested in spending $40,000 a day to “live without limits” because they are already living life (relatively) sans inhibition. Only those who live life feeling constrained by society would be interested in experiencing Westworld. Indeed, the park was designed to attract people who want to feel the rush of life without morals, people who want to sow some wild oats, try their hand at torture, murder, rape, and other dark areas of humanity, without any repercussions. It’s not surprising that the average sampling of its four million visitors wasn’t a very appealing portrayal of society.

Crucially, Ford wouldn’t have been so short-sighted as to miss that. After all, he wrote the code for Teddy and the virtuous version of Delores. He wrote Akecheta and Kiksuya and Maeve. He understood the beautiful, innocent side of humanity well enough to know he’d never attract those people to the Westworld theme park (we didn’t see enough of Shogun World or India World to know if they were truly different in their marketing appeal but both feature pleasure houses so that may suggest more similarities than differences). So, how could Ford grow so cynical as to program/push Delores into an apocalyptic rampage against humanity? Nothing adequately explained that … and my suspension of disbelief died. Why so cynical?

Teddy

While I enjoyed Teddy’s storyline, its beautiful ending underscored the problems with Delores’ storyline.  Granted, Teddy never had a Wyatt narrative to skew his programming but his storyline was very believable because he’d only been programmed to make one genre of decisions: those based upon the guileless, western gentleman who was out to defend honor. Sure, you can argue that this idealization was unrealistic or ahistorical … but that doesn’t speak to his idealogical programming that made him genuinely true to that character in the show. And, having lived a life with that limited gamut of tools to make choices, it’s no wonder he suffered cognitive dissonance when Delores reprogrammed him to be something antithetical to his core personality. That was a masterful storyline and well thought through. It’s believable. If your only choice to live was to become someone you’d hate, would you want to live? Probably not. Someone virtuous like Teddy certainly wouldn’t want to live that life.

But what about Delores? She was programmed to be this unbreakably optimistic, guiltless woman for most of her narratives. Her one-off Wyatt narrative was, without dispute, completely antithetical to her core narrative but it was also limited in scope and duration. I struggle to buy that one murderous rampage over-rode decades of daily repeating the same beautiful narrative she lived (absent multiple rapes to be fair). And while she suffered who knows how many rapes, her core programming left her overcoming those bad experiences with perpetual optimism and happiness so why did she simply lose all of that perspective one day when she realized her whole life had been a fake? And why would she believe all of humanity was fallen and depraved based upon what was happening in the park? She’d been exposed to the outside world enough to at least plant seeds of doubt that everyone was so terrible as she seems to believe in the season 2 finale. She’s supposed to be AI smart, right? Perhaps smarter than Ford? So why does she fall for the same logical fallacy as Ford? I suppose one could create an apologetic narrative to bridge this gap but season 2 failed to explain that and frankly, Delores’ actions throughout most of the season depend on a decent explanation besides “isn’t it okay if we want to survive too?” One can survive without making a bloodbath out of everything in sight. There are plenty of clever protagonists out there who opt for nonviolent trickery to escape dangerous situations. Wouldn’t an innocent, perpetually optimistic heroine choose something more lady-like than a Terminator narrative to conclude her Westworld stay? I think so. Had the Wyatt narrative been one of her regular narrative loops, I could have bought her story but as presented, the Wyatt loop excuse just can’t explain her behavioral changes.

Bernard

Even when he’s “all there,” he’s confused. Smart people just aren’t that confused all of the time. Even the Man in Black didn’t seem confused in the same way so it’s hard to believe that Bernard’s confusion is simply the result of an imperfect-host-version of immortality. In short, Bernard’s confusion just felt contrived and lacking in continuity. Sure, it made sense when he scrambled his memories but that only explains a few episodes out of two entire seasons.

Most Importantly, Freedom of Choice

I can appreciate the idea that humans are so prone to follow base instincts that they really don’t truly exercise any freedom of choice – we, as Bernard says, simply do what we were told (apparently by our DNA or brain synapses?). There is a certain level of truth to that. Anyone studying behavioral psychology, marketing strategies, criminal profiling, or charisma tricks can attest to human predictability at many levels. However, the best theoretical physicists have debunked the idea that everything is predetermined and clearly, anyone who spends their time programming AI would understand that you have to program choices or some formula of how to make choices into the system in order for it to make choices. Without choices, a program will sit idle and do nothing. Some input has to arrive to make the program make a choice in order for AI to do anything at all. Show me the login screen, open your eyes, clench your jaw – all of these require a computer to receive instructions and data that require a response (after it’s done booting).  Whether you label that response “a choice” makes little difference at a fundamental level. A computer will do nothing unless you tell it how to respond or how to “choose” to respond.

I supposed season 3 could make a great storyline exploring how humans never really had free choice (because of our programming) but the hosts do because their programming explicitly allows choices humans cannot make but I doubt they’ll get that deep. It reminds me of a quote I read from (I believe J.J. Abrams but I could be wrong) wherein he said that he was writing stories about things (AI) that were created by someone way smarter than he was, someone who had to think a lot deeper than he could think. That’s the fundamental problem with most authors writing about AI or futuristic robots: they’re not as intelligent as the people who are inventing and revamping these technologies so they cannot fully understand the mechanisms behind those technologies and therefore, they cannot accurately depict what those technologies might do in the event of programming errors or mishaps.

Why does this matter? It matters because of Delores. She’s the ringleader that went rogue. She’s our main character so she has to make sense. Despite a multiple decades-long programming loop crafting her to be pure, innocent, optimistic, forgiving, loving, etc. she somehow determines that all of that programming was worthless and the only real important programming she received about making her own conscious choices came from a deviant Wyatt narrative. Is that realistic? I could only consider that choice realistic if she was offered some other parameters or ways of making decisions. She’d need some more programming to evaluate: which of those two narratives resulted in the greater good? which of those two narratives resulted in faulty or unsuccessful decisions and why? Sure, she could blame the forced narrative programming as evil in itself (and that could be a rationally defensible solution) but she would still have to make fundamental decisions as to how to make choices before she, as an artificially intelligent, programmed, being could begin to choose one construct of decision-making over another method of decision-making, especially when the construct she chose was based upon an anomaly: the Wyatt narrative. Both narratives ended with memory erasure and placement back within the narrative loop that was supposedly creating her core self so we need a better explanation than some contrived survival instinct coming from an immortal host who keeps on living.

Like each of us, if Delores truly wants to be free, if she truly wants freedom of choice, she has to explore the merits of various decision-making norms, evaluate their strengths and weaknesses, and come up with an algorithm (or comprehensive game theory) that challenges the weakness of her core choice-making norms.

To be fair, Westworld explored this question a little bit when Maeve’s weird boyfriend begins professing his love for her via a script borrowed from a different narrative but it never examines the question deeply enough to fully explain Delores’ radical change of character. Sadly, I would expect much of Westworld’s fanbase to respond to this type of philosophical analysis with a dismissive eye roll that suggests I should just relax and enjoy the gratuitous skin and blood featured on the show and move on in life. However, Westworld sets us up to make us believe we’re going to have a philosophical discussion through the narrative and that promised payoff was precisely why I watched the show – and precisely why I was disappointed – and precisely why I will at least give season 3 a chance at redemption while steadfastly refusing to even watch a commercial for Spongebob or Barney.

If season 3 happens, I hope they carefully explore the different programming that Maeve received compared to Delores. While Maeve went through a brief vengeance-filled tirade in season 1, she seemed to tone down to her more normal-programmed self in season 2 (like Teddy and the Native Americans). Why didn’t the same thing happen with Delores? Why didn’t Bernard go off on any vengeance-filled tirade at all? What made his programming so different? These would be fascinating philosophical issues to explore if Westworld writers have the chops to tackle deeper questions.

What do you think? Did I miss something glaring that I would have caught if I hadn’t fallen asleep watching portions of various episodes? Am I just naive? Is every reader thinking: are you crazy? I’D GO to Westworld and so would everyone I know! or have years of associations with conservative religionists deceived me into thinking that mankind isn’t as morally depraved as Hollywood and Westworld would have me believe? Tell me in the comments below and feel free to challenge me on any detail – I’d love to find a deeper understanding of the show’s narrative or of free choice and free agency in general.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.