Brainy: My Newfound Obsession with Artificial Intelligence

A few years ago, I made the startling observation that I am a “hard sci-fi” film buff. Whenever I refer to myself in this way, I always raise eyebrows. What exactly is “hard sci-fi”? I’d taken for granted the meaning of this niche term for any fiction based on actual science and technology. It is why I hated Prometheus (Ridley Scott, 2012). And as much as I still can’t whole-heartedly embrace Interstellar (Christopher Nolan, 2014), I find it endlessly fascinating. The science and the implications of its use in manipulating the natural world is one of the reasons why I love Jurassic Park (Steven Spielberg, 1993) so much.

I’m not exactly sure what led me to seek out these thought-provoking narratives about life, history, and time. In short, the nature of existence. Is it because my father, a numbers and all-around science geek, would routinely tell us children that he believes in aliens and a multiverse? “Remember, in a parallel universe, you’re my mother, and I’m your son. In another, you are green, and in another blue. Anything and everything is possible.” In much the same way that people find comfort in believing in god, I find the notion of life on other planets, in other universes, so impossible to ignore or rule out that it is almost certainly true. For me, anyway. In any case, perhaps having this open mind and this desire to gaze up at the stars, to imagine different lives and circumstances, all but ensured my eventual identification with hard sci-fi. I may not understand everything, but my determination to make sense of these narratives defines my relationship to the genre. Hell, you could say that my lifelong obsession with cinema influenced this deep-seated belief that anything and everything is possible. For what is cinema if not the exploration of alternate realities defined by space and time? Cinema is still so young, and we’ve only scraped the surface of what is possible.

Icarus Mission psychologist Searle looks out at the nearby sun, contemplating his existence. Image courtesy of Fox Searchlight Pictures.
Icarus Mission psychologist Searle looks out at the nearby sun, contemplating his existence in Sunshine. Image courtesy of Fox Searchlight.

My hard sci-fi epiphany may have occurred when, in April 2007, I was one of only a handful of people taking in an afternoon showing of Sunshine (Danny Boyle, 2007) in Lancaster, England. Sitting in the darkened theater, thousands upon thousands of miles from home, and submitting to a film narrative that runs counter to our current fears about global warming, I had a visceral reaction to everything I watched on the big screen. I’d often thought about the time when the sun will die, over a billion years from now, and how its gaseous explosion will likely swallow up Earth. It was quite another thing to be confronted by a crew of astronauts charting a course to blow up the sun, to bring it back to life so that it may end the terrible Ice Age enveloping all of Earth. The physicist hero Capa can only successfully fulfill his mission by delivering himself with the payload, in the end reviving the sun in death. It seems perfectly logical to me that the film’s screenwriter, Alex Garland, would then go on to make one of the best hard sci-fi films about artificial intelligence. I fell hard for his directorial debut Ex Machina, which came out in April of 2015, and it cemented my new obsession with all things artificial intelligence.

Ava contemplates the nature of her existence in Ex Machina. Image courtesy of A24.
Ava contemplates the nature of her existence in Ex Machina. Image courtesy of A24.

Like Garland (and Stanley Kubrick before him), I believe that the next step in human evolution is the moment when we reach singularity, opening the door to a world where the reasoning of man-made machines supplants that of humankind. In Ex Machina, you root for the android Ava to escape her laboratory/modern home. She is a gothic heroine held captive by her megalomaniacal creator Nathan, and even though she cleverly manipulates and outwits her sympathetic suitor Caleb, leaving him to die on the compound after killing Nathan—even though she is a computer—you relate and identify with her plight. Ava is the future, and her discovery of the outside world suggests that our future, when it is run by machines, will not be without wonderment. It may be a scary thought that our computers will be in control one day, but we’re already headed in that direction (after all, who checks her phone for messages whenever it dings, like Pavlov’s dog?), and by the time scientists reach singularity, I will be long gone. That future doesn’t frighten me one byte bit.

On a high from Ex Machina, I devoured other cultural products about artificial intelligence last year. Chief among them were the novel Speak by Louisa Hall and The Brain with David Eagleman, a six-part documentary series that only touched on A.I. in its last hour. In the former, Hall weaves a compelling intertwining narrative around five different people from disparate times and places, people directly or indirectly involved in the science of artificial intelligence. She presents one of them, Alan Turing, the inventor of the modern computer, through letters he writes to the mother of his childhood friend Christopher, whom he loved all of his short, tragic life. The Imitation Game (Morten Tyldum, 2014) touches on some of Hall’s themes, and I inevitably pictured Cumberbund while reading Turing’s sections of the book, but that prestige picture paled in comparison to Hall’s thought-provoking and evocative language. Here is one of my favorite lines by Hall, writing as Turing, who’s reflecting on the theoretical experiments he was never able to perform with Christopher (because he died while they were still boys at school):

… I can only imagine that our brains must grow in similar patterns: one step backwards, added to the present term, resulting in a subsequent term that combines both. Past and present, contained in the future (191)

I thought of Steven Spielberg’s A.I. Artificial Intelligence (2001), too, when reading the book. Another official voice in Speak belongs to an inventor of lifelike companion dolls for children that, upon extensive exposure, inadvertently and progressively transform the children into lifeless robots. Interspersed are the memoirs that the dolls’ creator, Chinn, writes from prison as well as chat transcripts entered as proof that his programming did (or did not) intentionally harm children. Framing each section of the book is a first-person account from one of his dolls, on its way to die in the desert. The bleakness of its fate, its battery dying, its struggle to hold onto language, for that is what it thinks makes it humanlike, reminded me of David, the robot boy in A.I. When I grieve for a fictional humanoid robot—whether on screen or on the page—I must be subconsciously grieving my own mortality.

Kim Suozzi with her cat Mikey. Image courtesy of The New York Times.
Kim Suozzi with her cat Mikey. Image courtesy of The New York Times.

That is why I found the story of budding neuroscientist Kim Suozzi so fascinating (not to mention, we share an almost uncanny resemblance). Recognizing the impossibility of beating cancer (she was twenty-three when she died in 2013), Kim spent the remaining months of her life raising the funds to, essentially, donate her brain to the science of cryonics. She fought alongside her boyfriend to preserve her brain in extremely cold temperatures so that in the future, when the science has finally been developed, her consciousness can be plugged into a computer. In other words, she would reach a singularity that Johnny Depp does in Transcendence (Wally Pfister, 2014)—only without the ability to take over the highly connected digitized world. The New York Times profile of Kim by Amy Harmon is heartbreaking, but it asks a lot of questions—the right questions. When she died, Kim knew that she was making a gamble. We still don’t know if we will ever be able to simulate our connectomes, or the connections in the brain that give us our own unique consciousness. But isn’t it beautiful to dream of that possibility? I don’t see Kim’s wish as selfish (as in, why does she get to cheat death and become immortal through reviving her brain?). I think it’s inspiring that a young woman would devote her life—however short—to science, to figuring out the mystery of whether or not we can bring a person back to life.

In The Brain, neuroscientist David Eagleman happens to visit the facility where Kim Suozzi’s brain is being preserved in order to highlight the controversial science guiding organizations like Alcor Life Extension Foundation. Ted Williams is also uniquely interred there. More so than his comments on artificial intelligence, I savored Eagleman’s distillation of complex concepts, such as identity and reality, and how these socially constructed notions first and foremost exist within the brain. They can get distorted there, too. The Brain also made an alternate reality for me all too real: what might have I become had I continued studying linguistics in college? (I checked out when phonology got too challenging.) Back in the day, I’d imagined being a sociolinguist—I still act like one, to an extent—but with my new fascination with the brain, I know for sure that I would have liked to have been a neuroscientist who studies language, memory, and the brain.

In other words, The Brain confirmed what I already believe about life. We are who we are because of what we have in our brains and because of how our brains interact with each other, transcending time and space. That doesn’t mean our brains always work properly, or in the ways that we want them to. Memory is reliably unreliable. Words escape us from time to time. These are but two reasons why I attempt to document my every waking hour, why I write down what I have seen, why I used to write about everything I have seen. I know I cannot store all of that information in my brain. But my brain allows me to create the systems I use to remember, including a coded language. It doesn’t matter; these records will always be incomplete. There are some things I forget to write down, some things I don’t want to commit to paper for fear that another’s eyes may read my words and know my secrets. I may be knowable through what I think, say, and write, but I will never be known. This is the beauty and cruelty of our human consciousness. We’ll never be able to see the world exactly as someone else does. But of all of the art forms, cinema comes the closest to achieving empathy.

Read the Montage Series, 2015: A Year in Reflection, from the beginning.


Why I’m Not Seeing The Dark Knight Rises This Weekend

This being a movie blog, I thought it necessary to address the mass shooting that took place Thursday night at a midnight screening of The Dark Knight Rises (Christopher Nolan, 2012) in suburban Denver, Colorado. A lone twenty-four-year-old gunman named James Holmes shot and killed twelve people, wounding at least 58 others, including people as young as only a few months old. In the rush of news updates, these estimates are subject to change, and soon I suspect we’ll learn more about the movie-going victims.*

My nightly ritual consists of watching ABC World News with Diane Sawyer and NBC Nightly News with Brian Williams, and each news program dedicated last night’s episode to coverage of the horrific event and its aftermath. They were both hard to watch for a number of reasons, chief among them the featured amateur cellphone video of blood-soaked people exiting the multiplex and the repetition of terrifying eyewitness accounts. Tears welled up in my eyes, and sometimes I angrily shouted at the TV. Why did you bring your little children to a midnight movie screening? Why this movie in particular? I feel ashamed for so harshly judging people I don’t know personally, and I am thankful that Holmes’s attack didn’t produce even more casualties. I also couldn’t help but wonder, how could his mother, apparently a psychiatric nurse, reportedly say, upon first hearing that her son has been arrested, that the authorities indeed have the right man? She possibly knew he was capable of such an atrocity and never thought to alert anyone that her son is a potential threat to society?

It has been widely reported that Holmes either dyed his hair red or wore a red wig to mimic Heath Ledger’s portrayal of The Joker in The Dark Knight (Nolan, 2008), that he even announced—to the unassuming crowd watching the movie in Theater 9 before he started gunning people down and/or to the arresting police officers—that he was The Joker. Again, we won’t know the truth behind these details as everyone is still corroborating testimonials and processing exactly what happened. So it remains unclear what the relationship is between this hotly anticipated movie and Holmes’s intentions to massacre people. I agree with Roger Ebert, who wrote yesterday in The New York Times that Holmes more likely perpetrated his deadly actions in order to garner fame, infamy, or some twisted recognition rather than act out a movie-inspired fantasy. Seeing how the TV news media responded, devoting whole programs to “Tragedy in Colorado: Movie Theater Massacre,” makes me cringe, too. They’re just giving him what he wants, and they’re sensationalizing, I thought.

But I know one thing for sure, and it took me a while to make this realization: I won’t be going to see The Dark Knight Rises this weekend, and in fact, I’m not sure when I will feel comfortable going to the theater to do so.

I’m not a big fan of Christopher Nolan’s Batman trilogy; his movies are long, pretentious, and moralizing. However, I had thought I was going to see it because, as I have previously stated, I am interested in what people go to see. I do want to be part of a larger conversation. How could I justify standing on the sidelines, lambasting so-called mainstream audiences’ tastes in movies, if I don’t watch them, too, to form my own informed opinions? (The Dark Knight Rise‘s first controversy this week involved movie aggregator site Rotten Tomatoes having to shut down their comments section because fans who hadn’t even seen the film yet were bullying or threatening film critics who published negative reviews.) A dear friend of mine tried unsuccessfully to convince me to go to a midnight screening; I had no desire to see a 164-minute-long movie at that hour in a packed, claustrophobic theater. Besides, I told him, so many special screenings are sold-out or nearly sold-out, making it more difficult to secure tickets. Despite the Colorado tragedy, the movie has grossed over $30 million from midnight screenings alone, and it remains to be seen how its grosses will eventually be made public since its distributor, Warner Bros., and other movie studios have pledged not to report the numbers out of respect for the victims and their families.

The main reason I’m not going to see the movie is because I think it will be too traumatic an experience. I cannot imagine what the people in Theater 9 have gone through, but I am certain that I won’t be able to concentrate on the film unspooling on-screen because I will be thinking about how all those innocent people eagerly attended a movie they’d been waiting months—maybe years—to see, at first perplexed that one cinemagoer seemed to perform a movie stunt tie-in at the front until it became clear what his true intentions were. I echo the film director’s sentiment, released as a public statement: “The movie theatre is my home, and the idea that someone would violate that innocent and hopeful place in such an unbearably savage way is devastating to me.” I have even written a short essay about my love of going to the movies, getting lost in the dark amid celluloid shadows and strangers. The piece is for a humor writing contest, and I have yet to submit it. I’m a little apprehensive to turn it in without mentioning what happened in Colorado, even if my memories of movie-going are overwhelmingly positive—funny even—and have nothing to do with the violence of the theater.

Truth is, I don’t know when I will be ready to go to the theater to see any movie. It’s all still so raw.

Some have expressed concern that this will negatively impact The Movies (Rebecca Macatee of E! Online is already labeling the newest Batman sequel a “would-be blockbuster,” given what’s transpired). I don’t think most people who have really wanted to see The Dark Knight Rises will stay away. All the power to them, I say. We cannot let one crazed man’s fatal attacks deter us from doing the things we love. We cannot live our lives in fear, to paraphrase Barry Otto in Strictly Ballroom (Baz Luhrmann, 1992). We should pressure President Obama and his Republican challenger Mitt Romney to address the issue of gun control while on the campaign trail, as their tepid expressions of sorrow and compassion are not enough. (Their track records on the issue are not comforting if we’re looking for change, either.)

When it comes to Hollywood and cinema more generally, I do hope that studios, producers, and filmmakers reflect on their storytelling practices and recognize that they could make some changes, too, beyond re-editing the trailer for The Dark Knight Rises and yanking TV spots and trailers for it and the upcoming Gangster Squad (Ruben Fleischer, 2012), which features a scene involving men powering machine guns through a movie screen, firing on the audience. I am not blaming anyone for what happened in Theater 9 other than James Holmes, but the fact that violence is so permissible in movies, often glamorized or sensationalized, is a cultural problem. Many of us have become anesthetized to graphic representations of violence, accustomed to watching people, buildings, cities, and even the world blow up on-screen. Ninety-nine-point-nine percent of us know that this is not real, but there are those who might fetishize these images and seek to replicate them in the real world because the consequences of violence are barely ever the subject of sustained cinematic inquiry. One recent example of this more desirable filmic exploration comes to mind, though: Lynne Ramsay’s stark, impressionistic portrait of a mother coming to terms with the attack her teenage son perpetrated at school in We Need to Talk About Kevin (2011). It is a challenging, beautiful movie, and I guarantee it will stick with you. If more films addressed the viscerality and destructiveness of violence, perhaps they would remind us all that it is never cool, never something we should wish to emulate.

* The New York Times has just published (circa 10.30 pm on Saturday, July 21st ) the names of the twelve victims as well as the first in-depth attempt to get a handle on James Holmes’s character.