Fan-Made: Case Studies Inside Film Cultures, from Tarantino to Point Break

Early on in my academic study of film history and theory, I realized that the best way to understand the impact that the Movies have on our lives, to investigate these “popular entertainments,” is to watch what is and once was popular. I’ve seen a great number of films that never interested me as a filmgoer (I’m looking at you, Spider-Man 3), but I feel a professional obligation to see them nevertheless. This doesn’t mean that I see everything. (Who has the time, anyway?) For instance, I draw the line at certain kinds of horror movies, like torture porn or possession flicks. Limits. We all have our limits.

But when I examine popular films (by which I mean unequivocal blockbusters or cult classics), whether I am a self-professed fan or not, I tap into another world. Or at least I try. I want to know all the angles: all the controversies, all the gripes, all the pleasures that audiences have and share with one another. I have to see what all the fuss is about.

There was a lot of fuss about the Movies in 2015. Even though comic book superhero movies, studio tentpoles based on YA literature, and reboots of long-dormant franchises still dominated the box office this year, as they almost always do, to paraphrase New York Times film critics Manohla Dargis and A.O. Scott, the major studios pulled off the unthinkable: they gave us stuff that we largely wanted and liked, and thank god their original flicks with mid-size budgets did well, too. Maybe this means that film isn’t dying.

Christian and Ana are no closer to a business accord than when they started. Image courtesy of Universal Pictures.
Christian and Ana are no closer to a business accord than when they started in Fifty Shades of Grey. Image courtesy of Universal Pictures.

As I actively participated in the hullabaloo surrounding the likes of Fifty Shades of Grey (Sam Taylor-Johnson, 2015), Jurassic World (Colin Trevorrow, 2015), and even Star Wars: The Force Awakens (J.J. Abrams, 2015), the whole world of fandom was thrown into sharp relief. “Fan-made,” which generally connotes those cultural products that are made by amateurs, created in the spirit of or in homage to well known works, suddenly landed on a much bigger stage, with more money attached, in 2015. Now, I’m not suggesting that J.J. Abrams isn’t a professional—even if I do think he’s famous for aping Spielberg and for re-imagining other creators’ properties. He tried his best with Star Trek; he improved upon George Lucas. I also do not mean to diminish Taylor-Johnson’s talents; she elevated her source material (the poorly imagined fan-fiction/erotica drivel written by E.L. James) by focusing on the ridiculousness of what ultimately amounted to no more than the protracted business negotiations of a sexual contract between a man and a woman. Unfortunately, Taylor-Johnson won’t be returning for the next installments, and Universal has allowed James, who objected to Taylor-Johnson’s choices, to pass her husband, Niall Leonard, control over the screenplay for Fifty Shades Darker (James Foley, 2017).

However, while we’re on the topic of credibility, it is worth mentioning again that Trevorrow only had a low-budget romantic comedy to his name (Safety Not Guaranteed, 2012) before Spielberg handed him the keys to the Jurassic Park franchise and World‘s estimated $150 million budget. He foundered a few times while promoting his monstrosity, unable to convince us that the relationship between onscreen leads Chris Pratt and Bryce Dallas Howard didn’t reek of sexism and that the reason women are not called upon to direct giant studio tentpoles is because they simply don’t want to. And to top it all off, Trevorrow delivered a cynical CGI-laden horror show, without any of the thrills, wonder, or charm of the 1993 original. Can you tell that I am a huge Jurassic Park fan?

Can you believe Claire and Owen end up together? Image courtesy of Universal Pictures.
Can you believe Claire and Owen end up together in Jurassic World? Image courtesy of Universal Pictures.

It is well established that I am not a fan of Star Wars. But I knew that I was going to see the biggest film of the year in the theater. For a while, I just didn’t know when. Originally, I decided that I would wait a couple of weeks, allow the crowds to thin out. Then I decided that, in order to fully immerse myself in the fan culture, I had to see it opening weekend. I had my heart set on seeing filmgoers dressed as their favorite characters, maybe even turned away because they forgot to read the theater’s weapons policy and misguidedly brought that plastic lightsaber from home. I attended a sold-out show on the Sunday morning of its opening weekend. There were no Chewbaccas or Luke Skywalkers in the audience. Hell, we didn’t even have to stand in line before entering the auditorium. There were no hoots or hollers when the film franchise’s logo flashed across the screen. But I had tears in my eyes then, because I knew that I was sharing an experience with a larger, more enthusiastic community of film fans, even if they weren’t sitting in that darkened room with me. (They went at 8 pm Thursday night, right?) All in all, though, it was kind of like seeing any other movie.

Star Wars: The Force Awakens is predicated on the idea that familiarity will sell. That is, it will fill nostalgic hearts and minds and also sell a shit ton of toys. I also understand that most diehard fans of the epic space fantasy series resent mastermind George Lucas’s three prequels, not only for introducing the abomination that is Jar Jar Binks but also for boring audiences to tears. (Full disclosure: I’ve never seen the last two prequels, inelegantly referred to as Episodes II and III.) So it seems only logical that a Star Wars superfan like J.J. Abrams would be able to bring back for his fellow fans what I imagine is the wonder and excitement of the early films. As I like to say, “Star Wars: The Force Awakens is the best movie in the franchise, but that isn’t saying much.” It is a loving pastiche of the original trilogy, only it is well made. Its racially and sexually diverse cast is new and more than welcome, especially since the unequivocal lead is a resourceful young woman named Rey, who, throughout her (mis)adventures with former Storm-Trooper Finn, father figure Han Solo, and furry sidekick Chewbacca, gradually learns the source of all her magical abilities. (It’s the Force, duh.) Star Wars: The Force Awakens may be the ultimate fan movie experience that everyone wanted this year or any year, for that matter (just look at how its box-office haul continues to grow and grow, beating all kinds of records), but it left me with nothing more than a newfound interest in why it is so important and life-defining to so many people.

OK, I didn't go Thursday night, but this is the kind of fan experience I would have liked to have had, even as a Star Wars anti-fan. Image courtesy of Orlando Business Journal.
OK, I didn’t go Thursday night, but this is the kind of fan experience I would have liked to have had, even as a Star Wars anti-fan. Image courtesy of Orlando Business Journal.

Instead, I received the superfan experience that I had hoped to witness at Star Wars while attending a special presentation of The Hateful Eight in 70mm. Though I initially balked at the price for a ticket to the film’s limited Roadshow Edition ($20!), I reasoned that the promise of receiving a souvenir program and watching the shadow and light show of actual celluloid—and of a rare, large format, no less—unspooling through a projector had enough value for me. Well, that, and because I wanted to see Quentin Tarantino’s latest. His cinema of indulgence, as I like to think of it, is an acquired taste, but I love how he wears his ecstatic cinephilia on his sleeve. In the case of what is billed as “the eighth film by Quentin Tarantino,” this indulgence extends to amplifying the moviegoing experience for spectators to a new extreme, even for him.

Previously, Tarantino and best friend Robert Rodriguez put on a Grindhouse program in 2007, double billing their unapologetically trashy B-movies Death Proof and Planet Terror, respectively. Just as with Grindhouse, the auteur and his co-conspirators (historically, the Weinstein brothers Bob and Harvey) have injected a film history lesson about bygone exhibition practices back into contemporary pop culture, reminding today’s audiences that going to the Movies used to be a special, spectacular event. The limited Roadshow Edition of The Hateful Eight, complete with an instrumental overture, twelve-minute intermission, a handful of minutes not included on the digital prints of the film, and, not to mention, an earlier release date, subverts current film presentation trends such as surcharging tickets for movies screened in 3D and IMAX formats. You could even make the argument that the real star of the picture was the tangible film itself. Theaters had to be retrofitted with the right technology to screen 70mm, and transporting the heavy reels of film also proved a herculean task (the film is three hours long, mind you). Just read Adam Witmer’s account of what it is like to run the unfamiliar platter system of the 70mm projector at movie theaters in Los Angeles, with Tarantino sitting in the audience, to boot. It is mighty thrilling stuff.

Two of The Hateful Eight, being... hateful. Image courtesy of The Weinstein Company.
Two of The Hateful Eight, being… hateful. Image courtesy of The Weinstein Company.

I enjoyed The Hateful Eight as a film story, right up until the end, anyway. But what I will most appreciate about it is the experience that I had going to the Movies on this occasion. Unlike at Star Wars, I had to wait in line to be let into the theater showing The Hateful Eight. Before the presentation began, I watched loving couples, movie nerd guys, and families with teenage or young adult children snap photos of themselves holding up the souvenir program. It was clear that I was a part of something big, something made for fans of Tarantino and for fans of cinema. I was glad that I had plunked down the $20 to attend a film event that hadn’t been replicated in fifty years. Would I do it again? Well, not every film gets or deserves this treatment, so that’s a moot point.

The plethora of reboots, remakes, and re-imaginings of popular films—or, in the case of The Hateful Eight, the reconstruction of 1950s and 60s film exhibition practices—not only allowed fan culture to come to the fore in 2015, it damn near took control of our moviegoing habits. They were everywhere, and more are even coming to the small screens. Netflix will drop all episodes of its original series Fuller House next month, and fans of the family sitcom have already proposed new (sinister) ways of looking at the story of DJ Tanner raising her own kids with the help of friends and family in San Francisco. Twin Peaks is not definitely returning, this time to the cable channel Showtime. These TV shows help prove that “fan-made” doesn’t just mean a low budget, quirky reinterpretation of known properties by pop culture consumers. It also means “for the fans.”

Returning to the realm of the Big Screen, I found myself going to movies this year that I never would have imagined wanting to see if not for the perception that they would be special opportunities for me to participate in fan culture. Star Wars: The Force Awakens was one of them, but so was Mad Max: Fury Road (George Miller, 2015). Now, maybe it was because I had begun to appreciate the action film in all of its tense glory through repeated and ecstatic viewings of Speed (Jan de Bont, 1994)—more on that in part four—that I had wanted to see what one of my favorite film critics, Bilge Ebiri, had dubbed “the Sistine Chapel of action filmmaking.”

If I had ever seen the three original films by George Miller—1979’s Mad Max, 1981’s Mad Max: The Road Warrior, and 1985’s Mad Max Beyond Thunderdome—I didn’t remember them. They mostly resonated with me through their influence on my childhood favorites Tank Girl (Rachel Talalay, 1995) and Demolition Man (Marco Brambilla, 1993). In any case, I loved Fury Road. All at once, it was a recycling bin filled with iconography from every corner of cinema, refashioning elements of the modern vampire myth and Tod Browning’s Freaks (1932) in the process, but it also felt so incredibly fresh. I had never seen a setting like that, simultaneously warm and inviting but also austere and unforgiving. Still, before its release, I never could have predicted that Fury Road, a frenetic road war movie with a preponderance of supposed practical effects and real stuntwork, would go on to top so many critics associations’ lists of the best films from 2015, including that of the National Board of Review. And a nomination for Best Picture? Who would have thunk it, indeed?

Donnie accepts that he's a Creed, but he resembles a Balboa. Image courtesy of Warner Bros.
Donnie accepts that he’s a Creed, but he resembles a Balboa. Image courtesy of Warner Bros.

I’m not one for sports movies. I rented Southpaw (Antoine Fuqua, 2015) out of boredom and quickly lost interest. However, I rushed at the chance to see Creed (Ryan Coogler, 2015) in the theater, finding it my economic, political, and social obligation to support minority filmmakers. It didn’t matter that the only Rocky movie I had ever seen was the fourth installment in the franchise. On second thought, it probably helped that I had seen Drago bludgeon Apollo Creed to death in the ring in Rocky IV (Sylvester Stallone, 1985). For Creed is about a young black boxer’s coming to terms with his identity as the illegitimate son of the late world heavyweight champion. Aside from the stellar performances—especially by lead Michael B. Jordan—and an amazing single take that approximates what a real-life boxing match is like, I loved the call-backs to the original film, snippets that I recognized because I am a pop culture junkie and know Rocky iconography without ever having seen the movie. I loved the early scene where Adonis “Donnie” Johnson shadowboxes his father, taking Rocky’s place in one of their bouts, footage of which Donnie projects onto a wall, streaming the video from YouTube. Later, his running through the street while neighborhood kids on bikes roll alongside him reminded me of Rocky’s triumphal climb up the steps to the Philadelphia Museum of Art. What an exhilarating cinematic moment; it may have been like what film audiences experienced in 1976. I don’t remember how enthusiastic the crowd was when I saw Creed, but I couldn’t stop grinning from how well co-writer/director Coogler had rebooted, remade, and re-imagined a cultural touchstone that had run out of gas in recent years, how he had made it relevant to today’s audiences. With every day bringing us news of another unarmed African American being gunned down by excessive police force, Creed is a celebration of a strong black body, a multifaceted character with a complex inner life. In other words, it is a reminder that Black Lives Matter and are full of underestimated and untapped potential. Shame the Academy couldn’t see it.

But not everything produced with a strong fanbase in mind succeeded financially or critically. No one really cared to see Terminator: Genysis (Alan Taylor, 2015), probably turned off by its confusing story. Is it a sequel, a prequel, or what? And the remake of Kathryn Bigelow’s 1991 cult classic about a group of bank robbers who spend most days catching some waves off the Los Angeles coast, Point Break, crashed and burned. Like many people who grew up loving the campy original, I was at first hostile to the idea that Warner Bros. was going to distribute a remake of my beloved romance between Johnny Utah and Bodhi. But I learned that it is possible to appreciate both versions. In fact, it is possible to watch them both at the same time.

Those were some good times: the original Point Break. Image courtesy of Twentieth Century Fox.
Those were some good times: the original Point Break. Image courtesy of Twentieth Century Fox.

Having seen the trailer a couple of times, I was intrigued by how the filmmakers (including director/cinematographer Ericson Core and screenwriter Kurt Wimmer) had made a case for a new Point Break in 2015. It’s a Point Break set within the world of extreme sports, a picture about the forces of nature and economic inequality. In this version, Johnny Utah is an FBI cadet who, based on his previous experience as a poly-athlete (I’d never heard that term before!) hypothesizes that a series of crimes performed through gravity-defying stunts on separate continents are all the work of the same daring team. They’re chasing what he calls the Osaki Eight, a series of physically demanding stunts that bring one closer to Nature. In other words, this legendary philosophy (the progenitor of it died while attempting his third challenge) is kind of like The Force: it is meant to do good. But the group, led by Bodhi (who else?), commits criminal acts in order to give back, including hijacking millions of dollars being transported by a plane. Releasing the bills miles high to the Mexican villagers below, they also accomplish their goal of strategically falling through the sky and opening their parachutes inside a cave, effectively going from above to below the earth’s surface in one fell swoop. This is not your childhood’s Point Break. In addition to highlighting what was wrong with the original (the surfer gang wasn’t a band of Robin Hoods), the film is a showcase for the striking photography of beautiful natural landscapes and the real stunts performed by professional athletes that are littered throughout.

C'mon, Bodhi, why don't you take off your shirt, too? You know you wanna... Image courtesy of Warner Bros.
C’mon, Bodhi, why don’t you take off your shirt, too? You know you wanna… Image courtesy of Warner Bros.

I watched the new Point Break with the original, so ingrained in my memory, playing at the same time in the back of my mind. I could giddily anticipate some gestures and exchanges, such as the moment when Utah fires his gun into the air in a blaze of bullets after just having it trained on Bodhi, thereby allowing his friend/object of desire to get away before the feds arrive. I was the only one in the theater who yelped when she saw James Le Gros cameo as an FBI director (Roach lives!). Despite these call-backs to the original, I can assure you that this Point Break is its own campy thing. It is less a remake and more a re-imagining. And I couldn’t help thinking that an early scene set in a dilapidated Parisian train station (if memory serves) is the closest either film comes to shooting a love scene between the men. Here, Bodhi and his gang hang out, fighting each other for no apparent reason. Although couched as a test of Utah’s character and mettle, the fisticuffs between he and Bodhi signal a love and brutalism that binds them together. I just hope that in twenty-four years, if they even wait that long to remake Point Break, Bodhi and Utah consummate this desire to turn the other into himself. To fuck, as it were.

Fargo Season Two
Minnesota state trooper Lou Solverson (center) confronts Gerhardt scion Dodd in Fargo. Image courtesy of FX Networks.

However, the most immersive and rewarding fan experience that I had in 2015—and which carried into 2016—didn’t even involve going to the movie theater. I became obsessed with the FX original series Fargo, created by Noah Hawley and inspired by the 1996 film of the same name written and directed by Joel and Ethan Coen. I initially eschewed the first season of the mock true crime anthology series because Billy Bob Thornton starred. I hold a grudge against the man for having won an Oscar for Best Adapted Screenplay the year that John Hodge’s script for Trainspotting (Danny Boyle, 1996) was nominated in the same category. (More on that film in part four.) However, I had read that the show was amazing, and when I spotted the first season on DVD at my public library, I snatched up the opportunity to see what all the fuss was about Fargo.

Set in 2006, it follows Molly Solverson (Allison Tolman in a stunning debut), a sheriff’s deputy in a small Minnesota town who is the only one who can see what is really going on: perennial schlemiel Lester Nygaard (Martin Freeman, putting on his best north Midwestern accent) is in cahoots with the mysterious assassin Lorne Malvo (a charismatic Thornton). Malvo’s not-quite-solicited murder of Lester’s high school bully sets off a dangerous and absurd chain of events, transforming Lester from a mild-mannered underachiever into a successful insurance salesman with a murderous streak. A suitcase buried in the snow even figures prominently in a second narrative thread concerning Malvo’s manipulation of a grocery store king (Oliver Platt) who hired him to find his blackmailer. That reminds me: I really ought to check out Kumiko, the Treasure Hunter (David Zellner, 2014).

Anyway, I fell under Fargo’s spell immediately. I devoured episodes, reveling in the show’s intricate plotting, nuanced performances, and references not just to the Coens’ film but their whole cinematic universe. I finished in time to watch the second season as it aired, but I waited until my DVR had recorded all ten episodes before diving in. I wanted to go at my own (delayed but faster) pace.

For the second outing, Noah Hawley and his new writer’s room set the story in 1979, during the so-called Sioux Falls Massacre, which Molly’s retired sheriff of a father (Keith Carradine) referenced on a regular basis throughout season one. Going in, I already knew that at least two characters would survive: Molly, now played as a young girl by Raven Stewart, and her father Lou (played as a young state trooper by Patrick Wilson). All bets were off regarding everyone and everything else. The second season is more ambitious in style, story, and setting, incorporating a Midwestern turf war between a German-American crime family in Fargo, the Gerhardts, and a bigger, more streamlined operation in Kansas City that wishes to absorb the former’s drugs distribution business. Peggy Blumquist (Kirsten Dunst), a Minnesota beautician with a dream, accidentally runs over the youngest brother of the Gerhardt clan while he (Kieran Culkin, who knows a thing or two about family dynasties himself) tries to flee the scene of his triple homicide inside a remote diner. Peggy enlists the help of her dim-witted but well-meaning husband, the apprentice butcher Ed (Jesse Plemons), to get rid of Rye Gerhardt’s body. A call-back to the memorable woodchipper scene in the film Fargo ensues, as Ed disposes of Rye’s body the only way he knows how: with a meat grinder.

 

Fargo meat grinder
Ed prepares Rye Gerhardt for the woodchipper meat grinder in Fargo. Image courtesy of FX Networks.

Although the characters and storylines are different between the film and each season of the TV show, a cottage industry exists in which viewers spot references to the film in the new series. Originally, this activity maddened Adam Sternbergh, novelist and contributing editor of New York magazine, whose favorite film is Fargo. Writing for Vulture, he recounts the process of coming to terms with the TV show, whose announcement in 2014 made him feel “something between doubt and existential despair,” by being “able to let go and watch the show in the spirit in which it perhaps was always meant to be watched.” The widening of the show’s scope in season two to include references to the larger Coen “mythology” has influenced Sternbergh to see Fargo as “the ultimate tribute” to the filmmakers, continuing:

The show accepts as a given that the Coens haven’t just created a distinctive visual style, or a stable of recognizable character types, or a set of consistent thematic concerns: They’ve created all those things, with such richness and abundance that their films now qualify as a genre unto themselves. The Coens may have started out making noirs, or Westerns, or comedies, but now they indisputably make Coen Brothers films. Their work has become a stand-alone genre that exists to be referenced, caricatured, borrowed, even shamelessly strip-mined. And it’s rich enough to inspire not just a spinoff, but an expertly executed ongoing televisual homage.

My favorite reference in season two to the Coen Brothers’ filmography comes at the end of the seventh episode. With the eldest Gerhardt brother in his possession, Ed Blumquist phones low-level KC mob enforcer Mike Milligan (a transfixing Bokeem Woodbine) to make a deal: he’ll give him Dodd (Jeffrey Donovan) in exchange for help in getting the Gerhardts off his back. The song “Just Dropped In (To See What Condition My Condition Was In),” made famous by the Dude’s dream sequence in The Big Lebowski (1998), plays out the scene before the end credits roll. But rather than lift Mickey Newbury’s original 1967 track from the film, Hawley and Co. do something extra geeky: they put on an anachronistic funky cover of the song by the pop-synth band White Denim. I’d never heard of this musical group before, but I can only imagine that they probably first heard the song as I did in 1998: while watching The Big Lebowski. In this way, Hawley and his collaborators have taken their Coen fandom to new intertextual heights. Like White Denim, Hawley and his colleagues have taken a text (almost) exclusively associated with the Coen Brothers film genre, to use Sternbergh’s taxonomy, and created something new. Placing the cover of the song inside the playful homage that is Fargo the TV series emphasizes the fan culture from which both the cover song and the TV program were born and which they continue to stimulate.

Read the Montage Series, 2015: A Year in Reflection, from the beginning.

Advertisements

Brainy: My Newfound Obsession with Artificial Intelligence

A few years ago, I made the startling observation that I am a “hard sci-fi” film buff. Whenever I refer to myself in this way, I always raise eyebrows. What exactly is “hard sci-fi”? I’d taken for granted the meaning of this niche term for any fiction based on actual science and technology. It is why I hated Prometheus (Ridley Scott, 2012). And as much as I still can’t whole-heartedly embrace Interstellar (Christopher Nolan, 2014), I find it endlessly fascinating. The science and the implications of its use in manipulating the natural world is one of the reasons why I love Jurassic Park (Steven Spielberg, 1993) so much.

I’m not exactly sure what led me to seek out these thought-provoking narratives about life, history, and time. In short, the nature of existence. Is it because my father, a numbers and all-around science geek, would routinely tell us children that he believes in aliens and a multiverse? “Remember, in a parallel universe, you’re my mother, and I’m your son. In another, you are green, and in another blue. Anything and everything is possible.” In much the same way that people find comfort in believing in god, I find the notion of life on other planets, in other universes, so impossible to ignore or rule out that it is almost certainly true. For me, anyway. In any case, perhaps having this open mind and this desire to gaze up at the stars, to imagine different lives and circumstances, all but ensured my eventual identification with hard sci-fi. I may not understand everything, but my determination to make sense of these narratives defines my relationship to the genre. Hell, you could say that my lifelong obsession with cinema influenced this deep-seated belief that anything and everything is possible. For what is cinema if not the exploration of alternate realities defined by space and time? Cinema is still so young, and we’ve only scraped the surface of what is possible.

Icarus Mission psychologist Searle looks out at the nearby sun, contemplating his existence. Image courtesy of Fox Searchlight Pictures.
Icarus Mission psychologist Searle looks out at the nearby sun, contemplating his existence in Sunshine. Image courtesy of Fox Searchlight.

My hard sci-fi epiphany may have occurred when, in April 2007, I was one of only a handful of people taking in an afternoon showing of Sunshine (Danny Boyle, 2007) in Lancaster, England. Sitting in the darkened theater, thousands upon thousands of miles from home, and submitting to a film narrative that runs counter to our current fears about global warming, I had a visceral reaction to everything I watched on the big screen. I’d often thought about the time when the sun will die, over a billion years from now, and how its gaseous explosion will likely swallow up Earth. It was quite another thing to be confronted by a crew of astronauts charting a course to blow up the sun, to bring it back to life so that it may end the terrible Ice Age enveloping all of Earth. The physicist hero Capa can only successfully fulfill his mission by delivering himself with the payload, in the end reviving the sun in death. It seems perfectly logical to me that the film’s screenwriter, Alex Garland, would then go on to make one of the best hard sci-fi films about artificial intelligence. I fell hard for his directorial debut Ex Machina, which came out in April of 2015, and it cemented my new obsession with all things artificial intelligence.

Ava contemplates the nature of her existence in Ex Machina. Image courtesy of A24.
Ava contemplates the nature of her existence in Ex Machina. Image courtesy of A24.

Like Garland (and Stanley Kubrick before him), I believe that the next step in human evolution is the moment when we reach singularity, opening the door to a world where the reasoning of man-made machines supplants that of humankind. In Ex Machina, you root for the android Ava to escape her laboratory/modern home. She is a gothic heroine held captive by her megalomaniacal creator Nathan, and even though she cleverly manipulates and outwits her sympathetic suitor Caleb, leaving him to die on the compound after killing Nathan—even though she is a computer—you relate and identify with her plight. Ava is the future, and her discovery of the outside world suggests that our future, when it is run by machines, will not be without wonderment. It may be a scary thought that our computers will be in control one day, but we’re already headed in that direction (after all, who checks her phone for messages whenever it dings, like Pavlov’s dog?), and by the time scientists reach singularity, I will be long gone. That future doesn’t frighten me one byte bit.

On a high from Ex Machina, I devoured other cultural products about artificial intelligence last year. Chief among them were the novel Speak by Louisa Hall and The Brain with David Eagleman, a six-part documentary series that only touched on A.I. in its last hour. In the former, Hall weaves a compelling intertwining narrative around five different people from disparate times and places, people directly or indirectly involved in the science of artificial intelligence. She presents one of them, Alan Turing, the inventor of the modern computer, through letters he writes to the mother of his childhood friend Christopher, whom he loved all of his short, tragic life. The Imitation Game (Morten Tyldum, 2014) touches on some of Hall’s themes, and I inevitably pictured Cumberbund while reading Turing’s sections of the book, but that prestige picture paled in comparison to Hall’s thought-provoking and evocative language. Here is one of my favorite lines by Hall, writing as Turing, who’s reflecting on the theoretical experiments he was never able to perform with Christopher (because he died while they were still boys at school):

… I can only imagine that our brains must grow in similar patterns: one step backwards, added to the present term, resulting in a subsequent term that combines both. Past and present, contained in the future (191)

I thought of Steven Spielberg’s A.I. Artificial Intelligence (2001), too, when reading the book. Another official voice in Speak belongs to an inventor of lifelike companion dolls for children that, upon extensive exposure, inadvertently and progressively transform the children into lifeless robots. Interspersed are the memoirs that the dolls’ creator, Chinn, writes from prison as well as chat transcripts entered as proof that his programming did (or did not) intentionally harm children. Framing each section of the book is a first-person account from one of his dolls, on its way to die in the desert. The bleakness of its fate, its battery dying, its struggle to hold onto language, for that is what it thinks makes it humanlike, reminded me of David, the robot boy in A.I. When I grieve for a fictional humanoid robot—whether on screen or on the page—I must be subconsciously grieving my own mortality.

Kim Suozzi with her cat Mikey. Image courtesy of The New York Times.
Kim Suozzi with her cat Mikey. Image courtesy of The New York Times.

That is why I found the story of budding neuroscientist Kim Suozzi so fascinating (not to mention, we share an almost uncanny resemblance). Recognizing the impossibility of beating cancer (she was twenty-three when she died in 2013), Kim spent the remaining months of her life raising the funds to, essentially, donate her brain to the science of cryonics. She fought alongside her boyfriend to preserve her brain in extremely cold temperatures so that in the future, when the science has finally been developed, her consciousness can be plugged into a computer. In other words, she would reach a singularity that Johnny Depp does in Transcendence (Wally Pfister, 2014)—only without the ability to take over the highly connected digitized world. The New York Times profile of Kim by Amy Harmon is heartbreaking, but it asks a lot of questions—the right questions. When she died, Kim knew that she was making a gamble. We still don’t know if we will ever be able to simulate our connectomes, or the connections in the brain that give us our own unique consciousness. But isn’t it beautiful to dream of that possibility? I don’t see Kim’s wish as selfish (as in, why does she get to cheat death and become immortal through reviving her brain?). I think it’s inspiring that a young woman would devote her life—however short—to science, to figuring out the mystery of whether or not we can bring a person back to life.

In The Brain, neuroscientist David Eagleman happens to visit the facility where Kim Suozzi’s brain is being preserved in order to highlight the controversial science guiding organizations like Alcor Life Extension Foundation. Ted Williams is also uniquely interred there. More so than his comments on artificial intelligence, I savored Eagleman’s distillation of complex concepts, such as identity and reality, and how these socially constructed notions first and foremost exist within the brain. They can get distorted there, too. The Brain also made an alternate reality for me all too real: what might have I become had I continued studying linguistics in college? (I checked out when phonology got too challenging.) Back in the day, I’d imagined being a sociolinguist—I still act like one, to an extent—but with my new fascination with the brain, I know for sure that I would have liked to have been a neuroscientist who studies language, memory, and the brain.

In other words, The Brain confirmed what I already believe about life. We are who we are because of what we have in our brains and because of how our brains interact with each other, transcending time and space. That doesn’t mean our brains always work properly, or in the ways that we want them to. Memory is reliably unreliable. Words escape us from time to time. These are but two reasons why I attempt to document my every waking hour, why I write down what I have seen, why I used to write about everything I have seen. I know I cannot store all of that information in my brain. But my brain allows me to create the systems I use to remember, including a coded language. It doesn’t matter; these records will always be incomplete. There are some things I forget to write down, some things I don’t want to commit to paper for fear that another’s eyes may read my words and know my secrets. I may be knowable through what I think, say, and write, but I will never be known. This is the beauty and cruelty of our human consciousness. We’ll never be able to see the world exactly as someone else does. But of all of the art forms, cinema comes the closest to achieving empathy.

Read the Montage Series, 2015: A Year in Reflection, from the beginning.

Man, “You Should Be Dancing”: Memorable “WTF” Movie Moments

There are shots and scenes in films that are designed to take your breath away. Sometimes it’s the gorgeous cinematography, dazzling special effects, or a character’s sweeping romantic gesture that does the trick. The filmmakers’ choices, when properly executed, generally advance the narrative and enhance the overall movie-going experience. Think: any scene from Darren Aronofsky’s The Fountain (2006), with Clint Mansell’s score pushing the spectator through the heavens, or the moment Drs. Alan Grant and Ellie Sattler first glimpse the Brachiosaurus on their way into Jurassic Park (Steven Spielberg, 1993). These scenes are memorable because they are beautiful, intense, imaginative, and poignant.

But what about those scenes that, seemingly out of the blue, disrupt a film’s serious tone? Whether driven by camp, satire, or irony, these scenes are usually shocking and hilarious. I bet each of us has our own collection of these filmic moments. I know that my dad, for one, enjoys it whenever a character is surprisingly killed in the middle of a scene, such as when a shark jumps out of the water and eats Samuel L. Jackson after he gives a rousing, survivalist speech to the members of his team in Deep Blue Sea (Renny Harlin, 1999). However, my collection of favorite “what the fuck?” movie moments revolves around, well, men dancing.

Before I share with you my top five, I need to clarify the criteria by which these dances make the cut. None is from a musical (that’s why his dancing is so jarring for the viewer), but a song–sung live or reproduced through the character’s sound system or radio–does play a part in each case. In all but two instances, the actor spontaneously dances by himself, and his body–clothed or unclothed–is on display. What I like most about these moments is how they individually and collectively represent a direct address to the female gaze. Some are more sexualized than others, and still a few are downright horrific and disgusting. Since these dance scenes are generally the bright spots in a dark (or even frivolous) film, there is no Tobey Maguire strutting down the street in Spider-Man 3 (Sam Raimi, 2007). And as much as I enjoyed the whitewashing effect of the cast singing and dancing to the O’Jays at the end of The Voices (Marjane Satrapi, 2014), their “Sing A Happy Song” routine is actually too big a choreographed set-piece to make the moment seem spontaneous overall.

Without much further ado, I give you my five favorite scenes of men using the power of dance to lighten a deeply disturbing mood:

Number one, with a bullet, comes from Alex Garland’s much celebrated directorial debut Ex Machina (2015), which opened in wide release last Friday. This scene may receive pride of place on this list because of my crush on the actor Oscar Isaac, whose sinister artificial intelligence mastermind Nathan dances with a female android. However, the real reason it lands here is because Nathan turns something as joyful as disco-dancing into a physical threat directed at houseguest Caleb (Domhnall Gleeson), who disapproves of Nathan’s methods. Trust me, the commitment of the actors in this scene elevates it to high comedy, even when the scene is taken out of context from the whole picture.

Another classic. Christian Bale’s Patrick Bateman, the Resident Doofus of Mergers & Acquisitions, takes rival Paul Allen (the beautiful Jared Leto) back to his place in Mary Harron’s brilliant 2000 adaptation of Bret Easton Ellis’s novel, American Psycho. Before chopping his colleague to pieces, Bateman waxes philosophical about the misunderstood meaning behind Huey Lewis and the News’s “Hip to be Square.” Apparently, it’s about the pleasures of conformity, something he knows a lot about. While Bateman doesn’t dance dance, per se, he does emphasize his point with a quick nerd-accented shake of the hips. You stop laughing as soon as he strikes an ax into Allen’s head.

This is not actually my choice! I couldn’t, for the life of me, find the clip from Charlie’s Angels (McG, 2000) wherein client-turned-villain Sam Rockwell dances to “Got to Give It Up” by Marvin Gaye. A relative unknown at this time, Rockwell burned his name into my memory with his sexy shimmying to the song, a way for him to announce to Drew Barrymore’s Dylan, whom he just bedded, that he is in fact the bad guy from whom she’s been assigned to protect him. Yep, long before “Blurred Lines,” the Marvin Gaye classic had been associated with shameful sexual acts.

But it turns out that Sam Rockwell is a regular old Christopher Walken: he dances every chance he gets. Among the video treasures that YouTube has of his moves, is the above scene from Charlie’s Angels. The film never truly adopts a serious tone, and Rockwell’s Eric Knox lampoons earlier James Bond-type villains. He has a secret, coastal hideaway, and technology that goes BOOM! “Revenge is fun,” he says, because he likes to dance it out. Shame the above clip doesn’t run long enough to include his doing the splits.

Reluctant but hungry vampire Louis (Brad Pitt) has just swept young Claudia (Kirsten Dunst) in his arms and fed on her blood. At this turning point in Interview with the Vampire (Neil Jordan, 1994), Louis is disgusted with himself, whereas Lestat (Tom Cruise, electrifying) is elated that his protege has finally taken the plunge. How does he celebrate what Louis would rather forget? Why, by dancing with the corpse of Claudia’s mother, of course! The jubilant dancing and operetta singing sharply contrasts with the dark, spartan interior of Claudia’s home. It’d been a while since there was much evidence of any life there. Which is why Lestat’s bemused exclamation, “There’s still life in the old lady yet!” is so hilarious. An immortal, death is a joke to him, and for once, he has made the audience laugh with him. But poor Louis and Claudia: forever doomed.

Finally, how about some levity? Love Actually (Richard Curtis, 2003) isn’t a serious movie, except for maybe some of its apologists. Hands down, the best scene from this syrupy concoction is when Prime Minister Hugh Grant dances around 10 Downing Street to the tune of “Jump (For My Love)” by the Pointer Sisters, celebrating a personal and professional victory. In Curtis’s rewrite of the concurrent War in Iraq, the PM refuses to toe the line set by the lecherous American President (Billy Bob Thornton, never better). All because the Prez hit on the Prime Minister’s assistant/crush (Martine McCutcheon). A moment the country world can be proud of: Hugh Grant shaking his hips.

That’s it. What are some of your most cherished “what the fuck?” moments? Sound it out in the comments section.

Long Take: Unraveling the Psychosexual Trauma of It Follows

Viewed March 28, 2015

It Follows movie posterI had never heard of the indie horror film It Follows until I spotted its poster hanging in the hallway of a local art-house cinema here in Kansas City, announcing its imminent showcase. Using familiar iconography of American teenage rebellion, the minimalist printed advertisement poses two attractive young people in the midst of a backseat tryst in a classic car parked in the middle of the woods. A smoky footlight from within illuminates the interior of the car parked on this Lovers’ Lane. Without a tagline, the poster for It Follows relies on a quote from The Dissolve to entice spectators: “One of the Most Striking American Horror Films in Years,” which isn’t quite the same as saying it is the most striking American horror film in years. However, the beguiling film title is the viewer’s most helpful guide to interpreting the poster scene (and, by extension, the film it represents): what is “It?” Could “It” be me, as I look at this intimate moment between two people? The voyeuristic film poster perfectly encapsulates the dueling yet complementary senses of dread and yearning that the 2014 film, written and directed by David Robert Mitchell, instills in the audience.

Given the film’s strong word-of-mouth marketing campaign and sizable box office gross, distributors RADiUS-TWC scrapped plans to release the film via video-on-demand and instead rushed it to more theaters around the country. Even though I routinely eschew horror films (unless there are vampires; don’t ask why!), I determined that I had better see this thing, to judge for myself how “striking” it is. I couldn’t see It Follows at the Tivoli Cinemas in Westport now; when the film’s distribution widened ahead of the theater’s advertised opening day, Tivoli put on alternate programming instead. Wishing video-on-demand was still an option, I entered into an agreement at an AMC multiplex, voluntarily giving away control, allowing myself to be scared out of my mind. After all, if New York film critic and self-proclaimed “horror-movie freak” David Edelstein could barely handle It Follows, how was I ever going to walk away un-traumatized? Apparently, the film had frightened him so much that he left with a “so-upset-I-feel-sick kind of amorphous dread.” Yikes.

And here is where I must insert my common refrain: there be spoilers ahead. But if you are like me and avoid torture porn, slasher movies, and possession flicks, then you should know—if you’re even considering seeing It Follows—that the film is not scary! That’s right: to my pleasant surprise, It Follows isn’t scary in the least. That doesn’t mean there aren’t any jump scares (there are a few) or that the plotting doesn’t create and alleviate some violent tension. The sole “hideously gory image” that Edelstein can’t wipe from his memory arrives early, but it is no more traumatizing to look at than any of the so-called elegant murderous tableaux featured on the network drama Hannibal. (I admit: I watch that, but never before or after eating and only during daylight hours.)

By now, you probably know the premise of It Follows. A pretty but nice girl, Jay (Maika Monroe), has sex with a quiet but nice boy, Hugh (Jake Weary). Afterwards, he chloroforms her face, and she wakes up in her underwear, bound in a wheelchair. In an abandoned parking garage somewhere in the suburbs of Detroit, Hugh explains to her that, through intercourse, he has just passed a “thing” onto her, a monster that only she can see and that will take different forms, usually people she knows and loves. It will never stop haunting her, Hugh warns Jay (and by extension, the audience). It will follow her everywhere on foot, but she mustn’t let it kill her. If she succumbs to its evil force, then it will come after him.

Hugh (Jake Weary) is on the lookout for what follows, just after sexually transmitting the haunting onto Jay (Maika Monroe). Image courtesy of RADiUS-TWC.
Hugh (Jake Weary) is on the lookout for what follows, just after sexually transmitting the haunting onto Jay (Maika Monroe). Image courtesy of RADiUS-TWC.

In the introduction to the interview she conducted with David Robert Mitchell, Flavorwire writer Alison Nastasi hints at the parallels that Jay’s newfound diagnosis shares with venereal disease (prognosis: not good!). She writes, “Jay’s sexually transmitted haunting evokes the film’s theme of the terror of interconnectivity and teenage anxiety” (emphasis mine). In other words, despite coming of age well after the initial HIV/AIDS epidemic of the 1980s and the subsequent “safe sex” movement of the 1990s, teenagers today approach sex with some trepidation regarding infection. But in the film, the teenagers don’t so much internalize the idea that intercourse is more than just physically and emotionally laying yourself bare. Instead, sex is a means of, to borrow Nastasi’s words, “surrendering a part of yourself to someone else” whose sexual history weighs heavily on your own. The “interconnectivity” inherent in the sexual act renders Jay vulnerable to the aftereffects of one of Hugh’s earlier assignations and winds up controlling her destiny. Despite the support that she receives from her sister and friends, the visions of stalkers are too much for Jay to bear, and the burden of having to pass on this “sexually transmitted haunting” constitutes not only cruel and unusual punishment for all involved but also her only means of escape from imminent death.

But what does it all add up to? While Mitchell is open to various interpretations of what It Follows means, in talking with Nastasi, he resolutely denies that viewers should walk out of the theater believing that Jay’s “sexually transmitted haunting,” to use that phrase again, is a result of her sexuality. He also disagrees with the notion that the film purports a “sex-negative” message, specifically about women losing something during the act and that they should therefore be afraid of sex. Jay is not being punished for sleeping with someone, and casual sex isn’t something to fear. Watching the movie, I purposefully looked beyond the STI connection and about midway through decided that It Follows is about the collision of the real and the imaginary and how convincing someone of your truth binds you both together. I thought that what Jay suffers from isn’t a real physical threat but rather the psychological torment that Hugh first puts into her head. Can she trust him? Is this “thing” real? Will anyone believe that she is not crazy?

Young adults commonly struggle to make sense of the world and their place within it. What do I want to be when I grow up? Will my best friend and I always be close? How can I be cool? I don’t want to be anything like my parents! One of the elements I like best about It Follows is the companionship that Jay enjoys with Kelly (Jay’s sister, played by Lili Sepe), Yara (Olivia Luccardi), and Paul (Keir Gilchrist). Eventually, a classmate and former high school paramour, played by a young Johnny Depp lookalike named Greg (Daniel Zovatto) joins the crew. They all trust that Jay is seeing some pretty disturbing visions and that she isn’t crazy. Their teenage clique demonstrates that even if they cannot see or identify with the viral bullying that Jay endures, they can relate to it and desire to put a stop to it. In this way, It Follows turns the average teenager’s general desire to fit in (and the complementary fear of rejection) on its head. Jay’s supernatural condition infects everyone around her, thereby producing a cohesive social unit that doesn’t label her an outcast or social pariah. This is the silver lining to “the terror of interconnectivity and teenage anxiety,” as Nastasi labels the film’s overarching theme. In the end, the group relies on Jay’s vision to extinguish the monster once and for all.

In trying to locate Hugh and learn more of his secrets, Jay and her friends snoop around an abandoned house that he'd given her as his address. Image courtesy of RADiUS-TWC.
In trying to locate Hugh and learn more of his secrets, Jay and her friends snoop around an abandoned house that he’d given her as his address. Image courtesy of RADiUS-TWC.

But I’m getting ahead of myself. It Follows does showcase the interplay between Jay’s supernatural condition and the awkwardness of teenagehood in some poignant ways. Paul is in love with Jay, his best friend’s older sister. Years ago, they shared each other’s first kiss. When Paul finds out that Jay slept with Greg in order to pass on the haunting, he behaves somewhat like a petulant child. Why Greg? Paul wants to know. Because Jay thought that Greg could handle it. For days, Greg is unfazed, insisting that the monster has yet to visit him. Its absence leads him to believe that it is all in Jay’s mind. What happens next in the film rendered incorrect my observation about the slippage between what is real and what is imagined. One night, Jay watches a zombie-like stalker in the shape of Greg break into Greg’s house across the street. Jay realizes that her transmission, however successful, hasn’t freed her from the visions. So she runs across the street to warn him. Once in the house, it takes the form of Greg’s mother (Leisa Pulido), naked but for an untied silk robe, and pounds on Greg’s door. Jay yells for him to stay locked in his room. He doesn’t heed Jay’s warning, and for the first time the film audience catches a glimpse of what happens when the monster kills someone in the chain: it literally fucks the victim to death, their skin gelling together. That’s when it dawned on me that It Follows is simply about the fear of death—by incest.

In an interview with Vulture’s Kyle Buchanan, writer-director David Robert Mitchell discusses many of the film’s plot twists and turns. I think he’s being particularly cagey about why the monster takes the form of the victim’s loved ones, limiting his remarks to just the following: “So why did I make it the mom, other than just saying it was one of the more fucked-up things that I could think of? [Laughs.] It’s also that within the film, we’re sort of avoiding the influence of the adult world, and so I thought it was interesting to only enter into that space through the trope of the monster.” Without revealing much, this quote pinpoints exactly what the film is about: the anxiety over growing up and becoming an adult, and the film story uses incest as a metaphor for the teenagers to confront their own mortality, by becoming “one” with their parent.

Whereas my sister sees the teenagers’ mission to extinguish the monster on their own (that is, without adult supervision) as evidence of their self-sufficiency, I view it as an expression of the horror genre’s conventions. In these films, parents often don’t interact much with their teenagers—unless they are part of the problem. Just look at The Wicker Man (Robin Hardy 1973) or A Nightmare on Elm Street (Wes Craven, 1984), two seminal entries in the genre. In the former, orgiastic villagers use their children to lure a virginal fool to their Hebridean island in order to sacrifice him as part of their pagan May Day celebration. The titular nightmare(s) in the latter film also refer to a slippage between the real and the imaginary, and the adults are similarly of little use to solve the teens’ horrific ordeal. The adults’ attempt to murder Freddy Krueger, who now stalks the teens’ dreams, is the reason for their children’s torment today. It Follows even recalls the work of cult director David Lynch, specifically Blue Velvet (1986), Twin Peaks (1990-91), and Twin Peaks: Fire Walk With Me (1992). Writing for Slate, and without elaborating on the specifics, Mark Binelli claims that “Mitchell is clearly indebted to Blue Velvet,” probably because teenagers also investigate the psychosexual crimes of the adults in that voyeuristic picture. The highly influential Twin Peaks presents a more apt comparison; Leland Palmer killed his daughter but claims to have been possessed throughout his life by a demon, which led him to rape and murder her. The “influence of the adult world,” to use Mitchell’s phrase, is certainly something to avoid in these films and TV series, but since we only catch glimpses of the adults in It Follows, what makes them so sinister that copulation with your parent is deadly?

If the hallmark of teenage rebellion is not wanting to be anything like your parents, then, according to Freudian and Jungian psychoanalytical theory, the Oedipal and Electra complexes should be the death of you. It Follows realizes this destiny for those who contract the sexually transmitted haunting, but Mitchell goes one step further. Rather than merely promoting the fear of incest as the victim’s undoing, he combines these metaphors for psychosexual conflict with another Freudian theory: the interplay between the sexual and death instincts. While the former drives the individual to seek (sexual) pleasure and live life to the fullest, the latter binds the individual in a series of repetitious traumas that subconsciously influence the individual to seek solace in the space before his or her birth. In other words, life is about negotiating the libido and the death drive, which underpins the unconscious desire of the individual to cease to exist. In It Follows, the sexually transmitted haunting forces the infected to face his or her fear of death through violent confrontation with a monster that resembles the literal beginning of the victim’s being. For example, Greg’s pursuit of pleasure (his sexual instinct or libido) ends as his death drive brings him back to the place before he existed, in his mother’s loins. If the chain of victims haunted by the monster represents a series of repeated traumas, then the monster’s killing one victim and then going after another illustrates our inability to escape the death drive.

In the film's prologue, Annie (Bailey Spry) is on the run from the monster in her house. Her unseen dad (Loren Bass) seems involved in her psychosexual trauma. Image courtesy of RADiUS/TWC.
In the film’s prologue, Annie (Bailey Spry) is on the run from the monster in her house. Her unseen dad (Loren Bass) seems somehow involved in her psychosexual trauma. Image courtesy of RADiUS-TWC.

Evidence that It Follows is an exploration of (Neo-)Freudian psychoanalytical theory abounds throughout the picture. In fact, one of the most ominous aspects of the film’s prologue, shot in an extended long take, foreshadows what is to come. A young woman named Annie (Bailey Spry) runs out of her house dressed in a rather grown-up lingerie ensemble and high heels. Her father (Loren Bass) calls out to her as she circles around the stationary camera several times, afraid of whatever is in the house. The audience never sees what haunts her, so her father is the sole person (or even thing) that we can associate with the house. If you didn’t know the premise of the film, you might think that there is a history of sexual abuse between Annie and her father. She eventually gets into a car and drives far away to a secluded lakefront. She calls home and, fearing that the monster will soon reach her, insists that she loves her dad. Although we do not know who or what defiled and contorted Annie’s body (this is the gory image critic David Edelstein couldn’t erase from his memory), I think it is safe to assume that the monster probably took the form of her father in that unseen moment.

Jay's first vision appears while she's in the middle of a class at a community college. Find below the reverse shot of what Jay sees. Image courtesy of RADiUS-TWC.
Jay’s first vision appears while she’s in the middle of a class at a community college. Find below the reverse shot of what Jay sees. Image courtesy of RADiUS-TWC.

Once Jay contracts the haunting, the fear of sexual misconduct with her loved ones becomes more obvious. In conversation with Kyle Buchanan, Mitchell comments on the first form that Jay’s iteration of the monster takes: “And I think what you’re saying is true, it’s about the contrast of this [old] woman in its location [the campus of Jay’s community college]. Instantly, you realize that something is not quite right. And people are not paying attention to her, although in any other situation, they would be.” I recognized the old woman in her nightgown as the old woman from a photograph on the wall that Jay studies earlier in the film. The implication is that this woman is Jay’s grandmother. When Mitchell refers in the same interview to his method of keeping a distance between what the protagonists see from their perspective on the ground and what may be haunting Jay from any other vantage point, I believe he is referring specifically to the moment when Jay sees an old naked man standing atop her roof. Even though Mitchell and co. didn’t put on a longer lens to capture the menace’s visage in more detail, I still recognized that he is probably Jay’s grandfather, from the same photo.

The reverse shot captures the first vision Jay has, while in class at a community college. The old woman, I argue, is her dead grandmother. Image courtesy of RADiUS-TWC.
The reverse shot captures the first vision Jay has, while in class at a community college. The old woman, I argue, is her dead grandmother. Image courtesy of RADiUS-TWC.

Although the monster assumes more unfamiliar personages, including a young peeping tom from next door and an extraordinarily tall man (whom I couldn’t place), a majority of its reflections are familiar. When Jay and her friends track Hugh down and visit him at his real address, his mother answers the door, her wide smile in deep contrast to the emotionless face she had on while coming after Hugh (real name Jeff), completely naked, in the parking garage following his successful transmission of the haunting onto Jay. Furthermore, the violent climax at an indoor community pool not only resembles the end of Let the Right One In (Tomas Alfredson, 2008), it further illustrates my argument. The group (minus Greg, of course) plugs in various household appliances, hoping to use Jay to lure the monster into the pool so that they may electrocute it. Mitchell admits to Buchanan that the group’s plan is “the stupidest plan ever!” Paul has brought a gun and depends on Jay’s description of where it is in order to shoot it. Yara gets injured in the process, but it eventually falls into the pool, attempting to drown Jay. She gets away and watches as the pool fills up with blood. Crucially, however, throughout the whole scene the monster appears to Jay as her absent father, whom I also previously glimpsed in a family photograph. Besides wanting to keep these familiar connections as opaque as possible, I don’t understand why Jay, especially in this scene, never calls out whom it resembles. For that matter, after so many close calls, she never tells anyone whose appearance the monster adopts. Either I am completely off base as to why the monster threatens to kill the teenagers while in the form of their parents, or Mitchell is just one cagey guy.

Jay peers into the community pool to see if she and her friends successfully killed the monster. He's no longer there. Image courtesy of RADiUS-TWC.
Jay peers into the community pool to see if she and her friends successfully killed the monster. He’s no longer there. Image courtesy of RADiUS-TWC.

Despite the bloodbath, Jay and her friends are not out of the woods. Inexplicably, Jay sleeps with Paul. It Follows is not funny, but Paul’s asking Jay if she feels any different after they have sex cleverly corrupts the trope of (teenage) virginity. Next we see Paul troll for a prostitute. Meanwhile, Jay curls up on her bed in a familiar fetal position, her mother (Debbie Williams, whose face is never in focus—if it is even in the frame—throughout the film) stroking her naked back. This may be the most haunting image of It Follows. It demonstrates that the threat of death, expressed through her parent’s sexual menace, is ever near and ever present. The final shot of Jay and Paul, holding hands while walking down the street, may hint that the person following them in the distance is also haunting them. In this way, “the terror of interconnectivity and teenage anxiety” is defeated: these friends have become lovers and can now face the monster/death together. Although the film ends on an ambiguous but still upbeat note (in spite of everything that has happened, anyway), the image of Jay’s mother suggestively touching her daughter is creepier and more foreboding than the couple’s maybe-stalker.


Below is the lyric video for “From the Night,” the first song off the 2014 album No One is Lost by Stars. The Canadian band’s dreamy soundscapes complement the electronic synth score by Rich Vreeland (Disasterpeace), which borrows heavily from the horror-film themes of the 1970s and 80s, such as Halloween (John Carpenter, 1978). More than that, the newest record by the prolific band, which I only recently got into thanks to a live studio-session they performed on the CBC radio show Q, thematically and stylistically dovetails with one of It Follows’ themes: the average teenager’s desire to fit in and have fun. The cover of the album, which you can glimpse in the video, showcases a similar youthful yearning for connection and social acceptance that It Follows deconstructs. Most notably, listen to the lyrics after the bridge (at the 3.40 mark in the video). They could be telling Jay and Paul’s story.

Women Have Risen: Examining the Year’s Best Actress Nominees, Narrative Tropes, and the Human Experience

A few years ago, I published online an essay whose title encapsulated my frustration at the time with the apparent lack of compelling, universally humanistic film roles for women: “Can Female Film Characters Rise to Their Potential?” Inspired by a vision I had of a lone woman astronaut shuttling through space (Sally Ride had just died), I contemplated a future where women characters in film might “have interesting, fully realized inner lives that eschew all the narrative tropes that heretofore define women,” mainly being a wife and/or mother. The potential I see in women film characters, and women in general, is the narrative ability to illuminate the human condition for everyone.

On the eve of the 87th Academy Awards ceremony’s television broadcast, I habitually observe and reflect on the nominations. At this point, each of the four acting categories appears to offer no surprises when it will come time to announce the winners. Julianne Moore (Lead Actress, Still Alice), Patricia Arquette (Supporting Actress, Boyhood), Eddie Redmayne (Lead Actor, The Theory of Everything), and J.K. Simmons (Supporting Actor, Whiplash) have routinely won acting trophies for their respective film roles while competing on the awards circuit this season. With the outcome of these contests all but a certainty, I recognize that the most competitive category is that of Best Performance by an Actress in a Leading Role, and it collectively represents the fulfillment of my wish from over two years ago, with a few caveats. In other words, most performances in this category capture, for lack of a better turn of phrase, what it’s like to be human. If film is an art form that helps us make sense of our lives, we cannot take the woman’s experience for granted, as Academy voters have done. Of the five nominees for Best Performance by an Actor in a Leading Role, only one top-lines a film that is not nominated for Best Picture: Steve Carell in Foxcatcher. However, only one nominated female lead performance appears in a Best Picture contender: The Theory of Everything, as if to say that women-centered films are not prestigious (read: worthy) or capable of addressing everyone.

Rather than run through the list of nominees alphabetically, I want to discuss them in the chronological order that I first encountered them. Maybe it’s the simple passage of time or the workings of an unreliable memory, but every performance seemed to be better than the last one I saw. Fair warning: in my analysis, I give away many plot details of each film.

Gone Girl movie posterAt the beginning of October, Gone Girl kicked off the season of awards-friendly motion pictures, and I remember thinking throughout my viewing of it that Rosamund Pike, the titular “girl,” deserves a nomination for her portrait of a bonafide psychopath. As Amy Dunne, the dissatisfied wife of Ben Affleck’s mysterious charmer Nick Dunne, Pike both fakes her own kidnapping (and possible murder) and then frames her husband for it. It isn’t until halfway through that the viewer discovers that Amy, the subject of a statewide search, is in fact alive and on the run. Having set as her mission the complete and humiliating obliteration of Nick’s character as well as his eventual imprisonment, Amy watches from afar (using the national media circus surrounding their small Missouri town) as the forged artifacts and clues that she doctored to point towards Nick’s guilt gradually fall into place. The most lethal part of her scheme (killing a man in supposed self-defense in order to fake her abduction) ultimately reunites husband and wife. In the media spotlight she has helped orchestrate and direct, Amy uses the public court of opinion to both absolve Nick of any crime that the American public previously found him guilty of committing and to imprison him in an emotionally, mentally, and physically abusive marriage.

A snapshot of Amy Dunne's fake journal. Image courtesy of Twentieth Century Fox.
A snapshot of Amy Dunne’s fake journal in Gone Girl. Image courtesy of Twentieth Century Fox.

While Gone Girl and Amy’s role in it do not exactly conform to fulfilling my desire to see women in films who are unattached, undefined by their relationships to men and/or children, the David Fincher-directed thriller, which author Gillian Flynn adapted from her bestselling novel of the same name, at least deconstructs the sanctity of the institution of marriage. Keeping Amy’s machinations hidden until halfway through the picture, her perspective only relayed through fake found journals, not only shifts perspectives on the couple’s lives (from Nick’s to Amy’s), it also produces one helluva denouement. Amy’s cold and clinical calculations upend our previous idea of her, whether as flirtatious (the memory of their meet-cute), sacrificial (a longtime cosmopolitan, she left New York for suburban Missouri when Nick’s mother became terminally ill), or even physically abused (her fake journal embellishes an altercation with Nick in order to vilify him). More than this, Amy presents a pathologically sociopathic and misandrous response to patriarchy, going to libelous and murderous extremes to pervert the idea of a traditional marriage. As the primary breadwinner upon their transplant to the Midwest, Amy strikes back at Nick for his philandering ways and emotional neglect so that when he finds himself trapped in this controlling and harmful marriage (to say, “loveless” would be an understatement), she is not defined by her relationship to him so much as he is defined by whatever she thinks or says about him. In this way, Gone Girl examines how relationships bind us and in this process, redefines the rules of attachment. The opening and closing scenes, wherein Nick strokes his wife’s hair and, through voiceover narration, muses about how we really don’t know what goes on in the mind of our chosen companion, index our struggles with loneliness and desire to be free.

The Theory of Everything movie posterA Best Picture contender, The Theory of Everything is ostensibly a handsome biopic of British theoretical physicist Stephen Hawking. Based upon Jane Wilde Hawking’s memoir of the thirty years she was married to him, Traveling to Infinity: My Life With Stephen Hawking, the film is mostly focalized through her experience. While Eddie Redmayne receives almost unanimous praise for his physical transformation as Stephen, who was diagnosed with motor neuron disease (ALS or Lou Gehrig’s disease) in 1963 at age 21 and around the same time that he met fellow Cambridge student Jane, it is actually Felicity Jones as the scientist’s first wife who does most of the emotional heavy lifting in the film. The Theory of Everything doesn’t propose a film story about a woman uncharacterized by her relationship to a man and their children. Just the opposite, but it is worth discussing very briefly, to correct notions that the film is about a famous man and the people in his life. In fact, given the film’s source material, it is easy to argue that the film is about a woman and the famous man in her life. This does not mean that The Theory of Everything is a so-called “woman’s film,” but it is a family drama centered from the woman’s perspective.

In his review of the film, New York film critic David Edelstein writes that, “as the film’s focus drifts to [Jane], I found myself resenting the character—not for wanting more from her life, but for yanking the narrative away from by far the more fascinating figure.” I agree that the first part of the film focuses primarily on Stephen’s experience, combining his academic coming-of-age (meeting advisors’ expectations—or not—and choosing a dissertation topic) with his struggle to adjust to a rapidly degenerative disease as well as a nascent romance with Jane. She may have walked into his life at a party, but I argue that as soon as Jane determines that he should be a part of her life, she wrestles the picture away from him, and that gesture does make her both fascinating and compelling. I still cannot shake the image of the couple’s pronounced declaration of togetherness (it’s been used in the film’s marketing campaign, to boot) wherein they hold hands and joyously spin around. Significantly, it is Jane who initiates their little ball of energy, pulling Stephen into her orbit. Young and in love, Jane doesn’t realize the kind of life she commits herself to when she refuses to forget Stephen. For he far out-lives his life expectancy of two years, and as time marches on she becomes increasingly frustrated with her life. Taking care of Stephen and raising their children are two full-time jobs, and her own academic ambitions take a backseat to her husband’s. We witness the effect that choosing Stephen has on her life, and a romantic dalliance with a widowed choirmaster offers her some release. Jonathan (Charlie Cox) assists Jane with raising the kids and caring for Stephen, who condones their sexual relationship. Unable to face up to the rumors that Jane’s third child is his, Jonathan makes himself scarce. After Stephen loses his ability to speak and acquires a computer that will serve as his voice box, Jane recognizes that she can no longer support Stephen the way that he needs and reunites with Jonathan. She is a fascinating character, because she is willing to change her life and seek the fulfillment of her desires.

Felicity Jones as Jane Wilde Hawking with husband (Eddie Redmayne) and baby. Image courtesy of Focus Features.
Felicity Jones as Jane Wilde Hawking with husband Stephen (Eddie Redmayne) and baby. Image courtesy of Focus Features.

The Theory of Everything shines a light on one of the brightest minds of the twentieth and twenty-first centuries, but it also demonstrates that, to rephrase that old adage, “behind every great mind is a woman.” The title derives from Stephen’s quest to marry Einstein’s theory of general relativity with quantum mechanics, but it just as equally signifies that love is the answer to what binds people together for however long they can hold on. In this way, contextualizing Stephen Hawking’s life story and scientific and cultural contributions through his wife’s experience makes the case that they couldn’t have accomplished as much separately as they did together. If finding (self-)gratification is one of the tenets of the human condition, then Theory of Everything demonstrates how our desires are constantly in flux.

Wild movie posterMonths later, and with memories of Rosamund Pike and Felicity Jones sloshing around in my head, I finally saw Wild, the adaptation of Cheryl Strayed’s bestselling memoir. I fell so hard for this film, I don’t understand why it wasn’t nominated for its screenplay (by novelist Nick Hornby), cinematography (Yves Bélanger), and direction (Jean-Marc Vallée). Hell, I think Wild is easily one of the best films of the year and deserves one of those coveted spots not to exceed ten. Although I have never been a fan of Reese Witherspoon, I was in awe of the humanistic depth of her physical performance. It wasn’t so much a transformation—not like Eddie Redmayne’s or Charlize Theron’s for her Oscar-winning role in Monster where she turned out completely unrecognizable. Instead, Witherspoon perfectly embodies a woman who has been too hard on herself, on her spirit and on her body. When her young mother (Laura Dern in an achingly small but beautiful performance) dies of cancer, Cheryl grieves in an unexpected way, one that leads her astray from her husband (Thomas Sadoski) and into the arms of heroin addiction. With a painful divorce and an extramarital abortion behind her, Cheryl continues on her path to recovery under the most extreme of conditions: hiking 1100 miles of the Pacific Crest Trail alone. Along the way from the Mojave Desert to Portland, Oregon, she treks across a variety of terrain and climates (arid deserts, snow-capped mountains, Pacific Northwest rainforests) and encounters myriad threats, ranging from animal attacks and lost shoes to death by starvation/thirst and violent sexual assault.

Although Cheryl’s grief and infidelities may have instigated her pilgrimage, Wild isn’t about a woman defined by her relationship to her ex-husband Paul. The experience of losing him and herself in her grief even influences Cheryl to invent a new last name on the divorce papers: Strayed. In fact, Wild is a film about a woman’s self-programmed reinvention, or as the memoir’s subtitle states, she goes “From Lost to Found on the Pacific Crest Trail.” Cheryl takes ownership of the mistakes that she has made and grapples with how she took her mother for granted (but thankfully, like in this year’s indie rom-com extraordinaire Obvious Child, she unapologetically chooses an abortion when she stumbles into an unwanted pregnancy). By letting go of her social attachments for three months, during which time she calls on Paul and friends for support of the motivational and material kind, Cheryl learns to forgive and love herself again. For me, the most poignant aspect of the film is that Cheryl chooses her relationship with Bobbi as the one to define her, saying, “My mother was the love of my life.”

Cheryl (Reese Witherspoon, unpictured) remembers how much she wanted to be like her mother Bobbi (Laura Dern) with that irrepressible smile. Image courtesy of Fox Searchlight Pictures.
Cheryl (Reese Witherspoon, unpictured) remembers how much she wanted to be like her mother Bobbi (Laura Dern) with that irrepressible smile. Image courtesy of Fox Searchlight Pictures.

Moreover, Wild comes the closest of the Best Actress nominees so far in proposing a film about the human condition that just happens to be focalized through a woman’s experience. As I have already mentioned, the film is about self-programmed reinvention, love and regret, life and death. I imagine that we can all relate to a character who hurts the people who are closest, sometimes purposefully, sometimes without thinking at all. This doesn’t make the character a bad person, just someone who needs to learn to appreciate what life and love can offer. Crucially, it is too late for Cheryl to treat Bobbi as she deserved, but Cheryl’s arduous and

Cheryl (Reese Witherspoon) goes her own way. This obstacle course is the least of her problems. Image courtesy of Fox Searchlight Pictures.
Cheryl (Reese Witherspoon) goes her own way. This obstacle course is the least of her problems. Image courtesy of Fox Searchlight Pictures.

somewhat ascetic pilgrimage brings this all into focus. Presenting a woman’s story as universally humanistic is feminist in its own right, but Wild also engages the philosophy in more pointed ways. For example, virtually everyone she meets on the trail is astonished at her abilities and takes umbrage at her insistence to hike the trail without a male companion. She even locks heads with a reporter from The Hobo News who cannot comprehend her voluntary choice to drop out of society for a while and thus identifies her as a lost soul, a “hobo” with no job, home, or family. But most surprising of all, a group of three young men on the trail adopt Cheryl as their personal hero, having read her poetic entries in guest-books, which quote feminist icons such as Emily Dickinson and Adrienne Rich. Believing feminism to be part and parcel of humanism, Wild makes clear, as bell hooks once wrote, “Feminism is for everybody.”

Two Days One Night movie posterJust when I thought this year’s nominated lead performances by women couldn’t get any richer, I saw Marion Cotillard, de-glamorized, in Two Days, One Night. It is a much smaller film than the others, both in scale and, seemingly, in depth. Cotillard plays a working-class laborer who, given the weekend, must convince a majority of her co-workers to forgo their one thousand-euro bonuses so that she can keep her job. Whether or not the solar panel factory can legally put her continued employment to a vote by its employees is never questioned, but almost everyone she confronts points out that the boss’s ultimatum is unfair. Shot in their characteristic social realist/fly-on-the-wall style, the latest film by Belgian brothers Luc and Jean-Pierre Dardenne plays out like a thriller of a cruel joke: Will she get enough votes to keep her job? How many more times do we have to hear her plead with her co-workers to vote for her? Asking for anyone’s help is an ordeal in and of itself for Sandra, who, when the film begins, is on the brink of returning to work following a long absence (it gradually becomes clearer that she suffered a mental breakdown). A pathetic decision, choosing to speak with people in person whenever possible is costly in terms of time (she zigzags all over town in order to track them down at their homes, on the street, or in corner groceries or laundromats) and an emotionally draining exercise in futility. Thankfully, no two encounters are exactly the same, even if those unwilling to help her always have the same reason: they need the money, whether to pay their child’s tuition, build an addition to their house, or cover the electric bill for six months.

Sandra (Marion Cotillard) confronts a co-worker who cannot see beyond himself. Image courtesy of Sundance Selects.
Sandra (Marion Cotillard) confronts a co-worker who cannot see beyond himself. Image courtesy of Sundance Selects.

What makes Two Days, One Night so quietly impressive is its premise: to what lengths will someone go to keep her job? How will she convince human being after human being, with wants and needs not completely unidentical to her own, to sacrifice material gain in order to come to her aid? How will she react when, based on the number of votes pledged in her favor so far, her future looks bleak? Providing Sandra with a psychiatric disorder heightens the stakes—and the Dardennes do go to some dark places—but otherwise Two Days, One Night could be about anyone. In fact, there isn’t much character development in terms of Sandra’s familial role so as to make the part gender-specific. In other words, she spends so little time with her two children that her identity as mother does not define her. Even Sandra’s greatest champion, her husband Manu (Fabrizio Rongione), frames her ordeal as one about recovering her lost pride. Her humanity, and her repeated attempts to coax the more humane choice out of her peers, defines Sandra. Of course the couple needs her income to get by, but their situation is no more dire than that of most of her co-workers. In this way, the film is about overcoming adversity and preserving your own self-worth, arguably the most humanistic ideal. Come Monday morning, Sandra is one vote shy of keeping her job. Touched by the generosity of some of her colleagues, she refuses the boss’s offer to rehire her at the end of the season, because it would mean that one of her pledges would lose his contract with the company. Initially stunning, her decision to incur further economic hardship isn’t just about worker solidarity but also personal integrity. The final scene of Sandra’s bad-news phone call to Manu represents a revolution of some sorts: walking away from the factory, smiling, Sandra is buoyant with every step, personally motivated by the support of Manu and her co-workers to find another job. If she can get through this past weekend, she can approach any new challenge with enough courage and integrity to overcome it.

Still Alice movie posterRounding out the five nominees for Best Actress, Julianne Moore presents a deeply moving and sensitive portrayal of a woman diagnosed with early onset Alzheimer’s disease in Still Alice, an adaptation of Lisa Genova’s novel of the same name. Admittedly, I am not Moore’s biggest fan (she’s usually too showy for my tastes), and the negative reviews of the film colored my perception of it going in. Jason Bailey of Flavorwire wrote that the film “plays like a dusted-off, mid-‘90s Movie of the Week.” However, not only was I pleasantly surprised by the quality of the film, I was also overcome by profound sadness and grief, unable to talk about what I had just seen without choking up. Who cares if Still Alice is emotionally manipulative? More than any of the other films nominated in this category, Still Alice examines what makes us who we are while confronting our own mortality.

A world-famous linguistics professor at Columbia University, Alice Howland is the first to recognize that “something is wrong with [her].” Sometimes she can’t find the right word, and at other times she gets disoriented on her aerobic runs around the neighborhood. Her husband, John (Alec Baldwin), writes off her worries as evidence that at 50, she’s simply getting older. Determined to find the root of her newfound problems (it feels like her brain is slipping farther and farther away from her), she sees a neurologist in secret and eventually receives the dreaded diagnosis. The effects of the disease would be difficult for anyone to cope with, but as her doctor explains, since Alice carries the familial gene for early onset Alzheimer’s and is extremely well-educated, she can expect to deteriorate more rapidly than if she didn’t have the gene and wasn’t so well-educated. She simply has much more to lose, and for a linguist whose life’s work has been the study of human communication systems, the thought of losing her ability to relate who she is with words is devastating. As it is for me, as it is for anyone.

As her mother's primary caregiver, Lydia (Kristen Stewart) tries to comfort a sad and spacey Alice (Julianne Moore). Image courtesy of Sony Pictures Classics.
As her mother’s primary caregiver, Lydia (Kristen Stewart) tries to comfort a sad and spacey Alice (Julianne Moore). Image courtesy of Sony Pictures Classics.

But Alice is intellectually resourceful, and she can better compensate for her incapacities. It takes a while for her to admit defeat and leave her tenured position (her meeting with the chair of her department is the most implausible scene in the whole picture, for it would never be up to her colleagues to dismiss her because she has a health issue). John and their three children try to look after Alice as best they can. Eventually, their youngest, the Los Angeles-set aspiring actress and free spirit Lydia (Kristen Stewart), agrees to move back to New York to serve as Alice’s primary caregiver when John accepts a position at the Mayo Clinic in Rochester, Minnesota. In the exploration of this mother-daughter relationship, Alice’s older children, the lawyer Anna (Kate Bosworth) and the medical student Tom (Hunter Parrish), suffer from a severe lack of character development. While Anna and Lydia sometimes butt heads as to what is best for their mother, Tom’s only real function is to accompany Alice to a talk she gives at the local chapter of the Alzheimer’s Association. Film critic Jason Bailey denigrated this speech as a “forced, false moment” by writer-director duo Richard Glatzer and Wash Westmoreland, thereby completely forgetting that the scene parallels an earlier keynote address she gave at a linguistics event where she spoke confidently on topics related to her main line of inquiry: why do humans talk and how to they learn to communicate? In the later scene, the transformation that Alice has undergone throughout the film is palpable. Anxious and insecure, she must use a highlighter as she speaks at the podium so that she does not lose her place in the speech. Frustrated with her inability to write a persuasive argument using medical and linguistic jargon, she takes Lydia’s advice and writes about how it feels to lose her mind. There isn’t a dry eye in the house. For, as it is made clear throughout the picture, we are who we are because we have made ourselves into whoever we want to be. For Alice, that has been an expert on language acquisition, an equal partner in a loving relationship with a man who confidently says she was the smartest person he’d ever met, and a dependable and accepting mother.

Still Alice also makes the case that we are who we are because of what we remember. As Alice grapples with her diagnosis, slipping farther and farther away from the people in her life, she returns to memories of her sister, whom she lost as a teenager. I initially dismissed the final scene of the film, failing to recognize that Alice’s imagining she and her sister on a beach is her defiant stance against the havoc that Alzheimer’s wreaks on her mind. She clings to this memory as if to remind herself of who she is. This shot immediately follows the scene in which Lydia reads from the play Angels in America and asks her mother if she knows what the speech is about. Alice smiles and struggles to say, “Love.” Again, in his review of Still Alice, which he labels “desperate” and unoriginal, Bailey fails to see how the film’s ending illuminates something fundamental about the human experience: our appreciation and understanding of art and how it reflects our perception of what the meaning of life is. The Flavorwire film critic finds Glatzer and Westmoreland’s “desperation… particularly rancid at the end” because, “in lieu of saying anything moving or profound, they simply shoplift the ending of Angels in America.” In presumably one of Alice’s last moments of clarity, she demonstrates for Lydia that she is still present, that she can understand Tony Kushner’s complex speech, and that she loves her daughter and her long-lost sister. It doesn’t matter that these “moving and profound” words, to correct Bailey’s statement, are not Alice’s or Lydia’s. Not everything we say or do is original; the purpose of art is to draw connections between experiences, and the meaning of life is to see how art shapes us.

Contrary to what Russell Crowe thinks about roles for older women in Hollywood, the reality is that quality parts for women at any age are terribly lacking. While most Oscar prognosticators, critics, and cinephiles like myself watch the Academy Awards tonight and lament the fatedness of Julianne Moore’s, Patricia Arquette’s, Eddie Redmayne’s, and J.K. Simmons’s prize-winning, I will remember that for the first time in a long while, it seems that every nominee in the Best Actress category was phenomenal. Rather than choose a winner, I wish we could simply celebrate these five actresses and many more, because they brought to life film characters whose experiences illuminated different facets of the human condition. I hope this trend in representing women with “interesting, fully realized inner lives” continues. And I don’t care if they are wives or mothers anymore. Restricting what kinds of parts women play in film and in society isn’t humane.