Writer of things that go bump in the night

Tag: Television (Page 1 of 4)

Under the Influence, Part 1:  On Artistic Inspiration, Finding One’s Voice, and Tarantino’s Formative Faves

Let’s play Ten for Ten!  To commemorate the ten-year anniversary of this blog, which launched on June 26, 2014, here’s an appreciation for ten of my formative cinematic influences—an examination of why these movies resonated with me when I first saw them, and how they permanently informed my aesthetic tastes and creative sensibilities.  This post is presented in three installments.

“Under the Influence, Part 1” informally ponders through personal example how an artist develops a singular style and voice all their own, and offers an analysis of Quentin Tarantino’s essay collection Cinema Speculation, the auteur’s critical look at the movies of the ’70s that inspired him.

In “Under the Influence, Part 2,” I spotlight five films from my ’80s childhood that shaped my artistic intuition when at its most malleable.

And in “Under the Influence, Part 3,” I round out the bill with five selections from my ’90s adolescence, the period during which many of the themes that preoccupy me crystalized.


It takes an unholy degree of time and stamina to write a book.  Consequently, it’s advisable to have a really good reason to take a given project on—then see it through to the finish line.  Before typing word one of a new manuscript, it behooves us to ask (and answer):  Why is this project worth the herculean effort required to bring it into existence?

I wrote my debut novel The Dogcatcher (2023) for the most elemental of motives:  I ached for the kind of bygone horror/comedies on which I’d come of age in the ’80s, an era that produced such motley and memorable movies as An American Werewolf in London (1981), The Evil Dead (1981), Gremlins (1984), Ghostbusters (1984), The Witches of Eastwick (1987), The Lost Boys (1987), The Monster Squad (1987), The ’Burbs (1989), and Tremors (1990).  Where have those kinds of movies gone? I wondered.

Hollywood, to be fair, hadn’t stopped making horror/comedies, it only long since stopped making them with any panache.  I have spent many a Saturday night over the past decade in a binge-scrolling malaise, surfing numbly through hundreds of viewing options on Netflix or Prime or Hulu or whatever, when suddenly my inner adolescent’s interest is piqued—as though I were back at the old video store and had found a movie right up my alley.

I certainly sensed the stir of possibility in Vampires vs. the Bronx (2020), about a group of teenagers from my hometown battling undead gentrifiers.  Night Teeth (2021), featuring bloodsuckers in Boyle Heights, seemed equally promising.  And Werewolves Within (2021) is set in a snowbound Northeastern United States township already on edge over a proposed pipeline project when its residents find themselves under attack by a werewolf.

“Vampires vs. the Bronx” (2020) seemed like the perfect mix of Gen X–era throwback and Gen Z–era social commentary

All of a sudden, I felt like that sixteen-year-old kid who saw the one-sheet for Buffy the Vampire Slayer (1992) while riding the subway to work—“She knows a sucker when she sees one,” teased the tagline, depicting a cheerleader from the neck down with a wooden stake in her fist—and knew he was in for a good time at the cinema.

No such luck.  Vampires vs. the Bronx, in an act of creative criminality, pisses away a narratively and thematically fertile premise through flat, forgettable execution.

Night Teeth, meanwhile, answers the question:  How about a movie set in the same stomping ground as Blade (1998)—inner-city L.A., clandestine vampiric council calling the shots—only without any of its selling-point stylistics or visual inventiveness?

And Werewolves Within establishes an intriguing environmental justice subplot the screenwriter had absolutely no interest in or, it turns out, intention of developing—the oil pipeline isn’t so much a red herring as a dead herring—opting instead for a half-assed, who-cares-less whodunit beholden to all the standard-issue genre tropes.

Faced with one cinematic disappointment after another, it seemed the only way to sate my appetite for the kind of horror/comedy that spoke to me as a kid was to write my own.

On the subject of kids—specifically, stories about twelve-year-old boys—I haven’t seen one of those produced with any appreciable measure of emotional honesty or psychological nuance since Rob Reiner’s Stand by Me (1986), based on Stephen King’s 1982 novella The Body.  That was forty years ago!

Storytellers know how to write credible children (E.T. the Extra-Terrestrial, Home Alone, Room), and they know how to write teenagers (The Outsiders, Ferris Bueller’s Day Off, Clueless), but preadolescent boys are almost invariably reduced to archetypal brushstrokes (The Goonies, The Sandlot, Stranger Things).  The preteen protagonists of such stories are seldom made to grapple with the singular emotional turbulence of having one foot in childhood—still watching cartoons and playing with action figures—and the other in adolescence—beginning to regard girls with special interest, coming to realize your parents are victims of generational trauma that’s already in the process of being passed unknowingly and inexorably down to you.

For all of popular culture’s millennia-long fixation on and aggrandizement of the heroic journey of (usually young) men, our commercial filmmakers and storytellers either can’t face or don’t know how to effectively dramatize the developmental fulcrum of male maturation.  George Lucas’ experimental adventure series The Young Indiana Jones Chronicles (1992–1996) sheds light on Indy’s youth from ages eight through ten (where he’s portrayed by Corey Carrier) and then sixteen through twenty-one (Sean Patrick Flanery); the complicated messiness of pubescence, however, is entirely bypassed.  Quite notably, those are the years in which Indy’s mother died and his emotionally distant father retreated into his work—formative traumas that shaped, for better and worse, the adult hero played by Harrison Ford in the feature films.

Lucas’ elision seems odd to me—certainly a missed creative opportunity1—given that twelve-going-on-thirteen is the period of many boys’ most memorable and meaningful adventures.  King and Reiner never forgot that, and neither did I, hence the collection of magical-realism novellas I’m currently writing that explore different facets of that transitory experience:  going from wide-eyed wonder to adolescent disillusionment as a result of life’s first major disappointment (Spex); being left to navigate puberty on your own in the wake of divorce (The Brigadier); struggling to understand when, how, and why you got socially sorted at school with the kids relegated to second-class citizenry (H.O.L.O.).

This single-volume trilogy, I should note, isn’t YA—these aren’t stories about preteens for preteens.  Rather, they are intended, like The Body/Stand by Me before them, as a retrocognitive exercise for adults who’ve either forgotten or never knew the experience of being a twelve-year-old boy to touch base with that metamorphic liminality in all of its psychoemotional complexity.  They’re very consciously stories about being twelve as reviewed from middle-aged eyes.

As I’ll demonstrate in “Part 2” and “Part 3,” both that WIP and The Dogcatcher take inspiration—narratively, thematically, aesthetically, referentially—from the stories of my youth, the books and movies that first kindled my imagination and catalyzed my artistic passions.

Continue reading

Highway to Hell:  Car Culture and Hollywood’s Hero-Worship of the Automobile

With road-trip season upon us once again, here’s an examination of how American car culture has been romanticized by the entertainment industry; how automobiles, far from enablers of freedom and individuality, are in fact “turbo-boosted engines of inequality”; and how Hollywood can help remedy an ecocultural crisis it’s played no small role in propagating.


In any given episode, the action reliably starts the same way:  a wide shot of the Batcave, Batmobile turning on its rotating platform to face the cavemouth, camera panning left as the Dynamic Duo descend the Batpoles.  Satin capes billowing, Batman and Robin hop into their modified 1955 Lincoln Futura, buckle up—decades before it was legally required, incidentally—and the engine whines to life as they run through their pre-launch checklist:

ROBIN:  Atomic batteries to power.  Turbines to speed.

BATMAN:  Roger.  Ready to move out.

A blast of flame from the car’s rear thruster—whoosh!—and off they’d race to save the day.

By the time the 1980s had rolled around, when I was first watching Batman (1966–1968) in syndicated reruns, every TV and movie hero worth his salt got around the city in a conspicuously slick set of wheels.  Muscle cars proved popular with working-class ’70s sleuths Jim Rockford (Pontiac Firebird) and Starsky and Hutch (Ford Gran Torino).  The neon-chic aesthetic of Reagan era, however, called for something a bit sportier, like the Ferrari, the prestige ride of choice for Honolulu-based gumshoe Thomas Magnum (Magnum, P.I.) and buddy cops Crockett and Tubbs (Miami Vice).  The ’80s were nothing if not ostentatiously aspirational.

Even when cars were patently comical, they came off as cool despite themselves:  the Bluesmobile, the 1974 Dodge Monaco used in The Blues Brothers (1980); the Ectomobile, the 1959 Cadillac Miller-Meteor Sentinel in Ghostbusters (1984); the Wolfmobile, a refurbished bread truck that Michael J. Fox and his pal use for “urban surfing” in Teen Wolf (1985).

The DMC DeLorean time machine from Back to the Future is clearly meant to be absurd, designed in the same kitchen-sink spirit as the Wagon Queen Family Truckster from National Lampoon’s Vacation (1983), but what nine-year-old boy in 1985 didn’t want to be Michael J. Fox, sliding across the stainless-steel hood and yanking the gull-wing door shut behind him?  And like the characters themselves, the DeLorean evolved with each movie, going from nuclear-powered sports car (Part I) to cold-fusion flyer (Part II) to steampunk-retrofitted railcar (Part III).  “Maverick” Mitchell’s need for speed didn’t hold a candle to Marty McFly’s, who’s very existence depended on the DeLorean’s capacity to reach 88 miles per hour.

Vehicles that carried teams of heroes offered their own vicarious pleasure.  Case in point:  the 1983 GMC Vandura, with its red stripe and rooftop spoiler, that served as the A-Team’s transpo and unofficial HQ—a place where they could bicker comically one minute then emerge through the sunroof the next to spray indiscriminate gunfire from their AK-47s.  The van even had a little “sibling”:  the Chevrolet Corvette (C4) that Faceman would occasionally drive, marked with the same diagonal stripe.  Did it make sense for wanted fugitives to cruise L.A. in such a distinct set of wheels?  Not really.  But it was cool as hell, so.

The Mystery Machine was the only recurring location, as it were, on Scooby-Doo, Where Are You! (1969), and the van’s groovy paint scheme provided contrast with the series’ gloomy visuals.  Speaking of animated adventures, when once-ascetic Vietnam vet John Rambo made the intuitive leap from R-rated action movies to after-school cartoon series (1986), he was furnished with Defender, a 6×6 assault jeep.  Not to be outdone, the most popular military-themed animated franchise of the ’80s, G.I. Joe:  A Real American Hero (1983–1986), featured over 250 discrete vehicles, and the characters that drove them were, for the most part, an afterthought:

With the debut of the 3 ¾” figures in 1982, Hasbro also offered a range of vehicles and playsets for use with them.  In actual fact, the 3 ¾” line was conceived as a way to primarily sell vehicles—the figures were only there to fill them out!

‘3 ¾” Vehicles,’ YoJoe!

But who needs drivers when the vehicles themselves are the characters?  The protagonists of The Transformers (1984–1987) were known as the Autobots, a race of ancient, sentient robots from a distant planet that conveniently shapeshifted into 1980s-specific cars like the Porsche 924 and Lamborghini Countach, among scores of others.  (The premise was so deliriously toyetic, it never occurred to us to question the logic of it.)  Offering the best of both G.I. Joe and The Transformers, the paramilitary task force of M.A.S.K. (1985–1986), whose base of operations was a mountainside gas station (what might be described as Blofeld’s volcano lair meets the Boar’s Nest), drove armored vehicles that transformed into… entirely different vehicles.

Many movies and shows not only featured cars as prominent narrative elements, but literally took place on the roadVacationMad Max (1979).  Smokey and the Bandit (1977).  CHiPs (1977–1983).  Sometimes the car was so important it had a proper name:  General Lee from The Dukes of Hazzard (1979–1985).  Christ, sometimes it was the goddamn series costar:  KITT on Knight Rider (1982–1986).  Shit on David Hasselhoff’s acting ability all you want, but the man carried a hit TV show delivering the lion’s share of his dialogue to a dashboard.  Get fucked, Olivier.

1980s hero-car culture at a glance

As a rule, productions keep multiple replicas of key picture cars on hand, often for different purposes:  the vehicle utilized for dialogue scenes isn’t the one rigged for stunts, for instance.  It’s notable that the most detailed production model—the one featured in medium shots and closeups, in which the actors perform their scenes—is known as the “hero car.”  And why not?  Over the past half century, Hollywood has unquestionably programmed all of us to recognize the heroism of the automobile.

Continue reading

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

Book Review:  “Heat 2” by Michael Mann + Meg Gardiner

This article discusses plot details and scene specifics from Michael Mann’s film Heat (1995) and his novel Heat 2 (2022).


John Carpenter’s dystopian classic Escape from New York (1981), set in 1997, opens with an expository intertitle:  “1988—The Crime Rate in the United States Rises Four Hundred Percent.”  Though that grim prognostication amounted to an exaggeration, the issue itself had nonetheless become a big deal here in the real world by the early 1990s:

In 1993, the year President Clinton took office, violent crime struck nearly 11 million Americans, and an additional 32 million suffered thefts or burglaries.  These staggering numbers put millions more in fear.  They also choked the economic vitality out of entire neighborhoods.

Politically, crime had become one of the most divisive issues in the country.  Republicans called for an ever more punitive “war on drugs,” while many Democrats offered little beyond nebulous calls to eliminate the “root causes” of crime.

David Yassky, “Unlocking the Truth About the Clinton Crime Bill,” Opinion, New York Times, April 9, 2016

Clinton’s response was the measurably effective (if still controversial) Violent Crime Control and Law Enforcement Act of 1994, otherwise known as the 1994 Crime Bill, coauthored by Joe Biden, the provisions of which—and this is just a sampling—added fifty new federal offenses, expanded capital punishment, led to the establishment of state sex-offender registries, and included the Federal Assault Weapons Ban (since expired) and the Violence Against Women Act.

It was an attempt to address a big issue in America at the time:  Crime, particularly violent crime, had been rising for decades, starting in the 1960s but continuing, on and off, through the 1990s (in part due to the crack cocaine epidemic).

Politically, the legislation was also a chance for Democrats—including the recently elected president, Bill Clinton—to wrestle the issue of crime away from Republicans.  Polling suggested Americans were very concerned about high crime back then.  And especially after George H.W. Bush defeated Michael Dukakis in the 1988 presidential election in part by painting Dukakis as “soft on crime,” Democrats were acutely worried that Republicans were beating them on the issue.

German Lopez, “The controversial 1994 crime law that Joe Biden helped write, explained,” Vox, September 29, 2020

Given the sociopolitical conditions of the era, it stands to reason—hell, it seems so obvious in hindsight—the 1990s would be a golden age of neo-noir crime cinema.  The death of Michael Corleone, as it happens, signified a rebirth of the genre itself; Martin Scorsese countered the elegiac lethargy—that’s not a criticism—of Francis Ford Coppola’s The Godfather, Part III with the coke-fueled kineticism of Goodfellas (both 1990).  Henry Hill shared none of Michael’s nagging reluctance about life in the Italian Mafia; he always wanted to be a gangster!

Reasoning that was probably true of audiences, too—as an author of horror stories, I certainly appreciate a healthy curiosity for the dark side—Hollywood offered vicarious trips into the criminal underworlds of Hell’s Kitchen, in Phil Joanou’s State of Grace (1990), and Harlem, in Mario Van Peebles’ New Jack City (1991), both of which feature undercover cops as major characters.  So does Bill Duke’s Deep Cover (1992), about a police officer (Laurence Fishburne) posing as an L.A. drug dealer as part of a broader West Coast sting operation.

The line between cop and criminal, so clearly drawn in the action-comedies of the previous decade (Lethal Weapon, Beverly Hills Cop, Stakeout, Running Scared), was becoming subject to greater ambiguity.  In no movie is that made more starkly apparent than Abel Ferrara’s Bad Lieutenant (1992), about a corrupt, hedonistic, drug-addicted, gambling-indebted, intentionally nameless New York cop (Harvey Keitel) investigating the rape of a nun in the vain hope it will somehow redeem his pervasive rottenness.

And it wasn’t merely that new stories were being told; this is Hollywood, after all, so we have some remakes in the mix.  Classic crime thrillers were given contemporary makeovers, like Scorsese’s Cape Fear (1991), as well as Barbet Schroeder’s Kiss of Death (1995), which is mostly remembered, to the extent it’s remembered at all, as the beginning and end of David Caruso’s would-be movie career, but which is much better than its reputation, thanks in no small part to a sharp script by Richard Price (Clockers), full of memorably colorful Queens characters and his signature street-smart dialogue.

Creative experimentation was in full swing, too, as neo-noir films incorporated conventions of other genres, including erotic thriller (Paul Verhoeven’s Basic Instinct [1992]), black comedy (the Coen brothers’ Fargo [1996] and The Big Lebowski [1998]), period throwback (Carl Franklin’s Devil in a Blue Dress [1995]; Curtis Hanson’s L.A. Confidential [1997]), neo-Western (James Mangold’s Cop Land [1997]), and, well, total coffee-cup-shattering, head-in-a-box mindfuckery (Bryan Singer’s The Usual Suspects; David Fincher’s Seven [both 1995]).

Christ, at that point, Quentin Tarantino practically became a subgenre unto himself after the one-two punch of Reservoir Dogs (1992) and Pulp Fiction (1994), which in turn inspired an incessant succession of self-consciously “clever” knockoffs like John Herzfeld’s 2 Days in the Valley (1996) and Gary Fleder’s Things to Do in Denver When You’re Dead (1995).  By the mid-’90s, the crime rate, at least at the cinema, sure seemed like it had risen by 400%.

Tim Roth lies bleeding as Harvey Keitel comes to his aid in a scene from the film “Reservoir Dogs,” 1992 (photo by Miramax/Getty Images)

As different as they all are, those films can almost unanimously be viewed as a repudiation of the ethos of ’80s action movies, in which there were objectively good guys, like John McClane, in conflict with objectively bad guys, like Hans Gruber, in a zero-sum battle for justice, for victory.  It was all very simple and reassuring, in keeping with the archconservative, righteous-cowboy worldview of Ronald Reagan.  And while those kinds of movies continued to find a receptive audience—look no further than the Die Hard–industrial complex, which begat Under Siege (1992) and Cliffhanger (1993) and Speed (1994), among scores of others—filmmakers were increasingly opting for multilayered antiheroes over white hats versus black hats.

Which begged the question:  Given how blurred the lines had become between good guys and bad guys in crime cinema, could you ever go back to telling an earnest, old-school cops-and-robbers story—one with an unequivocally virtuous protagonist and nefarious antagonist—that nonetheless aspired to be something more dramatically credible, more psychologically nuanced, more thematically layered than a Steven Seagal star vehicle?

Enter Michael Mann’s Heat.

Continue reading

Sorting through the Clutter:  How “The Girl Before” Misrepresents Minimalism

The Girl Before depicts minimalism as an obsessive-compulsive symptom of emotional instability, in contrast with what I can attest it to be from years of committed practice:  a versatile set of tools/techniques to promote emotional balance—that is, to attain not merely a clutter-free home, but a clutter-free head.


In the BBC One/HBO Max thriller The Girl Before, created by JP Delaney (based on his novel), brilliant-but-troubled architect Edward Monkford (David Oyelowo)—ah, “brilliant but troubled,” Hollywood’s favorite compound adjective; it’s right up there with “grounded and elevated”—is designer and owner of a postmodern, polished-concrete, minimalist home in suburban London, One Folgate Street, which he rents out, with extreme selectivity, at an affordable rate to “people who live [t]here the way he intended.”  Prospective tenants are required to submit to an uncomfortably aloof interview with Edward, whose otherwise inscrutable mien lapses into occasional expressions of condescending disapproval, and then fill out an interminable questionnaire, which includes itemizing every personal possession the candidate considers “essential.”

The rarified few who meet with Edward’s approval must consent to the 200-odd rules that come with living in the house (no pictures; no ornaments; no carpets/rugs; no books; no children; no planting in the garden), enforced through contractual onsite inspections of the premises.  Meanwhile, One Folgate Street is openly monitored 24/7 by an AI automation system that tracks movements, polices violations of maximum-occupancy restrictions, regulates usage of water and electricity, sets time limits on tooth-brushing, and preselects “mood playlists”—just for that personal touch.  All of this is a reflection of Edward’s catholic minimalist philosophy:  “When you relentlessly eradicate everything unnecessary or imperfect, it’s surprising how little is left.”

“The Girl Before,” starring David Oyelowo, Gugu Mbatha-Raw, and Jessica Plummer

The Girl Before—and I’ve only seen the miniseries, not read the book—intercuts between two time periods, set three years apart, dramatizing the experiences of the current tenant, Jane Cavendish (Gugu Mbatha-Raw), grief-stricken over a recent stillbirth at 39 weeks, and the home’s previous occupant, Emma Matthews (Jessica Plummer), victim of a sexual assault during a home invasion at her flat.  (Emma, we soon learn, has since died at One Folgate Street under ambiguous circumstances that may or may not have something to do with Edward…?)  Edward’s minimalist dogma appeals to both women for the “blank slate” it offers—the opportunity to quite literally shed unwanted baggage.

This being a psychological thriller, it isn’t incidental that both Jane and Emma bear not merely uncanny physical resemblance to one another, but also to Edward’s late wife, who herself died at One Folgate Street along with their child, casualties of an accident that occurred during the construction of the home originally intended for the site before Edward scrapped those plans and went psychoneurotically minimalistic.  Everyone in The Girl Before is traumatized, and it is the imposition of or submission to minimalist living that provides an unhealthy coping mechanism for Edward, Jane, and Emma, each in their own way:

In this novel, [Delaney] wanted to explore the “weird and deeply obsessive” psychology of minimalism, evident in the fad for [Marie] Kondo and her KonMari system of organizing.  “On the face of it,” he wrote, “the KonMari trend is baffling—all that focus on folding and possessions.  But I think it speaks to something that runs deep in all of us:  the desire to live a more perfect, beautiful life, and the belief that a method, or a place, or even a diet, is going to help us achieve that.  I understand that impulse.  But my book is about what happens when people follow it too far.  As one of my characters says, you can tidy all you like, but you can’t run away from the mess in your own head.”

Gregory Cowles, “Behind the Best Sellers:  ‘Girl Before’ Author JP Delaney on Pseudonyms and the Limits of Marie Kondo,” New York Times, February 3, 2017

Indeed.  And if only The Girl Before had been a good-faith exploration of what minimalism, the psychology and practice of it, actually is.

Continue reading

Book Review:  “Blood, Sweat & Chrome” by Kyle Buchanan

Kyle Buchanan’s Blood, Sweat & Chrome, published by William Morrow in February, chronicles the not-to-be-believed making of George Miller’s Mad Max:  Fury Road (2015) from conception to release through interviews with its cast and crew, and celebrates the inspiring creative imagination of the filmmakers, who defied the odds to create a contemporary classic—a movie as singularly visceral as it is stunningly visual.

But much like the nonstop action in the movie itself, the adulation expressed in the book never pauses to interrogate Miller and company’s moral imagination.  Let’s fix that, shall we?


I abhor nostalgia, particularly for the 1980s and ’90s, but I’ve recently found myself revisiting many of the films and television shows of the latter decade, the period during which I first knew I wanted to be a cinematic storyteller, when earnest star-driven Oscar dramas like Forrest Gump (1994) coexisted with, and even prospered alongside, paradigm-shifting indies à la Pulp Fiction (also ’94).  Those days are gone and never coming back—the institution formerly known as Hollywood is now the superhero–industrial complex—but I’ve wondered if some of those works, so immensely popular and influential then, have stood the test of time?

Yet my informal experiment has been about much more than seeing if some old favorites still hold up (and, by and large, they do); it’s about understanding why they worked in the first place—and what storytelling lessons might be learned from an era in which movies existed for their own sake, as complete narratives unto themselves rather than ephemeral extensions of some billion-dollar, corporately superintended brand.

In an entertainment landscape across which there is so much content, most of it deceptively devoid of coherence or meaning—a transmedia morass I’ve come to call the Multiverse of Madness—the secret to studying narrativity isn’t to watch more but rather less.  To consume fewer movies and TV shows, but to watch them more selectively and mindfully.  Pick a few classics and scrutinize them until you know them backwards and forwards.

In college, I spent an entire semester analyzing Citizen Kane (1941), from reading multiple drafts of its screenplay to watching it all the way through with the volume turned down just to appreciate its unconventional cinematography.  That’s how you learn how stories work:  Study one or two movies/novels per year… but study the shit out of them.  Watch less, but do it far more attentively.

Tom Hardy as Max Rockatansky in “Mad Max: Fury Road,” the subject of “Blood, Sweat & Chrome”

That is, admittedly, a counterintuitive mindset in our Digital Age of automatic and accelerating behaviors, whereby post-credit scenes preemptively gin up anticipation for the next movie (often through homework assignments) before we’ve had a chance to digest the current one, and the autoplay feature of most streaming services encourages and enables mindless TV binge-watching.

But the quarantine, unwelcome though it may have been, did offer a pause button of sorts, and we are only now beginning to see some of the ways in which folks exploited the rare opportunity to slow down, to go deep, that it offered.  One such project to emerge from that period of thoughtful reflection is entertainment journalist Kyle Buchanan’s recently published nonfiction book Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road”:

In April 2020, as the pandemic swept the planet and the movie-release calendar fell apart, I began writing an oral history of Mad Max:  Fury Road for the New York Times.  Without any new titles to cover, why not dive deeply into a modern classic on the verge of its fifth anniversary?

Every rewatch over those five years had confirmed to me that Fury Road is one of the all-time cinematic greats, an action movie with so much going on thematically that there’d be no shortage of things to talk about.  I had also heard incredible rumors about the film’s wild making, the sort of stories that you can only tell on the record once the dust has long settled.

Kyle Buchanan, Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road” (New York:  William Morrow, 2022), 337

A movie two decades in the making, Fury Road, the belated follow-up to writer/director George Miller’s dystopian action-film trilogy Mad Max (1979, 1981, 1985) starring a then-unknown Mel Gibson as a wanderer in the wasteland—the Road Warrior—began its long journey to the screen as a proposed television series in 1995 when Miller won back the rights to the franchise from Warner Bros. as part of a settlement from a breach-of-contract suit he’d filed over having been fired from Contact (1997).

Eventually inspired to do another feature instead—“What if there was a Mad Max movie that was one long chase,” Miller pondered, “and the MacGuffin was human?” (ibid., 31)—the ensuing production was plagued with one near-terminal roadblock after another.  The behind-the-scenes story told in Blood, Sweat & Chrome is as thrilling, in its own way, as that of Mad Max:  Fury Road itself.

Continue reading

“Young Indiana Jones” Turns 30:  Storytelling Lessons from George Lucas’ Other Prequel Series

A television series based on an immensely popular action-movie franchise shouldn’t have been a creative or commercial risk—quite the opposite.  But with The Young Indiana Jones Chronicles, which premiered on March 4, 1992, filmmaker George Lucas had no intention of producing a small-screen version of his big-screen blockbusters.  Here’s how Lucas provided a richly imaginative model for what a prequel can and should be—and why it would never be done that way again.


Though he more or less innovated the contemporary blockbuster, George Lucas had intended—even yearned—to be an avant-garde filmmaker:

Lucas and his contemporaries came of age in the 1960s vowing to explode the complacency of the old Hollywood by abandoning traditional formulas for a new kind of filmmaking based on handheld cinematography and radically expressive use of graphics, animation, and sound.  But Lucas veered into commercial moviemaking, turning himself into the most financially successful director in history by marketing the ultimate popcorn fodder.

Steve Silberman, “Life After Darth,” Wired, May 1, 2005

After dropping the curtain on his two career- and era-defining action trilogies (Star Wars concluded in 1983, then Indiana Jones in ’89), then failing to launch a new franchise with Willow (his 1988 sword-and-sorcery fantasy fizzled at the box office, though even that would-be IP is getting a “legacy” successor later this year courtesy the nostalgia–industrial complex), Lucas did in fact indulge his more experimental creative proclivities—through the unlikeliest of projects:  a pair of prequels to both Indiana Jones and Star Wars.  And while both arguably got made on the strength of the brands alone, the prequels themselves would, for better and worse, defy the sacrosanct conventions of blockbuster cinema—as well the codified narrative patterns of Joseph Campbell’s “heroic journey”—that audiences had come to expect from Lucas.

A perfunctory scene in Return of the Jedi, in which Obi-Wan finally explains Darth Vader’s mysterious backstory to Luke (a piece of business that could’ve been easily handled in the first film, thereby sparing the hero needlessly considerable risk and disillusionment in The Empire Strikes Back, but whatever), served as the narrative foundation for Lucas’ Star Wars prequel trilogy (1999–2005), in which a precocious tike (The Phantom Menace) matures into a sullen teenager (Attack of the Clones) before warping into a murderous tyrant (Revenge of the Sith).  Underpinning Anakin’s emo-fueled transformation to the dark side is a byzantine plotline about Palpatine’s Machiavellian takeover of the Republic.  Meanwhile, references to the original trilogy, from crucial plot points to fleeting sight gags, abound.

You’ve all seen the movies, so I’ll say no more other than to suggest the story arc—which is exactly what Obi-Wan summarized in Return of the Jedi, only (much) longer, appreciably harder to follow, and a tonally incongruous mix of gee-whiz dorkiness and somber political intrigue—is precisely the kind of creative approach to franchise filmmaking that would’ve been summarily nixed in any Hollywood pitch meeting, had Lucas been beholden to the corporate precepts of the studio system from which the colossal success of the original Star Wars afforded him his independence.

George Lucas on the set of the “Star Wars” prequels

Which is not to say Lucas’ artistic instincts were infallible.  Financially successful though the prequels were, audiences never really embraced his vision of an even longer time ago in a galaxy far, far away:  Gungans and midi-chlorians and trade disputes didn’t exactly inspire the wide-eyed amazement that Wookiees and lightsabers and the Death Star had.

Maybe by that point Star Wars was the wrong franchise with which to experiment creatively?  Perhaps it had become too culturally important, and audience expectations for new entries in the long-dormant saga were just too high?  In the intervening years, Star Wars had ceased to be the proprietary daydreams of its idiosyncratic creator; culturally if not legally, Star Wars kinda belonged to all of us on some level.  By explicitly starting the saga with Episode IV in 1977, he’d invited each of us to fill in the blanks; the backstory was arguably better off imagined than reified.

As an IP, however, Indiana Jones, popular as it was, carried far less expectation, as did the second-class medium of network television, which made Lucas’ intended brand extension more of an ancillary product in the franchise than a must-see cinematic event—more supplemental than it was compulsory, like a tie-in novel, or the Ewok telefilms of the mid-eighties.  The stakes of the project he envisioned were simply much lower, the spotlight on it comfortably dimmer.  In the event of its creative and/or commercial failure, Young Indiana Jones would be a franchise footnote in the inconsequential vein of the Star Wars Holiday Special, not an ill-conceived vanity project responsible for retroactively ruining the childhoods of millions of developmentally arrested Gen Xers.  Here Lucas expounds on the genesis of the series:

Continue reading

Patriarchal Propaganda: How Hollywood Stories Give Men Delusions of Heroism

Movies and TV shows—and this includes both your favorites and mine—mostly exist to remind us that ours is a man’s world.  Popular entertainment in general, regardless of medium or genre or even the noble intentions of the storytellers, is almost invariably patriarchal propaganda.  But it doesn’t have to be that way.


Since at least as far back as the adventures of Odysseus, men have used fantasy narratives to contextualize ourselves as the solitary heroic protagonist of the world around us—a world that would be appreciably better off if only our judgment weren’t questioned or our actions thwarted by those of inferior hearts and minds.  In the Book of Genesis, God creates man from the dust, gives him dominion over the Earth, then provides him with a “helper”—Eve—who proves considerably less than helpful when she defiantly feeds from the tree of the knowledge of good and evil and spoils Paradise for everyone.

Such are the stories we’ve been hearing for literally thousands of years, and the reality we live in today is very much shaped by the presumption of patriarchy they propagandize.  In 1949, this way of telling stories—the Hero’s Journey—was codified by comparative mythologist Joseph Campbell in The Hero with a Thousand Faces, and adopted by Hollywood as the blueprint for the blockbuster.  From our Westerns (Dances with Wolves) to our policiers (Dirty Harry) to our space operas (Star Wars) to our spy thrillers (James Bond) to our teen comedies (Ferris Bueller’s Day Off) to our superhero universes (Iron Man) to our mob dramas (The Sopranos) to our sitcoms (Two and a Half Men), it’s a man’s world, baby—with the rest of you there to either support us or (foolishly) stand in our way.

It’s not that there’s anything inherently wrong with escapist entertainment.  It isn’t fantasy the genre that’s the problem, or even the Hero’s Journey story model, but rather the near-universal underlying patterns and motifs in our popular fictions that have unconsciously supported—that have surreptitiously sold us—the fantasy of patriarchal hegemony.  As such, white men in particular have been conditioned by these cultural narratives to see ourselves as the central heroic figure in the Epic of Me—even our storytellers themselves:

While accepting the award for Outstanding Directing for a Limited or Anthology Series or Movie for his work on The Queen’s Gambit, Scott Frank brushed off the “get off the stage” music not once but three times, reading a prepared speech from two full-length pages he’d shoved into his pocket and blowing past his allotted 45 seconds to speak for three minutes and 28 seconds—more than four and a half times as long as he was supposed to.

Viewers couldn’t have asked for a more perfect embodiment of white male privilege and entitlement as a visibly annoyed Frank reacted to the orchestra’s attempts to play him off by saying, “Really?  No,” and making a dismissive hand gesture.  The second time they started playing, he said, “Seriously, stop the music,” again waving his hand as if he were shooing away a fly and pressing on.  The third time, he insisted, “I’m almost done.”  Each time, when he commanded them to stop playing the music, they actually stopped the music.  Who knew it was that easy?

Bonnie Stiernberg, “Those ‘Queen’s Gambit’ Emmy Speeches Epitomized Exactly What’s Wrong With Hollywood,” InsideHook, September 20, 2021

Whether we’re aware of them or not, men have certain expectations about how the world should work:  that it should work for us.  After all, God gave us, not you, dominion over all living things and natural resources on Earth.  But when reality conflicts with those birthrights, we grow frustrated, and rather than questioning the stories we’ve been told about our place in the world, we tell more of the same self-mythologizing horseshit—to assure ourselves, and others, of our God-given entitlements, of our singular heroism.  Consider, for example, the overwhelming popularity—ten entries and counting—of the testosterone-charged Fast & Furious franchise:

These films use the concrete landscape to assert individuality and a refusal to knuckle under to authority.  With the exception of Brian and perhaps Roman, these inner-city car racers don’t want to be reintegrated into society.  They race cars to gain status and money, to impress sexy women, and to defy the police—just like [celebrated American NASCAR driver] Junior Johnson and the Dukes of Hazzard.  But, like the conformists and suburbanites they reject, they act like everything in nature exists to be consumed and exploited.

robin, “The Fast and Furious Films and Mad Max Fury Road,” Ecocinema, Media, and the Environment [blog], September 20, 2019

Dominic Toretto (Vin Diesel) famously makes his gang say grace before they eat, an utterly meaningless gesture since, unlike obeying the law, it costs him nothing to do so, yet it nonetheless speaks volumes about his patriarchal values.  He and his crew aren’t enlightened antiheroes as they believe, merely entitled gearheads who proudly and explicitly live their lives “a quarter mile at a time,” because to think beyond that would require a sense of empathy for those outside their range of awareness, as well as compel a sober consideration for the long-term consequences of their, to put it generously, socially irresponsible behaviors.

Dom appropriates Christian iconography to assert his moral authority—pure patriarchal propaganda

(And if you’re inclined to dismiss Dom’s worldview as the patently absurd pseudophilosophy of a one-dimensional action-movie street racer—nothing worth taking seriously—it’s worth remembering that Facebook co-founder and CEO Mark Zuckerberg’s operating motto is “Move fast and break things,” which sounds like exactly the sort of iconoclastic rallying cry you’d expect to hear Dom growl… until you realize the thing Zuckerberg might’ve broken while becoming the fifth wealthiest person in the world is democracy itself.)

Continue reading

The Ted Lasso Way: An Appreciation

The Emmy-nominated comedy series Ted Lasso doesn’t merely repudiate the knee-jerk cynicism of our culture—it’s the vaccine for the self-reinforcing cynicism of our pop culture.  In a feat of inspiring commercial and moral imagination, Jason Sudeikis has given us a new kind of hero—in an old type of story.


As a boy coming of age in the eighties and early nineties, I had no shortage of Hollywood role models.  The movies offered smartass supercops John McClane and Martin Riggs, vengeful super-soldiers John Matrix and John Rambo, and scorched-earth survivalists Snake Plissken and Mad Max, to cite a select sampling.  Sure, each action-hero archetype differed somewhat in temperament—supercops liked to crack wise as they cracked skulls, whereas the soldiers and survivalists tended to be men of few words and infinite munitions—but they were, one and all, violent badasses of the first order:  gun-totin’, go-it-alone individualists who refused to play by society’s restrictive, namby-pamby rules.

Yippee ki-yay.

The small screen supplied no shortage of hero detectives in this mode, either—Sonny Crockett, Thomas Magnum, Rick Hunter, Dennis Booker—but owed to the content restrictions of broadcast television, they mostly just palm-slammed a magazine into the butt of a chrome Beretta and flashed a charismatic GQ grin in lieu of the clever-kill-and-quick-one-liner m.o. of their cinematic counterparts.  (The A-Team sure as hell expended a lot of ammo, but their aim was so good, or possibly so terrible, the copious machine-gun fire never actually made contact with human flesh.)  The opening-credits sequences—MTV-style neon-noir music videos set to power-chord-driven instrumentals—made each show’s gleaming cityscape look like a rebel gumshoe’s paradise of gunfights, hot babes, fast cars, and big explosions.

It might even be argued our TV heroes exerted appreciably greater influence on us than the movie-franchise sleuths that would often go years between sequels, because we invited the former into our home week after week, even day after day (in syndication).  And to be sure:  We looked to those guys as exemplars of how to carry ourselves.  How to dress.  How to be cool.  How to talk to the opposite sex.  How to casually disregard any and all institutional regulations that stood in the way of a given momentary impulse.  How to see ourselves as the solitary hero of a cultural narrative in which authority was inherently suspect and therefore should be proudly, garishly, and reflexively challenged at every opportunity.  The world was our playground, after all—with everyone else merely a supporting actor in the “great-man” epic of our own personal hero’s journey.

Oh, how I wish, in retrospect, we’d had a heroic role model like Jason Sudeikis’ Ted Lasso instead.

THE LAST BOY SCOUT

The premise of Ted Lasso, which recently commenced its second season, is a can-do college-football coach from Kansas (Sudeikis) is inexplicably hired to manage an English Premier League team, despite that kind of football being an entirely different sport.  Ted, we learn, has been set up to fail by the embittered ex-wife of the club’s former owner (Hannah Waddingham), who, in a plot twist that owes no minor creative debt to David S. Ward’s baseball-comedy classic Major League—which the show tacitly acknowledges when Ted uncharacteristically invokes a key line of profane dialogue from the movie verbatim—inherited the team in a divorce and is now surreptitiously revenge-plotting its implosion.

Jason Sudeikis as Ted Lasso

But, boy oh boy, has Waddingham’s Rebecca Welton—a refreshingly dimensional and sympathetic character in her own right, it’s worth noting—seriously underestimated her handpicked patsy.  With his folksy enthusiasm and full Tom Selleck ’stache, Coach Ted Lasso unironically exemplifies big-heartedness, open-mindedness, kindness, courtesy, chivalry, civility, forgiveness, wisdom, teamwork, cultural sensitivity, and prosocial values—all with good humor, to boot.  His infectious optimism eventually converts even the most jaded characters on the show into true believers, and his innate goodness inspires everyone in his orbit—often despite themselves—to be a better person.  And if, like me, you watch the first season waiting for the show to at some point subject Ted’s heart-on-his-sleeve earnestness to postmodern mockery or ridicule—“spoiler alert”—it doesn’t.

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑