Writer of things that go bump in the night

Tag: franchise (Page 1 of 4)

Under the Influence, Part 2:  The Top Five Formative Cinematic Muses from My ’80s Childhood

Let’s play Ten for Ten!  To commemorate the ten-year anniversary of this blog, which launched on June 26, 2014, here’s an appreciation for ten of my formative cinematic influences—an examination of why these movies resonated with me when I first saw them, and how they permanently informed my aesthetic tastes and creative sensibilities.  This post is presented in three installments.

“Under the Influence, Part 1” informally ponders through personal example how an artist develops a singular style and voice all their own, and offers an analysis of Quentin Tarantino’s essay collection Cinema Speculation, the auteur’s critical look at the movies of the ’70s that inspired him.

In “Under the Influence, Part 2,” I spotlight five films from my ’80s childhood that shaped my artistic intuition when at its most malleable.

And in “Under the Influence, Part 3,” I round out the bill with five selections from my ’90s adolescence, the period during which many of the themes that preoccupy me crystalized.


Given that my childhood coincided with what Quentin Tarantino terms “the miserable eighties”—that decade of “middle-of-the-road successful films”1 during which “likeability was everything”2—the following ten cinematic specimens that impressed so notably upon my nascent imagination, accordingly, span the years 1978 through 1993.

Before we dive in, let’s stipulate what this digest isn’t.  These are not what I consider the Best Movies Ever, or even the best movies of their era, neither of which I am particularly qualified to judge.

Furthermore, they are not necessarily even my favorite movies, merely the ones that made a meaningful, lasting, and demonstrable impression on me, and whose DNA has (repeatedly) found their way into my own work.

Nor does this cover my literary or musical influences, because, as Geddy Lee suggests, the project of tracing this stuff ain’t easy; it took a surprising amount of rumination to settle upon the ten selections studied here.  (None of them are particularly obscure; if you haven’t seen all ten, you’ve at least heard of them.)

I have excluded any films that may have once held sway over me, particularly ’80s action movies (from police thrillers to sci-fi dystopias to car-worshipping petro-propaganda), whose hypermasculine spirit and/or trashy cynicism I can no longer in good conscience abide.

It must also be noted I am uncomfortably aware of how, well, white all my chosen case studies are.  The filmmakers and screenwriters are nigh exclusively straight white men, with the known exceptions of Joel Schumacher, Leslie Newman, and Janice Fischer.

What’s more, every protagonist across the board is a straight white male, several of them either explicitly or implicitly Irish American, at that.  Boys like me were very well represented in popular media back then—still are—as there are precious few actors of color to be found in any these productions, and, in those rare instances, always in small or supporting roles.

These cinematic influences are all unambiguously predicated on a heteronormative worldview and a white male perspective.  I acknowledge that.

But… as much as they (mostly) glorify white boys, they all (save one) speak to at least one of two themes that have fascinated me throughout my life, and which are the dominant subjects of my own fiction.

The first is the complicated dynamic between fathers/men and sons/boys.

The second:  the special bond of boyhood friendship, and how boys often look to each other for the emotional support they don’t get from their parents.

Men’s relationships with their fathers and their friends was a central theme of “Ted Lasso”

The stories I respond to and the stories I write are, for the most part, about straight white males.  But I consciously seek to eschew the reductive paradigms favored by Hollywood—notably the solitary antihero and middle-aged manchild—in favor of men who are competent but not superheroic, compassionate but not saintly, flawed but not cruel, and definitely not proudly antisocial, brazenly irresponsible, or comfortably violent.

In stark contrast with Tarantino’s reflexively defensive view that cinematic expressions of violence and hypermasculinity (to say nothing of the institutionalized misogyny that inspirits them) are harmless exercises in wish fulfillment, I believe commercial storytellers—particularly straight white cisgendered men—have a moral obligation to be a productive part of the cultural conversation initiated by the #MeToo movement and the George Floyd protests:

We have spent the past half-decade wrestling with ideas of gender and privilege, attempting to challenge the old stereotypes and power structures.  These conversations should have been an opportunity to throw out the old pressures and norms of manhood, and to help boys and men be more emotionally open and engaged.  But in many ways this environment has apparently had the opposite effect—it has shut them down even further. . . .

Perhaps it’s not surprising that in the grip of the culture wars, caring about boys has become subtly coded as a right-wing cause, a dog whistle for a kind of bad-faith politicking.  Men have had way more than their fair share of our concern already, the reasoning goes, and now it’s time for them to pipe down.  But for boys, privilege and harm intertwine in complex ways—male socialization is a strangely destructive blend of indulgence and neglect.  Under patriarchy, boys and men get everything, except the thing that’s most worth having:  human connection.

Silencing or demonizing boys in the name of progressive ideals is only reinforcing this problem, pushing them further into isolation and defensiveness.  The prescription for creating a generation of healthier, more socially and emotionally competent men is the same in the wider political discourse as it is in our own homes—to approach boys generously rather than punitively.  We need to acknowledge boys’ feelings, to talk with our sons in the same way we do our daughters, to hear them and empathize rather than dismiss or minimize, and engage with them as fully emotional beings.

Ruth Whippman, “Boys Get Everything, Except the Thing That’s Most Worth Having,” Opinion, New York Times, June 5, 2024

The storytellers could contribute to a meaningful shift of the cultural mindset if we summoned the moral imagination to refuse to further represent masculinity as a binary (and compulsory) choice between two equally oppressive and simplistic models of social posturing and self-identity—either he-man or Peter Pan—and dared to instead portray boys and men as human beings of nuanced emotion, as capable of expressing sympathy as they are deserving of receiving it.

Now more than ever, we need thoughtful, responsible fiction by men about men—stories that explore masculinity and manhood without invoking the same tired, narrow, noxious archetypes of tough-guy antiheroes who “play by their own rules” and stunted-adolescent slackers for whom rules, the mere acknowledgment of let alone adherence to, are the stuff of “adulting,” and fuck that shit.  Such prosocial, aspirational fiction might very well be called helpful exercises in wish fulfillment.  That’s what I’ve called for, and what I strive to produce myself.

Now let’s look, in mostly linear order, at the films that shaped my tastes and style, starting with the first five (of ten) selections.  Click on any of the links below to jump directly to that particular subheading and its corresponding treatise:

Continue reading

No, Virginia, “Die Hard” Is Not a Christmas Movie

Ah, it’s that magical time of year!  When the Hudson hawk nips at the nose, and the skyline over the New Jersey Palisades bruises by midafternoon.  When chimney smoke from private houses spices the air, and strings of colored lights adorn windows and fire escapes.  And, of course, when the Internet engages in its annual bullshit debate as to whether perennial holiday favorite Die Hard, currently celebrating its thirty-fifth anniversary, is a Christmas movie.  And since “bullshit debates” are my brand…


In fourth grade, I scored what was, by 1980s standards, the holy grail:  a best friend with HBO.  Over the following five years, I slept over at his house every weekend, where we watched R-rated action movies into the night.  Whatever HBO was showing that week, we delighted in it, no matter how idiotic (Action Jackson) or forgettable (Running Scared).  For a pair of preadolescent boys, that Saturday-night cinematic grab bag abounded with illicit wonders.

Much as we enjoyed those movies, though, they were for the most part—this isn’t a criticism—ephemeral crap.  We howled at their profane jokes and thrilled to their improbable set pieces, but seldom if ever revisited any of them (Beverly Hills Cop [1984] and its sequel [1987] being a rare exception), and certainly none inspired us to playact their scenarios as we had with PG-rated adventures Ghostbusters (1984) and Back to the Future (1985).  They entertained us, sure, but didn’t exactly impress upon our imaginations in any lasting or meaningful way…

That is, not until an action thriller with the snarky guy from Moonlighting (1985–1989) and Blind Date (1987) came along.  I still remember seeing Die Hard (1988) for the first time, on a thirteen-inch television with side-mounted mono speaker at my friend’s Bronx apartment.  As a viewing experience, it was about as low-def as they come, but that didn’t diminish the white-knuckled hold the movie had on us; we watched it in astonished silence from beginning to end.  From that point on—and this was the year no less than Tim Burton’s Batman had seized the zeitgeist, and our longstanding favorites Ghostbusters and Back to the Future got their first sequelsDie Hard was almost all we could talk about.

At the time, Manhattan College was in the process of erecting a twelve-story student residence overlooking Van Cortlandt Park, and we would gather with our JHS pals at the construction site on weekends, running around the unfinished edifice with automatic squirt guns, playing out the movie’s gleefully violent plot.  Hell, at one point or another, every multistory building in the neighborhood with a labyrinthine basement and rooftop access became Nakatomi Plaza, the setting of a life-and-death battle staged and waged by a group of schoolboys, our imaginations captive to the elemental premise of Die Hard.

We obsessed over that fucking movie so exhaustively, we passed around this still-in-my-possession copy of the pulp-trash novel it was based on—Roderick Thorp’s Nothing Lasts Forever (1979)—until every one of us had had a chance to read it:

The now-battered copy of “Nothing Last Forever” I bought in 1989 at the long-gone Bronx bookstore Paperbacks Plus

The thirteen-year-old boys of the late ’80s were far from the only demographic taken with Die Hard.  The movie proved so hugely popular, it not only spawned an immediate sequel in 1990 (which we were first in line to see at an appallingly seedy theater on Valentine Avenue), but became its own subgenre throughout the rest of that decade.  Hollywood gave us Die Hard on a battleship (Under Siege), Die Hard on a plane (Passenger 57), Die Hard on a train (Under Siege 2:  Dark Territory), Die Hard on a mountain (Cliffhanger), Die Hard on a bus (Speed), Die Hard on a cruise ship (Speed 2:  Cruise Control), Die Hard in a hockey arena (Sudden Death), Die Hard on Rodeo Drive (The Taking of Beverly Hills), Die Hard at prep school (Toy Soldiers)…

Christ, things got so out of control, even Beverly Hills Cop, an established action franchise predating Die Hard, abandoned its own winning formula for the third outing (scripted by Steven E. de Souza, co-screenwriter of the first two Die Hards) in favor of a half-assed “Die Hard in an amusement park” scenario.  This actually happened:

Eddie Murphy returns as Axel Foley—sort of—in “Beverly Hills Cop III” (1994)

None of those films has had the staying power of the original Die Hard.  Mostly that’s owed to Die Hard being a superior specimen of filmmaking.  Director John McTiernan demonstrates uncommonly disciplined visual panache:  He expertly keeps the viewer spatially oriented in the movie’s confined setting, employing swish pans and sharp tilts to establish the positions of characters within a given scene, as well as imbue the cat-and-mouse of it all with breathless tension.

McTiernan consistently sends his hero scuttling to different locations within the building—stairwells, pumprooms, elevator shafts, airducts, the rooftop helipad—evoking a rat-in-a-cage energy that leaves the viewer feeling trapped though never claustrophobic.  The narrative antithesis of the globetrotting exploits of Indiana Jones and James Bond, Die Hard is a locked-room thriller made with an ’80s action-movie sensibility.  It was and remains a masterclass in suspense storytelling—often imitated, as the old saying goes, never duplicated.

Perhaps another key reason for the movie’s durability, its sustained cultural relevance, is owed to its (conditional) status as a celebrated Christmas classic.  Like It’s a Wonderful Life (1946) and National Lampoon’s Christmas Vacation (1989) and Love Actually (2003), Die Hard is a feel-good film—albeit with a considerably higher body count—one is almost compelled to watch each December.  Yet whereas nobody questions any of the aforementioned movies’ culturally enshrined place in the holiday-movie canon—nor that of cartoonishly violent Home Alone (1990)—Die Hard’s eligibility seems perennially under review.

Why does the debate around Die Hard die hard… and is it, in fact, a Christmas movie?

Continue reading

Highway to Hell:  Car Culture and Hollywood’s Hero-Worship of the Automobile

With road-trip season upon us once again, here’s an examination of how American car culture has been romanticized by the entertainment industry; how automobiles, far from enablers of freedom and individuality, are in fact “turbo-boosted engines of inequality”; and how Hollywood can help remedy an ecocultural crisis it’s played no small role in propagating.


In any given episode, the action reliably starts the same way:  a wide shot of the Batcave, Batmobile turning on its rotating platform to face the cavemouth, camera panning left as the Dynamic Duo descend the Batpoles.  Satin capes billowing, Batman and Robin hop into their modified 1955 Lincoln Futura, buckle up—decades before it was legally required, incidentally—and the engine whines to life as they run through their pre-launch checklist:

ROBIN:  Atomic batteries to power.  Turbines to speed.

BATMAN:  Roger.  Ready to move out.

A blast of flame from the car’s rear thruster—whoosh!—and off they’d race to save the day.

By the time the 1980s had rolled around, when I was first watching Batman (1966–1968) in syndicated reruns, every TV and movie hero worth his salt got around the city in a conspicuously slick set of wheels.  Muscle cars proved popular with working-class ’70s sleuths Jim Rockford (Pontiac Firebird) and Starsky and Hutch (Ford Gran Torino).  The neon-chic aesthetic of Reagan era, however, called for something a bit sportier, like the Ferrari, the prestige ride of choice for Honolulu-based gumshoe Thomas Magnum (Magnum, P.I.) and buddy cops Crockett and Tubbs (Miami Vice).  The ’80s were nothing if not ostentatiously aspirational.

Even when cars were patently comical, they came off as cool despite themselves:  the Bluesmobile, the 1974 Dodge Monaco used in The Blues Brothers (1980); the Ectomobile, the 1959 Cadillac Miller-Meteor Sentinel in Ghostbusters (1984); the Wolfmobile, a refurbished bread truck that Michael J. Fox and his pal use for “urban surfing” in Teen Wolf (1985).

The DMC DeLorean time machine from Back to the Future is clearly meant to be absurd, designed in the same kitchen-sink spirit as the Wagon Queen Family Truckster from National Lampoon’s Vacation (1983), but what nine-year-old boy in 1985 didn’t want to be Michael J. Fox, sliding across the stainless-steel hood and yanking the gull-wing door shut behind him?  And like the characters themselves, the DeLorean evolved with each movie, going from nuclear-powered sports car (Part I) to cold-fusion flyer (Part II) to steampunk-retrofitted railcar (Part III).  “Maverick” Mitchell’s need for speed didn’t hold a candle to Marty McFly’s, who’s very existence depended on the DeLorean’s capacity to reach 88 miles per hour.

Vehicles that carried teams of heroes offered their own vicarious pleasure.  Case in point:  the 1983 GMC Vandura, with its red stripe and rooftop spoiler, that served as the A-Team’s transpo and unofficial HQ—a place where they could bicker comically one minute then emerge through the sunroof the next to spray indiscriminate gunfire from their AK-47s.  The van even had a little “sibling”:  the Chevrolet Corvette (C4) that Faceman would occasionally drive, marked with the same diagonal stripe.  Did it make sense for wanted fugitives to cruise L.A. in such a distinct set of wheels?  Not really.  But it was cool as hell, so.

The Mystery Machine was the only recurring location, as it were, on Scooby-Doo, Where Are You! (1969), and the van’s groovy paint scheme provided contrast with the series’ gloomy visuals.  Speaking of animated adventures, when once-ascetic Vietnam vet John Rambo made the intuitive leap from R-rated action movies to after-school cartoon series (1986), he was furnished with Defender, a 6×6 assault jeep.  Not to be outdone, the most popular military-themed animated franchise of the ’80s, G.I. Joe:  A Real American Hero (1983–1986), featured over 250 discrete vehicles, and the characters that drove them were, for the most part, an afterthought:

With the debut of the 3 ¾” figures in 1982, Hasbro also offered a range of vehicles and playsets for use with them.  In actual fact, the 3 ¾” line was conceived as a way to primarily sell vehicles—the figures were only there to fill them out!

‘3 ¾” Vehicles,’ YoJoe!

But who needs drivers when the vehicles themselves are the characters?  The protagonists of The Transformers (1984–1987) were known as the Autobots, a race of ancient, sentient robots from a distant planet that conveniently shapeshifted into 1980s-specific cars like the Porsche 924 and Lamborghini Countach, among scores of others.  (The premise was so deliriously toyetic, it never occurred to us to question the logic of it.)  Offering the best of both G.I. Joe and The Transformers, the paramilitary task force of M.A.S.K. (1985–1986), whose base of operations was a mountainside gas station (what might be described as Blofeld’s volcano lair meets the Boar’s Nest), drove armored vehicles that transformed into… entirely different vehicles.

Many movies and shows not only featured cars as prominent narrative elements, but literally took place on the roadVacationMad Max (1979).  Smokey and the Bandit (1977).  CHiPs (1977–1983).  Sometimes the car was so important it had a proper name:  General Lee from The Dukes of Hazzard (1979–1985).  Christ, sometimes it was the goddamn series costar:  KITT on Knight Rider (1982–1986).  Shit on David Hasselhoff’s acting ability all you want, but the man carried a hit TV show delivering the lion’s share of his dialogue to a dashboard.  Get fucked, Olivier.

1980s hero-car culture at a glance

As a rule, productions keep multiple replicas of key picture cars on hand, often for different purposes:  the vehicle utilized for dialogue scenes isn’t the one rigged for stunts, for instance.  It’s notable that the most detailed production model—the one featured in medium shots and closeups, in which the actors perform their scenes—is known as the “hero car.”  And why not?  Over the past half century, Hollywood has unquestionably programmed all of us to recognize the heroism of the automobile.

Continue reading

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

“Superman IV” at 35:  How the “Worst Comic-Book Movie Ever” Epitomizes What We Refuse to Admit about Superhero Fiction

Superman IV:  The Quest for Peace, unanimously reviled for both its unconvincing visuals and cornball story, inadvertently accomplished the theretofore unrealized dream of scores of nefarious supervillains when it was released on this date in 1987:  It killed Superman.  (Or at least put the cinematic franchise into two-decade dormancy.)

But a closer examination of the film suggests its objectively subpar storytelling might in fact be far more faithful to the spirit of the source material than today’s fanboy culture would care to concede.


Thirty-five years ago today, my mother took me to see Superman IV:  The Quest for Peace (1987).  Afterwards, we met up with my father at Doubleday’s, a neighborhood bar and grill that was the last stop on Broadway before you’d officially crossed the city line into Westchester County.  The restaurant had a hot-oil popcorn machine in the far corner, and when I went to refill our basket, I spied a man seated at the bar, nose in a copy of USA Today, the back panel of which boasted a full-page color advertisement for Superman IV.

When he caught me studying the ad, he asked, “Gonna go see the new Superman?”

“I just did.”

“Yeah?  How was it?”

“It was amazing,” I said, and I absolutely meant it.  Sensing my sincerity, the gentleman pulled the ad from the bundle of folded pages and handed it to me as a souvenir.  When I got home, I taped it up on my bedroom wall.

The theatrical one-sheet for “Superman IV” looks like a textbook “Action Comics” cover from the ’80s

Sidney J. Furie’s Superman IV:  The Quest for Peace is not amazing.  It is, in fact, commonly regarded as one of the worst comic-book movies ever made—if not the worst—in eternal competition for last place with Batman & Robin (1997) and Catwoman (2004).  It suffered from a notoriously troubled production:  After the diminishing returns of Superman III (1983) and spin-off Supergirl (1984), series producers Alexander and Ilya Salkind sold their controlling interests in the IP to the Cannon Group, the schlockmeister studio responsible for the American Ninja, Missing in Action, Breakin’, and Death Wish franchises—not exactly the optimal custodians of a series that had started out, against all expectation, so magnificently.

Richard Donner’s Superman:  The Movie (1978) was and remains the finest specimen of superhero cinema ever presented, at once ambitiously epic and emotionally relatable.  It pulls off the impossible in so many ways, first and foremost that it absolutely made us a believe a man could fly, which had never been credibly accomplished before.  Credit for that goes not only to the VFX team, which won the Academy Award for its efforts, but to Christopher Reeve, who delivered the movie’s most timeless special effect:  endowing profound dignity and genuine vulnerability to a spandex-clad demigod.  Even the lesser Superman films—and we’ll talk more about those soon enough—are elevated by Reeve’s extraordinary performance, which occupies a lofty position, right alongside Bela Lugosi’s Dracula, in the pantheon of defining interpretations of folkloric icons.

What’s also so remarkable about Superman is how many different tonal aesthetics it assimilates.  The opening sequences on Krypton with Marlon Brando feel downright Kubrickian; Donner somehow channels the cosmic splendor of 2001:  A Space Odyssey (1968), only to then transition us to Smallville, as warm and fertile as Krypton was cold and barren, which evokes the same spirit of sock-hop Americana George Lucas conjured to such success in American Graffiti (1973).

The remainder of the movie shifts fluidly from His Girl Friday–style newsroom comedy (the scenes at the Daily Planet) to urban action thriller à la The French Connection (the seedy streets of 1970s Metropolis) to Roger Moore–era 007 outing (Lex Luthor’s sub–Grand Central lair, complete with comically inept henchmen) to Irwin Allen disaster film (the missile that opens up the San Andreas Fault in the third act and sets off a chain reaction of devastation along the West Coast).

Somehow it coheres into a movie that feels like the best of all worlds rather than a derivative Frankenstein’s monster.  Up until that time, superhero features and television, hampered by juvenile subject matter and typically subpar production values, seemed inherently, inexorably campy.  The notion that a superhero movie could rise to the level of myth, or at least credibly dramatic science fiction, was unthinkable.  Superman is the proof-of-concept paradigm on which our contemporary superhero–industrial complex is predicated.

Continue reading

Book Review:  “Blood, Sweat & Chrome” by Kyle Buchanan

Kyle Buchanan’s Blood, Sweat & Chrome, published by William Morrow in February, chronicles the not-to-be-believed making of George Miller’s Mad Max:  Fury Road (2015) from conception to release through interviews with its cast and crew, and celebrates the inspiring creative imagination of the filmmakers, who defied the odds to create a contemporary classic—a movie as singularly visceral as it is stunningly visual.

But much like the nonstop action in the movie itself, the adulation expressed in the book never pauses to interrogate Miller and company’s moral imagination.  Let’s fix that, shall we?


I abhor nostalgia, particularly for the 1980s and ’90s, but I’ve recently found myself revisiting many of the films and television shows of the latter decade, the period during which I first knew I wanted to be a cinematic storyteller, when earnest star-driven Oscar dramas like Forrest Gump (1994) coexisted with, and even prospered alongside, paradigm-shifting indies à la Pulp Fiction (also ’94).  Those days are gone and never coming back—the institution formerly known as Hollywood is now the superhero–industrial complex—but I’ve wondered if some of those works, so immensely popular and influential then, have stood the test of time?

Yet my informal experiment has been about much more than seeing if some old favorites still hold up (and, by and large, they do); it’s about understanding why they worked in the first place—and what storytelling lessons might be learned from an era in which movies existed for their own sake, as complete narratives unto themselves rather than ephemeral extensions of some billion-dollar, corporately superintended brand.

In an entertainment landscape across which there is so much content, most of it deceptively devoid of coherence or meaning—a transmedia morass I’ve come to call the Multiverse of Madness—the secret to studying narrativity isn’t to watch more but rather less.  To consume fewer movies and TV shows, but to watch them more selectively and mindfully.  Pick a few classics and scrutinize them until you know them backwards and forwards.

In college, I spent an entire semester analyzing Citizen Kane (1941), from reading multiple drafts of its screenplay to watching it all the way through with the volume turned down just to appreciate its unconventional cinematography.  That’s how you learn how stories work:  Study one or two movies/novels per year… but study the shit out of them.  Watch less, but do it far more attentively.

Tom Hardy as Max Rockatansky in “Mad Max: Fury Road,” the subject of “Blood, Sweat & Chrome”

That is, admittedly, a counterintuitive mindset in our Digital Age of automatic and accelerating behaviors, whereby post-credit scenes preemptively gin up anticipation for the next movie (often through homework assignments) before we’ve had a chance to digest the current one, and the autoplay feature of most streaming services encourages and enables mindless TV binge-watching.

But the quarantine, unwelcome though it may have been, did offer a pause button of sorts, and we are only now beginning to see some of the ways in which folks exploited the rare opportunity to slow down, to go deep, that it offered.  One such project to emerge from that period of thoughtful reflection is entertainment journalist Kyle Buchanan’s recently published nonfiction book Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road”:

In April 2020, as the pandemic swept the planet and the movie-release calendar fell apart, I began writing an oral history of Mad Max:  Fury Road for the New York Times.  Without any new titles to cover, why not dive deeply into a modern classic on the verge of its fifth anniversary?

Every rewatch over those five years had confirmed to me that Fury Road is one of the all-time cinematic greats, an action movie with so much going on thematically that there’d be no shortage of things to talk about.  I had also heard incredible rumors about the film’s wild making, the sort of stories that you can only tell on the record once the dust has long settled.

Kyle Buchanan, Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road” (New York:  William Morrow, 2022), 337

A movie two decades in the making, Fury Road, the belated follow-up to writer/director George Miller’s dystopian action-film trilogy Mad Max (1979, 1981, 1985) starring a then-unknown Mel Gibson as a wanderer in the wasteland—the Road Warrior—began its long journey to the screen as a proposed television series in 1995 when Miller won back the rights to the franchise from Warner Bros. as part of a settlement from a breach-of-contract suit he’d filed over having been fired from Contact (1997).

Eventually inspired to do another feature instead—“What if there was a Mad Max movie that was one long chase,” Miller pondered, “and the MacGuffin was human?” (ibid., 31)—the ensuing production was plagued with one near-terminal roadblock after another.  The behind-the-scenes story told in Blood, Sweat & Chrome is as thrilling, in its own way, as that of Mad Max:  Fury Road itself.

Continue reading

“Young Indiana Jones” Turns 30:  Storytelling Lessons from George Lucas’ Other Prequel Series

A television series based on an immensely popular action-movie franchise shouldn’t have been a creative or commercial risk—quite the opposite.  But with The Young Indiana Jones Chronicles, which premiered on March 4, 1992, filmmaker George Lucas had no intention of producing a small-screen version of his big-screen blockbusters.  Here’s how Lucas provided a richly imaginative model for what a prequel can and should be—and why it would never be done that way again.


Though he more or less innovated the contemporary blockbuster, George Lucas had intended—even yearned—to be an avant-garde filmmaker:

Lucas and his contemporaries came of age in the 1960s vowing to explode the complacency of the old Hollywood by abandoning traditional formulas for a new kind of filmmaking based on handheld cinematography and radically expressive use of graphics, animation, and sound.  But Lucas veered into commercial moviemaking, turning himself into the most financially successful director in history by marketing the ultimate popcorn fodder.

Steve Silberman, “Life After Darth,” Wired, May 1, 2005

After dropping the curtain on his two career- and era-defining action trilogies (Star Wars concluded in 1983, then Indiana Jones in ’89), then failing to launch a new franchise with Willow (his 1988 sword-and-sorcery fantasy fizzled at the box office, though even that would-be IP is getting a “legacy” successor later this year courtesy the nostalgia–industrial complex), Lucas did in fact indulge his more experimental creative proclivities—through the unlikeliest of projects:  a pair of prequels to both Indiana Jones and Star Wars.  And while both arguably got made on the strength of the brands alone, the prequels themselves would, for better and worse, defy the sacrosanct conventions of blockbuster cinema—as well the codified narrative patterns of Joseph Campbell’s “heroic journey”—that audiences had come to expect from Lucas.

A perfunctory scene in Return of the Jedi, in which Obi-Wan finally explains Darth Vader’s mysterious backstory to Luke (a piece of business that could’ve been easily handled in the first film, thereby sparing the hero needlessly considerable risk and disillusionment in The Empire Strikes Back, but whatever), served as the narrative foundation for Lucas’ Star Wars prequel trilogy (1999–2005), in which a precocious tike (The Phantom Menace) matures into a sullen teenager (Attack of the Clones) before warping into a murderous tyrant (Revenge of the Sith).  Underpinning Anakin’s emo-fueled transformation to the dark side is a byzantine plotline about Palpatine’s Machiavellian takeover of the Republic.  Meanwhile, references to the original trilogy, from crucial plot points to fleeting sight gags, abound.

You’ve all seen the movies, so I’ll say no more other than to suggest the story arc—which is exactly what Obi-Wan summarized in Return of the Jedi, only (much) longer, appreciably harder to follow, and a tonally incongruous mix of gee-whiz dorkiness and somber political intrigue—is precisely the kind of creative approach to franchise filmmaking that would’ve been summarily nixed in any Hollywood pitch meeting, had Lucas been beholden to the corporate precepts of the studio system from which the colossal success of the original Star Wars afforded him his independence.

George Lucas on the set of the “Star Wars” prequels

Which is not to say Lucas’ artistic instincts were infallible.  Financially successful though the prequels were, audiences never really embraced his vision of an even longer time ago in a galaxy far, far away:  Gungans and midi-chlorians and trade disputes didn’t exactly inspire the wide-eyed amazement that Wookiees and lightsabers and the Death Star had.

Maybe by that point Star Wars was the wrong franchise with which to experiment creatively?  Perhaps it had become too culturally important, and audience expectations for new entries in the long-dormant saga were just too high?  In the intervening years, Star Wars had ceased to be the proprietary daydreams of its idiosyncratic creator; culturally if not legally, Star Wars kinda belonged to all of us on some level.  By explicitly starting the saga with Episode IV in 1977, he’d invited each of us to fill in the blanks; the backstory was arguably better off imagined than reified.

As an IP, however, Indiana Jones, popular as it was, carried far less expectation, as did the second-class medium of network television, which made Lucas’ intended brand extension more of an ancillary product in the franchise than a must-see cinematic event—more supplemental than it was compulsory, like a tie-in novel, or the Ewok telefilms of the mid-eighties.  The stakes of the project he envisioned were simply much lower, the spotlight on it comfortably dimmer.  In the event of its creative and/or commercial failure, Young Indiana Jones would be a franchise footnote in the inconsequential vein of the Star Wars Holiday Special, not an ill-conceived vanity project responsible for retroactively ruining the childhoods of millions of developmentally arrested Gen Xers.  Here Lucas expounds on the genesis of the series:

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

There He Was… and in He Walked: Lessons on Mythic Storytelling from the Mariachi Trilogy

In belated observation of Día de los Muertos, here’s an appreciation for the idiosyncratic storytelling of Robert Rodriguez’s Mariachi trilogy, a neo-Western action series that emerged from the indie-cinema scene of the 1990s and can only be deemed, by current Hollywood standards, an anti-franchise.  The movies and the manner in which they were made have a lot to teach us about what it means to be creative—and how to best practice creativity.


Before the shared cinematic universe became the holy grail of Hollywood, the coup d’éclat for any aspiring franchise—and we can probably credit Star Wars for this—was the trilogy.

In contrast with serialized IPs (James Bond and Jason Voorhees, for instance), the trilogy came to be viewed, rightly or wrongly, as something “complete”—a story arc with a tidy three-act design—and, accordingly, many filmmakers have leaned into this assumption, exaggerating a given series’ creative development post factum with their All part of the grand plan! assurances.

This peculiar compulsion we’ve cultivated in recent decades—storytellers and audiences alike—to reverse-engineer a “unified whole” from a series of related narratives, each of which developed independently and organically, is antithetical to how creativity works, and even to what storytelling is about.

Nowhere is the fluidity of the creative process on greater, more glorious display than in the experimental trilogy—that is, when a low-budget indie attains such commercial success, it begets a studio-financed remake that simultaneously functions as a de facto sequel, only to then be followed by a creatively emboldened third film that completely breaks from the established formula in favor of presenting an ambitiously gonzo epic.  Trilogies in this mode—and, alas, it’s pretty exclusive club—include Sam Raimi’s Evil Dead, George Miller’s Mad Max, and Robert Rodriguez’s El Mariachi.

Robert Rodriguez at the world premiere of “Alita: Battle Angel” on January 31, 2019 in London (Eamonn M. McCormack/Getty)

A film student at the University of Texas at Austin in the early nineties, Rodriguez self-financed El Mariachi with a few thousand dollars he’d earned as a medical lab rat; the project wasn’t meant to be much more than a modest trial run at directing a feature film that he’d hoped to perhaps sell to the then-burgeoning Spanish-language home-video market.  He reasoned that practical experience would be the best teacher, and if he could sell El Mariachi, it would give him the confidence and funds to produce yet more projects—increasingly ambitious and polished efforts—that would allow him to make a living doing what he loved.  He had no aspirations of power lunches at The Ivy or red-carpet premieres at Mann’s Chinese Theatre, only pursuing the art of cinematic storytelling—not necessarily Hollywood filmmaking, a different beast—to the fullest extent possible.

If you want to be a filmmaker and you can’t afford film school, know that you don’t really learn anything in film school anyway.  They can never teach you how to tell a story.  You don’t want to learn that from them anyway, or all you’ll do is tell stories like everyone else.  You learn to tell stories by telling stories.  And you want to discover your own way of doing things.

In school they also don’t teach you how to make a movie when you have no money and no crew.  They teach you how to make a big movie with a big crew so that when you graduate you can go to Hollywood and get a job pulling cables on someone else’s movie.

Robert Rodriguez, Rebel without a Crew, or, How a 23-Year-Old Filmmaker with $7,000 Became a Hollywood Player (New York:  Plume, 1996), xiii–xiv

They don’t teach a lot of things about Hollywood in film school, like how so many of the industry’s power brokers—from producers and studio execs to agents and managers—are altogether unqualified for their jobs.  These folks think they understand cinematic storytelling because they’ve watched movies their entire lives, but they’ve never seriously tried their hand at screenwriting or filmmaking.  Accordingly, the town’s power structure is designed to keep its screenwriters and filmmakers subordinate, to make sure the storytellers understand they take their creative marching orders from people who are themselves utterly mystified by the craft (not that they’d ever admit to that).

It’s the only field I know of whereby the qualified authorities are entirely subservient to desk-jockey dilettanti, but I suppose that’s what happens when a subjective art form underpins a multibillion-dollar industry.  Regardless, that upside-down hierarchy comes from a place of deep insecurity on both ends of the totem pole, and is in no way conducive to creativity, hence the premium on tried-and-true brands over original stories, on blockbusters over groundbreakers.  As I discovered the hard way—more on that in a minute—Hollywood is arguably the last place any ambitiously imaginative storyteller ought to aspire to be.  Rodriguez seemed to understand that long before he ever set foot in L.A.:

Continue reading

Grounded and Elevated: Screenwriting Secrets for a Sure-Thing Hollywood Pitch

Despite everything, it seems I still have a few friends in Tinseltown.  A development exec I know, aware of my blog’s polemical crusade against late-twentieth-century nostalgia as well as creatively and morally bankrupt storytelling, recently forwarded several e-mails containing informal pitches (from agented writers) he’d solicited for “reboots” of three classic IPs straight from the Gen X archives.  They offer fascinating firsthand insight into the demoralizing vocation of Hollywood screenwriting.

It might surprise those outside the industry to learn only a small fraction of a given screenwriter’s time and effort is spent developing original stories, known as “spec scripts.”  Few of those screenplays ever sell (certainly nowadays), and fewer still are produced; mostly, such projects are mere “calling cards”—writing samples designed to establish a scribe’s commercial sensibilities and creative credentials so he or she might be given the opportunity to vie for “open assignments.”  In those instances, a prodco controls the film rights to an intellectual property (IP)—a novel, a comic book, an old TV series—and, accordingly, invites such candidates to come in and pitch a take on it.

For instance, my prison break–zombie outbreak mashup Escape from Rikers Island afforded me opportunities to pitch cinematic adaptations of the pseudo-documentary series Ancient Aliens and the Japanese manga MPD-Psycho, as well as a remake of the 1992 action thriller Trespass.  If you’re higher up on the food chain—in, say, J. J. Abrams territory—that’s when you might get a shot at a gold-plated franchise like Star Wars or Mission:  Impossible.

Nicolas Cage as an anxiety-riddled screenwriter struggling to adapt “The Orchid Thief” in “Adaptation”

Because for the most part, Hollywood isn’t looking for new ideas; they have enough branded IPs to keep them in business through infinity and beyond.  What they’re looking for are skilled stenographers—writers-for-hire who can take a preexisting property and, juggling input from a thousand different chefs in the kitchen, turn it into a viable script for which a movie studio will be persuaded to invest millions of dollars.  That’s the litmus test:  Can you take an established IP and from it write a script that will motivate the studio to write a check?

The following proposals provide an insider’s glimpse into that singular development process.  With my contact’s express permission, I have reproduced the relevant text from his e-mails verbatim, including all typographical errors and syntactical idiosyncrasies, but excluding the identities of the authors, their representation, the executive, and his production company.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑