Sean P Carlin

Writer of things that go bump in the night

No, Virginia, “Die Hard” Is Not a Christmas Movie

Ah, it’s that magical time of year!  When the Hudson hawk nips at the nose, and the skyline over the New Jersey Palisades bruises by midafternoon.  When chimney smoke from private houses spices the air, and strings of colored lights adorn windows and fire escapes.  And, of course, when the Internet engages in its annual bullshit debate as to whether perennial holiday favorite Die Hard, currently celebrating its thirty-fifth anniversary, is a Christmas movie.  And since “bullshit debates” are my brand…


In fourth grade, I scored what was, by 1980s standards, the holy grail:  a best friend with HBO.  Over the following five years, I slept over at his house every weekend, where we watched R-rated action movies into the night.  Whatever HBO was showing that week, we delighted in it, no matter how idiotic (Action Jackson) or forgettable (Running Scared).  For a pair of preadolescent boys, that Saturday-night cinematic grab bag abounded with illicit wonders.

Much as we enjoyed those movies, though, they were for the most part—this isn’t a criticism—ephemeral crap.  We howled at their profane jokes and thrilled to their improbable set pieces, but seldom if ever revisited any of them (Beverly Hills Cop [1984] and its sequel [1987] being a rare exception), and certainly none inspired us to playact their scenarios as we had with PG-rated adventures Ghostbusters (1984) and Back to the Future (1985).  They entertained us, sure, but didn’t exactly impress upon our imaginations in any lasting or meaningful way…

That is, not until an action thriller with the snarky guy from Moonlighting (1985–1989) and Blind Date (1987) came along.  I still remember seeing Die Hard (1988) for the first time, on a thirteen-inch television with side-mounted mono speaker at my friend’s Bronx apartment.  As a viewing experience, it was about as low-def as they come, but that didn’t diminish the white-knuckled hold the movie had on us; we watched it in astonished silence from beginning to end.  From that point on—and this was the year no less than Tim Burton’s Batman had seized the zeitgeist, and our longstanding favorites Ghostbusters and Back to the Future got their first sequelsDie Hard was almost all we could talk about.

At the time, Manhattan College was in the process of erecting a twelve-story student residence overlooking Van Cortlandt Park, and we would gather with our JHS pals at the construction site on weekends, running around the unfinished edifice with automatic squirt guns, playing out the movie’s gleefully violent plot.  Hell, at one point or another, every multistory building in the neighborhood with a labyrinthine basement and rooftop access became Nakatomi Plaza, the setting of a life-and-death battle staged and waged by a group of schoolboys, our imaginations captive to the elemental premise of Die Hard.

We obsessed over that fucking movie so exhaustively, we passed around this still-in-my-possession copy of the pulp-trash novel it was based on—Roderick Thorp’s Nothing Lasts Forever (1979)—until every one of us had had a chance to read it:

The now-battered copy of “Nothing Last Forever” I bought in 1989 at the long-gone Bronx bookstore Paperbacks Plus

The thirteen-year-old boys of the late ’80s were far from the only demographic taken with Die Hard.  The movie proved so hugely popular, it not only spawned an immediate sequel in 1990 (which we were first in line to see at an appallingly seedy theater on Valentine Avenue), but became its own subgenre throughout the rest of that decade.  Hollywood gave us Die Hard on a battleship (Under Siege), Die Hard on a plane (Passenger 57), Die Hard on a train (Under Siege 2:  Dark Territory), Die Hard on a mountain (Cliffhanger), Die Hard on a bus (Speed), Die Hard on a cruise ship (Speed 2:  Cruise Control), Die Hard in a hockey arena (Sudden Death), Die Hard on Rodeo Drive (The Taking of Beverly Hills), Die Hard at prep school (Toy Soldiers)…

Christ, things got so out of control, even Beverly Hills Cop, an established action franchise predating Die Hard, abandoned its own winning formula for the third outing (scripted by Steven E. de Souza, co-screenwriter of the first two Die Hards) in favor of a half-assed “Die Hard in an amusement park” scenario.  This actually happened:

Eddie Murphy returns as Axel Foley—sort of—in “Beverly Hills Cop III” (1994)

None of those films has had the staying power of the original Die Hard.  Mostly that’s owed to Die Hard being a superior specimen of filmmaking.  Director John McTiernan demonstrates uncommonly disciplined visual panache:  He expertly keeps the viewer spatially oriented in the movie’s confined setting, employing swish pans and sharp tilts to establish the positions of characters within a given scene, as well as imbue the cat-and-mouse of it all with breathless tension.

McTiernan consistently sends his hero scuttling to different locations within the building—stairwells, pumprooms, elevator shafts, airducts, the rooftop helipad—evoking a rat-in-a-cage energy that leaves the viewer feeling trapped though never claustrophobic.  The narrative antithesis of the globetrotting exploits of Indiana Jones and James Bond, Die Hard is a locked-room thriller made with an ’80s action-movie sensibility.  It was and remains a masterclass in suspense storytelling—often imitated, as the old saying goes, never duplicated.

Perhaps another key reason for the movie’s durability, its sustained cultural relevance, is owed to its (conditional) status as a celebrated Christmas classic.  Like It’s a Wonderful Life (1946) and National Lampoon’s Christmas Vacation (1989) and Love Actually (2003), Die Hard is a feel-good film—albeit with a considerably higher body count—one is almost compelled to watch each December.  Yet whereas nobody questions any of the aforementioned movies’ culturally enshrined place in the holiday-movie canon—nor that of cartoonishly violent Home Alone (1990)—Die Hard’s eligibility seems perennially under review.

Why does the debate around Die Hard die hard… and is it, in fact, a Christmas movie?

Continue reading

“The Dogcatcher” Unleashed:  The Story behind My Debut Novel

My first novel, The Dogcatcher, is now available from DarkWinter Press.  It’s an occult horror/dark comedy about a municipal animal-control officer whose Upstate New York community is being terrorized by a creature in the woods.  Here’s a (spoiler-free) behind-the-scenes account of the project’s creative inception and development; how it’s responsible for my being blackballed in Hollywood; how the coronavirus pandemic challenged and ultimately elevated the story’s thematic ambitions; and how these characters hounded my imagination—forgive the pun—for no fewer than fourteen years.

The Dogcatcher is on sale in paperback and Kindle formats via Amazon.


In the spring of 2007, I came home from L.A. for a week to attend my sister’s graduation at Cornell University.  My first occasion to sojourn in the Finger Lakes region, I took the opportunity to stay in Downtown Ithaca, tour the Cornell campus, visit Buttermilk Falls State Park.  I was completely taken with the area’s scenic beauty and thought it would make the perfect location for a screenplay.  Only trouble was, all I had was a setting in search of a story.

CUT TO:  TWO YEARS LATER

Binge-watching wasn’t yet an institutionalized practice, but DVD-by-mail was surging, and my wife and I were, as such, working our way through The X-Files (1993–2002) from the beginning.  Though I have ethical reservations about Chris Carter’s hugely popular sci-fi series, I admired the creative fecundity of its monster-of-the-week procedural format, which allowed for the protagonists, his-and-her FBI agents Mulder and Scully, to investigate purported attacks by mutants and shapeshifters in every corner of the United States, from bustling cities to backwater burgs:  the Jersey Devil in Atlantic City (“The Jersey Devil”); a wolf-creature in Browning, Montana (“Shapes”); a prehistoric plesiosaur in Millikan, Georgia (“Quagmire”); El Chupacabra in Fresno, California (“El Mundo Gira”); the Mothman in Leon County, Florida (“Detour”); a giant praying mantis in Oak Brook, Illinois (“Folie à Deux”); a human bat in Burley, Idaho (“Patience”).

Special Agents Fox Mulder (David Duchovny) and Dana Scully (Gillian Anderson) in “The X-Files”

But the very premise of The X-Files stipulated that merely two underfunded federal agents, out of approximately 35,000 at the Bureau, were appropriated to investigate such anomalous urban legends.  I wondered:  If an average American town found itself bedeviled by a predatory cryptid—in real life, I mean—would the FBI really be the first responders?  Doubtful.  But who would?  The county police?  The National Guard?  If, say, a sasquatch went on a rampage, which regional public office would be the most well-equipped to deal with it…?

That’s when it occurred to me:  Animal Control.

And when I considered all the cultural associations we have with the word dogcatcher—“You couldn’t get elected dogcatcher in this town”—I knew I had my hero:  a civil servant who is the butt of everyone’s easy jokes, but whose specialized skills and tools and, ultimately, compassion are what save the day.

But it was, to be sure, a hell of a long road from that moment of inspiration to this:

When the basic concept was first devised, I wrote a 20-page story treatment for an early iteration of The Dogcatcher, dated August 25, 2009.  That same summer, I signed with new literary managers, who immediately wanted a summary of all the projects I’d been working on.  Among other synopses and screenplays, I sent them the Dogcatcher treatment.

They hated it.  They argued against the viability of mixing horror and humor, this despite a long precedent for such an incongruous tonal marriage in commercially successful and culturally influential movies the likes of An American Werewolf in London (1981), Ghostbusters (1984), Gremlins (1984), The Lost Boys (1987), Tremors (1990), Scream (1996), and Shaun of the Dead (2004), to say nothing of then–It Girl Megan Fox’s just-released succubus satire Jennifer’s Body (2009).  (I knew better than to cite seventy-year-old antecedents such as The Cat and the Canary and Hold That Ghost; Hollywood execs have little awareness of films that predate their own lifetimes.)  I was passionate about The Dogcatcher, but it was only one of several prospective projects I was ready to develop, so, on the advice of my new management, I put it in a drawer and moved on to other things.

Continue reading

Highway to Hell:  Car Culture and Hollywood’s Hero-Worship of the Automobile

With road-trip season upon us once again, here’s an examination of how American car culture has been romanticized by the entertainment industry; how automobiles, far from enablers of freedom and individuality, are in fact “turbo-boosted engines of inequality”; and how Hollywood can help remedy an ecocultural crisis it’s played no small role in propagating.


In any given episode, the action reliably starts the same way:  a wide shot of the Batcave, Batmobile turning on its rotating platform to face the cavemouth, camera panning left as the Dynamic Duo descend the Batpoles.  Satin capes billowing, Batman and Robin hop into their modified 1955 Lincoln Futura, buckle up—decades before it was legally required, incidentally—and the engine whines to life as they run through their pre-launch checklist:

ROBIN:  Atomic batteries to power.  Turbines to speed.

BATMAN:  Roger.  Ready to move out.

A blast of flame from the car’s rear thruster—whoosh!—and off they’d race to save the day.

By the time the 1980s had rolled around, when I was first watching Batman (1966–1968) in syndicated reruns, every TV and movie hero worth his salt got around the city in a conspicuously slick set of wheels.  Muscle cars proved popular with working-class ’70s sleuths Jim Rockford (Pontiac Firebird) and Starsky and Hutch (Ford Gran Torino).  The neon-chic aesthetic of Reagan era, however, called for something a bit sportier, like the Ferrari, the prestige ride of choice for Honolulu-based gumshoe Thomas Magnum (Magnum, P.I.) and buddy cops Crockett and Tubbs (Miami Vice).  The ’80s were nothing if not ostentatiously aspirational.

Even when cars were patently comical, they came off as cool despite themselves:  the Bluesmobile, the 1974 Dodge Monaco used in The Blues Brothers (1980); the Ectomobile, the 1959 Cadillac Miller-Meteor Sentinel in Ghostbusters (1984); the Wolfmobile, a refurbished bread truck that Michael J. Fox and his pal use for “urban surfing” in Teen Wolf (1985).

The DMC DeLorean time machine from Back to the Future is clearly meant to be absurd, designed in the same kitchen-sink spirit as the Wagon Queen Family Truckster from National Lampoon’s Vacation (1983), but what nine-year-old boy in 1985 didn’t want to be Michael J. Fox, sliding across the stainless-steel hood and yanking the gull-wing door shut behind him?  And like the characters themselves, the DeLorean evolved with each movie, going from nuclear-powered sports car (Part I) to cold-fusion flyer (Part II) to steampunk-retrofitted railcar (Part III).  “Maverick” Mitchell’s need for speed didn’t hold a candle to Marty McFly’s, who’s very existence depended on the DeLorean’s capacity to reach 88 miles per hour.

Vehicles that carried teams of heroes offered their own vicarious pleasure.  Case in point:  the 1983 GMC Vandura, with its red stripe and rooftop spoiler, that served as the A-Team’s transpo and unofficial HQ—a place where they could bicker comically one minute then emerge through the sunroof the next to spray indiscriminate gunfire from their AK-47s.  The van even had a little “sibling”:  the Chevrolet Corvette (C4) that Faceman would occasionally drive, marked with the same diagonal stripe.  Did it make sense for wanted fugitives to cruise L.A. in such a distinct set of wheels?  Not really.  But it was cool as hell, so.

The Mystery Machine was the only recurring location, as it were, on Scooby-Doo, Where Are You! (1969), and the van’s groovy paint scheme provided contrast with the series’ gloomy visuals.  Speaking of animated adventures, when once-ascetic Vietnam vet John Rambo made the intuitive leap from R-rated action movies to after-school cartoon series (1986), he was furnished with Defender, a 6×6 assault jeep.  Not to be outdone, the most popular military-themed animated franchise of the ’80s, G.I. Joe:  A Real American Hero (1983–1986), featured over 250 discrete vehicles, and the characters that drove them were, for the most part, an afterthought:

With the debut of the 3 ¾” figures in 1982, Hasbro also offered a range of vehicles and playsets for use with them.  In actual fact, the 3 ¾” line was conceived as a way to primarily sell vehicles—the figures were only there to fill them out!

‘3 ¾” Vehicles,’ YoJoe!

But who needs drivers when the vehicles themselves are the characters?  The protagonists of The Transformers (1984–1987) were known as the Autobots, a race of ancient, sentient robots from a distant planet that conveniently shapeshifted into 1980s-specific cars like the Porsche 924 and Lamborghini Countach, among scores of others.  (The premise was so deliriously toyetic, it never occurred to us to question the logic of it.)  Offering the best of both G.I. Joe and The Transformers, the paramilitary task force of M.A.S.K. (1985–1986), whose base of operations was a mountainside gas station (what might be described as Blofeld’s volcano lair meets the Boar’s Nest), drove armored vehicles that transformed into… entirely different vehicles.

Many movies and shows not only featured cars as prominent narrative elements, but literally took place on the roadVacationMad Max (1979).  Smokey and the Bandit (1977).  CHiPs (1977–1983).  Sometimes the car was so important it had a proper name:  General Lee from The Dukes of Hazzard (1979–1985).  Christ, sometimes it was the goddamn series costar:  KITT on Knight Rider (1982–1986).  Shit on David Hasselhoff’s acting ability all you want, but the man carried a hit TV show delivering the lion’s share of his dialogue to a dashboard.  Get fucked, Olivier.

1980s hero-car culture at a glance

As a rule, productions keep multiple replicas of key picture cars on hand, often for different purposes:  the vehicle utilized for dialogue scenes isn’t the one rigged for stunts, for instance.  It’s notable that the most detailed production model—the one featured in medium shots and closeups, in which the actors perform their scenes—is known as the “hero car.”  And why not?  Over the past half century, Hollywood has unquestionably programmed all of us to recognize the heroism of the automobile.

Continue reading

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

Book Review:  “Heat 2” by Michael Mann + Meg Gardiner

This article discusses plot details and scene specifics from Michael Mann’s film Heat (1995) and his novel Heat 2 (2022).


John Carpenter’s dystopian classic Escape from New York (1981), set in 1997, opens with an expository intertitle:  “1988—The Crime Rate in the United States Rises Four Hundred Percent.”  Though that grim prognostication amounted to an exaggeration, the issue itself had nonetheless become a big deal here in the real world by the early 1990s:

In 1993, the year President Clinton took office, violent crime struck nearly 11 million Americans, and an additional 32 million suffered thefts or burglaries.  These staggering numbers put millions more in fear.  They also choked the economic vitality out of entire neighborhoods.

Politically, crime had become one of the most divisive issues in the country.  Republicans called for an ever more punitive “war on drugs,” while many Democrats offered little beyond nebulous calls to eliminate the “root causes” of crime.

David Yassky, “Unlocking the Truth About the Clinton Crime Bill,” Opinion, New York Times, April 9, 2016

Clinton’s response was the measurably effective (if still controversial) Violent Crime Control and Law Enforcement Act of 1994, otherwise known as the 1994 Crime Bill, coauthored by Joe Biden, the provisions of which—and this is just a sampling—added fifty new federal offenses, expanded capital punishment, led to the establishment of state sex-offender registries, and included the Federal Assault Weapons Ban (since expired) and the Violence Against Women Act.

It was an attempt to address a big issue in America at the time:  Crime, particularly violent crime, had been rising for decades, starting in the 1960s but continuing, on and off, through the 1990s (in part due to the crack cocaine epidemic).

Politically, the legislation was also a chance for Democrats—including the recently elected president, Bill Clinton—to wrestle the issue of crime away from Republicans.  Polling suggested Americans were very concerned about high crime back then.  And especially after George H.W. Bush defeated Michael Dukakis in the 1988 presidential election in part by painting Dukakis as “soft on crime,” Democrats were acutely worried that Republicans were beating them on the issue.

German Lopez, “The controversial 1994 crime law that Joe Biden helped write, explained,” Vox, September 29, 2020

Given the sociopolitical conditions of the era, it stands to reason—hell, it seems so obvious in hindsight—the 1990s would be a golden age of neo-noir crime cinema.  The death of Michael Corleone, as it happens, signified a rebirth of the genre itself; Martin Scorsese countered the elegiac lethargy—that’s not a criticism—of Francis Ford Coppola’s The Godfather, Part III with the coke-fueled kineticism of Goodfellas (both 1990).  Henry Hill shared none of Michael’s nagging reluctance about life in the Italian Mafia; he always wanted to be a gangster!

Reasoning that was probably true of audiences, too—as an author of horror stories, I certainly appreciate a healthy curiosity for the dark side—Hollywood offered vicarious trips into the criminal underworlds of Hell’s Kitchen, in Phil Joanou’s State of Grace (1990), and Harlem, in Mario Van Peebles’ New Jack City (1991), both of which feature undercover cops as major characters.  So does Bill Duke’s Deep Cover (1992), about a police officer (Laurence Fishburne) posing as an L.A. drug dealer as part of a broader West Coast sting operation.

The line between cop and criminal, so clearly drawn in the action-comedies of the previous decade (Lethal Weapon, Beverly Hills Cop, Stakeout, Running Scared), was becoming subject to greater ambiguity.  In no movie is that made more starkly apparent than Abel Ferrara’s Bad Lieutenant (1992), about a corrupt, hedonistic, drug-addicted, gambling-indebted, intentionally nameless New York cop (Harvey Keitel) investigating the rape of a nun in the vain hope it will somehow redeem his pervasive rottenness.

And it wasn’t merely that new stories were being told; this is Hollywood, after all, so we have some remakes in the mix.  Classic crime thrillers were given contemporary makeovers, like Scorsese’s Cape Fear (1991), as well as Barbet Schroeder’s Kiss of Death (1995), which is mostly remembered, to the extent it’s remembered at all, as the beginning and end of David Caruso’s would-be movie career, but which is much better than its reputation, thanks in no small part to a sharp script by Richard Price (Clockers), full of memorably colorful Queens characters and his signature street-smart dialogue.

Creative experimentation was in full swing, too, as neo-noir films incorporated conventions of other genres, including erotic thriller (Paul Verhoeven’s Basic Instinct [1992]), black comedy (the Coen brothers’ Fargo [1996] and The Big Lebowski [1998]), period throwback (Carl Franklin’s Devil in a Blue Dress [1995]; Curtis Hanson’s L.A. Confidential [1997]), neo-Western (James Mangold’s Cop Land [1997]), and, well, total coffee-cup-shattering, head-in-a-box mindfuckery (Bryan Singer’s The Usual Suspects; David Fincher’s Seven [both 1995]).

Christ, at that point, Quentin Tarantino practically became a subgenre unto himself after the one-two punch of Reservoir Dogs (1992) and Pulp Fiction (1994), which in turn inspired an incessant succession of self-consciously “clever” knockoffs like John Herzfeld’s 2 Days in the Valley (1996) and Gary Fleder’s Things to Do in Denver When You’re Dead (1995).  By the mid-’90s, the crime rate, at least at the cinema, sure seemed like it had risen by 400%.

Tim Roth lies bleeding as Harvey Keitel comes to his aid in a scene from the film “Reservoir Dogs,” 1992 (photo by Miramax/Getty Images)

As different as they all are, those films can almost unanimously be viewed as a repudiation of the ethos of ’80s action movies, in which there were objectively good guys, like John McClane, in conflict with objectively bad guys, like Hans Gruber, in a zero-sum battle for justice, for victory.  It was all very simple and reassuring, in keeping with the archconservative, righteous-cowboy worldview of Ronald Reagan.  And while those kinds of movies continued to find a receptive audience—look no further than the Die Hard–industrial complex, which begat Under Siege (1992) and Cliffhanger (1993) and Speed (1994), among scores of others—filmmakers were increasingly opting for multilayered antiheroes over white hats versus black hats.

Which begged the question:  Given how blurred the lines had become between good guys and bad guys in crime cinema, could you ever go back to telling an earnest, old-school cops-and-robbers story—one with an unequivocally virtuous protagonist and nefarious antagonist—that nonetheless aspired to be something more dramatically credible, more psychologically nuanced, more thematically layered than a Steven Seagal star vehicle?

Enter Michael Mann’s Heat.

Continue reading

A History of the Blog (So Far)—and a Programming Update

Since launching this blog eight years ago, I have maintained a consistent publishing schedule of one new post per month.  However, given the ways in which this ongoing project has evolved, that level of output is no longer sustainable.  Here’s a brief chronicle of the blog’s creative progression—and a statement on what comes next.


From the time I signed with my first literary manager in 1998 through the ignominious end of my career in Hollywood in 2014, I was exclusively focused on one form of creative expression:  screenwriting.

Though ultimately unproduced, my scripts nonetheless earned praise from producers and development execs for their uncommon visual suggestiveness and sharp sense of pace, which I controlled through deliberate syntactic arrangement of the very things that do not appear in the finished film for audiences to appreciate:  the stage description.

Screenwriters, if you’re unaware, are not by and large particularly skillful wordsmiths.  And, to be fair, it’s not required of them.  Plot structure, characterization, and dialogue are what the screenwriter is there to provide for a motion picture.  Why waste time and creative energy on pretty prose in a blueprint, which is all a screenplay really is?

A rarified handful of pro screenwriters, Shane Black and James Cameron among them, paint immersive pictures with their words, imparting how the world of the story feels over merely sequentially reporting what happens.  Such is the dynamic mode of screenwriting for which I strove.

Most screenplays—and I’m talking about scripts to produced films, written by Hollywood’s A-list scribes—aren’t much more than utilitarian laundry lists of things we’ll see and hear onscreen, conveyed without any visceral impression of style or tempo, and are, accordingly, nigh unreadable.  The director, after all, is going to make the movie he sees in his head; the script is just a means to get all the above- and below-the-line talent quite literally on the same page.

Excerpted from “Indiana Jones and the Kingdom of the Crystal Skull” by David Koepp.  Mind-numbing, no?

I actually like words, however.  I like how they sound, and the infinite combinations of meaning that can be made from them.  Truth is, I never should’ve aspired to be a screenwriter.  It was the wrong medium for my talents and interests.  “Author” and “essayist” were always a better fit for my writerly sensibilities.  It took the implosion of my career to finally embrace that.

So, when I started this blog at the encouragement of my wife—one of her many good ideas—I didn’t know quite what to write about except screenwriting.  Accordingly, my first two dozen posts are almost entirely devoted to matters of narrative craft, from my customized Storytelling 101 curriculum to the violation of the Double Hocus Pocus principle in Ghostbusters II to character deconstructions of Jack Bauer and John Rambo and a comparative analysis of the Jack Nicholson and Heath Ledger interpretations of the Joker.

One year into this blogging project, all my notions about narrativity were challenged—perhaps even shattered—by a book I’d read called Present Shock:  When Everything Happens Now (2013) by Douglas Rushkoff, which argued that Joseph Campbell’s “heroic journey,” the dramatic schema that has served as the structural basis for nearly every story in the Western literary canon, had collapsed around the turn of the millennium, as evidenced by the fanatical popularity of “storyless” fiction like Lost, The X-Files, The Sopranos, CSI:  Crime Scene Investigation, The Walking Dead, and Game of Thrones.

Rushkoff’s premise inspired a yearslong scholarly investigation on my part, which began in earnest with a post called “Journey’s End:  Rushkoff and the Collapse of Narrative,” and turned the blog in a new, more complex direction.  This intellectual project would never be the same.

Continue reading

Sorting through the Clutter:  How “The Girl Before” Misrepresents Minimalism

The Girl Before depicts minimalism as an obsessive-compulsive symptom of emotional instability, in contrast with what I can attest it to be from years of committed practice:  a versatile set of tools/techniques to promote emotional balance—that is, to attain not merely a clutter-free home, but a clutter-free head.


In the BBC One/HBO Max thriller The Girl Before, created by JP Delaney (based on his novel), brilliant-but-troubled architect Edward Monkford (David Oyelowo)—ah, “brilliant but troubled,” Hollywood’s favorite compound adjective; it’s right up there with “grounded and elevated”—is designer and owner of a postmodern, polished-concrete, minimalist home in suburban London, One Folgate Street, which he rents out, with extreme selectivity, at an affordable rate to “people who live [t]here the way he intended.”  Prospective tenants are required to submit to an uncomfortably aloof interview with Edward, whose otherwise inscrutable mien lapses into occasional expressions of condescending disapproval, and then fill out an interminable questionnaire, which includes itemizing every personal possession the candidate considers “essential.”

The rarified few who meet with Edward’s approval must consent to the 200-odd rules that come with living in the house (no pictures; no ornaments; no carpets/rugs; no books; no children; no planting in the garden), enforced through contractual onsite inspections of the premises.  Meanwhile, One Folgate Street is openly monitored 24/7 by an AI automation system that tracks movements, polices violations of maximum-occupancy restrictions, regulates usage of water and electricity, sets time limits on tooth-brushing, and preselects “mood playlists”—just for that personal touch.  All of this is a reflection of Edward’s catholic minimalist philosophy:  “When you relentlessly eradicate everything unnecessary or imperfect, it’s surprising how little is left.”

“The Girl Before,” starring David Oyelowo, Gugu Mbatha-Raw, and Jessica Plummer

The Girl Before—and I’ve only seen the miniseries, not read the book—intercuts between two time periods, set three years apart, dramatizing the experiences of the current tenant, Jane Cavendish (Gugu Mbatha-Raw), grief-stricken over a recent stillbirth at 39 weeks, and the home’s previous occupant, Emma Matthews (Jessica Plummer), victim of a sexual assault during a home invasion at her flat.  (Emma, we soon learn, has since died at One Folgate Street under ambiguous circumstances that may or may not have something to do with Edward…?)  Edward’s minimalist dogma appeals to both women for the “blank slate” it offers—the opportunity to quite literally shed unwanted baggage.

This being a psychological thriller, it isn’t incidental that both Jane and Emma bear not merely uncanny physical resemblance to one another, but also to Edward’s late wife, who herself died at One Folgate Street along with their child, casualties of an accident that occurred during the construction of the home originally intended for the site before Edward scrapped those plans and went psychoneurotically minimalistic.  Everyone in The Girl Before is traumatized, and it is the imposition of or submission to minimalist living that provides an unhealthy coping mechanism for Edward, Jane, and Emma, each in their own way:

In this novel, [Delaney] wanted to explore the “weird and deeply obsessive” psychology of minimalism, evident in the fad for [Marie] Kondo and her KonMari system of organizing.  “On the face of it,” he wrote, “the KonMari trend is baffling—all that focus on folding and possessions.  But I think it speaks to something that runs deep in all of us:  the desire to live a more perfect, beautiful life, and the belief that a method, or a place, or even a diet, is going to help us achieve that.  I understand that impulse.  But my book is about what happens when people follow it too far.  As one of my characters says, you can tidy all you like, but you can’t run away from the mess in your own head.”

Gregory Cowles, “Behind the Best Sellers:  ‘Girl Before’ Author JP Delaney on Pseudonyms and the Limits of Marie Kondo,” New York Times, February 3, 2017

Indeed.  And if only The Girl Before had been a good-faith exploration of what minimalism, the psychology and practice of it, actually is.

Continue reading

“Superman IV” at 35:  How the “Worst Comic-Book Movie Ever” Epitomizes What We Refuse to Admit about Superhero Fiction

Superman IV:  The Quest for Peace, unanimously reviled for both its unconvincing visuals and cornball story, inadvertently accomplished the theretofore unrealized dream of scores of nefarious supervillains when it was released on this date in 1987:  It killed Superman.  (Or at least put the cinematic franchise into two-decade dormancy.)

But a closer examination of the film suggests its objectively subpar storytelling might in fact be far more faithful to the spirit of the source material than today’s fanboy culture would care to concede.


Thirty-five years ago today, my mother took me to see Superman IV:  The Quest for Peace (1987).  Afterwards, we met up with my father at Doubleday’s, a neighborhood bar and grill that was the last stop on Broadway before you’d officially crossed the city line into Westchester County.  The restaurant had a hot-oil popcorn machine in the far corner, and when I went to refill our basket, I spied a man seated at the bar, nose in a copy of USA Today, the back panel of which boasted a full-page color advertisement for Superman IV.

When he caught me studying the ad, he asked, “Gonna go see the new Superman?”

“I just did.”

“Yeah?  How was it?”

“It was amazing,” I said, and I absolutely meant it.  Sensing my sincerity, the gentleman pulled the ad from the bundle of folded pages and handed it to me as a souvenir.  When I got home, I taped it up on my bedroom wall.

The theatrical one-sheet for “Superman IV” looks like a textbook “Action Comics” cover from the ’80s

Sidney J. Furie’s Superman IV:  The Quest for Peace is not amazing.  It is, in fact, commonly regarded as one of the worst comic-book movies ever made—if not the worst—in eternal competition for last place with Batman & Robin (1997) and Catwoman (2004).  It suffered from a notoriously troubled production:  After the diminishing returns of Superman III (1983) and spin-off Supergirl (1984), series producers Alexander and Ilya Salkind sold their controlling interests in the IP to the Cannon Group, the schlockmeister studio responsible for the American Ninja, Missing in Action, Breakin’, and Death Wish franchises—not exactly the optimal custodians of a series that had started out, against all expectation, so magnificently.

Richard Donner’s Superman:  The Movie (1978) was and remains the finest specimen of superhero cinema ever presented, at once ambitiously epic and emotionally relatable.  It pulls off the impossible in so many ways, first and foremost that it absolutely made us a believe a man could fly, which had never been credibly accomplished before.  Credit for that goes not only to the VFX team, which won the Academy Award for its efforts, but to Christopher Reeve, who delivered the movie’s most timeless special effect:  endowing profound dignity and genuine vulnerability to a spandex-clad demigod.  Even the lesser Superman films—and we’ll talk more about those soon enough—are elevated by Reeve’s extraordinary performance, which occupies a lofty position, right alongside Bela Lugosi’s Dracula, in the pantheon of defining interpretations of folkloric icons.

What’s also so remarkable about Superman is how many different tonal aesthetics it assimilates.  The opening sequences on Krypton with Marlon Brando feel downright Kubrickian; Donner somehow channels the cosmic splendor of 2001:  A Space Odyssey (1968), only to then transition us to Smallville, as warm and fertile as Krypton was cold and barren, which evokes the same spirit of sock-hop Americana George Lucas conjured to such success in American Graffiti (1973).

The remainder of the movie shifts fluidly from His Girl Friday–style newsroom comedy (the scenes at the Daily Planet) to urban action thriller à la The French Connection (the seedy streets of 1970s Metropolis) to Roger Moore–era 007 outing (Lex Luthor’s sub–Grand Central lair, complete with comically inept henchmen) to Irwin Allen disaster film (the missile that opens up the San Andreas Fault in the third act and sets off a chain reaction of devastation along the West Coast).

Somehow it coheres into a movie that feels like the best of all worlds rather than a derivative Frankenstein’s monster.  Up until that time, superhero features and television, hampered by juvenile subject matter and typically subpar production values, seemed inherently, inexorably campy.  The notion that a superhero movie could rise to the level of myth, or at least credibly dramatic science fiction, was unthinkable.  Superman is the proof-of-concept paradigm on which our contemporary superhero–industrial complex is predicated.

Continue reading

EXT. LOS ANGELES – ONE YEAR LATER

I thought I’d said everything I had to say about Los Angeles last winter.  Should’ve known Hollywood would demand a sequel.


Even at the height of its considerable cultural influence, I never much cared for Sex and the City—for a very simple reason:  I didn’t in any way recognize the New York it depicted.

As someone who’d grown up there, Sex seemed like a postfeminist fantasy of the city as a bastion of neoliberal materialism, conjured by someone who’d never actually been to New York or knew so much as the first thing about it.  It certainly didn’t reflect the experience of any working-class New Yorkers I knew.

(It would seem the more things change, the more they stay the same:  The recent SATC revival series, And Just Like That…, is reported to be full of unintentionally cringe-inducing scenes of the gals apparently interacting with Black women for the first time in their lives.  Sounds on-brand.)

But this isn’t a retroactive reappraisal of a 1990s pop-cultural pacesetter—those have been exhaustively conducted elsewhere of late—merely an acknowledgment that the impression the series made on the generation of (largely) female Millennials who adored it is undeniable, legions of whom relocated to New York in early adulthood to have the full Sex and the City experience, and who, in turn, in many ways remade the city in Carrie Bradshaw’s image, for better or worse.

I can’t say as I blame those folks, really.  That they were sold a load of shit isn’t their fault.  Here in New York, we were just as susceptible to Hollywood’s greener-grass illusions of elsewhere.  As a student in the 1990s, the Los Angeles of Beverly Hills, 90210 (1990–2000) and Baywatch (1989–2001), of Buffy the Vampire Slayer (1992) and Clueless (1995), seemed like a fun-in-the-sun teenage paradise in stark contrast with the socially restrictive experience of my all-boys high school in the Bronx, where the only thing that ever passed for excitement were spontaneous gang beatings at the bus stop on Fordham Road.

The high-school experience depicted on “Beverly Hills, 90210” is one I think we can all relate to

The sunny schoolyards and neon-lit nighttime streets of L.A. carried the promise of good times, the kind that seemed altogether out of reach for me and my friends.  The appeal of what California had to offer was so intoxicating, in fact, my two best pals and I spent an entire summer in the mid-’90s trying to make the streets of the Bronx look like Santa Cruz—a place none of us had ever been—for an amateur sequel to The Lost Boys, the ’80s cult classic about a coven of adolescent vampires who’ve (wisely) opted to spend eternity on the boardwalk.  That notion unquestionably took hold of my impressionable imagination—it made me want to be a part of that culture, and tell those kinds of stories.

Accordingly, it’s fair to say it wasn’t merely the movie business that brought me to Los Angeles in my early twenties as an aspiring screenwriter, but arguably the romantic impressions of California itself imprinted upon my psyche by all those movies and TV series on which I came of age.  Yet for the two decades I lived there, the city I’d always imagined L.A. to be—a place full of golden possibilities, as low-key as New York was high-strung—wasn’t the one I experienced.  Not really.  Not until last month, anyway.

Continue reading

The Last Walking Infinity Throne Corrupts Infinitely:  How the Mega-Franchise Format Warps Creative Storytelling Goals

“As a medium, stories have proven themselves great as a way of storing information and values, and then passing them on to future generations”—Douglas Rushkoff, Present Shock:  When Everything Happens Now (New York:  Penguin Group, 2013), 16.

Traditionally, stories have been organized around universal dramatic principles first identified by Aristotle in Poetics, later codified by Joseph Campbell in The Hero with a Thousand Faces, and most recently customized for screenwriters in programs like Blake Snyder’s Save the Cat!  But in recent decades, narrativity has taken on a new, shapeless, very possibly endless permutation:  the transmedia “mega-franchise”—that is, the intertextual and ever-expanding storyworlds of Marvel, Star Wars, The Conjuring, Harry Potter’s Wizarding World, et al.

In this month’s guest post, friend of the blog Dave Lerner returns to delineate the five creative objectives of storytelling—and how those have mutated, along with narrativity itself, in this era of branded-IP entertainment.


From the first cave paintings to the Homeric epics to the Globe Theatre to the multicamera sitcom, storytellers across the ages have told stories for reasons so obvious they often go unstated and unacknowledged.

Let’s take a look at the five creative goals that guide storytellers in any medium, whether it be a movie, novel, TV episode, comic book, or otherwise.  Commercial considerations such as “profit” and “being hired to do so” are omitted here, as these are not creative goals.

Storytelling Goal #1:  Entertainment

Elementary!  The storyteller intends for their audience to have fun, to relax, to take their minds off their problems, to experience another world, another life, for a while.  Pure escapism.  While some may decry “mindless entertainment,” I would argue that it has a necessary place in life—and I’m not the only one who sees the virtues of escapist stories:

Hence the uneasiness which they arouse in those who, for whatever reason, wish to keep us wholly imprisoned in the immediate conflict.  That perhaps is why people are so ready with the charge of “escape.”  I never fully understood it till my friend Professor Tolkien asked me the very simple question, “What class of men would you expect to be most preoccupied with, and hostile to, the idea of escape?” and gave the obvious answer:  jailers.

C. S. Lewis, On Stories:  And Other Essays on Literature

Storytelling Goal #2:  Artistic Expression

Although the definition of “Art” has been and will be debated endlessly, for the purpose of this category I will use the second definition from Wiktionary:

The creative and emotional expression of mental imagery, such as visual, auditory, social, etc.

To further specify, art is more about the feelings the artist is expressing and the statement the artist is making than the emotions they are attempting to evoke in their audience.

Arguments about whether or not a given piece is “art,” or a given medium is “capable of creating art,” though valid in other contexts, will be disregarded here.  I’ll assume if you say your piece is art, then it’s art.  I am also ignoring the quality of the piece, the term “a work of art.”  By my definition, a movie can be as much a piece of art as a painting, sculpture, symphony, literary novel, etc., though when it is, it’s usually called a “film” and not a “movie.”

Storytelling Goal #3:  Education

The storyteller aspires to teach their audience something they did not know before.  While documentaries and lectures are obvious examples, many read historical novels or hard science fiction for much the same purpose.  When I was a child, I first learned that water expands when it freezes from a Shazam! comic book.  Of course, a person may forget most of what they’d learned almost immediately afterwards, but the learning experience itself was enjoyable.

“Young Indiana Jones,” recently studied here, incorporated biographical information about many early-20th-century historical figures, fulfilling the third of five storytelling goals

Even if the “facts” presented are deliberately inaccurate, as long the intent is for people to believe them, this category applies.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑