Writer of things that go bump in the night

Tag: horror (Page 1 of 3)

Under the Influence, Part 1:  On Artistic Inspiration, Finding One’s Voice, and Tarantino’s Formative Faves

Let’s play Ten for Ten!  To commemorate the ten-year anniversary of this blog, which launched on June 26, 2014, here’s an appreciation for ten of my formative cinematic influences—an examination of why these movies resonated with me when I first saw them, and how they permanently informed my aesthetic tastes and creative sensibilities.  This post is presented in three installments.

“Under the Influence, Part 1” informally ponders through personal example how an artist develops a singular style and voice all their own, and offers an analysis of Quentin Tarantino’s essay collection Cinema Speculation, the auteur’s critical look at the movies of the ’70s that inspired him.

In “Under the Influence, Part 2,” I spotlight five films from my ’80s childhood that shaped my artistic intuition when at its most malleable.

And in “Under the Influence, Part 3,” I round out the bill with five selections from my ’90s adolescence, the period during which many of the themes that preoccupy me crystalized.


It takes an unholy degree of time and stamina to write a book.  Consequently, it’s advisable to have a really good reason to take a given project on—then see it through to the finish line.  Before typing word one of a new manuscript, it behooves us to ask (and answer):  Why is this project worth the herculean effort required to bring it into existence?

I wrote my debut novel The Dogcatcher (2023) for the most elemental of motives:  I ached for the kind of bygone horror/comedies on which I’d come of age in the ’80s, an era that produced such motley and memorable movies as An American Werewolf in London (1981), The Evil Dead (1981), Gremlins (1984), Ghostbusters (1984), The Witches of Eastwick (1987), The Lost Boys (1987), The Monster Squad (1987), The ’Burbs (1989), and Tremors (1990).  Where have those kinds of movies gone? I wondered.

Hollywood, to be fair, hadn’t stopped making horror/comedies, it only long since stopped making them with any panache.  I have spent many a Saturday night over the past decade in a binge-scrolling malaise, surfing numbly through hundreds of viewing options on Netflix or Prime or Hulu or whatever, when suddenly my inner adolescent’s interest is piqued—as though I were back at the old video store and had found a movie right up my alley.

I certainly sensed the stir of possibility in Vampires vs. the Bronx (2020), about a group of teenagers from my hometown battling undead gentrifiers.  Night Teeth (2021), featuring bloodsuckers in Boyle Heights, seemed equally promising.  And Werewolves Within (2021) is set in a snowbound Northeastern United States township already on edge over a proposed pipeline project when its residents find themselves under attack by a werewolf.

“Vampires vs. the Bronx” (2020) seemed like the perfect mix of Gen X–era throwback and Gen Z–era social commentary

All of a sudden, I felt like that sixteen-year-old kid who saw the one-sheet for Buffy the Vampire Slayer (1992) while riding the subway to work—“She knows a sucker when she sees one,” teased the tagline, depicting a cheerleader from the neck down with a wooden stake in her fist—and knew he was in for a good time at the cinema.

No such luck.  Vampires vs. the Bronx, in an act of creative criminality, pisses away a narratively and thematically fertile premise through flat, forgettable execution.

Night Teeth, meanwhile, answers the question:  How about a movie set in the same stomping ground as Blade (1998)—inner-city L.A., clandestine vampiric council calling the shots—only without any of its selling-point stylistics or visual inventiveness?

And Werewolves Within establishes an intriguing environmental justice subplot the screenwriter had absolutely no interest in or, it turns out, intention of developing—the oil pipeline isn’t so much a red herring as a dead herring—opting instead for a half-assed, who-cares-less whodunit beholden to all the standard-issue genre tropes.

Faced with one cinematic disappointment after another, it seemed the only way to sate my appetite for the kind of horror/comedy that spoke to me as a kid was to write my own.

On the subject of kids—specifically, stories about twelve-year-old boys—I haven’t seen one of those produced with any appreciable measure of emotional honesty or psychological nuance since Rob Reiner’s Stand by Me (1986), based on Stephen King’s 1982 novella The Body.  That was forty years ago!

Storytellers know how to write credible children (E.T. the Extra-Terrestrial, Home Alone, Room), and they know how to write teenagers (The Outsiders, Ferris Bueller’s Day Off, Clueless), but preadolescent boys are almost invariably reduced to archetypal brushstrokes (The Goonies, The Sandlot, Stranger Things).  The preteen protagonists of such stories are seldom made to grapple with the singular emotional turbulence of having one foot in childhood—still watching cartoons and playing with action figures—and the other in adolescence—beginning to regard girls with special interest, coming to realize your parents are victims of generational trauma that’s already in the process of being passed unknowingly and inexorably down to you.

For all of popular culture’s millennia-long fixation on and aggrandizement of the heroic journey of (usually young) men, our commercial filmmakers and storytellers either can’t face or don’t know how to effectively dramatize the developmental fulcrum of male maturation.  George Lucas’ experimental adventure series The Young Indiana Jones Chronicles (1992–1996) sheds light on Indy’s youth from ages eight through ten (where he’s portrayed by Corey Carrier) and then sixteen through twenty-one (Sean Patrick Flanery); the complicated messiness of pubescence, however, is entirely bypassed.  Quite notably, those are the years in which Indy’s mother died and his emotionally distant father retreated into his work—formative traumas that shaped, for better and worse, the adult hero played by Harrison Ford in the feature films.

Lucas’ elision seems odd to me—certainly a missed creative opportunity1—given that twelve-going-on-thirteen is the period of many boys’ most memorable and meaningful adventures.  King and Reiner never forgot that, and neither did I, hence the collection of magical-realism novellas I’m currently writing that explore different facets of that transitory experience:  going from wide-eyed wonder to adolescent disillusionment as a result of life’s first major disappointment (Spex); being left to navigate puberty on your own in the wake of divorce (The Brigadier); struggling to understand when, how, and why you got socially sorted at school with the kids relegated to second-class citizenry (H.O.L.O.).

This single-volume trilogy, I should note, isn’t YA—these aren’t stories about preteens for preteens.  Rather, they are intended, like The Body/Stand by Me before them, as a retrocognitive exercise for adults who’ve either forgotten or never knew the experience of being a twelve-year-old boy to touch base with that metamorphic liminality in all of its psychoemotional complexity.  They’re very consciously stories about being twelve as reviewed from middle-aged eyes.

As I’ll demonstrate in “Part 2” and “Part 3,” both that WIP and The Dogcatcher take inspiration—narratively, thematically, aesthetically, referentially—from the stories of my youth, the books and movies that first kindled my imagination and catalyzed my artistic passions.

Continue reading

“The Dogcatcher” Unleashed:  The Story behind My Debut Novel

My first novel, The Dogcatcher, is now available from DarkWinter Press.  It’s an occult horror/dark comedy about a municipal animal-control officer whose Upstate New York community is being terrorized by a creature in the woods.  Here’s a (spoiler-free) behind-the-scenes account of the project’s creative inception and development; how it’s responsible for my being blackballed in Hollywood; how the coronavirus pandemic challenged and ultimately elevated the story’s thematic ambitions; and how these characters hounded my imagination—forgive the pun—for no fewer than fourteen years.

The Dogcatcher is on sale in paperback and Kindle formats via Amazon.


In the spring of 2007, I came home from L.A. for a week to attend my sister’s graduation at Cornell University.  My first occasion to sojourn in the Finger Lakes region, I took the opportunity to stay in Downtown Ithaca, tour the Cornell campus, visit Buttermilk Falls State Park.  I was completely taken with the area’s scenic beauty and thought it would make the perfect location for a screenplay.  Only trouble was, all I had was a setting in search of a story.

CUT TO:  TWO YEARS LATER

Binge-watching wasn’t yet an institutionalized practice, but DVD-by-mail was surging, and my wife and I were, as such, working our way through The X-Files (1993–2002) from the beginning.  Though I have ethical reservations about Chris Carter’s hugely popular sci-fi series, I admired the creative fecundity of its monster-of-the-week procedural format, which allowed for the protagonists, his-and-her FBI agents Mulder and Scully, to investigate purported attacks by mutants and shapeshifters in every corner of the United States, from bustling cities to backwater burgs:  the Jersey Devil in Atlantic City (“The Jersey Devil”); a wolf-creature in Browning, Montana (“Shapes”); a prehistoric plesiosaur in Millikan, Georgia (“Quagmire”); El Chupacabra in Fresno, California (“El Mundo Gira”); the Mothman in Leon County, Florida (“Detour”); a giant praying mantis in Oak Brook, Illinois (“Folie à Deux”); a human bat in Burley, Idaho (“Patience”).

Special Agents Fox Mulder (David Duchovny) and Dana Scully (Gillian Anderson) in “The X-Files”

But the very premise of The X-Files stipulated that merely two underfunded federal agents, out of approximately 35,000 at the Bureau, were appropriated to investigate such anomalous urban legends.  I wondered:  If an average American town found itself bedeviled by a predatory cryptid—in real life, I mean—would the FBI really be the first responders?  Doubtful.  But who would?  The county police?  The National Guard?  If, say, a sasquatch went on a rampage, which regional public office would be the most well-equipped to deal with it…?

That’s when it occurred to me:  Animal Control.

And when I considered all the cultural associations we have with the word dogcatcher—“You couldn’t get elected dogcatcher in this town”—I knew I had my hero:  a civil servant who is the butt of everyone’s easy jokes, but whose specialized skills and tools and, ultimately, compassion are what save the day.

But it was, to be sure, a hell of a long road from that moment of inspiration to this:

When the basic concept was first devised, I wrote a 20-page story treatment for an early iteration of The Dogcatcher, dated August 25, 2009.  That same summer, I signed with new literary managers, who immediately wanted a summary of all the projects I’d been working on.  Among other synopses and screenplays, I sent them the Dogcatcher treatment.

They hated it.  They argued against the viability of mixing horror and humor, this despite a long precedent for such an incongruous tonal marriage in commercially successful and culturally influential movies the likes of An American Werewolf in London (1981), Ghostbusters (1984), Gremlins (1984), The Lost Boys (1987), Tremors (1990), Scream (1996), and Shaun of the Dead (2004), to say nothing of then–It Girl Megan Fox’s just-released succubus satire Jennifer’s Body (2009).  (I knew better than to cite seventy-year-old antecedents such as The Cat and the Canary and Hold That Ghost; Hollywood execs have little awareness of films that predate their own lifetimes.)  I was passionate about The Dogcatcher, but it was only one of several prospective projects I was ready to develop, so, on the advice of my new management, I put it in a drawer and moved on to other things.

Continue reading

Into Each Generation a Slayer Is Born:  How the “Buffy” Franchise Demonstrates the Differences between Gen X and Millennials

A cultural blip, disowned and dismissed.  A cultural phenomenon, nurtured and celebrated.  Is there any doubt Kristy Swanson’s Buffy the Vampire Slayer is an Xer, and Sarah Michelle Gellar’s a Millennial?


Joss Whedon famously dislikes the movie made from his original screenplay for Buffy the Vampire Slayer (1992), directed by Fran Rubel Kuzui and starring Kristy Swanson.  Seems he’d envisioned a B-movie with a Shakespearean soul, whereas Kuzui saw pure juvenile camp—an empowerment tale for prepubescent girls.

Buffy arrived right before it became cool for teenagers to brood about real things like depression and the cost of Doc Martens.  But something about this particular movie was bewitching to a tweeny bopper with an alternative undertow.  It had gloss and edge—but more gloss than edge.  This was a pre-Clueless, Skittles-tinted ode to California ditz. . . .  The result was an unfussy pre–Spice Girls girl-power fantasy for a 12-year-old kid.

Soraya Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer,” Atlantic, July 31, 2022

Only a modest success during its theatrical run, the cult horror/comedy found an appreciable audience on VHS.  Three years later, nascent netlet The WB saw an opportunity to bring the inspired concept of Valley girl–turned–vampire slayer to television—only this time under the auspices of the IP’s disgruntled creator:

Building on his original premise, he re-imagined the monsters as metaphors for the horrors of adolescence.  In one climactic scene, Buffy loses her virginity to a vampire who has been cursed with a soul; the next morning, his soul is gone and he’s lusting for blood.  Any young woman who had gone to bed with a seemingly nice guy only to wake up with an asshole could relate. . . .

In those early days of the internet, before nerd culture swallowed the world, fans flocked to a message board set up by the WB to analyze Buffy with the obsessive zeal of Talmudic scholars.  Whedon knew how to talk to these people—he was one of them.  He would visit the board at all hours to complain about his grueling schedule or to argue with fans about their interpretations of his work.  Back then, as he pointed out to me, the internet was “a friendly place,” and he, the quick-witted prince of nerds, “had the advantage of it.”

Lila Shapiro, “The Undoing of Joss Whedon,” Vulture, January 17, 2022

It is impossible to fully appreciate the monopolistic stranglehold geek interests have maintained on our culture over the first two decades of this millennium without acknowledging the pivotal role Buffy the Vampire Slayer (1997–2003) played in elevating such pulp ephemera to a place of mainstream legitimacy and critical respectability.  It was the right premise (Whedon pitched it as My So-Called Life meets The X-Files) on the right network (one willing to try new ideas and exercise patience as they found an audience) by the right creator (a card-carrying, self-professed geek) speaking to the right audience (impressionable Millennials) at the right time (the dawn of the Digital Age).  It all synthesized at exactly that moment.  Forget Booger—Buffy was our culture’s revenge of the nerds.

Sarah Michelle Gellar and Joss Whedon on the set of “Buffy the Vampire Slayer”

In what was surely a first for any geek or screenwriter, let alone a combo platter, a cult of hero worship coalesced around Whedon.  His genius was celebrated on message boards and at academic conferences, inked in books and on body parts.  “He was a celebrity showrunner before anyone cared who ran shows” (ibid.).

Master storyteller that he is, Whedon didn’t merely reset the narrative of Buffy; he reframed the narrative about it.  While serving as a loose sequel to the feature film, the television series wasn’t Buffy the Vampire Slayer 2 so much as Buffy the Vampire Slayer 2.0—a complete overhaul and upgrade.  This was Buffy as it was always intended to be, before Hollywood fucked up a great thing.  That the startup-network show emerged as a phoenix from the ashes of a major-studio feature only burnished Whedon’s geek-underdog credentials.  To utter the word “Buffy” was to be speaking unambiguously about the series, not the movie.

What movie?

In 1997, Whedon premiered his Buffy series on The WB and essentially wiped the film from the collective memory.

By that point, I had turned 17, and even though the show was more serious than the movie, even though its universe was cleverer and more cohesive, even though the silent episode “Hush” was probably one of the best things on television at the time it aired, Buffy was still a vampire show—to me, it was just kids’ play.  My adolescence adhered to a kind of Gen-X aimlessness, to indie films with lots of character and very little plot.  Whedon’s show seemed more like the kind of thing Reality Bites would make fun of—a juvenile, overly earnest studio product.

Roberts, “I’ll Always Love the Original Buffy the Vampire Slayer

As a member of Ms. Roberts’ demographic cohort, four years her senior, I’ll second that appraisal.  Yet for the Millennials who came of age in a post-Whedon world, and who were introduced to Buffy through the series—who fell in love with her on TV—Whedon’s creative contextualization of the movie became the universally accepted, unchallenged, and perennially reinforced perception of it:

You actually can’t watch the Buffy the Vampire Slayer film online, and honestly, you might be better off.  Luckily, all seven seasons of the Whedon-helmed (and approved) masterpiece that is Buffy the Vampire Slayer the series is easily streamed.  25 years later, Buffy movie is proof that our heroine was always better off in the hands of her maker.

Jade Budowski, “The ‘Buffy’ Movie At 25:  A Rough, Rough Draft Of The Magic That Followed,” Decider, July 31, 2017

The simultaneous display of blind devotion, proprietary entitlement, and self-assured dismissiveness in a statement like that, far from the only likeminded Millennial assessment of Buffy, is the kind of thing we humble Xers have spent a lifetime swallowing and shrugging off, even—especially—when we know better.  Not that anyone much cares what we have to say:

Here’s a refresher on the measliness of Generation X:  Our parents were typically members of the Silent Generation, that cohort born between 1928 and 1945—people shaped by the Great Depression and World War II, people who didn’t get to choose what they were having for dinner and made sure their kids didn’t either.  The parents of Gen X believed in spanking and borderline benign neglect, in contrast to the boisterous boomers and their deluxe offspring, the millennial horde. . . .

. . . Baby boomers and millennials have always had a finely tuned sense of how important they are.  Gen Xers are under no such illusion.  Temperamentally prepared to be criticized and undermined at all times, we never entirely trusted the people in charge anyway.

Pamela Paul, “Gen X Is Kind of, Sort of, Not Really the Boss,” Opinion, New York Times, August 14, 2022

Whereas the Millennials who deified Whedon have in recent years had to square their enduring love for Buffy with the spate of damning accusations against him—marital infidelity, feminist hypocrisy, emotionally abusive treatment of subordinates—the geek god’s fall from grace is no skin off Gen X’s nose; Big Daddy disavowed our Buffy, to the extent we feel that strongly about it one way or the other, decades ago.  Lucky for us, as Ms. Paul observes, we never entirely trusted the people in charge anyway.  And since Whedon’s critique of the Buffy movie remains to this day the culturally enshrined view of it, perhaps that merits reconsideration, too?

For the past quarter century, the differences between the Buffy movie and TV series have been authoritatively chalked up to all the usual cinema-snobbery bullshit:  tone and aesthetics and emotional depth and worldbuilding breadth.  Wrong.  The tonal disparity between the two Buffys has from the outset been greatly overstated.  The gap between Swanson’s Buffy and Gellar’s is, at its heart, generational.

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

Too Much Perspective: On Writing with Moral Imagination

Practicing morally imaginative storytelling means scrutinizing the values and messages encrypted in the fiction we produce—but it does not mean passing a “purity test.”


In Marty Di Bergi’s 1984 rockumentary This Is Spinal Tap, the titular British heavy-metal band, faced with ebbing popularity and flagging album sales, embarks on a disaster-prone tour of North America in support of its latest release, the critically savaged Smell the Glove.  During a stopover at Graceland to pay their respects to the King of Rock and Roll at his gravesite, lead vocalist David St. Hubbins comments, “Well, this is thoroughly depressing.”

To which bandmate and childhood best friend Nigel Tufnel responds, “It really puts perspective on things, though, doesn’t it?”

“Too much.  There’s too much fucking perspective now.”

It’s a sentiment to which we can all relate, collectively endowed as we’ve become with a migrainous case of “2020 vision.”  At the start of the pandemic, long before we had any sense of what we were in for let alone any perspective on it, I like many essayists felt the urge or need or even the responsibility to say something about it, despite knowing I had no useful or meaningful insight.  I netted out with an acknowledgment that the months to come would present a rare Digital Age opportunity for quiet introspection and reflection—one in which we might expand our moral imagination of what’s possible, to invoke the exquisite wisdom of my mentor Al Gore, and perhaps envision a world on the other side appreciably more just, equitable, and sustainable than the one we had before the global shutdown.

Did we ever.  Here in the United States, we are now wrestling with issues of economic inequality, structural racism, police brutality, environmental justice, and fair access to affordable housing and healthcare with an awareness and an urgency not seen in generations, and President Joe Biden—responding to the social movements of his times like FDR and LBJ before him—has proposed a host of progressive legislation that matches the visionary, transformative ambition of the New Deal and the Great Society.

Reuters via the New York Times

With heartening moral imagination (certainly more than this democratic eco-socialist expected from him), Biden is attempting to turn the page on the Randian, neoliberal narrative of the past forty years and write a new chapter in the American story—one founded on an ethos of sympathetic coexistence, not extractive exploitation.  With our continued grassroots support and, when necessary, pressure, he might even be the unlikely hero to pull it off, too—our Nixon in China.

As for me?  I spent most of the pandemic thinking about narrativity myself.  Doing nothing, after all, was a privilege of the privileged, with whom I am obliged to be counted.  So, I used the time in self-quarantine to think and to write about the stories we tell, and I arrived at the resolute conclusion that we—the storytellers—need to do a lot better.

Continue reading

Here Lies Buffy the Vampire Slayer: On Letting Go of a Fan Favorite—and Why We Should

Last month, actress Charisma Carpenter publicly confirmed a longstanding open secret in Hollywood:  Buffy the Vampire Slayer creator and Avengers writer/director Joss Whedon is an irredeemable asshole.

For years, fans of “Buffy the Vampire Slayer,” which aired on the WB and UPN from 1997 to 2003, have had to reconcile their adoration for a show about a teenage girl who slays monsters with the criticism that often swirled around her creator.

Mr. Whedon’s early reputation as a feminist storyteller was tarnished after his ex-wife, the producer Kai Cole, accused him of cheating on her and lying about it.  The actress Charisma Carpenter, a star of the “Buffy” spinoff “Angel,” hinted at a fan convention in 2009 that Mr. Whedon was not happy when she became pregnant.

In July, Ray Fisher, an actor who starred in Mr. Whedon’s 2017 film “Justice League,” accused him of “gross” and “abusive” treatment of the cast and crew. . . .

On Wednesday, Ms. Carpenter released a statement in support of Mr. Fisher, in which she said Mr. Whedon harassed her while she was pregnant and fired her after she gave birth in 2003. . . .

Over the past week, many of the actors who starred on “Buffy,” including Sarah Michelle Gellar, who played Buffy Summers, have expressed solidarity with Ms. Carpenter and distanced themselves from Mr. Whedon.  The actress Michelle Trachtenberg, who played Buffy’s younger sister, Dawn, alleged on Instagram on Thursday that Mr. Whedon was not allowed to be alone with her.

“I would like to validate what the women of ‘Buffy’ are saying and support them in telling their story,” Marti Noxon, one of the show’s producers and longtime writers, said on Twitter.  Jose Molina, a writer who worked on Mr. Whedon’s show “Firefly,” called him “casually cruel.”

Maria Cramer, “For ‘Buffy’ Fans, Another Reckoning With the Show’s Creator,” New York Times, February 15, 2021

If the copious fan-issued blog posts and video essays on this damning series of insider testimonials is an accurate barometer, Millennials have been particularly crestfallen over Whedon’s fall from grace.  It’s only over the last few years, really, I’ve come to truly appreciate just how proprietary they feel about Buffy the Vampire Slayer.  That surprises me still, because I tend to think of Buffy as a Gen X artifact; after all, the modestly successful if long-derided (by even screenwriter Whedon himself) feature film was released five years before its TV sequel.  (If you don’t remember—and I’ll bet you don’t—the movie’s shockingly impressive cast includes no less than pre-stardom Xers Hilary Swank and Ben Affleck.)  I recall seeing this one-sheet on a subway platform during the summer between sophomore and junior years of high school—

Fran Rubel Kuzui’s “Buffy the Vampire Slayer” (1992)

—and thinking somebody had finally made a spiritual sequel to my formative influence:  Joel Schumacher’s Gen X cult classic The Lost Boys.  (Turned out, however, I was gonna have to do that myself.)  I was sold!  I marvel still at how the advertisement’s economical imagery conveys the movie’s entire premise and tone.  So, yes—I was the one who went to see Buffy the Vampire Slayer in theaters.  Guilty as charged.

But it was the TV series, I’ll concede, that took Buffy from creative misfire to cultural phenomenon, so it stands to reason it made such an indelible impression on Millennials.  I submit that more than any content creator of his cohort—more so than even celebrated pop-referential screenwriters Kevin Smith or Quentin Tarantino or Kevin Williamson—Whedon is preeminently responsible for the mainstreaming of geek culture at the dawn of the Digital Age.

Buffy not only coincided with the coming out of geeks from the dusty recesses of specialty shops, it helped facilitate that very cultural shift:  As John Hughes had done for Gen X a decade earlier, Whedon spoke directly to the socially and emotionally precarious experience of adolescent misfits, and his comic-book-informed sensibilities (before such influences were cool) endowed the Buffy series with a rich, sprawling mythology—and star-crossed romance (beautiful though it is, Christophe Beck’s Buffy/Angel love theme, “Close Your Eyes,” could hardly be described as optimistic)—over which fans could scrupulously obsess.

What’s more, all three cult serials Whedon sired were alienated underdogs in their own right:  Buffy the Vampire Slayer, a reboot of a campy B-movie on a fledgling, tween-centric “netlet” that no one took seriously; Angel, a second-class spinoff that was perennially on the brink of cancelation (and ultimately ended on an unresolved cliffhanger); and Firefly, his ambitious Star Wars–esque space opera that lasted exactly three months—or less than the average lifespan of an actual firefly.  That these shows struggled for mainstream respect/popular acceptance only burnished Whedon’s credentials as the bard of geek-outsider angst…

Continue reading

Grounded and Elevated: Screenwriting Secrets for a Sure-Thing Hollywood Pitch

Despite everything, it seems I still have a few friends in Tinseltown.  A development exec I know, aware of my blog’s polemical crusade against late-twentieth-century nostalgia as well as creatively and morally bankrupt storytelling, recently forwarded several e-mails containing informal pitches (from agented writers) he’d solicited for “reboots” of three classic IPs straight from the Gen X archives.  They offer fascinating firsthand insight into the demoralizing vocation of Hollywood screenwriting.

It might surprise those outside the industry to learn only a small fraction of a given screenwriter’s time and effort is spent developing original stories, known as “spec scripts.”  Few of those screenplays ever sell (certainly nowadays), and fewer still are produced; mostly, such projects are mere “calling cards”—writing samples designed to establish a scribe’s commercial sensibilities and creative credentials so he or she might be given the opportunity to vie for “open assignments.”  In those instances, a prodco controls the film rights to an intellectual property (IP)—a novel, a comic book, an old TV series—and, accordingly, invites such candidates to come in and pitch a take on it.

For instance, my prison break–zombie outbreak mashup Escape from Rikers Island afforded me opportunities to pitch cinematic adaptations of the pseudo-documentary series Ancient Aliens and the Japanese manga MPD-Psycho, as well as a remake of the 1992 action thriller Trespass.  If you’re higher up on the food chain—in, say, J. J. Abrams territory—that’s when you might get a shot at a gold-plated franchise like Star Wars or Mission:  Impossible.

Nicolas Cage as an anxiety-riddled screenwriter struggling to adapt “The Orchid Thief” in “Adaptation”

Because for the most part, Hollywood isn’t looking for new ideas; they have enough branded IPs to keep them in business through infinity and beyond.  What they’re looking for are skilled stenographers—writers-for-hire who can take a preexisting property and, juggling input from a thousand different chefs in the kitchen, turn it into a viable script for which a movie studio will be persuaded to invest millions of dollars.  That’s the litmus test:  Can you take an established IP and from it write a script that will motivate the studio to write a check?

The following proposals provide an insider’s glimpse into that singular development process.  With my contact’s express permission, I have reproduced the relevant text from his e-mails verbatim, including all typographical errors and syntactical idiosyncrasies, but excluding the identities of the authors, their representation, the executive, and his production company.

Continue reading

The Lost Boys of the Bronx: A Tribute to Joel Schumacher

Batman Forever and The Lost Boys director Joel Schumacher died on Monday, June 22, at the age of eighty after a yearlong battle with cancer.  In an industry where branding is sacrosanct, his brand, as it were, was his steadfast refusal to be artistically pigeonholed:  Hit-and-miss though his track record may be, he was a rare breed of filmmaker who worked in virtually every genre, from comedy (D.C. Cab; Bad Company) to drama (Cousins; Dying Young) to sci-fi/horror (Flatliners; Blood Creek) to crime thriller (Falling Down, 8mm) to legal thriller (The Client, A Time to Kill) to musical (The Phantom of the Opera).  His filmography is as winding and unconventional as was his path to commercial success:

Schumacher was born in New York City in 1939 and studied design at Parsons and the Fashion Institute of Technology. . . .

When Schumacher eventually left fashion for Hollywood, he put his original trade to good use, designing costumes for various films throughout the Seventies. . . .  He also started writing screenplays during this time, including the hit 1976 comedy Car Wash and the 1978 adaptation of the musical The Wiz.

In 1981, Schumacher made his directorial debut with, The Incredible Shrinking Woman, a sci-fi comedy twist on Richard Matheson’s 1959 novel, The Shrinking Man, starring Lily Tomlin.  Fitting the pattern that would define his career, the film was a financial success but a flop with critics. . . .

Schumacher’s true breakout came a few years later in 1985, when he wrote and directed St. Elmo’s Fire, the classic post-grad flick with the Brat Pack cast, including Rob Lowe, Demi Moore and Judd Nelson.  Two years later, he wrote and directed The Lost Boys, a film about a group of teen vampires that marked the first film to star both Corey Feldman and Corey Haim, effectively launching the heartthrob duo known as “the Coreys.”

Jon Blistein, “Joel Schumacher, Director of ‘Batman & Robin,’ ‘St. Elmo’s Fire,’ Dead at 80,” Rolling Stone, June 22, 2020

Though Schumacher did not write The Lost Boys (1987) as the Rolling Stone piece erroneously asserts (the screenplay is credited to Janice Fischer & James Jeremias and Jeffrey Boam), neither his creative imprint on the project nor the cultural impact of the movie itself can in any way be overstated.  Sure, teenage vampires may be a dime-a-dozen cottage industry now, from Buffy the Vampire Slayer to Twilight to The Vampire Diaries, but if you happened to grow up on any of those Millennial staples, it’s worth knowing that pubescent bloodsuckers had never really been done prior to The Lost Boys—no, that celebrated iteration of the vampire’s pop-cultural evolution is entirely owed to the pioneering vision of Joel Schumacher.

Late filmmaker Joel Schumacher; photo by Gabriella Meros/Shutterstock, 2003 (498867t)

When Richard Donner left the project to direct Lethal Weapon instead, the script Schumacher inherited was essentiallyThe Goonies… with vampires.”  By aging up the characters from preteens to hormonal adolescents, Schumacher saw a creative opportunity to do something scarier—and sexier.  A cult classic was thusly born, and though The Lost Boys itself never became a franchise (save a pair of direct-to-video sequels two decades later, and the less said about them, the better), its fingerprints are all over the subgenre it begat.  We owe Schumacher a cultural debt for that.

Kiefer Sutherland’s David (second from left) leads a gang of teenage vampires in “The Lost Boys”

And I owe him a personal debt.  Over any other formative influence, The Lost Boys is directly and demonstrably responsible for my decision to study filmmaking in college and then to pursue a screenwriting career in Hollywood.  More than simply my professional trajectory, in point of fact, my very creative sensibilities were indelibly forged by that film:  The untold scripts and novels I’ve written over the past quarter century have almost exclusively been tales of the supernatural with a strong sense of both humor and setting—the very qualities The Lost Boys embodies so masterfully and memorably.  All of that can be traced to the summer of 1994.

Continue reading

Misery Sans Company: On the Opportunities and Epiphanies of Self-Isolation

March?  Please!  I’ve been in self-isolation since January.

No, I was not clairvoyantly alerted to the impending coronavirus pandemic; only our dear leader can claim that pansophic distinction.  Rather, my wife started a new job at the beginning of the year, necessitating a commute, thereby leaving me carless.  (Voluntarily carless, I should stipulate:  I refuse to be a two-vehicle household; as it is, this congenital city kid, certified tree-hugger, and avowed minimalist owns one car under protest.)

My obstinance, however, comes at a cost:  I don’t live within convenient walking distance of anything save a Chevron station (the irony of which is only so amusing), so while the missus is at work, I’m effectively immobilized.  I got nowhere to go… save the home office opposite my bedroom.  Thusly, I made a conscious decision at the start of the year to embrace my newfound confinement as a creative opportunity—to spend the entirety of winter devoted all but exclusively to breaking the back of my new novel.  I kept my socializing and climate activism to a minimum during this period, submitting to the kind of regimented hourly schedule I haven’t known since my college days.

Johnny Depp in creative self-isolation in “Secret Window” (2004), from Stephen King’s novella

Before long, my period of self-imposed artistic self-isolation was yielding measurable results, and I’d been looking forward to emerging from social exile.  The week I’d earmarked for my “coming-out party”?  You guessed it:  The Ides of March.

I instead spent St. Paddy’s week mostly reeling, knocked sideways—as I imagine many were—by the speed and scale at which this crisis ballooned.  But in the days that followed, I resolved to compartmentalize—to get back to work.  I still had my codified daily routine, after all, which required a few adjustments and allowances under the new circumstances, and I had a project completely outlined and ready to “go to pages.”  So, that’s what I turned to.

And in short order, I’d produced the first two chapters, which, for me, are always the hardest to write, because I have no narrative momentum to work with as I do in later scenes.  You open a blank Scrivener document, and—BOOM!—all your careful planning and plotting, your meticulously considered character arcs and cerebral theme work?  It ain’t worth shit at that ex nihilo instant.  You may’ve built the world, but how do you get into it?  Writing that first sentence, that first paragraph, that first scene, that first chapter is like feeling your way around in the dark.  (Fittingly, my first chapter is literally about three guys finding their way through a forest path in the pitch black of night.)

“Going to pages” turned out to be just the intellectual occupation I needed to quell my anxiety, to give me a reprieve from our present reality.  And now that I’ve got story momentum, slipping into the world of my fiction every morning is as easy as flicking on the television.  For the three or four hours a day I withdraw to my personal paracosm, I’m not thinking about anything other than those characters and their problems.  As such, I’ve thus far sat out this crisis in my study, trafficking in my daydreams to pass the time; I’m not treating patients, or bagging groceries, or delivering packages, or working the supply chain, or performing any of the vital services upholding our fragile social order.  Instead, I’m playing make-believe.

Self-isolation didn’t serve Stephen King’s Jack Torrance particularly well in “The Shining”

It wasn’t long ago—Christmas, in fact—I’d issued an earnest, hopeful plea that in the year to come we might all forsake our comforting fictions, our private parallel dimensions, in favor of consciously reconnecting with our shared nonfictional universe.  And now here many of us find ourselves, banished from the streets, from the company of others, confined by ex officio decree to our own hermetic bubbles—as of this writing, 97% of the world is under stay-at-home orders—with nowhere to retreat but our escapist fantasies.  I’ve been reliant upon them, too—even grateful for them.

And that got me thinking about Stephen King’s Misery.  As masterful, and faithful in plotting, as Rob Reiner’s movie adaptation (working from a screenplay by William Goldman) is to King’s book, the theme—the entire point of the narrative—gets completely lost in translation.  This is a story about addiction, as only King could tell it:  It’s about how drugs (in this case, prescription-grade painkillers) help us cope with misery, but it’s also about how art can be an addictive—and redemptive—coping mechanism, as well; how it can turn misery into a kind of beauty, especially for the artist himself.

Continue reading

It’s Alive! Return of the Universal Classic Monsters

Ah, the “shared cinematic universe”—the favored narrative model–cum–marketing campaign of the new millennium!  Pioneered by Marvel, it wasn’t long before every studio in town wanted a “mega-franchise” of its own, feverishly ransacking its IP archives for reliable brands to exploit anew.  By resurrecting the Universal Classic Monsters, Universal Studios saw an opportunity to create its own interconnected multimedia initiative… and the so-called “Dark Universe” was born.

Well, not born, exactly—more like announced.  When the first offering, Dracula Untold, took a critical beating and underperformed domestically, Universal promptly issued a retraction:  “Just kidding!  That wasn’t really the first Dark Universe movie!”  An all-star cast was hastily assembled:  Russell Crowe as Jekyll and Hyde!  Javier Bardem as Frankenstein’s monster!  Johnny Depp as the Invisible Man!  Angelina Jolie as the Bride of Frankenstein!  And first up would be Tom Cruise in The Mummy

Um… isn’t this precisely the kind of arrogant presumption most of the Universal Classic Monsters came to regret?

Except—whoops!The Mummy bombed, too… at which point the sun rather quietly went down on the Dark Universe project altogether.  Seems launching a shared fictional universe is considerably harder than Marvel made it look.  Imagine that.

The thing is, we already had a revival—arguably a cinematic renaissance—of the Universal Classic Monsters in the 1990s.  Dracula, Frankenstein, the Mummy, the Invisible Man, the Wolf Man, and Dr. Jekyll and Mr. Hyde were given gloriously Gothic reprisals in an (unrelated) series of studio features that starred some of the biggest names in Hollywood.  None of those projects were cooked up in a corporate think tank, but were instead the idiosyncratic visions of a diverse group of directors—the artists behind no less than The Godfather, The Graduate, The Crying Game, Dangerous Liaisons, and Basic Instinct, to name a few—employing horror’s most recognizable freaks to (for the most part) explore the anxiety of confronting the end of not merely a century, but a millennium.

If the respective creative efforts of these filmmakers were uncoordinated, their common agenda was entirely logical.  Many of their fiendish subjects, after all, first arrived on the cultural scene at the end of the previous century:  Strange Case of Dr Jekyll and Mr Hyde was published in 1886; both Dracula and The Invisible Man in 1897.  Furthermore, their stories tended to speak to either the hazards of zealous scientific ambition (Frankenstein, The Invisible Man, Dr Jekyll and Mr Hyde), or, in the case of Dracula and The Mummy, the limitations of it—of humankind’s attempts to tame the natural world through technology:  “And yet, unless my senses deceive me, the old centuries had, and have, powers of their own which mere ‘modernity’ cannot kill” (from Jonathan Harker’s journal, dated 15 May).

Even the Wolf Man serves as a metaphor for the primal instincts we’ve suppressed under our civilized veneer; far from having learned to let our two halves coexist in harmony, they are instead at war within the modern man and woman.  These are existential issues that seem to weigh more heavily on us at the eve of a new epoch, which is arguably why the monstrous creations we use to examine them flourished in the literature of the 1890s and then again, a century later, through the cinema of the 1990s.  It goes to illustrate that sometimes fictional characters simply speak to their times in a very profound way that can’t be engineered or anticipated.  It’s just alchemical, much as Hollywood would prefer it to be mathematical.

With that in mind, let’s have a look at the unofficial “Universal Classic Monsters reprisal” of the nineties (and I’ve included a few other likeminded films from the movement) to better appreciate what worked and what sometimes didn’t.

Continue reading
« Older posts

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑