Writer of things that go bump in the night

Tag: Hollywood (Page 2 of 5)

EXT. LOS ANGELES – ONE YEAR LATER

I thought I’d said everything I had to say about Los Angeles last winter.  Should’ve known Hollywood would demand a sequel.


Even at the height of its considerable cultural influence, I never much cared for Sex and the City—for a very simple reason:  I didn’t in any way recognize the New York it depicted.

As someone who’d grown up there, Sex seemed like a postfeminist fantasy of the city as a bastion of neoliberal materialism, conjured by someone who’d never actually been to New York or knew so much as the first thing about it.  It certainly didn’t reflect the experience of any working-class New Yorkers I knew.

(It would seem the more things change, the more they stay the same:  The recent SATC revival series, And Just Like That…, is reported to be full of unintentionally cringe-inducing scenes of the gals apparently interacting with Black women for the first time in their lives.  Sounds on-brand.)

But this isn’t a retroactive reappraisal of a 1990s pop-cultural pacesetter—those have been exhaustively conducted elsewhere of late—merely an acknowledgment that the impression the series made on the generation of (largely) female Millennials who adored it is undeniable, legions of whom relocated to New York in early adulthood to have the full Sex and the City experience, and who, in turn, in many ways remade the city in Carrie Bradshaw’s image, for better or worse.

I can’t say as I blame those folks, really.  That they were sold a load of shit isn’t their fault.  Here in New York, we were just as susceptible to Hollywood’s greener-grass illusions of elsewhere.  As a student in the 1990s, the Los Angeles of Beverly Hills, 90210 (1990–2000) and Baywatch (1989–2001), of Buffy the Vampire Slayer (1992) and Clueless (1995), seemed like a fun-in-the-sun teenage paradise in stark contrast with the socially restrictive experience of my all-boys high school in the Bronx, where the only thing that ever passed for excitement were spontaneous gang beatings at the bus stop on Fordham Road.

The high-school experience depicted on “Beverly Hills, 90210” is one I think we can all relate to

The sunny schoolyards and neon-lit nighttime streets of L.A. carried the promise of good times, the kind that seemed altogether out of reach for me and my friends.  The appeal of what California had to offer was so intoxicating, in fact, my two best pals and I spent an entire summer in the mid-’90s trying to make the streets of the Bronx look like Santa Cruz—a place none of us had ever been—for an amateur sequel to The Lost Boys, the ’80s cult classic about a coven of adolescent vampires who’ve (wisely) opted to spend eternity on the boardwalk.  That notion unquestionably took hold of my impressionable imagination—it made me want to be a part of that culture, and tell those kinds of stories.

Accordingly, it’s fair to say it wasn’t merely the movie business that brought me to Los Angeles in my early twenties as an aspiring screenwriter, but arguably the romantic impressions of California itself imprinted upon my psyche by all those movies and TV series on which I came of age.  Yet for the two decades I lived there, the city I’d always imagined L.A. to be—a place full of golden possibilities, as low-key as New York was high-strung—wasn’t the one I experienced.  Not really.  Not until last month, anyway.

Continue reading

Book Review:  “Blood, Sweat & Chrome” by Kyle Buchanan

Kyle Buchanan’s Blood, Sweat & Chrome, published by William Morrow in February, chronicles the not-to-be-believed making of George Miller’s Mad Max:  Fury Road (2015) from conception to release through interviews with its cast and crew, and celebrates the inspiring creative imagination of the filmmakers, who defied the odds to create a contemporary classic—a movie as singularly visceral as it is stunningly visual.

But much like the nonstop action in the movie itself, the adulation expressed in the book never pauses to interrogate Miller and company’s moral imagination.  Let’s fix that, shall we?


I abhor nostalgia, particularly for the 1980s and ’90s, but I’ve recently found myself revisiting many of the films and television shows of the latter decade, the period during which I first knew I wanted to be a cinematic storyteller, when earnest star-driven Oscar dramas like Forrest Gump (1994) coexisted with, and even prospered alongside, paradigm-shifting indies à la Pulp Fiction (also ’94).  Those days are gone and never coming back—the institution formerly known as Hollywood is now the superhero–industrial complex—but I’ve wondered if some of those works, so immensely popular and influential then, have stood the test of time?

Yet my informal experiment has been about much more than seeing if some old favorites still hold up (and, by and large, they do); it’s about understanding why they worked in the first place—and what storytelling lessons might be learned from an era in which movies existed for their own sake, as complete narratives unto themselves rather than ephemeral extensions of some billion-dollar, corporately superintended brand.

In an entertainment landscape across which there is so much content, most of it deceptively devoid of coherence or meaning—a transmedia morass I’ve come to call the Multiverse of Madness—the secret to studying narrativity isn’t to watch more but rather less.  To consume fewer movies and TV shows, but to watch them more selectively and mindfully.  Pick a few classics and scrutinize them until you know them backwards and forwards.

In college, I spent an entire semester analyzing Citizen Kane (1941), from reading multiple drafts of its screenplay to watching it all the way through with the volume turned down just to appreciate its unconventional cinematography.  That’s how you learn how stories work:  Study one or two movies/novels per year… but study the shit out of them.  Watch less, but do it far more attentively.

Tom Hardy as Max Rockatansky in “Mad Max: Fury Road,” the subject of “Blood, Sweat & Chrome”

That is, admittedly, a counterintuitive mindset in our Digital Age of automatic and accelerating behaviors, whereby post-credit scenes preemptively gin up anticipation for the next movie (often through homework assignments) before we’ve had a chance to digest the current one, and the autoplay feature of most streaming services encourages and enables mindless TV binge-watching.

But the quarantine, unwelcome though it may have been, did offer a pause button of sorts, and we are only now beginning to see some of the ways in which folks exploited the rare opportunity to slow down, to go deep, that it offered.  One such project to emerge from that period of thoughtful reflection is entertainment journalist Kyle Buchanan’s recently published nonfiction book Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road”:

In April 2020, as the pandemic swept the planet and the movie-release calendar fell apart, I began writing an oral history of Mad Max:  Fury Road for the New York Times.  Without any new titles to cover, why not dive deeply into a modern classic on the verge of its fifth anniversary?

Every rewatch over those five years had confirmed to me that Fury Road is one of the all-time cinematic greats, an action movie with so much going on thematically that there’d be no shortage of things to talk about.  I had also heard incredible rumors about the film’s wild making, the sort of stories that you can only tell on the record once the dust has long settled.

Kyle Buchanan, Blood, Sweat & Chrome:  The Wild and True Story of “Mad Max:  Fury Road” (New York:  William Morrow, 2022), 337

A movie two decades in the making, Fury Road, the belated follow-up to writer/director George Miller’s dystopian action-film trilogy Mad Max (1979, 1981, 1985) starring a then-unknown Mel Gibson as a wanderer in the wasteland—the Road Warrior—began its long journey to the screen as a proposed television series in 1995 when Miller won back the rights to the franchise from Warner Bros. as part of a settlement from a breach-of-contract suit he’d filed over having been fired from Contact (1997).

Eventually inspired to do another feature instead—“What if there was a Mad Max movie that was one long chase,” Miller pondered, “and the MacGuffin was human?” (ibid., 31)—the ensuing production was plagued with one near-terminal roadblock after another.  The behind-the-scenes story told in Blood, Sweat & Chrome is as thrilling, in its own way, as that of Mad Max:  Fury Road itself.

Continue reading

“Young Indiana Jones” Turns 30:  Storytelling Lessons from George Lucas’ Other Prequel Series

A television series based on an immensely popular action-movie franchise shouldn’t have been a creative or commercial risk—quite the opposite.  But with The Young Indiana Jones Chronicles, which premiered on March 4, 1992, filmmaker George Lucas had no intention of producing a small-screen version of his big-screen blockbusters.  Here’s how Lucas provided a richly imaginative model for what a prequel can and should be—and why it would never be done that way again.


Though he more or less innovated the contemporary blockbuster, George Lucas had intended—even yearned—to be an avant-garde filmmaker:

Lucas and his contemporaries came of age in the 1960s vowing to explode the complacency of the old Hollywood by abandoning traditional formulas for a new kind of filmmaking based on handheld cinematography and radically expressive use of graphics, animation, and sound.  But Lucas veered into commercial moviemaking, turning himself into the most financially successful director in history by marketing the ultimate popcorn fodder.

Steve Silberman, “Life After Darth,” Wired, May 1, 2005

After dropping the curtain on his two career- and era-defining action trilogies (Star Wars concluded in 1983, then Indiana Jones in ’89), then failing to launch a new franchise with Willow (his 1988 sword-and-sorcery fantasy fizzled at the box office, though even that would-be IP is getting a “legacy” successor later this year courtesy the nostalgia–industrial complex), Lucas did in fact indulge his more experimental creative proclivities—through the unlikeliest of projects:  a pair of prequels to both Indiana Jones and Star Wars.  And while both arguably got made on the strength of the brands alone, the prequels themselves would, for better and worse, defy the sacrosanct conventions of blockbuster cinema—as well the codified narrative patterns of Joseph Campbell’s “heroic journey”—that audiences had come to expect from Lucas.

A perfunctory scene in Return of the Jedi, in which Obi-Wan finally explains Darth Vader’s mysterious backstory to Luke (a piece of business that could’ve been easily handled in the first film, thereby sparing the hero needlessly considerable risk and disillusionment in The Empire Strikes Back, but whatever), served as the narrative foundation for Lucas’ Star Wars prequel trilogy (1999–2005), in which a precocious tike (The Phantom Menace) matures into a sullen teenager (Attack of the Clones) before warping into a murderous tyrant (Revenge of the Sith).  Underpinning Anakin’s emo-fueled transformation to the dark side is a byzantine plotline about Palpatine’s Machiavellian takeover of the Republic.  Meanwhile, references to the original trilogy, from crucial plot points to fleeting sight gags, abound.

You’ve all seen the movies, so I’ll say no more other than to suggest the story arc—which is exactly what Obi-Wan summarized in Return of the Jedi, only (much) longer, appreciably harder to follow, and a tonally incongruous mix of gee-whiz dorkiness and somber political intrigue—is precisely the kind of creative approach to franchise filmmaking that would’ve been summarily nixed in any Hollywood pitch meeting, had Lucas been beholden to the corporate precepts of the studio system from which the colossal success of the original Star Wars afforded him his independence.

George Lucas on the set of the “Star Wars” prequels

Which is not to say Lucas’ artistic instincts were infallible.  Financially successful though the prequels were, audiences never really embraced his vision of an even longer time ago in a galaxy far, far away:  Gungans and midi-chlorians and trade disputes didn’t exactly inspire the wide-eyed amazement that Wookiees and lightsabers and the Death Star had.

Maybe by that point Star Wars was the wrong franchise with which to experiment creatively?  Perhaps it had become too culturally important, and audience expectations for new entries in the long-dormant saga were just too high?  In the intervening years, Star Wars had ceased to be the proprietary daydreams of its idiosyncratic creator; culturally if not legally, Star Wars kinda belonged to all of us on some level.  By explicitly starting the saga with Episode IV in 1977, he’d invited each of us to fill in the blanks; the backstory was arguably better off imagined than reified.

As an IP, however, Indiana Jones, popular as it was, carried far less expectation, as did the second-class medium of network television, which made Lucas’ intended brand extension more of an ancillary product in the franchise than a must-see cinematic event—more supplemental than it was compulsory, like a tie-in novel, or the Ewok telefilms of the mid-eighties.  The stakes of the project he envisioned were simply much lower, the spotlight on it comfortably dimmer.  In the event of its creative and/or commercial failure, Young Indiana Jones would be a franchise footnote in the inconsequential vein of the Star Wars Holiday Special, not an ill-conceived vanity project responsible for retroactively ruining the childhoods of millions of developmentally arrested Gen Xers.  Here Lucas expounds on the genesis of the series:

Continue reading

A Hollywood Ending: Hopeful Reflections on a Failed Screenwriting Career

I’ve alluded to the irretrievable implosion of my screenwriting career in many a previous blog post.  I never felt ready to write about it at length before now.  So, since we were just recently discussing the artful revelation of backstory, here’s mine.


Given the long odds of a career in Hollywood, even under the most favorable of circumstances, the unexpressed question that looms ominously over every aspirant is:  How do I know when it’s time to call this quits?

My wife and I were having drinks at the S&P Oyster Co. in Mystic, Connecticut, when I knew I was done with Hollywood forever—that my ship wasn’t coming.  That was September 24, 2014, during a visit to the East Coast for her aunt and uncle’s golden-anniversary party, exactly thirteen years to the day after we’d relocated from our hometown of New York City to L.A.

Right out of college, I’d landed representation as a screenwriter—though that management company folded a few months prior to my move, catalyzing, at least in part, my decision to try my luck in Tinseltown—and I had a reel full of TV spots and short films I’d cut while working as an audiovisual editor in SoHo, so I felt certain I’d land on my feet in Hollywood, this despite having no contacts there.

So, in the predawn hours of Tuesday, September 11, 2001, I left the Bronx, the only home I’d ever known, and met my wife, though we weren’t married at the time, at JFK Airport to embark on our new adventure together.  Perhaps the cosmic timing of our departure (which was delayed by two weeks) should’ve been taken as a sign that the road ahead would be bumpier than I’d naïvely anticipated?

It took a full year in L.A. before I could even get a call returned, but finally I got some opportunities to edit a few independent shorts and features, and began networking my way into the industry.  But it would be another seven years yet before I procured representation as a screenwriter again, during which time I can’t tell you how many contemporaries I watched pack up their shit and abandon their dreams to move back home.  They’d decided it wasn’t worth it, that life was too short.  I’m certain I’d have been one of them were it not for my wife, who remained steadfastly supportive, and for a few friends—notably my buddy Mike—who were also Hollywood hopefuls determined to keep at it, too, through bad times and, well, less bad.  We were going to be the ones that hung in there and made it.

By 2009, things were looking up—considerably.  At long last I’d found representation once again with a management company, this time off a spec I’d written called Leapman, and all manner of opportunities soon followed:  to turn Leapman into a comic-book series; to sign with a big-letter talent agency; to vie for open screenwriting assignments; to develop an undersea sci-fi thriller (in the vein of The Abyss and Sphere) with a red-hot producer.

From “The Abyss” (1989), a movie about deep-sea extraterrestrials akin to the one I was developing

Around this same time, I got friendly with another up-and-coming screenwriter—we were repped by the same management—and he and I formed a critique group, enthusiastically enlisting half a dozen fledgling screenwriters we barely knew.  In short order, we all became close friends, meeting every other Tuesday night at one watering hole or another around Hollywood to trade script notes and war stories.  All unknowns at the time, some of those scribes have since gone on to write for shows including The Handmaid’s Tale and Women of the Movement, as well as WandaVision and Ted Lasso.

I was also, during this period, developing a short film with Mike.  He and I had met in 2003 on the postproduction crew of an indie film; we were on location in the redwoods of Marin County, right down the road from Skywalker Ranch, cutting dailies in a ramshackle cabin that looked for all the world like Ewok Village Hall.  Under those circumstances, it didn’t take long to become fast friends:  We were the same age, came up on the same cinematic influences, and—most notably—shared the same irreverent sense of humor, turning our verbal knives on all of Hollywood’s sacred cows, delighting in making one another howl with one progressively outrageous remark after the next.

Also like me, Mike was married to his teenage sweetheart, sans children, so we were both in the same place:  free to pursue our Hollywood dreams with the support of the women we loved.  It was and remains the closest male friendship I’ve ever made in my adult life.  As Mike continued to come into ever-more-promising editorial opportunities on studio features, my screenwriting career was kicking into high gear.  With aspirations to direct, he asked me if I wouldn’t mind taking one of my concepts—a horror/comedy I’d pitched him that reflected our mutual sensibilities—and scripting a short film for him to shoot.  So, there I was, developing a big-budget monster movie for a legit prodco by day, and a no-budget monster movie with my best friend by night.  After over a decade in Hollywood, everything had clicked into place.

And then came 2014.  Frustrated with the inexcusable lack of progress on the short—I’d written a script all of us were expressly happy with, and yet years had gone by and we were no closer to rolling camera—I put pressure on the project’s producer, Mike’s spouse, to do her part.  Consequently, for the first time in our decade-long association, our friendship grew strained, and once we both crossed the line and turned our caustic criticisms, the source of so many years of bonding and hilarity, on each other, our relationship eventually became irreversibly poisoned.  I’d lost my closest friend and ally in Hollywood, and that was only the beginning of my troubles.

Continue reading

“Scream” at 25: Storytelling Lessons from Wes Craven’s Slasher Classic

In honor of the twenty-fifth anniversary of Wes Craven’s Scream, released on this date in 1996, here’s how the movie revived a genre, previewed a defining characteristic of Generation X, dramatized the psychological toll of trauma with uncommon emotional honesty—and how it even offers a roadmap out of the prevailing narrative of our time:  extractive capitalism.


For all the decades we’ve been together, my wife and I have observed a particular protocol, probably owed to how many movies we used to see at the two-dollar cinema in Hell’s Kitchen when we were dirt-poor college students:  Upon exiting the theater, neither issues a comment on or reaction to the film we just saw.  Instead, we save the discussion for when we’re seated at a nearby restaurant, at which point one or the other invariably asks, “Do you want to go first?”  As far as I can recall, we’ve broken with that tradition but once.

“We just saw a classic,” she blurted as we staggered our way through the lobby moments after seeing Scream.  “They’ll still be talking about that in twenty years.”  (Such an estimate, in fairness, seemed like a glacially long time when you’re only as many years old.)

In fact, a full quarter century has now passed since the release of the late Wes Craven’s postmodern slasher masterpiece, and the movie has very much earned a fixed place in the cultural consciousness.  That opening sequence alone, so shocking at the time, hasn’t lost any of its power to frighten and disturb; an entire semester could be spent studying it, from the exquisite camerawork to the dramatic pacing to Drew Barrymore’s heartwrenchingly credible performance as a young woman scared shitless—and this despite having no one in the scene to act against save a voice on a phone.  Ten minutes into the movie, its marquee star is savagely disemboweled… and now you don’t know what the hell to expect next!

Drew Barrymore as Casey Becker in “Scream”

I really can’t say I’ve seen a horror film since that was at once so scary, clever, entertaining, influential, and of its moment the way Scream was.  With eerie prescience, Craven and screenwriter Kevin Williamson (born 1965) seemed to put their finger on an idiopathic attribute of Generation X that would, as Xers settled into adulthood and eventually middle age, come to define the entirety of the pop-cultural landscape over which we currently preside:  that rather than using fiction to reflect and better understand reality—viewing narrativity as “a coherent design that asks questions and provides opinions about how life should be lived,” per Christopher Vogler—we more or less gave up on understanding reality in favor of mastering the expansive, intricate storyworlds of Star Wars and Star Trek, DC and Marvel, Westworld and Game of Thrones.  And such figure-ground reversal started long before the Marvel–industrial complex capitalized on it.

In the early ’90s, as the first members of Gen X were becoming filmmakers, avant-garde auteurs like Quentin Tarantino (born 1963) and Kevin Smith (1970) not only devoted pages upon pages in their screenplays to amusingly philosophical conversations about contemporary pop culture, but the characters across Tarantino and Smith’s various movies existed in their own respective shared universes, referencing other characters and events from prior and sometimes even yet-to-be-produced films.  That kind of immersive cinematic crosspollination, inspired by the comic books Tarantino and Smith had read as kids, rewarded fans for following the directors’ entire oeuvres and mindfully noting all the trivial details—what later came to be known as “Easter eggs.”

What’s more, the trove of pop-cultural references embedded in their movies paid off years of devoted enrollment at Blockbuster Video.  Whereas previously, fictional characters seemed to exist in a reality devoid of any pop entertainment of their own—hence the reason, for instance, characters in zombie movies were always on such a steep learning curve—now they openly debated the politics of Star Wars (Clerks); they analyzed the subtext of Madonna lyrics (Reservoir Dogs); they waxed existential about Superman’s choice of alter ego (Kill Bill:  Volume 2); they even, when all was lost, sought the sagacious counsel of that wisest of twentieth-century gurus:  Marvel Comics’ Stan Lee (Mallrats).

For Gen X, our movies and TV shows and comics and videogames are more than merely common formative touchstones, the way, say, the Westerns of film (Rio Bravo, The Magnificent Seven) and television (Bonanza, Gunsmoke) had been for the boomers.  No, our pop culture became a language unto itself:  “May the Force be with you.”  “Money never sleeps.”  “Wax on, wax off.”  “Wolfman’s got nards!”  “I’m your density.”  “Be excellent to each other.”  “Do you still want his daytime number?”  “Just when you thought it was safe to go back in the water…”

Those are more than quotable slogans; they’re cultural shorthands.  They express a worldview that can only be known and appreciated by those of us encyclopedically literate in Reagan-era ephemera, like the stunted-adolescence slackers from Clerks and nostalgic gamer-geeks of Ready Player One and, of course, the last-wave Xers in Scream:

Kevin Williamson, “Scream” (undated screenplay draft), 89

The characters from Scream had grown up watching—arguably even studying—Halloween and Friday the 13th and A Nightmare on Elm Street on home video and cable TV, so they had an advantage the teenage cannon fodder from their favorite horror movies did not:  They were savvy to the rules of the genre.  Don’t have sex.  Don’t drink or do drugs.  Never say “I’ll be right back.”

There was a demonstrably prescriptive formula for surviving a slasher movie—all you had to do was codify and observe it.  That single narrative innovation, the conceptual backbone of Scream, was revelatory:  Suddenly everything old was new again!  A creatively exhausted subgenre, long since moldered by its sequel-driven descent into high camp, could once again be truly terrifying.

Continue reading

There He Was… and in He Walked: Lessons on Mythic Storytelling from the Mariachi Trilogy

In belated observation of Día de los Muertos, here’s an appreciation for the idiosyncratic storytelling of Robert Rodriguez’s Mariachi trilogy, a neo-Western action series that emerged from the indie-cinema scene of the 1990s and can only be deemed, by current Hollywood standards, an anti-franchise.  The movies and the manner in which they were made have a lot to teach us about what it means to be creative—and how to best practice creativity.


Before the shared cinematic universe became the holy grail of Hollywood, the coup d’éclat for any aspiring franchise—and we can probably credit Star Wars for this—was the trilogy.

In contrast with serialized IPs (James Bond and Jason Voorhees, for instance), the trilogy came to be viewed, rightly or wrongly, as something “complete”—a story arc with a tidy three-act design—and, accordingly, many filmmakers have leaned into this assumption, exaggerating a given series’ creative development post factum with their All part of the grand plan! assurances.

This peculiar compulsion we’ve cultivated in recent decades—storytellers and audiences alike—to reverse-engineer a “unified whole” from a series of related narratives, each of which developed independently and organically, is antithetical to how creativity works, and even to what storytelling is about.

Nowhere is the fluidity of the creative process on greater, more glorious display than in the experimental trilogy—that is, when a low-budget indie attains such commercial success, it begets a studio-financed remake that simultaneously functions as a de facto sequel, only to then be followed by a creatively emboldened third film that completely breaks from the established formula in favor of presenting an ambitiously gonzo epic.  Trilogies in this mode—and, alas, it’s pretty exclusive club—include Sam Raimi’s Evil Dead, George Miller’s Mad Max, and Robert Rodriguez’s El Mariachi.

Robert Rodriguez at the world premiere of “Alita: Battle Angel” on January 31, 2019 in London (Eamonn M. McCormack/Getty)

A film student at the University of Texas at Austin in the early nineties, Rodriguez self-financed El Mariachi with a few thousand dollars he’d earned as a medical lab rat; the project wasn’t meant to be much more than a modest trial run at directing a feature film that he’d hoped to perhaps sell to the then-burgeoning Spanish-language home-video market.  He reasoned that practical experience would be the best teacher, and if he could sell El Mariachi, it would give him the confidence and funds to produce yet more projects—increasingly ambitious and polished efforts—that would allow him to make a living doing what he loved.  He had no aspirations of power lunches at The Ivy or red-carpet premieres at Mann’s Chinese Theatre, only pursuing the art of cinematic storytelling—not necessarily Hollywood filmmaking, a different beast—to the fullest extent possible.

If you want to be a filmmaker and you can’t afford film school, know that you don’t really learn anything in film school anyway.  They can never teach you how to tell a story.  You don’t want to learn that from them anyway, or all you’ll do is tell stories like everyone else.  You learn to tell stories by telling stories.  And you want to discover your own way of doing things.

In school they also don’t teach you how to make a movie when you have no money and no crew.  They teach you how to make a big movie with a big crew so that when you graduate you can go to Hollywood and get a job pulling cables on someone else’s movie.

Robert Rodriguez, Rebel without a Crew, or, How a 23-Year-Old Filmmaker with $7,000 Became a Hollywood Player (New York:  Plume, 1996), xiii–xiv

They don’t teach a lot of things about Hollywood in film school, like how so many of the industry’s power brokers—from producers and studio execs to agents and managers—are altogether unqualified for their jobs.  These folks think they understand cinematic storytelling because they’ve watched movies their entire lives, but they’ve never seriously tried their hand at screenwriting or filmmaking.  Accordingly, the town’s power structure is designed to keep its screenwriters and filmmakers subordinate, to make sure the storytellers understand they take their creative marching orders from people who are themselves utterly mystified by the craft (not that they’d ever admit to that).

It’s the only field I know of whereby the qualified authorities are entirely subservient to desk-jockey dilettanti, but I suppose that’s what happens when a subjective art form underpins a multibillion-dollar industry.  Regardless, that upside-down hierarchy comes from a place of deep insecurity on both ends of the totem pole, and is in no way conducive to creativity, hence the premium on tried-and-true brands over original stories, on blockbusters over groundbreakers.  As I discovered the hard way—more on that in a minute—Hollywood is arguably the last place any ambitiously imaginative storyteller ought to aspire to be.  Rodriguez seemed to understand that long before he ever set foot in L.A.:

Continue reading

Patriarchal Propaganda: How Hollywood Stories Give Men Delusions of Heroism

Movies and TV shows—and this includes both your favorites and mine—mostly exist to remind us that ours is a man’s world.  Popular entertainment in general, regardless of medium or genre or even the noble intentions of the storytellers, is almost invariably patriarchal propaganda.  But it doesn’t have to be that way.


Since at least as far back as the adventures of Odysseus, men have used fantasy narratives to contextualize ourselves as the solitary heroic protagonist of the world around us—a world that would be appreciably better off if only our judgment weren’t questioned or our actions thwarted by those of inferior hearts and minds.  In the Book of Genesis, God creates man from the dust, gives him dominion over the Earth, then provides him with a “helper”—Eve—who proves considerably less than helpful when she defiantly feeds from the tree of the knowledge of good and evil and spoils Paradise for everyone.

Such are the stories we’ve been hearing for literally thousands of years, and the reality we live in today is very much shaped by the presumption of patriarchy they propagandize.  In 1949, this way of telling stories—the Hero’s Journey—was codified by comparative mythologist Joseph Campbell in The Hero with a Thousand Faces, and adopted by Hollywood as the blueprint for the blockbuster.  From our Westerns (Dances with Wolves) to our policiers (Dirty Harry) to our space operas (Star Wars) to our spy thrillers (James Bond) to our teen comedies (Ferris Bueller’s Day Off) to our superhero universes (Iron Man) to our mob dramas (The Sopranos) to our sitcoms (Two and a Half Men), it’s a man’s world, baby—with the rest of you there to either support us or (foolishly) stand in our way.

It’s not that there’s anything inherently wrong with escapist entertainment.  It isn’t fantasy the genre that’s the problem, or even the Hero’s Journey story model, but rather the near-universal underlying patterns and motifs in our popular fictions that have unconsciously supported—that have surreptitiously sold us—the fantasy of patriarchal hegemony.  As such, white men in particular have been conditioned by these cultural narratives to see ourselves as the central heroic figure in the Epic of Me—even our storytellers themselves:

While accepting the award for Outstanding Directing for a Limited or Anthology Series or Movie for his work on The Queen’s Gambit, Scott Frank brushed off the “get off the stage” music not once but three times, reading a prepared speech from two full-length pages he’d shoved into his pocket and blowing past his allotted 45 seconds to speak for three minutes and 28 seconds—more than four and a half times as long as he was supposed to.

Viewers couldn’t have asked for a more perfect embodiment of white male privilege and entitlement as a visibly annoyed Frank reacted to the orchestra’s attempts to play him off by saying, “Really?  No,” and making a dismissive hand gesture.  The second time they started playing, he said, “Seriously, stop the music,” again waving his hand as if he were shooing away a fly and pressing on.  The third time, he insisted, “I’m almost done.”  Each time, when he commanded them to stop playing the music, they actually stopped the music.  Who knew it was that easy?

Bonnie Stiernberg, “Those ‘Queen’s Gambit’ Emmy Speeches Epitomized Exactly What’s Wrong With Hollywood,” InsideHook, September 20, 2021

Whether we’re aware of them or not, men have certain expectations about how the world should work:  that it should work for us.  After all, God gave us, not you, dominion over all living things and natural resources on Earth.  But when reality conflicts with those birthrights, we grow frustrated, and rather than questioning the stories we’ve been told about our place in the world, we tell more of the same self-mythologizing horseshit—to assure ourselves, and others, of our God-given entitlements, of our singular heroism.  Consider, for example, the overwhelming popularity—ten entries and counting—of the testosterone-charged Fast & Furious franchise:

These films use the concrete landscape to assert individuality and a refusal to knuckle under to authority.  With the exception of Brian and perhaps Roman, these inner-city car racers don’t want to be reintegrated into society.  They race cars to gain status and money, to impress sexy women, and to defy the police—just like [celebrated American NASCAR driver] Junior Johnson and the Dukes of Hazzard.  But, like the conformists and suburbanites they reject, they act like everything in nature exists to be consumed and exploited.

robin, “The Fast and Furious Films and Mad Max Fury Road,” Ecocinema, Media, and the Environment [blog], September 20, 2019

Dominic Toretto (Vin Diesel) famously makes his gang say grace before they eat, an utterly meaningless gesture since, unlike obeying the law, it costs him nothing to do so, yet it nonetheless speaks volumes about his patriarchal values.  He and his crew aren’t enlightened antiheroes as they believe, merely entitled gearheads who proudly and explicitly live their lives “a quarter mile at a time,” because to think beyond that would require a sense of empathy for those outside their range of awareness, as well as compel a sober consideration for the long-term consequences of their, to put it generously, socially irresponsible behaviors.

Dom appropriates Christian iconography to assert his moral authority—pure patriarchal propaganda

(And if you’re inclined to dismiss Dom’s worldview as the patently absurd pseudophilosophy of a one-dimensional action-movie street racer—nothing worth taking seriously—it’s worth remembering that Facebook co-founder and CEO Mark Zuckerberg’s operating motto is “Move fast and break things,” which sounds like exactly the sort of iconoclastic rallying cry you’d expect to hear Dom growl… until you realize the thing Zuckerberg might’ve broken while becoming the fifth wealthiest person in the world is democracy itself.)

Continue reading

Entre Nous

An old friend called recently for a commensurably old-fashioned reason:  just to say hi.  Turns out, Xers still do that.  Incorrigible habit we picked up in the analog age, I’m afraid.

We’d grown up together in the Bronx, though she’s lived in New England nearly as long as I’ve been in L.A., and we’ve seldom had occasion to cross paths in the old hometown over the past two decades.  Still, we’ve remained close; I regard her in every way as an older sister, indistinguishable from my actual older sisters.  She wanted to know how my wife and I were settling into our new home (more on that matter in a forthcoming post), and asked how my various writing projects were going, citing each by title.  Few of my friends ever inquire as to my writing (they’ve probably long since reasoned I’d be only too delighted to tell them, in exhaustive detail), and I’d buy the lot a round, with chasers, if even one could reference a single project by name.

This particular friend is a registered nurse who took a professional leave of absence to care for her terminally ailing mother after a prognosis set the woman’s lifespan expectations at perhaps a few months.  That was well over two years ago.  My friend’s life and career, accordingly, remain on indefinite hold.  So, when I asked how she was doing, she sighed and blurted, “Not great.”  To be clear:  She wasn’t looking to complain, only to confide.  I think it helped her, however fleetingly, to have the ear of someone who knows and loves her family as if it was his own, but isn’t directly involved with or affected by its short- and long-term dramas.

In May of 2016, we had a rare chance to hang together at the Casino Ballroom in Hampton Beach, New Hampshire, to see old favorite Extreme perform (pictured: Gary Cherone and Nuno Bettencourt)

The entire conversation stood in stark contrast with an experience I’d had only a week earlier.  I was at a backyard barbecue in Jersey—there have been quite a number of those this past August, as it happens—with friends and relatives I hadn’t seen since well before the shutdown, folks I’ve known for at least a quarter century if not the entirety of my life.  We’d all just endured the collective trauma of pandemia, and I guess I had a notion in my head that being in each other’s company once again would provide a tangible sensation of catharsis—a renewed appreciation for our shared history; a deeper sense of trust in one another; a tighter grip on the ties that bind; a desire, for lack of a more erudite phrase, to be real.  To confide.

Heh.  My wife warned me years ago I’m a hopeless Romantic.  Well, she was right yet again, because while it was certainly nice to see them, we mostly just talked about the same old shit:  the Yankees’ midseason slump; the enduring mystery of why Millennials venerate The Office as the Greatest Sitcom Ever; etcetera, etcetera.  I wasn’t asked about my work—I can write about all this publicly with full confidence none of them will ever read it—and I’ve learned to stop asking about theirs; I never get an answer, anyway.  And Christ knows no one expressed a candid or unflattering word about how they were feeling.  No, everyone just put on a happy face—though many of them didn’t seem particularly happy to me—and a lot of perfectly polite if entirely superficial discourse ensued… just like the good old days.

The difference this time, I suppose, was how attuned I was to the skillful manner by which some of those folks—not all of them, to be perfectly fair—fluidly change the subject the instant a question trips the “too personal” wire.  Suddenly, I found myself flashing back on a zillion cocktail conversations over multiple decades and wondering if a piece of information has ever been exchanged that offered even so much as a cursory glimpse at their secret hearts?  I don’t think it has, and not for lack of trying on my part.  I make it easy for people to open up, if they choose.

Continue reading

The Ted Lasso Way: An Appreciation

The Emmy-nominated comedy series Ted Lasso doesn’t merely repudiate the knee-jerk cynicism of our culture—it’s the vaccine for the self-reinforcing cynicism of our pop culture.  In a feat of inspiring commercial and moral imagination, Jason Sudeikis has given us a new kind of hero—in an old type of story.


As a boy coming of age in the eighties and early nineties, I had no shortage of Hollywood role models.  The movies offered smartass supercops John McClane and Martin Riggs, vengeful super-soldiers John Matrix and John Rambo, and scorched-earth survivalists Snake Plissken and Mad Max, to cite a select sampling.  Sure, each action-hero archetype differed somewhat in temperament—supercops liked to crack wise as they cracked skulls, whereas the soldiers and survivalists tended to be men of few words and infinite munitions—but they were, one and all, violent badasses of the first order:  gun-totin’, go-it-alone individualists who refused to play by society’s restrictive, namby-pamby rules.

Yippee ki-yay.

The small screen supplied no shortage of hero detectives in this mode, either—Sonny Crockett, Thomas Magnum, Rick Hunter, Dennis Booker—but owed to the content restrictions of broadcast television, they mostly just palm-slammed a magazine into the butt of a chrome Beretta and flashed a charismatic GQ grin in lieu of the clever-kill-and-quick-one-liner m.o. of their cinematic counterparts.  (The A-Team sure as hell expended a lot of ammo, but their aim was so good, or possibly so terrible, the copious machine-gun fire never actually made contact with human flesh.)  The opening-credits sequences—MTV-style neon-noir music videos set to power-chord-driven instrumentals—made each show’s gleaming cityscape look like a rebel gumshoe’s paradise of gunfights, hot babes, fast cars, and big explosions.

It might even be argued our TV heroes exerted appreciably greater influence on us than the movie-franchise sleuths that would often go years between sequels, because we invited the former into our home week after week, even day after day (in syndication).  And to be sure:  We looked to those guys as exemplars of how to carry ourselves.  How to dress.  How to be cool.  How to talk to the opposite sex.  How to casually disregard any and all institutional regulations that stood in the way of a given momentary impulse.  How to see ourselves as the solitary hero of a cultural narrative in which authority was inherently suspect and therefore should be proudly, garishly, and reflexively challenged at every opportunity.  The world was our playground, after all—with everyone else merely a supporting actor in the “great-man” epic of our own personal hero’s journey.

Oh, how I wish, in retrospect, we’d had a heroic role model like Jason Sudeikis’ Ted Lasso instead.

THE LAST BOY SCOUT

The premise of Ted Lasso, which recently commenced its second season, is a can-do college-football coach from Kansas (Sudeikis) is inexplicably hired to manage an English Premier League team, despite that kind of football being an entirely different sport.  Ted, we learn, has been set up to fail by the embittered ex-wife of the club’s former owner (Hannah Waddingham), who, in a plot twist that owes no minor creative debt to David S. Ward’s baseball-comedy classic Major League—which the show tacitly acknowledges when Ted uncharacteristically invokes a key line of profane dialogue from the movie verbatim—inherited the team in a divorce and is now surreptitiously revenge-plotting its implosion.

Jason Sudeikis as Ted Lasso

But, boy oh boy, has Waddingham’s Rebecca Welton—a refreshingly dimensional and sympathetic character in her own right, it’s worth noting—seriously underestimated her handpicked patsy.  With his folksy enthusiasm and full Tom Selleck ’stache, Coach Ted Lasso unironically exemplifies big-heartedness, open-mindedness, kindness, courtesy, chivalry, civility, forgiveness, wisdom, teamwork, cultural sensitivity, and prosocial values—all with good humor, to boot.  His infectious optimism eventually converts even the most jaded characters on the show into true believers, and his innate goodness inspires everyone in his orbit—often despite themselves—to be a better person.  And if, like me, you watch the first season waiting for the show to at some point subject Ted’s heart-on-his-sleeve earnestness to postmodern mockery or ridicule—“spoiler alert”—it doesn’t.

Continue reading

In the Multiverse of Madness: How Media Mega-Franchises Make Us Their Obedient Servants, Part 2

Editor’s note:  Owed to the length of “In the Multiverse of Madness,” I divided the essay into two posts.  If you haven’t already, I encourage you to read Part 1 first, and please feel welcome to offer feedback on that post, this one, or both in the comments section of Part 2 below.  Thank you.


Previously on “In the Multiverse of Madness,” we covered the three engagement strategies (and correlating tactics) transmedia mega-franchises deploy to keep us consuming each new offering in real time:  by leveraging FOMO via “spoilers”; by encouraging “forensic fandom” with Easter eggs and puzzle-boxing; and by reversing “figure and ground.”  Now let’s talk about why 1970s-born adults have been particularly susceptible to these narrative gimmicks—and what to do about it.

X Marks the Spot

Mega-franchises are dependent on a very particular demographic to invest in their elaborate and expanding multiverse continuities:  one that has both a strong contextual foundation in the storied histories of the IPs—meaning, viewers who are intimately familiar with (and, ideally, passionately opinionated about) all the varied iterations of Batman and Spider-Man from the last thirty or so years—and is also equipped with disposable income, as is typically the case in middle age, hence the reason Gen X has been the corporate multimedia initiative’s most loyal fan base.  Fortunately for them, we’d been groomed for this assignment from the time we learned to turn on the television.

Very quickly (if it isn’t already too late for that):  From 1946 through 1983, the FCC enforced stringent regulations limiting the commercial advertisements that could be run during or incorporated into children’s programming.  However:

Ronald W. Reagan did not much care for any regulations that unduly hindered business, and the selling of products to an entire nation of children was a big business indeed.  When Reagan appointed Mark S. Fowler as commissioner of the FCC on May 18, 1981, children’s television would change dramatically.  Fowler championed market forces as the determinant of broadcasting content, and thus oversaw the abolition of every advertising regulation that had served as a guide for broadcasters.  In Fowler’s estimation, the question of whether children had the ability to discriminate between the ads and the entertainment was a moot point; the free market, and not organizations such as [Actions for Children’s Television] would decide the matter.

Martin Goodman, “Dr. Toon:  When Reagan Met Optimus Prime,” Animation World Network, October 12, 2010

In the wake of Fowler’s appointment, a host of extremely popular animated series—beginning with He-Man and the Masters of the Universe but also notably including The Transformers, G.I. Joe:  A Real American Hero, and M.A.S.K. for the boys, and Care Bears, My Little Pony, and Jem for young girls—flooded the syndicated market with 65-episode seasons that aired daily.  All of these series had accompanying action figures, vehicles, and playsets—and many of them, in fact, were explicitly based on preexisting toylines; meaning, in a flagrant instance of figure-and-ground reversal, the manufacturers often dictated narrative content:

“These shows are not thought up by people trying to create characters or a story,” [Peggy Charren, president of Action for Children’s Television] explained, terming them “program-length advertisements.”  “They are created to sell things,” she said.  “Accessories in the toy line must be part of the program.  It reverses the traditional creative process.  The children are getting a manufacturer’s catalogue instead of real programming content.”

Glenn Collins, “Controversy about Toys, TV Violence,” New York Times, December 12, 1985

This was all happening at the same time Kenner was supplying an endless line of 3.75” action figures based on Star Wars, both the movies and cartoon spinoffs Droids and Ewoks.  Even Hanna-Barbera’s Super Friends, which predated Fowler’s tenure as FCC commissioner by nearly a decade, rebranded as The Super Powers Team, complete with its own line of toys (also courtesy of Kenner) and tie-in comics (published by DC), thereby creating a feedback loop in which each product in the franchise advertised for the other.  Meanwhile, feature films like Ghostbusters and even the wantonly violent, R-rated Rambo and RoboCop movies were reverse-engineered into kid-friendly cartoons, each with—no surprise here—their own action-figure lines.

I grew up on all that stuff and obsessed over the toys; you’d be hard-pressed to find a late-stage Xer that didn’t.  We devoured the cartoons, studied the comics, and envied classmates who were lucky enough to own the Voltron III Deluxe Lion Set or USS Flagg aircraft carrier.  To our young minds, there was no differentiating between enjoying the storyworlds of those series and collecting all the ancillary products in the franchise.  To watch those shows invariably meant to covet the toys.  At our most impressionable, seventies-born members of Gen X learned to love being “hostage buyers.”  Such is the reason I was still purchasing those goddamn Batman comics on the downslope to middle age.

Continue reading
« Older posts Newer posts »

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑