Sean P Carlin

Writer of things that go bump in the night

Page 5 of 13

The Nostalgist’s Guide to the Multiverse—and How We All Might Find Our Way Back Home

Gee, for someone who’s spent the past few years lecturing others on the hazards of living on Memory Lane—by way of curated collections of memorabilia, or the unconscionable expropriation of superheroes from children, or whatever your nostalgic opiate—I quite recently became starkly aware of my own crippling sentimental yearning for obsolete pleasures.  But I’ve also identified the precise agent of disorientation that’s led many of us down this dead-end path… and, with it, a way out.  First, some backstory.

I’ve had occasion this autumn to enjoy ample time back on the East Coast, both a season and region I can never get enough of.  I spent a weekend in Rehoboth Beach, Delaware, with a group of high-school friends, many of whom I hadn’t seen in a quarter century.  I visited my beautiful sister in Washington, D.C., where we took in a Nats game so I could get a firsthand look at the team my Dodgers were set to trounce in the playoffs.  I attended my closest cousin’s wedding (Bo to my Luke), and served as best man at my oldest friend’s—both in New Jersey.  I marched in Greta Thunberg’s #ClimateStrike rally at Battery Park, and took meetings with representatives from the Bronx and Manhattan borough presidents’ offices about bringing both districts into the County Climate Coalition.

(I also got chased out of Penn Station by a mutant rat, so it was about as complete a New York adventure as I could’ve hoped for.)

Wonderful and often productive as those experiences were, though—the subway run-in with Splinter from Teenage Mutant Ninja Turtles notwithstanding—my favorite moments were the ones where nothing so noteworthy occurred.  The pints at my favorite pubs.  The old faces I stopped to chat with “on the Avenue,” as we say back home.  The solitary strolls through the park amidst the holy silence of snowfall.

Brust Park in the Bronx, New York, on December 2, 2019 (photo credit: Sean P. Carlin)

More than any of that, though—the ballgames, the gatherings formal and informal, the walks down the street or into the woods—I did what I always do, regardless of site or circumstance:  entertained quixotic fantasies about moving back.

This has become, over the past half-decade, a personal pathological affliction, as my long-suffering friends and family can lamentably attest.  I mean, I left New York for Los Angeles eighteen years ago.  Eighteen years!  That’s years—not months.  Christ, Carlin, at what point does the former cease to feel like home in favor of the latter?

I can’t say what prompted my recent epiphany, but for the first time in all my exhausting exhaustive ruminating on the matter, this simple, self-evident truth occurred to me:  I’ve never really left New York.

Continue reading

It’s Alive! Return of the Universal Classic Monsters

Ah, the “shared cinematic universe”—the favored narrative model–cum–marketing campaign of the new millennium!  Pioneered by Marvel, it wasn’t long before every studio in town wanted a “mega-franchise” of its own, feverishly ransacking its IP archives for reliable brands to exploit anew.  By resurrecting the Universal Classic Monsters, Universal Studios saw an opportunity to create its own interconnected multimedia initiative… and the so-called “Dark Universe” was born.

Well, not born, exactly—more like announced.  When the first offering, Dracula Untold, took a critical beating and underperformed domestically, Universal promptly issued a retraction:  “Just kidding!  That wasn’t really the first Dark Universe movie!”  An all-star cast was hastily assembled:  Russell Crowe as Jekyll and Hyde!  Javier Bardem as Frankenstein’s monster!  Johnny Depp as the Invisible Man!  Angelina Jolie as the Bride of Frankenstein!  And first up would be Tom Cruise in The Mummy

Um… isn’t this precisely the kind of arrogant presumption most of the Universal Classic Monsters came to regret?

Except—whoops!The Mummy bombed, too… at which point the sun rather quietly went down on the Dark Universe project altogether.  Seems launching a shared fictional universe is considerably harder than Marvel made it look.  Imagine that.

The thing is, we already had a revival—arguably a cinematic renaissance—of the Universal Classic Monsters in the 1990s.  Dracula, Frankenstein, the Mummy, the Invisible Man, the Wolf Man, and Dr. Jekyll and Mr. Hyde were given gloriously Gothic reprisals in an (unrelated) series of studio features that starred some of the biggest names in Hollywood.  None of those projects were cooked up in a corporate think tank, but were instead the idiosyncratic visions of a diverse group of directors—the artists behind no less than The Godfather, The Graduate, The Crying Game, Dangerous Liaisons, and Basic Instinct, to name a few—employing horror’s most recognizable freaks to (for the most part) explore the anxiety of confronting the end of not merely a century, but a millennium.

If the respective creative efforts of these filmmakers were uncoordinated, their common agenda was entirely logical.  Many of their fiendish subjects, after all, first arrived on the cultural scene at the end of the previous century:  Strange Case of Dr Jekyll and Mr Hyde was published in 1886; both Dracula and The Invisible Man in 1897.  Furthermore, their stories tended to speak to either the hazards of zealous scientific ambition (Frankenstein, The Invisible Man, Dr Jekyll and Mr Hyde), or, in the case of Dracula and The Mummy, the limitations of it—of humankind’s attempts to tame the natural world through technology:  “And yet, unless my senses deceive me, the old centuries had, and have, powers of their own which mere ‘modernity’ cannot kill” (from Jonathan Harker’s journal, dated 15 May).

Even the Wolf Man serves as a metaphor for the primal instincts we’ve suppressed under our civilized veneer; far from having learned to let our two halves coexist in harmony, they are instead at war within the modern man and woman.  These are existential issues that seem to weigh more heavily on us at the eve of a new epoch, which is arguably why the monstrous creations we use to examine them flourished in the literature of the 1890s and then again, a century later, through the cinema of the 1990s.  It goes to illustrate that sometimes fictional characters simply speak to their times in a very profound way that can’t be engineered or anticipated.  It’s just alchemical, much as Hollywood would prefer it to be mathematical.

With that in mind, let’s have a look at the unofficial “Universal Classic Monsters reprisal” of the nineties (and I’ve included a few other likeminded films from the movement) to better appreciate what worked and what sometimes didn’t.

Continue reading

Naomi Klein’s “On Fire” (Book Review)

Since I trained under former vice president Al Gore to serve in his Climate Reality Leadership Corps just over a year ago—a period in which no fewer than eighty-five federal environmental regulations have been rolled back, greenhouse-gas emissions have spiked (after leveling off in years prior), polar-ice melt is outpacing predictive modeling, and the Intergovernmental Panel on Climate Change has strenuously warned us we have a mere decade to halve our current rate of carbon-burning if we hope to avoid the most catastrophic effects of climate change—there is one distinct emotional state that has been entirely absent from my life.

Despair.

I might, in fact, be happier and more optimistic than at any other point in my adult life.

Activism, I’ve discovered, is the antidote to despair, to doomism.  Over the past year, I’ve given public presentations on the Energy Innovation and Carbon Dividend Act, a bipartisan bill in Congress that would charge fossil-fuel extractors for the privilege of pollution—of treating the public commons of our atmosphere like an open sewer—they’ve thus far enjoyed free of charge.

This past March, my Climate Reality chapter was proud to enlist Los Angeles into the County Climate Coalition, an alliance of jurisdictions across the United States, formed by Santa Clara County Supervisor Dave Cortese, that have formally pledged to uphold the standards of the Paris Accord.  Less than six months later, we were in attendance as the L.A. County Board of Supervisors voted to adopt the OurCounty sustainability plan, one of the most ambitious green initiatives in the United States.

And just last month, I joined 300,000 activists in Lower Manhattan for the Global Climate Strike as we swarmed the streets of City Hall, marched down Broadway, and rallied at Battery Park—where no less than Greta Thunberg addressed the crowd.  None of that, as it happens, has left much time to actually worry about the climate breakdown.

Greta Thunberg at the Global Climate Strike in New York City on September 20, 2019 (photo credit: Sean P. Carlin)

But that level of activism, I acknowledge, isn’t something to which everyone can readily commit.  So, if you want to share my profound hopefulness about the solutions to the climate crisis—if you want to appreciate the world-changing opportunity humanity has been handed by history—do yourself a favor and read a book that might admittedly be outside your comfort zone:  Naomi Klein’s On Fire:  The (Burning) Case for a Green New Deal.

Naomi Klein’s “On Fire: The (Burning) Case for a Green New Deal”

I promise:  You won’t be inundated with scientific facts and figures; if you want to understand the basic science of global warming, Mr. Gore’s documentaries An Inconvenient Truth (2006) and An Inconvenient Sequel:  Truth to Power (2017) are both excellent primers.  Naomi Klein’s On Fire is a recently published collection of her essays and lectures from the past decade, bookended by all-new opening and closing statements on why a Global Green New Deal is the blueprint for an ecologically sustainable and socially equitable twenty-first century:

The idea is a simple one:  in the process of transforming the infrastructure of our societies at the speed and scale that scientists have called for, humanity has a once-in-a-century chance to fix an economic model that is failing the majority of people on multiple fronts.  Because the factors that are destroying our planet are also destroying people’s quality of life in many other ways, from wage stagnation to gaping inequalities to crumbling services to the breakdown of any semblance of social cohesion.  Challenging these underlying forces is an opportunity to solve several interlocking crises at once. . . .

. . . In scale if not specifics, the Green New Deal proposal takes its inspiration from Franklin Delano Roosevelt’s original New Deal, which responded to the misery and breakdown of the Great Depression with a flurry of policies and public investments, from introducing Social Security and minimum wage laws, to breaking up the banks, to electrifying rural America and building a wave of low-cost housing in cities, to planting more than two billion trees and launching soil protection programs in regions ravaged by the Dust Bowl.

Naomi Klein, On Fire:  The (Burning) Case for a Green New Deal, (New York:  Simon & Schuster, 2019), 26
Continue reading

“It’s Over, Johnny”: The Thrill Is Gone in “Rambo: Last Blood”

The following article discusses story details of Rambo:  Last Blood.

In the lead-up to Creed (2015), the New Yorker published a fascinating analysis of the six Rocky movies, arguing that they can be viewed as a trilogy:  In Rocky (1976) and Rocky II (1979), the Italian Stallion goes from nobody to somebody; in III (1982) and IV (1985), he mutates once again, this time from hero to superhero; Sylvester Stallone then sought to extricate the champ from the excesses of Reagan’s America (the robot butler, anyone?), setting up Rocky’s ignoble return to the streets of Philly in Rocky V (1990), then credibly reestablishing him as an underdog in Rocky Balboa (2006).  It was this iteration of Rocky—the purest version—that Stallone reprised in Creed and Creed II (2018), in which an aging, widowed, streetwise Rocky acts (reluctantly at first) as mentor and trainer to a young protégé.

Sylvester Stallone in “Rambo: Last Blood” (2019)

Sly’s other signature role, troubled Vietnam vet John Rambo, has had no less of a winding road through the past five decades when it comes to his ever-evolving characterization:  The self-hating solider of David Morrell’s 1972 novel First Blood was recast as a sympathetic hero in the 1982 movie of the same name, who in turn became the jingoistic superhero of Rambo:  First Blood, Part II (1985) and Rambo III (1988).  It was only in his belated fourth cinematic adventure, Rambo (2008), that his prototypal literary temperament atavistically asserted itself:

You know what you are, what you’re made of.  War is in your blood.  Don’t fight it.  You didn’t kill for your country—you killed for yourself.  God’s never gonna make that go away.  When you’re pushed, killing’s as easy as breathing.

Rambo’s inner monologue in Rambo (2008)

Upon ending the prolonged moratorium on both creatively depleted franchises in the aughts, Stallone didn’t “retcon” some of the lesser entries in the Rocky and Rambo series, but rather embraced them as part of both heroes’ long emotional arcs:  Just as Creed II redeems the hokey jingoism of Rocky IV, Rambo IV acknowledges that the previous sequels glorified violence—gleefully, even pornographically—and burdens the protagonist with the guilt of that indefensible carnage, refusing to let him off the hook for it.  The inconvenient mistakes of the past aren’t expunged from the hagiographies of either of these American icons for the sake of a cleaner narrative—an increasingly common (and inexcusably lazy) practice in franchise filmmaking, as evidenced by recent “do-over” sequels to Terminator and Halloween—but instead seed the conditions in which we find both Rocky and Rambo at the next stage of their ongoing sagas.

So, in Rambo:  Last Blood (2019), which sees the itinerant commando back home at his ranch in Arizona (per the coda of the last movie), the big question I had going into the film was this:  Which permutation of Rambo would we find in this story—the one about what happened after Johnny came marching home?  What might Rambo, who has always served a cultural Rorschach—first as an expression of the political disillusionment of the seventies, then recruited in the eighties to serve as poster boy for the Reagan Doctrine—tell us about ourselves in the Trump era?

Continue reading

Mirror/Mirror: On Seeing Ourselves in Fictional Characters

Over the past few months, I’ve been helping plan an old friend’s bachelor party, the experience of which has made me starkly aware of just how conservative I’ve become in middle age.  Not politically, you understand—personally.  When I was a kid, I was like Leo Getz in Lethal Weapon (I was seriously that annoying) who nonetheless fancied himself Martin Riggs; somewhere along the way, though, I grew up to be Roger Murtaugh.

Riggs (Mel Gibson), Leo (Joe Pesci), and Murtaugh (Danny Glover) in “Lethal Weapon 2” from 1989 (Mary Evans Picture Library)

And that got me thinking about how, at different stages of life, we’re sometimes lucky enough to closely identify with a particular fictional character in an exceptional way; I would say the experience is even as random and as rarified as true friendship:  How many times, really, have we “met” a character who speaks so directly to us, whose emotional circumstances so closely reflect our own, that through them we vicariously attain some measure of insight… and maybe even catharsis?

We’re not necessarily talking favorite characters here; those come in spades.  God knows, I love Indiana Jones and Jean-Luc Picard and Philip Marlowe and Chili Palmer, but I don’t necessarily—much as I want to—relate to those characters so much as admire their characteristics.  In that way, they’re more aspirational than they are analogous.

I’d like to know which characters from fiction speak to you—and for you.  I’ll get us started, selecting examples from three distinct phases of my life:  childhood, adolescence, and midlife.  (For those interested, I’ve included each narrative’s Save the Cat! genre.)

Continue reading

Oh, Snap! The Nostalgia-Industrial Complex — ’90s Edition

Et tu, Millennials?  The old nostalgia-industrial complex got its hooks into you, too, huh?  I must ask:  Have you not witnessed in firsthand horror what pining for the good old days has done to Generation X…?

To recap:  We Xers have thus far spent the twenty-first century reliving all our childhood favorites—Star Wars, Super Friends, Karate Kid, Ghostbusters, Lethal Weapon, Halloween, Bill & Ted, Tron, Transformers, Terminator, Top Gun—a pathological exercise in self-infantilization that has catastrophically retarded both the culture as well as a generation of middle-aged adults who are at this point more passionately invested in Skywalkers and superheroes than are the juvenile audiences for whom those characters were originally intended.

Always keen to recognize—and replicate—a winning formula, a new permutation of forward-thinking backward-gazing has recently seized Hollywood:  Sell nineties-era nostalgia to the generation that came of age in that decade!  Over the past few years, we got a pair of Jurassic Park remakes-masquerading-as-sequels that didn’t inspire a single word of enthusiasm (certainly not a second viewing), but nonetheless earned over a billion dollars apiece, while our last conventional movie star, Dwayne Johnson, used his considerable clout (or more aptly muscle?) to resurrect both Jumanji and Baywatch.  As for this year?  Hope you’re excited for warmed-over helpings of The Lion King, Men in Black, Toy Story, Aladdin, and yet more Jumanji.  And while we’re at it, let’s welcome back slacker duo Jay and Silent Bob, because surely their grunge-era stoner humor still holds up in middle-age—

Our sentiments exactly, fellas…

—as well as Will Smith and Martin Lawrence, back from buddy-cop purgatory for more Bad Boys badassery!  You know damn well whatcha gonna do when they come for you:  Buy a ticket!

For an indeterminate, but clearly not immeasurable, swath of moviegoers, there is no marketing campaign more alluring than one that taps into foggy childhood memories. . . .

. . . The great nostalgia-industrial complex will [continue] steamrollering us against our better judgment into multiplexes, hoping for a simulacrum of the first high we felt watching great characters years ago.

Tom Philip, “Summer ’19 Brought To You By Nostalgia-Bait Movies,” Opinion, New York Times, July 4, 2019

Not just multiplexes.  (And how are those even still a thing?)  On the small screen, VH1 revived game-changing nineties slasher franchise Scream this summer (how, for that matter, is VH1 still a thing?), and new iterations of decade-defining teen melodramas 90210 and Party of Five are on the way.  Dope.

Continue reading

Tim Burton’s “Batman” at 30—and the Cultural Legacy of the Summer of 1989

In order to appreciate the state of commercial adolescence to which Generation X has been disproportionately consigned, one needs to consider Tim Burton’s Batman in its sociocultural context:  how it inadvertently provided a blueprint to reconceptualize superheroes from innocent entertainment meant to inspire the imagination of children to hyperviolent wish-fulfillment fantasies for commercially infantilized adults.


The weekly theatrical debut of a new franchise tentpole, voraciously bulling aside the $200 million–budgeted blockbuster released a mere seven days prior, is par for the course nowadays, but back in 1989—thirty summers ago per the calendar, though seemingly as recently as yesterday by the nebulous barometer of memory—we’d never before experienced anything like that.

That was the year that gave us new entries in such ongoing adventures as Indiana Jones, Star Trek, Ghostbusters, The Karate Kid, Lethal Weapon, James Bond, and Back to the Future, lowbrow comedies Police Academy, Fletch, and Vacation, as well as slasher staples Friday the 13th, A Nightmare on Elm Street, and Halloween—to say nothing of launching all-new franchises with Bill & Ted’s Excellent Adventure, Major League, Pet Sematary, Honey, I Shrunk the Kids, Weekend at Bernie’s, and Look Who’s Talking.  To anyone who’d grown up in the nascent home-video era—that period in which all the aforementioned series (save 007) were born and could thusly be re-watched and obsessed-over ad infinitum—1989 was the Christmas of summer-movie seasons.

Tim Burton's "Batman"
Michael Keaton in Tim Burton’s “Batman” (1989)

But none of those films, huge as many of them were, dominated the cultural spotlight that year as pervasively as Tim Burton’s Batman, released on this date in 1989.

Out of the Shadows

I can hear my thirteen-year-old nephew now:  “One superhero movie?  Wow—how’d you handle the excitement?”

Yeah, I know.  But it was exciting.  I was thirteen myself in 1989, spending most of my free time with my grade-school gang at the neighborhood comic shop down on Broadway, steeped in a subculture that hadn’t yet attained popular acceptance.  Richard Donner’s Superman (1978) had been the only previous attempt at a reverent comic-book adaptation, and, creatively and financially successful though it was, most of that goodwill had been squandered in the intervening decade by a succession of increasingly subpar sequels (through no fault of the marvelous Christopher Reeve, who makes even the worst of them watchable).

Christopher Reeve and Margot Kidder in “Superman: The Movie”

As for Batman:  It’s crucial to remember, and easy enough now to overlook, that in the late eighties, the prevailing public perception of the character was not Frank Miller’s Dark Knight, but rather Adam West’s “Bright Knight” from the self-consciously campy acid-trip of a TV series that had aired twenty years earlier.  In the wake of that show’s cancelation, a concerted effort was made by the character’s creative custodians at DC Comics—first Dennis O’Neil and Neal Adams, then Steve Englehart and Marshall Rogers, and most effectively Miller with his aptly titled The Dark Knight Returns—to reestablish Batman as the “nocturnal avenger” he was originally conceived to be.

“Dark Knight Triumphant” (July 1986); art by Frank Miller and Lynn Varley

But if you weren’t following the comics—and, in those days, few over thirteen years old were—the predominant impression the name “Batman” conjured wasn’t the ferocious Miller rendering above so much as this:

Continue reading

Game Over: Why an Unsatisfying “Game of Thrones” Resolution Was a Predictable Inevitability

After eight intense seasons of scheming (on the part of the characters) and puzzling (on the part of the viewership), at long last we finally know who won the Game of Thrones.

I did.

Fans found the end to be an unsatisfying “Game of Thrones” resolution
The moment we’ve been waiting for…

A few years back, as friends and colleagues were indulging in fevered speculation about who would ultimately end up on the Iron Throne, I attempted to spare them another Lost-style disappointment by explaining the story conventions of what media theorist Douglas Rushkoff identified as “postnarrative” fiction, which eschews the predictable, linear, closed-ended form of the monomythic arc—Joseph Campbell’s “hero’s journey”—in favor of an unpredictable, nonlinear, “hyperlinked” mode of narrative “that gets more open rather than more closed as it goes along” (Molly Soat, “Digital Disruption and the Death of Storytelling,” Marketing News, April 2015, 44), and accounts for such Digital Age watercooler shows as The Walking Dead, Westworld, Orphan Black, This Is Us, and Mr. Robot.

This mere fraction of the cast—itself three times the amount most other shows carry—alone suggests an unsatisfying “Game of Thrones” resolution was inevitable
This mere fraction of the cast—itself three times the amount most other shows carry—alone suggests an unsatisfying “Game of Thrones” resolution was inevitable

To that end, I argued that no series with as many characters and concurrent plotlines as Game of Thrones had been made to service could ever rightfully hope—or even credibly intend—to reach a definitive climax, let alone have any catharsis to offer in exchange for viewers’ time and miss-no-detail devotion:

The opening titles sequence of the show betrays this emphasis:  the camera pans over an animated map of the entire world of the saga, showing the various divisions and clans within the empire.  It is drawn in the style of a fantasy role-playing map used by participants as the game board for their battles and intrigues.  And like a fantasy role-playing game, the show is not about creating satisfying resolutions, but rather about keeping the adventure alive and as many threads going as possible.  There is plot—there are many plots—but there is no overarching story, no end.  There are so many plots, in fact, that an ending tying everything up seems inconceivable, even beside the point.

Douglas Rushkoff, Present Shock:  When Everything Happens Now, [New York:  Penguin Group, 2013], 34.

The many, many peers who willingly engaged me on the subject by and large dismissed the very notion of postnarrativity—of course all stories are meant to provide closure, the argument went, and A Song of Ice and Fire author George R. R. Martin was on record as knowing the particulars of how his saga would conclude!—and insisted with good-natured sportsmanship that my Game of Thrones prediction (prophecy?) would be decisively debunked come the series finale.  To support that assertion, the legendary five-hour pitch meeting was often cited in which screenwriters David Benioff and D. B. Weiss claimed to have accurately deduced Jon Snow’s true parentage and were accordingly rewarded with Martin’s theretofore elusive blessing to adapt the high-fantasy series for Hollywood.

To which I emphatically called bullshit.  The account of that alleged pitch meeting—much more so than anything from the world of Westeros—is pure fantasy from people who know a thing or two about mythopoeia.

To wit:  Anyone who’s ever written a story—particularly a long-form, multipart saga like A Song of Ice and Fire—knows that a narrative takes on a course of its own as it develops, and an author’s notions about where it’s all going are about as bankable as our grand ideas of how are own lives are going to play out in five, ten, fifteen years.  In life, you got your plans and schemes… and then you got what happens irrespective of those.  The latter always wins.  Fiction works in a similar fashion.  (And—you can take my word for this—little if anything that gets pitched in development meetings survives to the final draft, anyway.)  As David Benioff himself said in 2015:

We’ve had a lot of conversations with George, and he makes a lot of stuff up as he’s writing it.  Even while we talk to him about the ending, it doesn’t mean that that ending that he has currently conceived is going to be the ending when he eventually writes it.

Debra Birnbaum, “‘Game of Thrones’ Creators:  We Know How It’s Going to End,” Variety, April 15, 2015

Exactly.  And whereas a novel is beholden to the vagaries of merely a single determinant—its author—a television show is a complex organism whose creative evolution changes constantly based on content restrictions imposed by the studio, talent availability, production logistics, budgetary considerations… an endless host of factors.

Case in point:  It came to light earlier this year that shortly after completing work on the first season of GoT, series mainstay Emilia Clarke (Daenerys Targaryen) underwent high-risk surgery to treat a life-threatening brain aneurysm.  In the hypothetical instance she’d been unable to resume work on the show, what would that have meant for the so-called “grand plan” of Game of Thrones?

It would’ve been thrown right out the window is what.

Daenerys’ unsatisfying “Game of Thrones” resolution
Actress Emilia Clarke as Daenerys Targaryen in “The Bells”

That’s the way TV production works.  It’s amorphous.  It’s fluid.  It’s necessarily reactive.  Trying to conceive and carry out a five-year plan for a serialized show is about as tenable as trying to do the same for one’s personal and/or professional life.  It can’t really be done because none of us know what tomorrow might bring.  Any showrunner that insists he knows how it all ends is either full of shit or delusional.

Despite that, my contemporaries maintained the same unwavering faith in the Game of Thrones writers that Tyrion inexplicably invested in Dany, certain all would be paid off and tied up at journey’s end—you’ll see!

“Spoiler alert”:  It wasn’t.

Continue reading

All That You Can’t Leave Behind: On Memories, Memorabilia, and Minimalism

A lifelong packrat, here’s the story of my unlikely conversion to minimalism.


Concert tickets.  Refrigerator magnets.  Christmas ornaments.  Comic books.  Trading cards.  Greeting cards.  Bobbleheads.  Bank statements.  Photo albums.  Vinyl records.  Shoes.  Shot glasses.  Jewelry.  Blu-rays.

What does the stuff we collect, consciously or unconsciously, contribute to the story of our lives?

And… what does it mean for us when there’s less of it?

Photo credit: Ticketmaster blog, June 26, 2015

In an opinion piece that appeared in the New York Times earlier this month, columnist Peter Funt laments the obsolescence of analog mementoes in a Digital Age:

And so ticket stubs join theater playbills, picture postcards, handwritten letters and framed photos as fading forms of preserving our memories.  It raises the question, Is our view of the past, of our own personal history, somehow different without hard copies?

Peter Funt, “Does Anyone Collect Old Emails?,” Opinion, New York Times, April 5, 2019

In recent years, I’ve expanded this blog from its initial scope, an exclusively academic forum on storytelling craft, to chronicle my own personal history, often in no particular order.  I am ever and always in search of a clearer, more complete, more honest perspective on my past, and how it has shaped the narrative arc of my life; I mine my memories regularly for content, and for truth.

I have also routinely expressed apprehension about the practices we’ve lost in a Digital Age, the kind to which Mr. Funt refers, particularly as that applies to the corrupted discipline of storytelling itself:  From the superhero crossovers of the “Arrowverse,” to the literary Easter-egg hunt of Castle Rock, to the expansive franchising of Star Wars, today’s popular entertainments are less concerned with saying something meaningful about the human condition than they are with challenging the viewer to catch all their internal cross-references.  Whereas stories once rewarded audiences with insight, now the reward is the esteemed privilege of calling oneself a superfan—a participatory designation earned by following all the breadcrumbs and connecting all the dots… an assignment only achievable if one never misses a new installment:

In a nod to the subscription model of consumption—where we lease cars or pay monthly to a music service—the extended narratives of prestige TV series spread out their climaxes over several years rather than building to a single, motion picture explosion at the end.  But this means energizing the audience and online fan base with puzzles and “spoilers”. . . .

. . . The superfan of commercial entertainment gets rewarded for going to all the associated websites and fan forums, and reading all the official novels.  Superfans know all the answers because they have purchased all the products in the franchise.  Like one of those card games where you keep buying new, expensive packs in order to assemble a powerful team of monsters, all it takes to master a TV show is work and money.

Douglas Rushkoff, Team Human (New York:  W. W. Norton & Company, 2019), 163

Fanboys and -girls thought they were legitimized when the geek subculture went mainstream—when superheroes and sci-fi went from niche hobby to pop-cultural monopoly—but they were really just commodified:  “geek” shifted from a stigmatized social category to a lucrative economic one.  Leveraging our telecommunications-induced FOMO, a new permutation of commercial narrative was contrived:  the “mega-franchise,” which seeks not our intermittent audience, but rather our habitual obedience.  Sure, you may not have even liked the last four Star Wars or Terminator or Transformers movies… but do you really wanna run the risk of skipping this one?

More is more: Every “Star Wars” character has its own backstory and action figure—collect them all!

So, given those two ongoing preoccupations—personal history and receding traditions in the Digital Age—the thesis of “Does Anyone Collect Old Emails?” would’ve spoken to me regardless, but the timing of it was nonetheless uncanny, as I have devoted no small degree of consideration in recent months to the matter of the physical objects we amass, wittingly or otherwise, and how they tether us to the past.  Here’s the story.

Continue reading

Big News from a Small Climate Reality Chapter: Los Angeles Joins the County Climate Coalition

What can I do about it?  When it comes to the climate crisis, all of us have thought or expressed that sentiment, even—at some point or another—the most passionate environmental activists.  It can be uttered out of well-meaning curiosity… or genuine bewilderment… or political frustration… or apathetic abdication.  Regardless of which mindset it reflects, it is a universally valid—and perfectly understandable—acknowledgment of the overwhelming complexities of the problem of climate change.  What can any of us, as individuals, really do about it?

Especially when individual efforts simply aren’t going to move the needle on this at the speed and scale required; we have ten years, per the IPCC, to halve our greenhouse-gas emissions if we’re going to keep global warming below catastrophic levels.  By all means:  swap out your lightbulbs, compost your trash, take public transportation whenever possible—but understand the time when “small” personal actions like that could’ve actually made a meaningful difference has passed.

Now this existential crisis must be addressed legislatively, with bold and effective public policy, hence the reason so much has been made of the Green New Deal resolution, and the less-publicized but no-less-crucial Energy Innovation and Carbon Dividend Act, a bipartisan bill in Congress that would (finally) put a price on carbon pollution.  As exciting and promising as those steps are, though, in some respects they only make an answer to our intimate question—What can I do about climate change?—seem yet further out of reach.


Representative Alexandria Ocasio-Cortez and Senator Ed Markey outside the U.S. Capitol on Feb. 7, 2019 (Saul Loeb/AFP—Getty Images)

Take me, for instance.  A recovering screenwriter, I’m happy to illustrate at length the storytelling transgressions of Ghostbusters II, or mathematically quantify the similarities between Jack Nicholson and Heath Ledger’s interpretations of the Joker (they’re precisely 60% alike, for the record)—ya know, intellectual stuff—but good luck putting those “skills” to use in service for environmental-policy initiatives, right?

Well, not so fast.  Here’s how a bunch of ordinary laypeople banded together to do exactly that—to make a legislative difference in relatively short order—and how a few tricks I picked up in the Hollywood trenches actually came in handy.

Continue reading
« Older posts Newer posts »

© 2025 Sean P Carlin

Theme by Anders NorenUp ↑