Writer of things that go bump in the night

Tag: Superhero (Page 2 of 5)

The Lost Boys of the Bronx: A Tribute to Joel Schumacher

Batman Forever and The Lost Boys director Joel Schumacher died on Monday, June 22, at the age of eighty after a yearlong battle with cancer.  In an industry where branding is sacrosanct, his brand, as it were, was his steadfast refusal to be artistically pigeonholed:  Hit-and-miss though his track record may be, he was a rare breed of filmmaker who worked in virtually every genre, from comedy (D.C. Cab; Bad Company) to drama (Cousins; Dying Young) to sci-fi/horror (Flatliners; Blood Creek) to crime thriller (Falling Down, 8mm) to legal thriller (The Client, A Time to Kill) to musical (The Phantom of the Opera).  His filmography is as winding and unconventional as was his path to commercial success:

Schumacher was born in New York City in 1939 and studied design at Parsons and the Fashion Institute of Technology. . . .

When Schumacher eventually left fashion for Hollywood, he put his original trade to good use, designing costumes for various films throughout the Seventies. . . .  He also started writing screenplays during this time, including the hit 1976 comedy Car Wash and the 1978 adaptation of the musical The Wiz.

In 1981, Schumacher made his directorial debut with, The Incredible Shrinking Woman, a sci-fi comedy twist on Richard Matheson’s 1959 novel, The Shrinking Man, starring Lily Tomlin.  Fitting the pattern that would define his career, the film was a financial success but a flop with critics. . . .

Schumacher’s true breakout came a few years later in 1985, when he wrote and directed St. Elmo’s Fire, the classic post-grad flick with the Brat Pack cast, including Rob Lowe, Demi Moore and Judd Nelson.  Two years later, he wrote and directed The Lost Boys, a film about a group of teen vampires that marked the first film to star both Corey Feldman and Corey Haim, effectively launching the heartthrob duo known as “the Coreys.”

Jon Blistein, “Joel Schumacher, Director of ‘Batman & Robin,’ ‘St. Elmo’s Fire,’ Dead at 80,” Rolling Stone, June 22, 2020

Though Schumacher did not write The Lost Boys (1987) as the Rolling Stone piece erroneously asserts (the screenplay is credited to Janice Fischer & James Jeremias and Jeffrey Boam), neither his creative imprint on the project nor the cultural impact of the movie itself can in any way be overstated.  Sure, teenage vampires may be a dime-a-dozen cottage industry now, from Buffy the Vampire Slayer to Twilight to The Vampire Diaries, but if you happened to grow up on any of those Millennial staples, it’s worth knowing that pubescent bloodsuckers had never really been done prior to The Lost Boys—no, that celebrated iteration of the vampire’s pop-cultural evolution is entirely owed to the pioneering vision of Joel Schumacher.

Late filmmaker Joel Schumacher; photo by Gabriella Meros/Shutterstock, 2003 (498867t)

When Richard Donner left the project to direct Lethal Weapon instead, the script Schumacher inherited was essentiallyThe Goonies… with vampires.”  By aging up the characters from preteens to hormonal adolescents, Schumacher saw a creative opportunity to do something scarier—and sexier.  A cult classic was thusly born, and though The Lost Boys itself never became a franchise (save a pair of direct-to-video sequels two decades later, and the less said about them, the better), its fingerprints are all over the subgenre it begat.  We owe Schumacher a cultural debt for that.

Kiefer Sutherland’s David (second from left) leads a gang of teenage vampires in “The Lost Boys”

And I owe him a personal debt.  Over any other formative influence, The Lost Boys is directly and demonstrably responsible for my decision to study filmmaking in college and then to pursue a screenwriting career in Hollywood.  More than simply my professional trajectory, in point of fact, my very creative sensibilities were indelibly forged by that film:  The untold scripts and novels I’ve written over the past quarter century have almost exclusively been tales of the supernatural with a strong sense of both humor and setting—the very qualities The Lost Boys embodies so masterfully and memorably.  All of that can be traced to the summer of 1994.

Continue reading

Challenging Our Moral Imagination: On Hollywood’s Crises of Climate, Conscience, and Creativity

“What about Thanos?”

A strange question, I’ll concede, to emerge from an impassioned conversation about the transformative systemic overhauls required to our energy policy, our health care, and our economic ideology in the wake of the coronavirus—

—because what could the cartoon villain from the Avengers movies possibly have to do with any of that?

The answer, frustratingly, is:  More than you may realize.

During a recent online confab with the leadership team of the San Fernando Valley Chapter of the Climate Reality Project, the discussion drifted momentarily from existential matters to televisional ones:  What’s everybody been binge-watching?

Now, anyone who knows me—in person or through this blog—is peripherally aware of my immedicable disdain for movies and television.  Yet… with no baseball this spring to occupy my time, I’ve been reluctantly compelled to sample quite a bit of scripted media to which I’d have otherwise turned up a nose.  And, to my surprise, I find myself excited to share a handful of programming that, in my view, embodies creativity with a conscience.  (We’ll get to those coveted endorsements shortly.)

The cast of “Schitt’s Creek” (2015–2020)

To that end, one of our Climate Reality Leaders recommended Schitt’s Creek:  “The evolution of the self-absorbed yet well-meaning characters as they deal with the adversity that helps them discover what it really means to love is quite endearing,” my colleague said, “and I believe has left an impact on many who are out there now hoping for the world to refashion itself in that way.”

Schitt’s Creek is one of those shows that got away from me in our era of Peak TV, but I second the motion for more prescriptive fiction that both challenges us to be better—individually and collectively—as well as provides a model to do so.  Hard as this may be to fathom for those born into a postnarrative world, but our popular entertainments used to reliably perform that public service.  To wit:  Earlier this month, this unflinching indictment of white privilege from a 1977 episode of Little House on the Prairie resurfaced on Twitter to considerable gape-mouthed astonishment:

Bet you didn’t recall that show being so edgy.  Thing is, the stories we tell about the world in which we live are only as aspirational—and inspirational—as the moral imagination of our storytellers.  Alas, ever since meaningless worldbuilding supplanted purposeful storytelling, the function of popular fiction has shifted from lighting a path forward to drawing us down a rabbit hole of “Easter eggs” and “spoilers” that lead only to the next installment of a given multimedia franchise (meaning:  keep your wallet handy).  As the late Neil Peart wrote forty years ago:

Art as expression –
Not as market campaigns
Will still capture our imaginations
Given the same
State of integrity
It will surely help us along

Talk about advice unheeded.  Consequently, our commercial entertainment is often embedded—however unconsciously—with culturally pernicious values, from glorifying vigilante justice (superhero sagas; revenge thrillers), to undermining trust in public institutions (the self-serving White Houses of Scandal and House of Cards were a far cry from the empathetic Bartlet administration), to romanticizing criminal sociopathy (the street-racing “rebels” of Fast & Furious) and—bonus!—thereby validating a mindset in which “environmental degradation is not only a given but a goal” (robin, “The Fast and Furious Films and Mad Max Fury Road,” Ecocinema, Media, and the Environment [blog], September 20, 2019)

Continue reading

Misery Sans Company: On the Opportunities and Epiphanies of Self-Isolation

March?  Please!  I’ve been in self-isolation since January.

No, I was not clairvoyantly alerted to the impending coronavirus pandemic; only our dear leader can claim that pansophic distinction.  Rather, my wife started a new job at the beginning of the year, necessitating a commute, thereby leaving me carless.  (Voluntarily carless, I should stipulate:  I refuse to be a two-vehicle household; as it is, this congenital city kid, certified tree-hugger, and avowed minimalist owns one car under protest.)

My obstinance, however, comes at a cost:  I don’t live within convenient walking distance of anything save a Chevron station (the irony of which is only so amusing), so while the missus is at work, I’m effectively immobilized.  I got nowhere to go… save the home office opposite my bedroom.  Thusly, I made a conscious decision at the start of the year to embrace my newfound confinement as a creative opportunity—to spend the entirety of winter devoted all but exclusively to breaking the back of my new novel.  I kept my socializing and climate activism to a minimum during this period, submitting to the kind of regimented hourly schedule I haven’t known since my college days.

Johnny Depp in creative self-isolation in “Secret Window” (2004), from Stephen King’s novella

Before long, my period of self-imposed artistic self-isolation was yielding measurable results, and I’d been looking forward to emerging from social exile.  The week I’d earmarked for my “coming-out party”?  You guessed it:  The Ides of March.

I instead spent St. Paddy’s week mostly reeling, knocked sideways—as I imagine many were—by the speed and scale at which this crisis ballooned.  But in the days that followed, I resolved to compartmentalize—to get back to work.  I still had my codified daily routine, after all, which required a few adjustments and allowances under the new circumstances, and I had a project completely outlined and ready to “go to pages.”  So, that’s what I turned to.

And in short order, I’d produced the first two chapters, which, for me, are always the hardest to write, because I have no narrative momentum to work with as I do in later scenes.  You open a blank Scrivener document, and—BOOM!—all your careful planning and plotting, your meticulously considered character arcs and cerebral theme work?  It ain’t worth shit at that ex nihilo instant.  You may’ve built the world, but how do you get into it?  Writing that first sentence, that first paragraph, that first scene, that first chapter is like feeling your way around in the dark.  (Fittingly, my first chapter is literally about three guys finding their way through a forest path in the pitch black of night.)

“Going to pages” turned out to be just the intellectual occupation I needed to quell my anxiety, to give me a reprieve from our present reality.  And now that I’ve got story momentum, slipping into the world of my fiction every morning is as easy as flicking on the television.  For the three or four hours a day I withdraw to my personal paracosm, I’m not thinking about anything other than those characters and their problems.  As such, I’ve thus far sat out this crisis in my study, trafficking in my daydreams to pass the time; I’m not treating patients, or bagging groceries, or delivering packages, or working the supply chain, or performing any of the vital services upholding our fragile social order.  Instead, I’m playing make-believe.

Self-isolation didn’t serve Stephen King’s Jack Torrance particularly well in “The Shining”

It wasn’t long ago—Christmas, in fact—I’d issued an earnest, hopeful plea that in the year to come we might all forsake our comforting fictions, our private parallel dimensions, in favor of consciously reconnecting with our shared nonfictional universe.  And now here many of us find ourselves, banished from the streets, from the company of others, confined by ex officio decree to our own hermetic bubbles—as of this writing, 97% of the world is under stay-at-home orders—with nowhere to retreat but our escapist fantasies.  I’ve been reliant upon them, too—even grateful for them.

And that got me thinking about Stephen King’s Misery.  As masterful, and faithful in plotting, as Rob Reiner’s movie adaptation (working from a screenplay by William Goldman) is to King’s book, the theme—the entire point of the narrative—gets completely lost in translation.  This is a story about addiction, as only King could tell it:  It’s about how drugs (in this case, prescription-grade painkillers) help us cope with misery, but it’s also about how art can be an addictive—and redemptive—coping mechanism, as well; how it can turn misery into a kind of beauty, especially for the artist himself.

Continue reading

The Nostalgist’s Guide to the Multiverse—and How We All Might Find Our Way Back Home

Gee, for someone who’s spent the past few years lecturing others on the hazards of living on Memory Lane—by way of curated collections of memorabilia, or the unconscionable expropriation of superheroes from children, or whatever your nostalgic opiate—I quite recently became starkly aware of my own crippling sentimental yearning for obsolete pleasures.  But I’ve also identified the precise agent of disorientation that’s led many of us down this dead-end path… and, with it, a way out.  First, some backstory.

I’ve had occasion this autumn to enjoy ample time back on the East Coast, both a season and region I can never get enough of.  I spent a weekend in Rehoboth Beach, Delaware, with a group of high-school friends, many of whom I hadn’t seen in a quarter century.  I visited my beautiful sister in Washington, D.C., where we took in a Nats game so I could get a firsthand look at the team my Dodgers were set to trounce in the playoffs.  I attended my closest cousin’s wedding (Bo to my Luke), and served as best man at my oldest friend’s—both in New Jersey.  I marched in Greta Thunberg’s #ClimateStrike rally at Battery Park, and took meetings with representatives from the Bronx and Manhattan borough presidents’ offices about bringing both districts into the County Climate Coalition.

(I also got chased out of Penn Station by a mutant rat, so it was about as complete a New York adventure as I could’ve hoped for.)

Wonderful and often productive as those experiences were, though—the subway run-in with Splinter from Teenage Mutant Ninja Turtles notwithstanding—my favorite moments were the ones where nothing so noteworthy occurred.  The pints at my favorite pubs.  The old faces I stopped to chat with “on the Avenue,” as we say back home.  The solitary strolls through the park amidst the holy silence of snowfall.

Brust Park in the Bronx, New York, on December 2, 2019 (photo credit: Sean P. Carlin)

More than any of that, though—the ballgames, the gatherings formal and informal, the walks down the street or into the woods—I did what I always do, regardless of site or circumstance:  entertained quixotic fantasies about moving back.

This has become, over the past half-decade, a personal pathological affliction, as my long-suffering friends and family can lamentably attest.  I mean, I left New York for Los Angeles eighteen years ago.  Eighteen years!  That’s years—not months.  Christ, Carlin, at what point does the former cease to feel like home in favor of the latter?

I can’t say what prompted my recent epiphany, but for the first time in all my exhausting exhaustive ruminating on the matter, this simple, self-evident truth occurred to me:  I’ve never really left New York.

Continue reading

Naomi Klein’s “On Fire” (Book Review)

Since I trained under former vice president Al Gore to serve in his Climate Reality Leadership Corps just over a year ago—a period in which no fewer than eighty-five federal environmental regulations have been rolled back, greenhouse-gas emissions have spiked (after leveling off in years prior), polar-ice melt is outpacing predictive modeling, and the Intergovernmental Panel on Climate Change has strenuously warned us we have a mere decade to halve our current rate of carbon-burning if we hope to avoid the most catastrophic effects of climate change—there is one distinct emotional state that has been entirely absent from my life.

Despair.

I might, in fact, be happier and more optimistic than at any other point in my adult life.

Activism, I’ve discovered, is the antidote to despair, to doomism.  Over the past year, I’ve given public presentations on the Energy Innovation and Carbon Dividend Act, a bipartisan bill in Congress that would charge fossil-fuel extractors for the privilege of pollution—of treating the public commons of our atmosphere like an open sewer—they’ve thus far enjoyed free of charge.

This past March, my Climate Reality chapter was proud to enlist Los Angeles into the County Climate Coalition, an alliance of jurisdictions across the United States, formed by Santa Clara County Supervisor Dave Cortese, that have formally pledged to uphold the standards of the Paris Accord.  Less than six months later, we were in attendance as the L.A. County Board of Supervisors voted to adopt the OurCounty sustainability plan, one of the most ambitious green initiatives in the United States.

And just last month, I joined 300,000 activists in Lower Manhattan for the Global Climate Strike as we swarmed the streets of City Hall, marched down Broadway, and rallied at Battery Park—where no less than Greta Thunberg addressed the crowd.  None of that, as it happens, has left much time to actually worry about the climate breakdown.

Greta Thunberg at the Global Climate Strike in New York City on September 20, 2019 (photo credit: Sean P. Carlin)

But that level of activism, I acknowledge, isn’t something to which everyone can readily commit.  So, if you want to share my profound hopefulness about the solutions to the climate crisis—if you want to appreciate the world-changing opportunity humanity has been handed by history—do yourself a favor and read a book that might admittedly be outside your comfort zone:  Naomi Klein’s On Fire:  The (Burning) Case for a Green New Deal.

Naomi Klein’s “On Fire: The (Burning) Case for a Green New Deal”

I promise:  You won’t be inundated with scientific facts and figures; if you want to understand the basic science of global warming, Mr. Gore’s documentaries An Inconvenient Truth (2006) and An Inconvenient Sequel:  Truth to Power (2017) are both excellent primers.  Naomi Klein’s On Fire is a recently published collection of her essays and lectures from the past decade, bookended by all-new opening and closing statements on why a Global Green New Deal is the blueprint for an ecologically sustainable and socially equitable twenty-first century:

The idea is a simple one:  in the process of transforming the infrastructure of our societies at the speed and scale that scientists have called for, humanity has a once-in-a-century chance to fix an economic model that is failing the majority of people on multiple fronts.  Because the factors that are destroying our planet are also destroying people’s quality of life in many other ways, from wage stagnation to gaping inequalities to crumbling services to the breakdown of any semblance of social cohesion.  Challenging these underlying forces is an opportunity to solve several interlocking crises at once. . . .

. . . In scale if not specifics, the Green New Deal proposal takes its inspiration from Franklin Delano Roosevelt’s original New Deal, which responded to the misery and breakdown of the Great Depression with a flurry of policies and public investments, from introducing Social Security and minimum wage laws, to breaking up the banks, to electrifying rural America and building a wave of low-cost housing in cities, to planting more than two billion trees and launching soil protection programs in regions ravaged by the Dust Bowl.

Naomi Klein, On Fire:  The (Burning) Case for a Green New Deal, (New York:  Simon & Schuster, 2019), 26
Continue reading

Mirror/Mirror: On Seeing Ourselves in Fictional Characters

Over the past few months, I’ve been helping plan an old friend’s bachelor party, the experience of which has made me starkly aware of just how conservative I’ve become in middle age.  Not politically, you understand—personally.  When I was a kid, I was like Leo Getz in Lethal Weapon (I was seriously that annoying) who nonetheless fancied himself Martin Riggs; somewhere along the way, though, I grew up to be Roger Murtaugh.

Riggs (Mel Gibson), Leo (Joe Pesci), and Murtaugh (Danny Glover) in “Lethal Weapon 2” from 1989 (Mary Evans Picture Library)

And that got me thinking about how, at different stages of life, we’re sometimes lucky enough to closely identify with a particular fictional character in an exceptional way; I would say the experience is even as random and as rarified as true friendship:  How many times, really, have we “met” a character who speaks so directly to us, whose emotional circumstances so closely reflect our own, that through them we vicariously attain some measure of insight… and maybe even catharsis?

We’re not necessarily talking favorite characters here; those come in spades.  God knows, I love Indiana Jones and Jean-Luc Picard and Philip Marlowe and Chili Palmer, but I don’t necessarily—much as I want to—relate to those characters so much as admire their characteristics.  In that way, they’re more aspirational than they are analogous.

I’d like to know which characters from fiction speak to you—and for you.  I’ll get us started, selecting examples from three distinct phases of my life:  childhood, adolescence, and midlife.  (For those interested, I’ve included each narrative’s Save the Cat! genre.)

Continue reading

Tim Burton’s “Batman” at 30—and the Cultural Legacy of the Summer of 1989

In order to appreciate the state of commercial adolescence to which Generation X has been disproportionately consigned, one needs to consider Tim Burton’s Batman in its sociocultural context:  how it inadvertently provided a blueprint to reconceptualize superheroes from innocent entertainment meant to inspire the imagination of children to hyperviolent wish-fulfillment fantasies for commercially infantilized adults.


The weekly theatrical debut of a new franchise tentpole, voraciously bulling aside the $200 million–budgeted blockbuster released a mere seven days prior, is par for the course nowadays, but back in 1989—thirty summers ago per the calendar, though seemingly as recently as yesterday by the nebulous barometer of memory—we’d never before experienced anything like that.

That was the year that gave us new entries in such ongoing adventures as Indiana Jones, Star Trek, Ghostbusters, The Karate Kid, Lethal Weapon, James Bond, and Back to the Future, lowbrow comedies Police Academy, Fletch, and Vacation, as well as slasher staples Friday the 13th, A Nightmare on Elm Street, and Halloween—to say nothing of launching all-new franchises with Bill & Ted’s Excellent Adventure, Major League, Pet Sematary, Honey, I Shrunk the Kids, Weekend at Bernie’s, and Look Who’s Talking.  To anyone who’d grown up in the nascent home-video era—that period in which all the aforementioned series (save 007) were born and could thusly be re-watched and obsessed-over ad infinitum—1989 was the Christmas of summer-movie seasons.

Tim Burton's "Batman"
Michael Keaton in Tim Burton’s “Batman” (1989)

But none of those films, huge as many of them were, dominated the cultural spotlight that year as pervasively as Tim Burton’s Batman, released on this date in 1989.

Out of the Shadows

I can hear my thirteen-year-old nephew now:  “One superhero movie?  Wow—how’d you handle the excitement?”

Yeah, I know.  But it was exciting.  I was thirteen myself in 1989, spending most of my free time with my grade-school gang at the neighborhood comic shop down on Broadway, steeped in a subculture that hadn’t yet attained popular acceptance.  Richard Donner’s Superman (1978) had been the only previous attempt at a reverent comic-book adaptation, and, creatively and financially successful though it was, most of that goodwill had been squandered in the intervening decade by a succession of increasingly subpar sequels (through no fault of the marvelous Christopher Reeve, who makes even the worst of them watchable).

Christopher Reeve and Margot Kidder in “Superman: The Movie”

As for Batman:  It’s crucial to remember, and easy enough now to overlook, that in the late eighties, the prevailing public perception of the character was not Frank Miller’s Dark Knight, but rather Adam West’s “Bright Knight” from the self-consciously campy acid-trip of a TV series that had aired twenty years earlier.  In the wake of that show’s cancelation, a concerted effort was made by the character’s creative custodians at DC Comics—first Dennis O’Neil and Neal Adams, then Steve Englehart and Marshall Rogers, and most effectively Miller with his aptly titled The Dark Knight Returns—to reestablish Batman as the “nocturnal avenger” he was originally conceived to be.

“Dark Knight Triumphant” (July 1986); art by Frank Miller and Lynn Varley

But if you weren’t following the comics—and, in those days, few over thirteen years old were—the predominant impression the name “Batman” conjured wasn’t the ferocious Miller rendering above so much as this:

Continue reading

All That You Can’t Leave Behind: On Memories, Memorabilia, and Minimalism

A lifelong packrat, here’s the story of my unlikely conversion to minimalism.


Concert tickets.  Refrigerator magnets.  Christmas ornaments.  Comic books.  Trading cards.  Greeting cards.  Bobbleheads.  Bank statements.  Photo albums.  Vinyl records.  Shoes.  Shot glasses.  Jewelry.  Blu-rays.

What does the stuff we collect, consciously or unconsciously, contribute to the story of our lives?

And… what does it mean for us when there’s less of it?

Photo credit: Ticketmaster blog, June 26, 2015

In an opinion piece that appeared in the New York Times earlier this month, columnist Peter Funt laments the obsolescence of analog mementoes in a Digital Age:

And so ticket stubs join theater playbills, picture postcards, handwritten letters and framed photos as fading forms of preserving our memories.  It raises the question, Is our view of the past, of our own personal history, somehow different without hard copies?

Peter Funt, “Does Anyone Collect Old Emails?,” Opinion, New York Times, April 5, 2019

In recent years, I’ve expanded this blog from its initial scope, an exclusively academic forum on storytelling craft, to chronicle my own personal history, often in no particular order.  I am ever and always in search of a clearer, more complete, more honest perspective on my past, and how it has shaped the narrative arc of my life; I mine my memories regularly for content, and for truth.

I have also routinely expressed apprehension about the practices we’ve lost in a Digital Age, the kind to which Mr. Funt refers, particularly as that applies to the corrupted discipline of storytelling itself:  From the superhero crossovers of the “Arrowverse,” to the literary Easter-egg hunt of Castle Rock, to the expansive franchising of Star Wars, today’s popular entertainments are less concerned with saying something meaningful about the human condition than they are with challenging the viewer to catch all their internal cross-references.  Whereas stories once rewarded audiences with insight, now the reward is the esteemed privilege of calling oneself a superfan—a participatory designation earned by following all the breadcrumbs and connecting all the dots… an assignment only achievable if one never misses a new installment:

In a nod to the subscription model of consumption—where we lease cars or pay monthly to a music service—the extended narratives of prestige TV series spread out their climaxes over several years rather than building to a single, motion picture explosion at the end.  But this means energizing the audience and online fan base with puzzles and “spoilers”. . . .

. . . The superfan of commercial entertainment gets rewarded for going to all the associated websites and fan forums, and reading all the official novels.  Superfans know all the answers because they have purchased all the products in the franchise.  Like one of those card games where you keep buying new, expensive packs in order to assemble a powerful team of monsters, all it takes to master a TV show is work and money.

Douglas Rushkoff, Team Human (New York:  W. W. Norton & Company, 2019), 163

Fanboys and -girls thought they were legitimized when the geek subculture went mainstream—when superheroes and sci-fi went from niche hobby to pop-cultural monopoly—but they were really just commodified:  “geek” shifted from a stigmatized social category to a lucrative economic one.  Leveraging our telecommunications-induced FOMO, a new permutation of commercial narrative was contrived:  the “mega-franchise,” which seeks not our intermittent audience, but rather our habitual obedience.  Sure, you may not have even liked the last four Star Wars or Terminator or Transformers movies… but do you really wanna run the risk of skipping this one?

More is more: Every “Star Wars” character has its own backstory and action figure—collect them all!

So, given those two ongoing preoccupations—personal history and receding traditions in the Digital Age—the thesis of “Does Anyone Collect Old Emails?” would’ve spoken to me regardless, but the timing of it was nonetheless uncanny, as I have devoted no small degree of consideration in recent months to the matter of the physical objects we amass, wittingly or otherwise, and how they tether us to the past.  Here’s the story.

Continue reading

Maybe It’s Time: Here’s to Making 2019 the First Official Year of the 21st Century

“Maybe it’s time to let the old ways die.”  How ironically apropos that in a world led by a reality-show president, where facts are subjective and everything from our energy sources to our economic policies to our pop culture are the antiquated vestiges of a previous century, that a lyric by a fictitious rock star from a remake of a remake of a remake of a movie from 1937 should emerge as the perfect, hopeful mantra of an impending (if belated) new millennial era.  I propose officially adopting it as such; it might make what comes next a little easier to accept for those of us still clinging nostalgically to the 1950s (Baby boomers) and the 1980s (Gen X).

If you belong to one of those analog generations—I’m an Xer myself—and you’ve ever had the frustrating experience of working with a Millennial, you know their nonlinear minds interpret the world in an entirely different manner than those that came before them.  The first wave arrived in the workforce a decade ago, expecting a seat at the table before they’d earned one, demanding their voices be heard before their opinions were informed by practical experience.  Their operating philosophy seemed to be:  Yeah, but just because we’ve always done it that way doesn’t mean we shouldn’t try it… this way.  In their view, the arduous, incremental, straight-line path of our institutionalized practices and protocols didn’t square with their hyperlinked grasp of our new Digital Age reality.  Thusly, conventional (read:  linear) thinking was to be openly challenged, not obediently emulated.

Like many of my fellow Xers that came up the hard way—those of us that knew our place, paid our dues (there’s that pesky sense of linearity again), never assumed we had all the answers—that worldview has often left me bewildered at best, infuriated at worst.  And the sense of entitlement so endemic to Millennials is only compounded by their corresponding characteristic of impatience:

“They’ve grown up in a world of instant gratification.  You want to buy something—you go on Amazon, it arrives the next day.  You want to watch a movie?  Log on and watch a movie—you don’t check movie times.  You want to watch a TV show?  Binge!  You don’t even have to wait week to week to week.  Right?  I know people who skip seasons just so they can binge at the end of the season.  Right?  Instant gratification.”

Simon Sinek, “Simon Sinek,” Inside Quest with Tom Bilyeu, August 7, 2016

Now, to a middle-aged generation still trying (without success) to take the seat at the head of the table from the unyielding occupancy of the Boomers, the Millennials’ impulse—their self-ordained imperative—to grab the wheel and make “meaningful impact” is their most vexing attribute.

And—Christ help me for saying this—it just might change everything for the better.

Continue reading

Dreaming Dreams and Seeing Apparitions: On Writing Horror and Fighting Climate Change

It certainly occurred to me, ahead of last month’s post, that the blog’s left turn into environmentalism might’ve whiplashed those expecting the customary deep dive into craft or culture.  As part of our training as Climate Reality Leaders, we’re asked to reflect on our personal climate stories—the origins of our interest in the movement—something I’ve invested no small amount of time doing this past month.  To that end, it dawned on me that the very same formative circumstances inspired both my passion for horror fiction and climate activism; they are not unrelated callings but very much part and parcel.

It was at the confluence of the Harlem and Hudson Rivers, my old stomping ground, where many of my first boyhood adventures were undertaken.  My friends and I would scale the towering steel foundational girders of the Henry Hudson Bridge as high as we could climb.  We’d cross Spuyten Duyvil Creek by way of the century-old railroad swing bridge to explore the Indian caves in the vast, lush expanse of Inwood Hill Park at the northernmost tip of Manhattan.  (Incidentally, those caves feature prominently in the 2003 historical fantasy Forever, Pete Hamill’s centuries-spanning ode to Gotham.  Great novel.)

On weekends, my parents would drive us up the Hudson Valley—to Sleepy Hollow or Nyack or Bear Mountain—which was a particularly spellbinding delight this time of year.  It’s a truly magical region that in many respects looks just the same as it did to the Dutch explorers who first arrived in the early seventeenth century—and, more to the point, the Lenape Indians who called the valley their home for a dozen millennia before that.  For the conservation of this land, you can thank—and I can’t believe I’m saying this—J. P. Morgan.

And not just him—George Walbridge Perkins and John D. Rockefeller, too.  Owed in part to the efforts of these forward-thinking businessmen-philanthropists at the turn of the twentieth century, much of the woodlands on the banks of the Hudson was spared from development, as were the Palisades, the magnificent cliffs along the west side of the river.  Consider it:  These capitalists preserved the natural harmony of the Lower Hudson Valley from the ravages of capitalism itself; on account of their preemptive actions, much of it remains to this day virgin forest to be (re)discovered by successive generations.

The woodlands just blocks from where I grew up in the Bronx (photo credit: Sean Carlin, 29 December 2012)

As a writer of supernatural fiction who continues to draw inspiration from this region—virtually all my stories are set there—I walk in the footsteps of literary giants.  Two of the first American authors—horror authors, no less—lived in the area and wrote about it:  Washington Irving and Edgar Allen Poe.  Savor the way Irving lets this “region of shadows,” pregnant with manes, cast a spell over his receptive imagination in the Halloween classic “The Legend of Sleepy Hollow”: Continue reading

« Older posts Newer posts »

© 2024 Sean P Carlin

Theme by Anders NorenUp ↑