Do we actually NEED pop culture?
We're all feeling like there are vanishingly few shared experiences anymore, but were we truly better off when there were? (A response to The New York Times, Ross Douthat, and Ross Barkan.)

In 2014, I interviewed a musician who told me something that’s stuck in my mind ever since. “Nowadays,” he said, “you can find bands that accurately re-create the sound of any style of music from any era. It’s like all time periods existing at once.”
“Wait,” I thought, “that’s a good thing???”
This musician, metal vocalist Greg Puciatio, certainly thought so.1 Puciato was quite enthused about the idea that we’d reached a point where listeners could now dial-up a contemporary facsimile of whatever period they craved, as if the sum total of all recording achievement were now replicable à la carte. I, on the other hand, felt queasy.
With a handful of exceptions, I’ve always been repulsed by music that merely mimics what’s been done before.2 As far as I’m concerned — and I say this as someone who makes music myself — there’s something almost horrifyingly soulless about the endeavor to exhume old music-making techniques in order to wear them as a kind of artificial skin in the present.3 I view the majority of musicians who follow this path as grave robbers and cannibals who live in the creative equivalent of a zombie-like state, willfully allowing themselves to be body-snatched by their romanticized image of the past.4 Needless to say, I feel strongly about this.
Every deeply musical person I’ve met in my life will find some basis on which to draw hard lines around the music they approve of, so as to delineate it from the music they disapprove of. To the outside observer, these lines may appear arbitrarily drawn, even riddled with contradictions, but the deeply musical person will expound with evangelical zeal on why these lines are sacrosanct. They can — and will — go on for hours, as if discussing matters of life and death.
Actually, mere life and death pale in comparison to these Great Concerns. The things that fall on the other side of the music aficionado’s lines must be viewed as abominations that encroach upon our essential humanity on a cosmic scale. If you think I’m joking — or even slightly exaggerating — then you haven’t spent enough time with musicians or serious music fans, both of whom tend to be frighteningly incapable of discerning degree of importance when it comes to things that run afoul of their preferences. All too often, they construct a kind of musical code of honor out of those preferences, missing the fact that the preferences begat the code — not the other way around.
Okay, I confess to being guilty as charged. But you should know that it wasn’t my pet peeves triggering my reaction during that 2014 interview. The apprehension I felt wasn’t just a matter of taste. Something more urgent was at play…
At that point, I’d already intuited for years that technology was inducing a mass case of sensory-emotional vertigo. Musician Greg Edwards (of the bands Failure, Autolux, A Perfect Circle, Replicants, and Lusk) would soon observe something along the same lines, referring to this condition as a form of “psychic decapitation.”
In 2018, Edwards offered the following, via the press release for Failure’s then-new album In the Future Your Body Will Be the Furthest Thing from Your Mind:
Living inside a screen seems like absolute freedom sometimes, but it’s more like a kind of psychic decapitation. We’ve made aliens of our bodies. We exist in an era where the most primitive structures in our brains are being rewarded and controlled — almost constantly — by extremely sophisticated, interconnected, and self-perpetuating technologies. There seems to be no imaginable way or real desire to moderate this. Artificial Intelligence may be creating itself right beneath our noses and using our bodies as unaware hosts. Everything is talking to everything else but there’s no communication anymore. Only divisions and their promotion. The surrogate reality of the internet sucks us out of our own bodies and puts us in a space where we can imagine we have less and less resemblance to the creatures we actually are.
A decade prior to Edwards’ insights, well before “AI” was a conversation I had any kind of context to understand, I could tell that something was wrong. The period we were living through, for one, felt distinctly flavorless — not on a purely aesthetic level, but at the level of actually being in the present. In hindsight, this makes perfect sense: if you cauterize your five senses with “virtual” engagement long enough, nothing around you will seem tangible anymore. Newly detached from the primal awareness that we exist in physical space, we found out the hard way what it’s like to be unmoored from the flow of time as well.
The suite of co-morbidities we unleashed with the advent of the smartphone has been exhaustively documented, but I think it’s instructive to zero-in on one particular factor. For the first time in human history we were observing ourselves in real time as we tried — also in real time — to lend definition to the present. We were now permanently stuck reminiscing in the present tense, as if we were looking back on the moment with enough distance to imbue it with perspective. But you can’t attain perspective before time has actually passed.
Little did I know in 2014, but writer and technology theorist
had, a year prior, already cataloged this dynamic in his book Present Shock: When Everything Happens Now.In the preface to the book, Rushkoff writes:
We are not approaching some Zen state of an infinite moment, completely at one with our surroundings, connected to others, and aware of ourselves on any fundamental level. Rather, we tend to exist in a distracted present, where forces on the periphery are magnified and those immediately before us are ignored. Our ability to crate a plan — much less follow-through on it — is undermined by our need to be able to improvise our way through any number of external impacts that stand to derail us at any moment. Instead of finding a stable foothold in the here and now, we end up reacting to the ever-present assault of simultaneous impulses and commands.
My sense is that the disintegration of our temporal anchoring hit a nerve of anxiety in the culture at large. And the subliminal sense that the moment possessed no inherent character of its own triggered a panic. The knee-jerk response was to reach backwards in time in a vain attempt to grasp at something that simply could not be retrieved within the flat, dimensionless voids of Tumblr, Instagram filters, and American Apparel ads. In this context, we can look at the proliferation of music that evoked the past as a kind of topical balm for the feeling of dangling over an existential abyss.
To be clear, art that fetishizes the past is nothing new. Retro-style ska bands in zoot suits, for example, enjoyed a popularity boom during the 1990s. Two decades prior, the iconic TV show Happy Days struck a chord in the American psyche by tapping into then-lingering memories of the 1950s. And if you were of college age or older when Richard Linklater’s period film Dazed and Confused was released in 1993, that means you’re old enough to view the film today while bathing in the warm glow of nostalgia for both the ‘90s and the ‘70s.
The examples go on and on. Trends, of course, are cyclical, and the idea that they eventually resurface twenty to thirty years after going out of style has become axiomatic. Likewise, we could argue that all time periods feel flavorless while we’re experiencing them. After all, we only notice the haircuts, clothing fashions, car designs, and audio-visual characteristics — like film grade, typeface, recording ambience, etc — after the fact. In the moment, however, all of those elements are as transparent as the air we breathe. We often don’t even realize they’re there until they’ve already slipped through our fingers into the mist of the past. Still, toward the late ‘aughts, I couldn’t shake the feeling that there had been an uptick in regurgitated trends.
Admittedly, I’m just old enough to remember the 1970s, which means I remember them vaguely. The feel of that decade flickers at the periphery of my awareness like the blurred edges of a dream — tantalizingly close yet just out of reach. By comparison, I can very clearly recall ephemera from the 1980s because I was old enough by then for my memories to solidify. As a result, I don’t take ‘80s nostalgia personally. Sure, it’s annoying, but there’s nothing threatening about it.
On the other hand, when actor Burt Reynolds’ mustachio’d look came back into vogue in the mid-2000s, I found it deeply unsettling. The inevitable return of ‘70s fashion sense stirred the cobwebs of an almost Alzheimer’s-like space in my mind’s eye. I’m not actually sure it would’ve aroused such a strong response had the actual world around me started to resemble the set of ‘70s sitcoms like Barney Miller.
But there was something eerie symbolic about the way digital filters — a flat medium by definition — echoed the faded overtones of Polaroid film that you could once hold in your hand. Suddenly inundated with flat, deadened imagery, I got a feeling not unlike being in the kind of nightmare where nothing particularly scary takes place and yet the setting emanates a powerful frequency of dread. This didn’t just feel like a fashion trend coming back around. It felt more like a collective haunting.
Naturally, the “Tumblr aesthetic” now recedes further and further into the rearview mirror. (Look out, though! It already appears to be making a comeback.) Regardless, I would argue that the anxious yearning to recapture something essential in the comet trail of days long gone is still alive and well. Take, for example, this new piece by New York Times columnist and Matter of Opinion host Ross Douthat:

Douthat writes:
There are certainly signs of ferment out there, in technology, religion and intellectual life. But I’m worried about pop culture — worried that the relationship between art and commerce isn’t working as it should, worried that even if the rest of American society starts moving, our storytelling is still going to be stuck. Or maybe not stuck so much as completely fragmented, with forms of creativity that are all intensely niche, like the podcast-splintered marketplace of news consumption.
Certainly that’s the feeling I had reading a lot of “best of” lists from movie critics this year. The films the critics really loved often felt incredibly marginal, more microtargeted even than the old art-house circuit. But the reviewers weren’t being unusually snobbish (my own favorite movie to date, “Anora,” has only made $13 million in North America); the list of genuinely commercially successful movies was just an incredibly dispiriting round of sequels and spinoffs and reboots.
[…]
It’s possible that the idea of an “important” work of popular art, like the idea of movie stardom, simply can’t survive the transition to the digital era. The journalist and novelist
has done interesting writing on this theme, borrowing from Bret Easton Ellis’s concepts of “Empire” and “Post-Empire” to describe a shift from the post-World War II culture that gave us big stars and big movies and Great American Novelists to a culture that’s too fractured for any artist to matter at that kind of scale. (Barkan argues that the brief cultural dominance of Taylor Swift and Travis Kelce was a fleeting throwback, like the last light from a dying sun.)
It would be a galactic-scale understatement to say that I’ve never been a fan of The New York Times. As far back as junior high, I was struck by what I recognized even then as an upper-class blindness that screams so loudly from the genteel tenor of its pages as to be almost deafening. For the record, I harbored no such disdain toward the rich kids with whom I attended an exclusive Manhattan private school. It’s not the people from that world that bother me, it’s the expressions that issue forth from that world, as if no other worlds exist.
With that said, the liturgical incantations of ghetto slang always hit me as equally elitist and myopic in their own way. Let’s just say I have an acute aversion to language that’s built to enclose people within a self-reinforcing outlook. In any case, it isn’t fair to blame Ross Douthat for my reservations about his employer — not any more, at least, than I would blame a hip hop artist for signaling his understanding of the world he comes from in exactly the ways his world demands to be recounted by its inhabitants.
But even still, it’s not like Douthat voices anything the rest of us haven’t been thinking for about twenty years at this point — namely, that the grip of pop culture as we once knew it has eroded to the point where arts and entertainment no longer furnish us with a common language. There are, as so many pundits, essayists, and culture critics have observed, vanishingly few events or phenomena that resonate with the public at large — at least in the moments those events are actually taking place.
Okay, so we’re losing our sense of shared experience. I suspect that few would argue otherwise. Among them, though, media analyst
— a longtime proponent of Rushkoff’s, and a friend of mine whose podcast Media Studies I appear on as a recurring guest — makes a compelling counter-argument. Shared culture, Brown offered on our latest livestream, remains as much of a presence in our lives, it’s just that it exerts less of a pull because there’s now a lot more niche stuff taking up bandwidth. As a result, our common lexicon of entertainment is more diffuse.I find Brown’s argument compelling, but I’m with Douthat — we’ve definitely lost something, and it feels permanent. Believe it or not, heavy metal/horror-film icon Rob Zombie’s 2019 comments on The Joe Rogan Experience align perfectly with Douthat’s points. About 15 minutes into the episode, Zombie laments the decline of television and radio as mediums with which the American public achieved a common fluency — or at least a working fluency as a kind of unofficial national language. As an example, Zombie reflects back on mainstream FM radio during the early ‘70s when, he says, disc jockeys might swing from playing The Allman Brothers to Diana Ross to KISS to ABBA.
“I’ll just listen to all of it,” says Zombie of the mindset at the time, “because it’s on the radio. There was something about being exposed to everything because there was nothing else.”
In 2023, Zombie revisited this theme as a guest on Howie Mandel Does Does Stuff, telling comedian Howie Mandel that he grew up on a TV diet that included the music shows Soul Train, American Bandstand, and Don Kirshner’s Rock Concert in equal measure. Later in the conversation, the once-universal reach of popularity comes up again.
“[It used to be that] if something was huge on the radio,” Mandel offers, “whether I liked it or not, I knew it was huge. If somebody was going out and doing stadium tours, I knew who they were. Whether I chose to go and see them, or whether I chose to buy their album, I knew.”
Zombie concurs: “Back in the day,” he answers, “there were artists that I didn’t even like, but [that I still instantly recognized]: Oh, that’s so-and-so. You would just hear it in the supermarket or wherever. It just permeated the culture. Now, it just doesn’t seem to.”
You’d expect that a weakened monoculture would lead to more artistic innovation, but Zombie argues otherwise in the Rogan segment. Zombie posits that what Douthat describes as “microtargeted” distribution has led to artists narrowing their horizons. Instead of responding to all the new available lanes of creativity by taking chances, artists today pander to niche tastes — their own tastes, as well as the tastes of their audiences. All of us, it appears, have developed an expectation of being catered-to. And it’s constricting our sense of adventure.
“If you hear a band, you go: Lemme guess what your favorite band is — the band you sound exactly like because you have no other influences. As opposed to: a lot of metal bands I know that are huge, they go My favorite band was actually ZZ Top, so we just decided to play ZZ Top riffs really fast and that’s how we created this [other thing]. But now everybody’s just so like I only like this [gestures with his hands to indicate ‘narrow’].”
Which brings us to our current predicament. If, as Greg Puciato indicated, both audiences and creators have unlimited access to previously established forms of art, then apparently we end up caught in a vicious cycle. The dis-incentive to make fresh creative choices has generated a feedback loop resulting in a failure to put our own collective stamp on the present. Rob Zombie, Ross Douthat, and others have all properly diagnosed the issue.
I lived through the Seinfeld phenomenon. In its prime, the era-defining sitcom had such a grip on culture that it aired eleven times a week in my TV market — new episodes during its Thursday evening primetime slot on NBC along with not one but two reruns every weeknight on my local FOX affiliate. None of us, as James Brown once told me off-camera, realized that Seinfeld would end up being the last TV show to register in the zeitgeist in such a profound way.
Likewise, I feel incredibly lucky to have seen summer blockbusters like Jaws, Raiders of the Lost Ark, E.T., Back to the Future, and Aliens in the theater. I remember the electric charge of anticipation coursing through the air in the run-up to both The Empire Strikes Back and Return of the Jedi. And even at the age of 4, I could grasp that the original Star Wars film had left an indelible imprint on the world around me. I also recall, with an almost narcotic rush of nostalgia, watching episodes of film critics Siskel and Ebert’s show At the Movies once a week on TV.
Along the same lines, I was present for the ascension of musical game-changers like Michael Jackson, Madonna, Motley Crue, Metallica, the Lollapalooza festival, Nirvana/Pearl Jam, etc — to say nothing of countless other songs that, as Rob Zombie attests, were omnipresent during their moment in the sun. (I turned my nose up at many of those songs, but these days I tend to regard them fondly.) And, let’s not forget that Rob Zombie’s original band White Zombie established a pop-culture presence only because the band was shouted out on Mike Judge’s animated MTV show Beavis and Butt-head, one of the most iconic and enduring cultural powerhouses of the Gen-X era.
If you offered me all the money in the world tomorrow, I wouldn’t trade the experience of having lived through those moments. Their value is so limitless that it simply couldn’t be compensated for by any other means. In that regard, I completely understand why so many people present the fragmentation of culture as a net loss. But is it fair to interpret our circumstances in those terms alone? I mean, after all, haven’t we gained something in this Faustian bargain too?
Take, for example, the late-’80s/early-’90s hard rock band Extreme, who made a splash in 2023 when the video for their then-new song “Rise” went viral — along, of course, with the online reaction to the song.
In a piece titled Hair Metal Is Back in Style Like It’s Going Out of Style, I wrote the following:
When Extreme released “Rise”, the leadoff single from their new album Six, this past March, Bettencourt’s guitar solo generated something of a viral sensation. Justin Hawkins of the Darkness posted a video titled “Holy F***ing Sh*t. I Can’t Handle This” lauding the solo, while a clip by the popular YouTube commentator Rick Beato analyzing the same solo has, to date, drawn two million views. Beato’s subsequent interview with Bettencourt has been watched over a million times, and the album’s first three videos have, as of this writing, racked up more than seven million YouTube views combined. In June, a week after the album’s release, Extreme frontman Gary Cherone told radio host Eddie Trunk that he was “overwhelmed” by the response. [All of those YouTube numbers have, of course, gone up.]
Prior to 2023, the last time Extreme was quote-unquote “relevant” was in 1990-91, on the strength of the smash-hit acoustic ballads “More Than Words” and “Hole Hearted.” If we were to revive the old paradigm, there’s no way a band that had dropped so far off the radar would stand a chance of generating the buzz we saw Extreme stir-up in ‘23. It bears repeating that the Boston quartet hit commercial paydirt three and a half decades ago, when it was standard for trends to usurp incumbent trends like bloody revolutions. Had culture not splintered in the way that it has, Extreme would most likely be extinct today.
In the current climate, however, artists that are well past their sell-by date in terms of mass appeal can still function — even thrive — by scaling back on their operation while still managing to reach their audience. Sure, the official YouTube clip of “More Than Words” has been viewed more than 786 million times — and it was only uploaded in 2009. “Rise,” meanwhile, hovers at the 5-million view mark. That sounds like a massive margin until you consider that the official “Hole Hearted” video has been viewed 15 million times (also since 2009).
Either way, if we ask the fanbases of bands like Extreme if they would rather revert back to the way things were, what do we think they would say. Are we to presume that fans who still get to enjoy their favorite bands in the present feel like something’s missing from the entertainment ecosystem that’s given those bands new life? Do we think those fans crave the tawdry spectacle of pop mega-star Taylor Swift’s relationship with NFL player Travis Kelce? Why on earth would they? The upshot, after all, of Greg Puciato’s “all time periods at once” reality is that all trends can exist at once too. They don’t have to die off, nor does anyone need to be left behind by the unforgiving churn of fashionability.
Could it be that culture will continue to be shared, only within smaller spheres of people? And is it possible that the existence of global-scale narratives is an anomalous occurrence in the first place? What would the likes of Joseph Campbell and Carl Jung predict about where we’re headed? If indeed human beings have an intrinsic need for shared narratives, will we not eventually get back around to creating them and passing them around? Or does the atomized state of media distribution reflect an equally intrinsic need to find sanctuary from the oppressive boot heel of mass popularity?
I personally have never cared whether the music I love is the most popular stuff out there. In fact, I much prefer when it isn’t, because that ensures that the music is allowed to develop away from the corrosive glare of superstardom. Nirvana’s meteoric rise, for example, effectively killed them off as a band, not to mention that it caused an existential crisis among a whole scene of bands whose driving ethos was to exist in opposition to the mainstream.
More important for the sake of this discussion: though Nirvana obviously set off an earthquake within the music industry, they did little to change my life as a listener. And I would go even further and argue that Nirvana may have changed music as a business, but not as an artform. That’s because there have always been less-visible — but not necessarily less functional — strata of popular music.
When the rock band The Strokes made a splash in the immediate wake of 9/11, the then-upstart blogosphere got busy hailing them (and the rest of the downtown New York scene they represented) as a “return to guitar rock.” Except here’s the problem: guitar rock hadn’t actually gone anywhere. I can say this with assurance because, at that point, I’d just spent the last few years working live sound at a rock club that became a premier destination for indie and underground bands like At The Drive-In, The White Stripes, etc. Three nights a week, I stood behind the soundboard as band after band whirred through town, shitloads of guitars in tow. “Guitar rock” may slipped from the upper echelons of mass awareness, but that didn’t mean that what I’d just witnessed with my own eyes and ears hadn’t happened.
If, as Ross Douthat and Ross Barkan observe, the current state of pop culture has endangered the apex-predator megafauna acts at the top of the food chain, it’s worked wonders for anyone who can scratch and claw their way to establishing even a modest foothold. It may be harder than ever for artists to subsist following a direct-to-consumer, farm-to-table business model, but there was far less infrastructure for that kind of model thirty years ago.

There are, of course, parallels between today’s art/entertainment landscape and what Douthat sees as “the podcast-splintered marketplace of news consumption.” The presumption here is, again, that fragmentation is a bad thing — no surprise coming from an old-media flagship and avowed information gatekeeper like The New York Times. Over the last 10 years, outlets like The Times have relentlessly proselytized for black-and-white stances on complex issues like gender youth “care,” vaccine mandates and, say, racial-essentialist notions that so-called “people of color” are dis-inclined towards being on time or being precise with math.
If we’re in the business handing out speeding tickets at the proverbial Indy 500, we need to recognize when mainstream ideas sound at least as batshit insane as the most fringe and supposedly “dangerous” viewpoints that crept in through open crevices in the floorboards. In a nuance-averse information monoculture that militates against skeptics, free thinkers, and even decorated scientists — tarnishing their reputations and tarring them as enemies — I would urge you to be cautious about how freely the same mechanism arrives at determining who is and isn’t hip. That mechanism isn’t built to protect you from danger — it's only built to numb you from the dangers it routinely exposes you to.
In that light, all the clamoring to “make pop culture great again” should be viewed as a desperate, last-ditch attempt at maintaining power over a culture that no longer has much use for gatekeepers. Sure, it’s more difficult to distinguish signal from noise (i.e: truth from lies, information from propaganda, media presence from grift) these days, but how much of the signal was noise before? Gatekeepers, alas, were never dependable at discerning truth — not in art, culture, or politics. In all three domains, I’ll gladly take my chances in the ruins of the Tower of Babel. It may take more work, but I’d rather leave the sense-making to me than outsource it to someone who sees it as their god-given right to do it for me, thank you very much.
With all that said, do we need pop culture? Well… yes and no. But whether or not mass-scale entertainment as we know it survives, it’s probably high time we show the gatekeepers the door. <3 SRK
Puciato was at that time the vocalist of the pioneering extreme technical/math-metal group The Dillinger Escape Plan. What he said to me at the top of this post didn’t make it into the piece I ended up extracting from our discussion, but it’s what I remember most about it. I quoted him here as best as I could from memory because the interview audio is sitting on an external drive in a cardboard box in a storage unit. I can’t remember if I said “That’s a good thing?” out loud in response or if I just thought it to myself.
There’s a gulf of difference between mimicry and inspiration, plagiarism vs. drawing from the past to arrive at something new. If you’re even remotely creative, you can tell the difference. Pretending that we can’t tell the difference is a fool’s errand. Sure, the line can be blurry sometimes, but anyone who insists on that hoary old maxim that “all great artists steal” isn’t an artist — or at least not an artist who’s staying true to their artistry.
So if you find yourself with the words “but all art is derivative” crossing your lips, it’s because you’re rationalizing your (or someone else’s) laziness. And FYI, “all great artists steal” is often cited out of context. Often attributed to figures from Igor Stravinsky to Pablo Picasso, we can trace the line back to T.S. Eliot, who actually meant it as a repudiation of artists who simply copy others.
Let me be clear: I love the work of countless artists who wear their influences from the past on their sleeve. In fact, virtually all musicians channel previous musical generations in overt ways — we simply can’t help it. When we get creative, the music we love listening to seeps up through our pores involuntarily, like sweat. And there’s no reason whatsoever to hide it. Musicians actually should openly wear their influences on their sleeve.
When we let the creative process unfold organically, our influences will inevitably sound fresh — and indeed authentic — as they work their way through our fingertips. I’m convinced that any musician, if they’re following their imagination and letting it do the work of leading them, will arrive at their own individual style. It usually takes some time for that style to coalesce — hence the term “finding one’s own voice” — but as I see it, it takes effort to be unoriginal. You have to put in work to suppress the part of you that’s actually you.
My issue is with artists who set out to re-create the past much like a set designer, for example, would re-create the past for a period film. When musicians take that kind of approach, I view the sounds they make not as music, but as cosplay. And (for the most part) I view the musicians themselves as play actors, not as artists. Or, perhaps, people who’ve undermined their own creativity by stuffing themselves into a mold someone else has already created.
I’m aware that copying other artists is a built-in feature of certain musical forms, like the blues, for example, or the entire canon of Western classical music. Folk music styles the world over are, in fact, meant to be preserved over time, while Indian classical and griot traditions are essentially mandated to be passed down intact from one generation to the next. On a related note: Shakespeare, of all people, “routinely stole plotlines and even whole scenes”!