Not Happily Ever After: Why the Disney Renaissance Ended…

disney

With the recent slate of Disney films being released to theaters, it could be mistaken that we’re in the 1990s again. In the past two years, Disney has released live action remakes of Beauty and the Beast, Aladdin, and The Lion King. Next year Mulan will follow suit. The Hunchback of Notre Dame is officially in the works now too. All of these films are based on blockbuster animated features from Disney that premiered in the 1990s.

They were part of the illustrious era of animated feature films known as the Disney Renaissance—from 1989 to 1999. And frankly, it’s easy to see why these films still seem to be relevant—chronologically and culturally: the generation that grew up with these films, mostly Millennials born between the early 1980s through the mid 1990s, have yet to reach age forty at most—and barely finished college at the least. They’re also a notoriously nostalgic group, like most Americans—but with an even more ubiquitous platform to perpetuate it—on social media and the Web.

They grew up with the Disney Renaissance and have ostensibly kept the relevance of its animated features alive—literally two decades after its official end. With at least half of the films from the era slated for reinterpretation as major live action films, their enduring grasp on popular culture is unquestionable.

As a member of this generation and a bona fide Disney buff, I cant attest that the memory of these films are indeed as vivid and fresh today as they were when I was at the target age for their appeal. They are touchstones of my childhood, imbuing me with a sense of wonder and a worldview that I still hold onto to this day.

Nonetheless, the Disney Renaissance did have a clear beginning and end. It veritably ceased to produce more new films to augment its original run, after 1999. As any Disney fan or even a casual observer will recall, subsequent animated features from Disney experienced a steep drop in popularity for nearly a decade afterwards, in spite of a continual output of releases then.

As a fan with unrelenting exposure to these animated films, I have arrived at a conclusion as to why the phenomenon of the Disney Renaissance drew to a close at the end of the last century.

My theory is rooted in the catalyst that started the Disney Renaissance and made it popular in the first place. Kick-starting the era in 1989, The Little Mermaid was the first fairy tale that the Disney studio had worked on in thirty years. This was largely why it was a resounding success, because it returned to the formula that had worked so well for Disney in the past: a fairy tale musical, centered on a princess. Disney had erroneously abandoned this formula for nearly two decades prior to this, and suffered commercially and artistically with audiences.

Per hindsight, however, I believe the rediscovery of this Disney formula during the Renaissance era would also become its own undoing. If the era had one fault, it was that literally every film adhered to the formula, stringently, with the exception of The Rescuers Down Under, an early entry to the era that proved to be a commercial misfire, tellingly. Every other animated feature between 1989 and 1999 consisted of: a beautiful or outcast protagonist—usually orphaned by one or both parents, amusing sidekicks, romantic interest, colorful villain, an epic setting, and a roster of songs that covered the requisite themes of life goals, romantic love, and villainy. This is not to dispel the considerable diversity of creative and technical achievement of the era—producing some of the most ingenious, memorable and astonishing feats of song, visual, and characters—not to mention an unprecedented level of representation for the first time from Disney (lead characters of different ethnicities: Aladdin, Pocahontas, Mulan).

Nonetheless it’s quite revealing to see that, when compared to previous eras of Disney animated features, no other era could be accused of the same homogeneity: the Golden Age, from 1937 to 1942, only had two films that featured romantic love, only one princess, and two clear musicals. The Silver Age had several films without romantic love or a musical format as well. These two eras are arguably the biggest rivals of the Renaissance, in popularity and artistic achievement. Both reached a demise for their own disparate reasons—economical downturn after 1942 due to World War II, and the death of Walt Disney in 1966, respectively.

The theory of redundancy during the Disney Renaissance had also possibly begun to take shape as early as the mid-way point of its run: after four stellar blockbusters from the studio, things suddenly slowed down with the release of Pocahontas in 1995. Widely viewed as the first critical let down of the era, things didn’t immediately return to form in 1996 or 1997 either, with the releases of The Hunchback of Notre Dame and Hercules. Both films failed to draw audiences back after the misstep of Pocahontas. Something was amiss.

Audiences were being drawn elsewhere too: computer animation. This is perhaps the most commonly held theory of why the Disney Renaissance came to an end: towards the end of the 1990s, a new medium was dawning—usurping traditional, hand-drawn (2-D) animation that Disney was known for. With the release of Toy Story in 1995, a resounding success not just for being the first major computer-animated feature but a true triumph of story, audiences found a new outlet for family-friendly films that appealed to all ages. A slew of computer-animated (or CGI) films followed in gradual succession for the rest of the decade and beyond, none of which followed the renowned Disney formula—and often to critical and commercial success, surpassing even Disney. If the Disney Renaissance proved that animated features could appeal to all ages, CGI animated films proved that they didn’t have to be based on classic, existing literature—opening the doors for innovations in story that just happened to be married to a very innovative technology, now coming to its own at the end of the twentieth century.

Although I agree that CGI certainly disrupted Disney’s momentum in the latter half of the 1990s—particularly since CGI animated features have ostensibly remained more popular with audiences and critics alike, and 2-D animation has never come back into vogue since—I still stand by my theory that it was more of content than just medium. Also, the onslaught of CGI feature-length films actually occurred rather slowly, and did not immediately crowd the market that 2-D Disney animated features dominated: after Toy Story was first released in 1995, the next CGI films were Antz and A Bug’s Life, both premiering in 1998. That left three full years in between, which subsequently saw the release of three Disney animated features to vigorously fill the void to maintain their stronghold on audiences—yet they didn’t. The Hunchback of Notre Dame, Hercules, and Mulan were released by Disney during this period and though not critical or commercial failures, were far less renowned than their predecessors from the Disney Renaissance. Again, a malaise seemed to have settled with audiences, which could be read as a reflection of the medium’s output. Audiences surely weren’t just holding off for the next great CGI film, after only having witnessed the medium’s sole initial output in 1995. The average moviegoer had no idea how the CGI medium would eventually fare, though it was clearly a technological advancement. (It wasn’t until 2001, that the medium exploded with simultaneous releases of multiple CGI animated films, cementing it as a mainstay in cinema).

It was apparent that audiences had simply grown tired of the Disney formula, and so the business changed after 1999, just as it did after the Silver Age in 1967—when the studio entered a prior era of commercial and critical malaise, following the death of Walt Disney.

With that, it’s also helpful to understand what followed the Disney Renaissance: from 2000 to 2008, the Post Renaissance era indeed echoed the era that followed the Silver Age—the Bronze Age of 1970-1988, when the studio struggled to redefine its identity to audiences then too. The resulting films in the new millennium would reflect these efforts, striking into new territories such as Science Fiction, original stories not based on classic tales, even the CGI medium as well—which would be a portent of the studio’s eventual direction. Most of the films from this era didn’t quite resonate enough with audiences to turn them into classics.

The Revival era followed in 2009, with yet another rediscovery of the formula—with Tangled cementing Disney’s return to the zeitgeist, followed by Frozen. Both were clearly fairy-tale musicals centered on a princess, but married to the new CGI medium now, which Disney has converted to indefinitely to fit with the times. Aside from the new look, these films are quite similar to the Renaissance formula. Audiences responded and propelled these films into the public conscience as they did in the Renaissance era, hence the new namesake.

But if these new films from the Revival are following the same virtual formula as the Renaissance, why did the Renaissance cease in the first place? Shouldn’t it have endured, unabated, by sheer demand?

Again: we just needed a break. As a lifelong Disney fan, with the benefit of hindsight, I couldn’t fathom a Disney Renaissance-style film being released by the studio every year for the next two decades after Tarzan, the last of that era, in 1999. On some level, I would enjoy it purely as a diehard fan, but it would almost become campy—a parody of itself if you will. As much as audiences loved the Disney Renaissance, we can also sense artistic malaise. The formula had gotten monotonous and stale—again, already by the midway point of its era—and audiences clearly reacted with their wallets.

Does that mean that the Revival era is doomed to repeat history? Surprisingly, it may be averting this fate because: although it certainly has resuscitated the Disney formula, there’s one telling factor that separates it from the Disney Renaissance—it’s not following the formula for every new film. To their credit and maybe by calculation, they’re not just doing princess stories or fairy tales exclusively. Maybe that’s part of its success: Big Hero 6 and Zootopia are some of the titles that are as divergent from fairy tales and princesses as you can get. Both met with clear commercial and critical acclaim—unlike the misfires of previous eras that also strayed from the formula.

Whether they realize it or not, perhaps this is what audiences need. We will always love and adore princesses and fairy tales, but there needs to be variety. There’s nothing wrong with having a studio trademark (family-friendly films, music, artistic visuals), but the trademark can be broad and expand. Art is about change, pushing boundaries, and expanding possibilities. Sure, we do like some degree of familiarity—all art has a core familiarity: a movie has a beginning, middle and end; music has notes and instruments, verses and choruses. But along with familiarity we need variety.

Perhaps Disney has a new, unprecedented confidence and competency that is allowing them to achieve something they weren’t quite able to do in the past: successfully tell classic stories and original new stories, concurrently. Disney may have failed at pursuing either one at various times in the past, not because either one was theoretically bad—but because they just weren’t truly creating something exceptional. As mentioned, they were going through the motions after a certain point during the Disney Renaissance, settling into a creative ennui—or alternately, striking into new territory with dubious artistic vision, during the Post Renaissance for example. But if a story is truly told well, it can potentially succeed. Audiences will respond to something special even if it defies current trends, as they did when The Little Mermaid reignited this medium that had virtually gone defunct for nearly two decades, kick-starting the Disney Renaissance in 1989.

Will Disney get it right this time? We can only wait and see.

Any long-running institution is going to experience inevitable peaks and valleys in relevance and vitality—hence the different eras of Disney feature animation that exist in the first place. I am resigned to the eventual fate of the storied Disney Renaissance of my youth, because to borrow a platitude: good things can’t last forever. Sitcoms come to an end. Book and film franchises end—and can even be revived again after a certain period of absence (sound familiar?). The much-beloved Disney Renaissance is all the more revered because it wasn’t permanent and was limited in duration. It lends it a rarity that further incites gratitude and veneration. It was beautifully fleeting, as all life is.

It’s almost as if the beginning and ending of each era was inevitable—because like all art, Disney feature animation is an evolving medium. The studio is learning their craft in real time, and we get to watch it unfold onscreen.

 

Advertisements

25th Anniversary: ‘Live Through This’ album review

livethroughthis

The rumors about Courtney Love were true: her band’s second album is brilliant. From the deceptively underplayed riffs of opening song “Violet” to its explosive chorus with Love’s rebel yell backed by her four-piece band, Hole laid the groundwork for an album that flexed considerable muscle for the then-peak alternative rock movement. It will stand as one of the genre’s seminal works.

The elephant in the room is neither ignored, pointedly demolished, or obsessed over on this album: can a woman rock legitimately, without negating her femininity?

Love wins, because she has it both ways: she’s so good her gender’s not even relevant, which makes the revelation all the more relevant. She’s a natural: charismatic, dangerous, cocky, defiant, funny, tender, and poetic. That she happens to wear baby doll dresses is moot.

And the answer is a resounding yes: feminine themes are laced throughout the album’s lyrics and sound, but not at the expense of the genre’s nihilism. Just as Love’s voice can command and dominate with raspy force, it can flirt and dance with a showgirl’s glee.

‘I am the girl you know can look you in the eye,’ Love boasts in the raucous first single, “Miss World”. Mixing her favorite concepts of glamour and destruction, the song nakedly implores ‘Watch me break/And watch me burn’, before crunching everything under a guttural chorus: ‘I made my bed and I lie in it’.

Most of the album employs this soft/hard dynamic that dominates the genre, with a few heavy exceptions. “Plump” churns hard guitar riffs like gunfire while Love subverts feminine expectations: ‘I don’t do the dishes/I throw them in the crib.’ “Jennifer’s Body” skitters edgily along until exploding into power pop/rock riffs rivaling any hard-rock contemporaries.

Elsewhere, the slow-burn cautionary tale “Doll Parts” lays down its lyrical and stylistic groundwork so expertly without a hint of artiness: an artist’s dream in the form of twentieth century grunge rock. ‘Someday you will ache like I ache,’ Love forewarns in the chorus, changing the inflection slightly at every reprise until it bears multiple meanings.

A lone guitar riff periodically accents the throbbing bass showcase of the album’s quietest song, “Softer. Softest”, titillating you just as you’re being soothed by the song’s languid spell. It’s these simple but unexpected sonic twists that captivate and challenge listeners.

Throughout the album, we’re reminded again of the ineffable power of music—what can be achieved by the arrangement of chords and beats from a few instruments in different variations. No matter how crude and humble the parts are the sum can be transcendent.

The album’s lyrics alone are exemplary too—born from the best conversations neo-philosophers dream of and budding screenwriters would sacrifice a rent check for: ‘If you live through this with me/I swear that I will die for you’, begs the song “Asking for It”.  ‘I fake it so real I am beyond fake,’ Love concedes in “Doll Parts”. ‘I don’t really miss God/But I sure miss Santa Clause,’ quips “Gutless”. None of the lines feel precious or pretentious, furthering their impact.

Like the lead singer herself though, it’s not an easy album to accept at face value. Its compelling sheen is on alternative-rock terms; this is not your grandmother’s female rock star. Many music fans will simply not bear the palette to welcome it, and it’s their loss.

For fans of alternative rock and true music connoisseurs, however, it is undeniable. “Live Through This” is a stroke of genius in its sonic dynamics, thematic scope, and lyrical potency. It’s rife with excoriating ruminations set to indelible hooks that seduce and assault you simultaneously, daring you to embrace and question yourself and the world—like the best rock music does.

‘Mid90s’, middle ground: lacking inspiration.

mid90s1

Mid90s proves that standard movie tropes are always familiar no matter how you dress them up. And first-time director Jonah Hill has certainly earned kudos for dressing his new film up to fit its epochal title: one only has to glimpse a few grainy frames (purposely shot on 16mm film for added effect), to be transported back into the last days before the millennium: compact discs, baggy clothes, big hair and of course a nostalgic soundtrack by a seminal voice of the era, Trent Reznor.

Although the title references an entire cultural zeitgeist, the film is far from being all-encompassing in scope or subject. Instead, it’s an insular story built on specificity, resting under a rather prosaic and vague title for lack of keener inspiration, which is its biggest flaw.

The story begins in Los Angeles during its titular time period, with a young preadolescent boy named Stevie. Hounded by his boorish older brother from the opposite end of the adolescent spectrum and given free rein by a lais·sez-faire mother suffering from arrested development, Stevie is primed for one of cinema’s biggest clichés: a summer he’ll never forget.

This leads into another hallmark of the period: the skateboarding underworld, when Stevie sets his sights on befriending a group of older boys at the local board shop.

As soon as he unremarkably worms his way into the affections of the boisterous but nonthreatening slackers, his story ticks off the requisite milestones of coming-of-age and its subgenre of films: exhilarating new experiences, wise mentors, chafing against his family, high jinks that just skirt the line of true danger and serious trouble.

Since the plot is standard framework, the question is if the parts make up for the sum. Stevie is competent enough as a protagonist: he fits the bill in looks and temperament, without hitting any false notes. The home life he shares with his threadbare family never truly generates a sense of urgency, which curbs any added weight to his arc. Stevie’s older brother and young mother aren’t guilty of anything beyond typical dysfunctional fare: physical taunts from the former and distractions by the latter. As for Stevie’s newfound entourage: they border on caricatures, with raunchy nicknames and slight characterizations that are as nuanced as a junior high yearbook.

 The film suddenly hits a climax that can only be described as inorganic and again, contrived—but this is in keeping with its steadily innocuous tone. Mid90s doesn’t seek to innovate or make a statement. It’s a light tale that never truly triumphs or fails abysmally either—inhabiting a safe middle ground of familiarity, evident all the more by its usage of epidemic-level nostalgia for a past era that’s bound to pique audience interest. It’s the only true star of the movie; without it, it would lose half of its distinction.

The View vs. The Talk

viewtalk

I love watching women gab. As sexist as it sounds, I’ll just say it: they’re good at it. I imagine it’s the equivalent of people tuning in to watch physically fit men play sports. Also, if I really want to fully commit to being politically incorrect: maybe it’s part of my DNA as a gay man to enjoy hearing women yak about everything from the profound to the frivolous. I can relate, and it’s fun.

Since the beginning of this decade, we’ve had two major choices to see the biggest and brightest women in pop culture do just this, on daytime T.V. in the U.S.

Venerated journalist Barbara Walters set the precedent in 1997 with a show called “The View”—featuring ‘different women with different points of views’ sitting around a table and discussing the day’s biggest headlines. They ruled the roost as the lone standard for such a concept, until 2010 when former child actress Sara Gilbert had the sterling idea to do an offshoot of the format (with the angle that it’d consist of a panel of “moms”—although its predecessor never played down the maternal status that most of its panelists could claim too). As a viewer though, I wasn’t discerning—it made sense because: in a nation as large and diverse as ours, one of the benefits is how we can expand on commodities… like talk shows. After all, there have been multiple late night talk shows for decades now, competing directly with one another and thriving in their own right regardless of the saturated market. When a new daytime talk show featuring a panel of half a dozen women talking about topics in the news with their “different points of views” popped up, we took it in stride.

Both “The View” and “The Talk” have succeeded with viewers and been nominated for the same daytime Emmy awards throughout the years, solidifying their place in the pop culture lexicon.

But is there a difference or a clearly superior one?

“The View” has the advantage of experience on its side: thirteen more years over its rival. With that plethora of time, it’s seen and done many things it can learn from. Infamously, placing two panelists who are diametrically at odds with one another in perspective is ratings gold: when outspoken liberal Rosie O’Donnell was recruited as the show’s mediator in 2006 during the contentious Bush/Iraq War years, fate was written on the wall—she would ultimately come to blows with then-outspoken conservative panelist Elisabeth Hasselback the following year. It was the best daytime drama that needed no script.

The show also has the undeniable class factor that only a highly respected figure in the journalism field like Barbara Walters can provide. Although “The View”’s reputation has ebbed and flowed as any long-running entity is prone to, its pedigree is still rooted in solid stock.

It’s not without its trials. The show has “jumped the shark” as much as a talk show can do, in the sense of creative/production malaise. Since the 2010s, there has been a highly visible turnaround in the show’s panelists—it’s hard to even keep up with who’s officially on the roster these days, like watching your favorite sitcom characters getting replaced by new actors or new characters that you just don’t care for. Many of the new recruits were blatantly regrettable as well (Candice Cameron Bure and Raven Simone dragged down the credibility of the show, imho! Thankfully, their tenures were scant). The show has even rehired previously retired or exited co-hosts such as longtime favorite Joy Behar, Sherri Shepherd and even Rosie O’Donnell herself (who ultimately only stayed for one season again in 2014, mirroring her infamously clipped first round).

“The Talk” also tinkered with its lineup initially after its debut season, which is to be expected of a fledgling show though. It found its footing with a consistent lineup afterwards, and has only had one panelist replacement since.

Another difference with “The Talk” is its less emphasis on formality. The show humors its audience and viewers by directly asking them questions after bringing up a headline—from a serious news story to celebrity gossip, mediator Julie Chen will offer a concluding missive to encourage monosyllabic responses, boos, hisses, or laughter from the live audience reminiscent of, well, a daytime talk show (a 1990s version moreso, though).

Since the show is filmed in Los Angeles, another distinction from its New York City predecessor, it also has a daily celebrity-themed guest correspondent who contributes a pop culture headline (adding to the inevitable pop culture news that permeate the show anyway), in a segment loosely dubbed “Today’s Top Talker”.

As one can guess, “The View” and its reputation skews more towards a serious, politically-themed show. Although its current longtime mediator Whoopi Goldberg is a veteran Hollywood actress, she is outspokenly political and even good-naturedly mocks the more frivolous pop culture news she’s required to broach regularly (read: reality show fodder).

Other panelists, regardless of how short their tenures have been in recent years, have frequently been renowned political pundits as well, something “The Talk” has steered from completely. Currently, Senator John McCain’s daughter Meghan McCain is the resident conservative Republican on “The View”.

“The View” has also expanded its most well-known segment, the roundtable discussion deemed “Hot Topics” from just a third of the show’s running time to half or more now, betting on the attention-grabbing headlines and the often heated exchanges between the ladies on the panel to sustain viewers.

Both shows have the requisite celebrity guest interview in the latter half of the show. Again, “The View”, naturally more political, regularly invites political figures such as former president Barack Obama and several political commentators. “The Talk” relies entirely on celebrity guests, occasionally some that are not even major draws. This is moot, since I only tune in to each show to watch the ladies yak amongst themselves in their roundtable segments.

Judging each show based on my proclivities, I do have a clear conclusion of which one succeeds most. “The View” tides me over, for the aforementioned reasons above—it has more legitimacy but is still able to delve into melodrama, camp, and frivolity. Although its high turnover rate is unnerving and dispiriting, it has enough mainstay power players to anchor it. As a child of the 1980s and 1990s, I have a bias for Whoopi Goldberg as a pop culture fixture. Comedian Joy Behar’s sassy Italian schtick hasn’t gotten old—or perhaps, twenty-one years later on the show, I’ve also grown attached to her presence. As for the rest of the current panelists, I feel neither strongly for or against them. Sara is the bright blonde who keeps things light or at least centered; Sunny adds more diversity and a touch of primness. Meghan obviously serves as an antidote to the clear liberal slant from the two veterans of the show, and for the most part I enjoy her dynamic. Not to paint her as an archetype, but I love a good “nemesis”, and Meghan is one by default, constantly having to defend her political party whenever President Trump drags it through the mud, which is often.

“The Talk” is sufficient enough, but my taste doesn’t quite extend to audience participation and an overabundance of pop culture fluff. And although they currently have the steadiest panel lineup longevity, I’m not especially fond of any of the panelists: mediator Julie Chen is too proper; Sara Gilbert is insightful but staid as well; Sharon is the venerable one who’s been around the block—but is a bit too mannered and biased in her outspokenness; newcomer Eve hasn’t proven her worth yet beyond tugging the median age of the group down more; and Sheryl Underwood plays up the sassy black woman trope a bit too much.

Each show brings something to the table, and it’s merely a matter of taste. To me, I primarily blur the edges that separate the shows. They’re like two sitcoms that have an overlap of similarities and differences, and I like them both for different and similar reasons.

Why do people love the 80s?!?! (Try the 90s!)

80s

America’s unnatural love of all things 1980s is like society’s reverence towards pregnant women: you can’t really counter it without sounding like a complete monster. But since I’m already an inherent outcast twice removed, I guess I’ll be the brave soul to take a stab at it (the ‘80s).

They say trends come in twenty-year cycles. I was born in the 80s, and I remember as a preteen, being glad when all the saccharine gaudiness of the decade vanished by the early 1990’s. Little did I know that it would all come skipping back in an even more mannered, pretentious form—ten years later when I was in my TWENTIES, in the ‘00s.

By 2003, you couldn’t surf the web without coming across an article that proclaimed: “Check out your favorite redheaded ‘80s celebrities HERE!” or hear a song that didn’t sample a classic ‘80s synth-pop ballad, or have a conversation with an adult girl who didn’t squeal: “Ohhh, I LOVE the ‘80s!” Basically, it was like crack in the ‘80s: integral to the social scene.

If you can’t guess by now, I have highly objective reasons why I don’t like the ‘80s. I came of age in the decade that succeeded it: the ‘90s. When I say “come of age”, I mean the (first) era of maturing in one’s life—your teen years.

Nothing is as great (or bad) as when you are a teenager. If I came of age during the 1890s, no doubt I would be sitting here clamoring about how great churning butter was, and how kids these days are missing out on savoring fermented cow milk you procured with your own two hands. So I’m aware that I suffer from a little bias.

For me, I feel sorry that kids today didn’t grow up with angry, forlorn, edgy alternative-rock singers who managed to somehow be both dangerous and mainstream in this perfect window of time known as the 1990s. It was a truly magical time. I mean, MTV not only PLAYED music videos for significant chunks of time, they actually focused on music from earnest, serious artists. Music hadn’t been this socially aware and provocative since the ‘60s!

TV and movies vastly improved in my eyes too. Gone were the days where a movie focused solely on a nuclear family going on vacation, or a kid taking a day off from school. Movies with higher concepts were in vogue now: the term “indie” exploded, with all its subversive and innovative connotations. Disney rode a triumphant wave of Renaissance for the first half of the decade. Summer blockbusters pushed their art to new, exhilarating heights with movies like “Jurassic Park” and “Forrest Gump” setting records.

TV shows delved into darker and more progressive parts of the cultural psyche, with shows like “The Simpsons”, “Seinfeld”, “The X-Files” and “Roseanne” (although some of them debuted in the late ‘80s, they came into their prime in the ‘90s). Shows didn’t have to pander to the ideal family unit anymore. They could push the boundaries of what we found funny or intriguing, and succeed.

Look, I get the objective reasons why people love the penultimate decade of the twentieth century: it was simple. Sweet. Goofy. Over-the-top. Everything my fellow gay men love, which is why all gay men have some voluminous playlist somewhere that is nothing but ‘80s, ‘80s, ‘80s—as well as the perfect ‘80s getup outfit, should they have the divine fortune of crossing paths with an ‘80s-themed party. The ‘80s is like your kooky, fun, and slightly frivolous aunt. Whereas the ‘90s is your cooler but more sedate and socially conscious uncle. It’s kind of obvious who you’d rather party with.

But this is why I don’t like the ‘80s: I don’t like things that are simple, sweet, and over-the-top. It’s not my style. I’m the jerk that likes things to be ironic, dark, and brooding, hence: I will always identify with the Gen-X-dominated ‘90s. And hence: why most gay men have a convenient blind spot for this decade altogether. Seriously—can you imagine a gay man squealing about the ‘90s? ….? Only if they were forced to go to a ‘90s-themed party; they’d be squealing about their “other obligations that night”’—to get out of it. No gay man wants to be reminded of a classic Tarantino movie. It’s way too heavy, and our lives are already heavy enough. The same can be said for society at large, truly.

But the ‘90s are innocent as well, compared to the subsequent decade(s) that follow it. For one: during that decade, “social media” only went so far as logging into AOL via your phone cord, selecting a terrible login name, and signing into a god-awful chat room with other strangers. We had virtually no digital footprint, and honestly: many minds and lives were saved because of it. Terrorism was not truly a household word until the tragic events that ignited it on a fateful day in New York City, the following decade. We didn’t have such a politically divisive country due to a polarizing president yet. And a recession, the likes of which hadn’t been seen since the 1930s, hadn’t yet imploded.

So if you want something innocent, fun, but with a little more edge and a smidgen of self-important angst, why not make a pit stop in the decade before the ‘80s (if you’re going backwards in time)? You can geek out to Ace of Base, camp it up to the Spice Girls—but you can also show your gritty, “street cred” side by wearing baggy gangsta pants or grungy thrift-store plaid. The ‘90s had its perks too, ya’ know.

Thankfully, it is the 2010’s now—well over twenty years since my favorite decade started its rotation under the sun. It’s finally getting more of the “respect” I always knew it deserved. Too bad it takes twenty years for some people to arrive to the party—but better late than never.