The View vs. The Talk

viewtalk

I love watching women gab. As sexist as it sounds, I’ll just say it: they’re good at it. I imagine it’s the equivalent of people tuning in to watch physically fit men play sports. Also, if I really want to fully commit to being politically incorrect: maybe it’s part of my DNA as a gay man to enjoy hearing women yak about everything from the profound to the frivolous. I can relate, and it’s fun.

Since the beginning of this decade, we’ve had two major choices to see the biggest and brightest women in pop culture do just this, on daytime T.V. in the U.S.

Venerated journalist Barbara Walters set the precedent in 1997 with a show called “The View”—featuring ‘different women with different points of views’ sitting around a table and discussing the day’s biggest headlines. They ruled the roost as the lone standard for such a concept, until 2010 when former child actress Sara Gilbert had the sterling idea to do an offshoot of the format (with the angle that it’d consist of a panel of “moms”—although its predecessor never played down the maternal status that most of its panelists could claim too). As a viewer though, I wasn’t discerning—it made sense because: in a nation as large and diverse as ours, one of the benefits is how we can expand on commodities… like talk shows. After all, there have been multiple late night talk shows for decades now, competing directly with one another and thriving in their own right regardless of the saturated market. When a new daytime talk show featuring a panel of half a dozen women talking about topics in the news with their “different points of views” popped up, we took it in stride.

Both “The View” and “The Talk” have succeeded with viewers and been nominated for the same daytime Emmy awards throughout the years, solidifying their place in the pop culture lexicon.

But is there a difference or a clearly superior one?

“The View” has the advantage of experience on its side: thirteen more years over its rival. With that plethora of time, it’s seen and done many things it can learn from. Infamously, placing two panelists who are diametrically at odds with one another in perspective is ratings gold: when outspoken liberal Rosie O’Donnell was recruited as the show’s mediator in 2006 during the contentious Bush/Iraq War years, fate was written on the wall—she would ultimately come to blows with then-outspoken conservative panelist Elisabeth Hasselback the following year. It was the best daytime drama that needed no script.

The show also has the undeniable class factor that only a highly respected figure in the journalism field like Barbara Walters can provide. Although “The View”’s reputation has ebbed and flowed as any long-running entity is prone to, its pedigree is still rooted in solid stock.

It’s not without its trials. The show has “jumped the shark” as much as a talk show can do, in the sense of creative/production malaise. Since the 2010s, there has been a highly visible turnaround in the show’s panelists—it’s hard to even keep up with who’s officially on the roster these days, like watching your favorite sitcom characters getting replaced by new actors or new characters that you just don’t care for. Many of the new recruits were blatantly regrettable as well (Candice Cameron Bure and Raven Simone dragged down the credibility of the show, imho! Thankfully, their tenures were scant). The show has even rehired previously retired or exited co-hosts such as longtime favorite Joy Behar, Sherri Shepherd and even Rosie O’Donnell herself (who ultimately only stayed for one season again in 2014, mirroring her infamously clipped first round).

“The Talk” also tinkered with its lineup initially after its debut season, which is to be expected of a fledgling show though. It found its footing with a consistent lineup afterwards, and has only had one panelist replacement since.

Another difference with “The Talk” is its less emphasis on formality. The show humors its audience and viewers by directly asking them questions after bringing up a headline—from a serious news story to celebrity gossip, mediator Julie Chen will offer a concluding missive to encourage monosyllabic responses, boos, hisses, or laughter from the live audience reminiscent of, well, a daytime talk show (a 1990s version moreso, though).

Since the show is filmed in Los Angeles, another distinction from its New York City predecessor, it also has a daily celebrity-themed guest correspondent who contributes a pop culture headline (adding to the inevitable pop culture news that permeate the show anyway), in a segment loosely dubbed “Today’s Top Talker”.

As one can guess, “The View” and its reputation skews more towards a serious, politically-themed show. Although its current longtime mediator Whoopi Goldberg is a veteran Hollywood actress, she is outspokenly political and even good-naturedly mocks the more frivolous pop culture news she’s required to broach regularly (read: reality show fodder).

Other panelists, regardless of how short their tenures have been in recent years, have frequently been renowned political pundits as well, something “The Talk” has steered from completely. Currently, Senator John McCain’s daughter Meghan McCain is the resident conservative Republican on “The View”.

“The View” has also expanded its most well-known segment, the roundtable discussion deemed “Hot Topics” from just a third of the show’s running time to half or more now, betting on the attention-grabbing headlines and the often heated exchanges between the ladies on the panel to sustain viewers.

Both shows have the requisite celebrity guest interview in the latter half of the show. Again, “The View”, naturally more political, regularly invites political figures such as former president Barack Obama and several political commentators. “The Talk” relies entirely on celebrity guests, occasionally some that are not even major draws. This is moot, since I only tune in to each show to watch the ladies yak amongst themselves in their roundtable segments.

Judging each show based on my proclivities, I do have a clear conclusion of which one succeeds most. “The View” tides me over, for the aforementioned reasons above—it has more legitimacy but is still able to delve into melodrama, camp, and frivolity. Although its high turnover rate is unnerving and dispiriting, it has enough mainstay power players to anchor it. As a child of the 1980s and 1990s, I have a bias for Whoopi Goldberg as a pop culture fixture. Comedian Joy Behar’s sassy Italian schtick hasn’t gotten old—or perhaps, twenty-one years later on the show, I’ve also grown attached to her presence. As for the rest of the current panelists, I feel neither strongly for or against them. Sara is the bright blonde who keeps things light or at least centered; Sunny adds more diversity and a touch of primness. Meghan obviously serves as an antidote to the clear liberal slant from the two veterans of the show, and for the most part I enjoy her dynamic. Not to paint her as an archetype, but I love a good “nemesis”, and Meghan is one by default, constantly having to defend her political party whenever President Trump drags it through the mud, which is often.

“The Talk” is sufficient enough, but my taste doesn’t quite extend to audience participation and an overabundance of pop culture fluff. And although they currently have the steadiest panel lineup longevity, I’m not especially fond of any of the panelists: mediator Julie Chen is too proper; Sara Gilbert is insightful but staid as well; Sharon is the venerable one who’s been around the block—but is a bit too mannered and biased in her outspokenness; newcomer Eve hasn’t proven her worth yet beyond tugging the median age of the group down more; and Sheryl Underwood plays up the sassy black woman trope a bit too much.

Each show brings something to the table, and it’s merely a matter of taste. To me, I primarily blur the edges that separate the shows. They’re like two sitcoms that have an overlap of similarities and differences, and I like them both for different and similar reasons.

Advertisements

Movie Review: ‘It’ doesn’t deliver

It

Unless you’re from the tail end of Generation Z, you at least know what Stephen King’s It is about already. The question is if the new film is a worthy take on the classic novel, which had only been filmed once as a well-known 1990 TV mini-series. Spoiler alert: I did give in to nostalgic curiosity and re-watched the original version before viewing this new one. Don’t worry: although I’d long revered it as a fearful preteen back then, I was shocked to find now that it was rather underwhelming—a mild, moody drama with some decent scares thrown in.

So I was primed and as objective as possible to the prospective terrors of an ambitious new take from the best that Hollywood has to offer today. From the opening scenes in the film that lead to the introduction of Pennywise the clown, otherwise known as the title namesake It, the movie looked promising.

Unfortunately, it didn’t exceed expectations from there. First off, Pennywise the clown is the centerpiece of the entire story, hence the title. Without his terrifying image or concept of menacing evil, the story isn’t effective. Not to sound like a purist, but for lack of a better example: the original Pennywise played by Tim Curry in the mini-series was far more sinister. Although his looks were barely a step away from a typical birthday clown, that’s what made him frightening: he was plausible. Here was a clown that could exist in your neighbor’s backyard, surrounded by innocent children—yet there was a studied vitriol to his gaze and a barely controllable sneer to his painted red lips. When he opened his mouth to lunge at last—that spray of razor sharp teeth only solidified our very fears. The new Pennywise, played by Bill Skarsgard, is so stylized he’s as flat as a joker from a playing card. And as engaging. His appearances are not particularly memorable and are often upstaged by the other manifestations of “fears” that he lobs towards his victims, in the forms of an abusive father, a distorted painting of a woman, and a headless child from a history book.

What about the rest of the characters? The story centers around a gang of “losers” in the late 1980s: seven misfits from the local junior high in Derry, Maine, who congregate as a means of survival from the social hierarchy of their peers—and eventually, from the deadly curse that Pennywise has inflicted on the town for nearly a century. The child actors that portray them are all competent, but only three of the characters are given any distinct personalities that leave an impression: Bill, the stuttering de factor leader and protagonist who wants to resolve the death of his little brother from the opening of the film, is appealing and bright. The group’s lone female member, Beverly, stands out not just for being a girl—but because she gets the most screen time to develop her troubled back story that includes an abusive, preying father. Richie, the loudmouth comic relief of the group, is memorable by default because he’s the most vocal and biting. The rest of the kids aren’t fleshed out particularly well—they end up being ciphers who just provide physical power and exposition to the story.

As for the story itself, it lags in places and could have benefited from more urgent pacing—given this is a horror story, where timing is of the essence. Although the film is inevitably going to lapse into some preteen requisites, which is fine for the sake of character and plot development: crushes, friendships, betrayal, etc.—the overall story suffers as a result. Although the original novel was sprawling, it somehow seemed too unnecessarily long onscreen.

It’s fitting that this movie takes place in the 1980s because the special effects for the film seem to be right out of that era, almost. Although visual effects should never be relied on to propel a horror film—they are surprisingly disappointing and innocuous in this movie, given today’s technological advances. Since the movie suffered from middling pacing as well, that left for very little to keep me at the edge of my seat. By the time the movie hit its climactic standoff between Pennywise and the brave, bereaved kids, I gave up my search for something truly terrifying to materialize.

Overall, I don’t think this film will join the pantheon of truly classic horror films in my eyes. The hype clearly overshadowed the actual execution of the story onscreen. It ended up just being another underwhelming horror flick.

Movie Review: ‘Beach Rats’ is relevant, compellingly told

beachrats

In an era where gay issues have been at the forefront of social change and a visible part of mainstream culture with no signs of turning back—regardless of the new presidency in the U.S.—a film like Beach Rats stands out simply for not being politically correct.

How many films in the second decade of the new millennium center solely on a young man living in a premiere urban mecca in the U.S. yet refuses to come to terms with his clear proclivities for other men?

Frankie is a nineteen-year-old, born and raised in Brooklyn along its fabled beachfront that invites the lifestyle for which the movie is named for: his routine involves getting high on the beach with a pack of similar-looking bro’s, often topless or decked in wife beaters during the swampy summer months. It would be idyllic if it weren’t for his covert internal struggle.

Unbeknownst to his virile buddies, Frankie also engages in meaningless sex with older men, whom he meets on a very contemporary platform: a hookup website. From the very first such exchange that he attempts in the film, it is clear that Frankie is hesitant and discreet with this pastime.

And although it’s already been a couple of years since the Supreme Court overturned gay-marriage bans in the U.S., it’s clear why someone like Frankie would still be stuck in the past no matter how fast the rest of the world is moving: entrenched in ostensibly lifelong friendships with typically meathead bro’s, with no prospects of his own—educationally or professionally, not to mention his dying father and somber home life—it’s no wonder Frankie doesn’t want to make waves.

It’s easy to forget that this world—including supposedly progressive countries like the U.S.—is still full of stories like this. They could be in your own backyard, even if you live in a major metropolitan city.

Frankie’s narrative propels further into deeper waters when he encounters a young woman on the boardwalk who openly pursues him. He instinctively goes along with the courtship because she is the right age, beauty, and temperament.

Naturally, tensions and conflicts escalate as Frankie continues to lead concurrent lives that are at odds with one another.

What makes this film rise above whatever connotations that may haunt it—the themes of shame, deception, and meaningless lascivious activities for gay or bisexual men—is its lack of judgment. This isn’t a film about the triumphs of being gay, and it’s not supposed to be. Sure, there have been more than enough films like this since time immemorial, but it’s still part of the gay experience, progress be damned.

The style of the film also beckons for a more sympathetic ear to such a subject. The laconic, natural pace is almost voyeuristic—heavy on visual and mood, over unnecessary plot developments. Frankie is not just a cipher, although it’s easy to label him one due to his reticence and ambiguity as a character. Although none of the other characters are effervescent either, they’re also not mouthpieces for exposition or pedantic moralizing. They feel like real people you meet in passing, even if you don’t get a full chance to know them entirely.

Beach Rats is obviously an old story—closeted homosexuality—but it manages to breathe new life into it through an unlikely setting and character by default, and an uncompromising vision of the subject. Taken on its own merit, outside of our cultural context—it’s simply well done.

Do the Oscars Matter Anymore?

oscars

It’s that time of year again: when people come together to talk about what some famous actresses wore—who wore it best—oh, and which film won Best Picture. Probably something artsy and serious. Sometimes it’s deserved—a film of true excellence and craftsmanship in writing, acting, and directing. But usually it’s just a film that you may or may not have seen. (I don’t know about you, but I’ve gotten to the point where I’ve decided all serious dramas will be relegated to DVD viewing—‘cause, you know: why do I need to see talking faces on a big screen?) Also, movie prices are astronomical, so—okay, I see it: I’m part of the cycle and why Hollywood is nickel and diming every potential film that passes through their gates in the hopes of production. No wonder they’re settling for the bottom line so often—a “sure” thing (read: sequel, prequel, or remake of something that did legitimate business once). But I digress.

Anyway, it’s the Oscars again. And of second most importance, it is 2017. I make a point of the year because frankly, I don’t believe the Oscars are nor have been the same for a long time now.

I often wonder what my younger doppelganger today would think of this Hollywood pastime now. What do young, budding (okay, and gay!) dreamers like me today think of this rapidly declining tradition of awarding the “Best” in Motion Pictures Arts and Sciences?

Cut to: me in the early 1990s. Maybe because things often look better in retrospect or I just didn’t know any better because I was a kid, but: the Oscars felt like they meant something back then. The five, count ‘em, just five nominated films for Best Picture (more on the topic of that category being expanded to ten nominations later) really felt like they earned that coveted spot. Each film that was nominated felt special, and it was usually a tight race that was more or less about merit and not just politicking by studios and adhering to social trends of the day.

Culturally, budding gay—I mean, budding dreamers of all stripes only had a few outlets to view their favorite stars back then: People magazine, and “Entertainment Tonight”. Which meant we were primed and hungry to see all these stars convene on one epic night—a smorgasbord of glamour, glitz, and at least to an idealistic kid like me back then: talent!

The Oscars have been cheekily dubbed “The Superbowl for Women”—in terms of annual cultural impact and significance. But unlike the actual Superbowl, the Oscars have been morphing and changing notably, and gradually eclipsed by other smaller Superbowls in the past two decades.

In the age of Twitter, TMZ, and the E! Channel, we can literally follow our favorite stars online 24/7 to see what they ate for breakfast or what color their kids’ poop is; spy on them as they exit an airport terminal via shaky video footage, or consume their daily lives in a craftily executed weekly reality TV show.

With these enlightening options that we’ve been blessed with through technical progress, the mystery of what it means to be rich and famous and talented has become rote and accessible in ways never before imaginable.

I have a feeling my teenage doppelganger today would view the Oscars the same way I viewed silent films or drive-in movie theaters when I was a teen in the 1990s.

Perhaps in response to this changing culture (read: poorer ratings for the telecast—undoubtedly due to the Academy’s penchant for nominating “serious” films that don’t do much business at the box office)—the category for Best Picture was expanded to include up to ten nominees, in 2009. The Academy claimed this was a throwback to the early years in the 1930s and ‘40s, where there were up to ten nominees per year—but many cynical observers assumed it was a blatant attempt to nab more viewers for the annual show. The quip “Are there even ten films worthy of being nominated every year?” hit the web quicker than you could say ‘Action!’. Incidentally, the Oscars suffered its lowest TV ratings ever the previous year, so read into the subsequent change however way you want.

As I alluded to earlier, I could relate to the criticism on the merit of today’s films—let alone their worthiness of being nominated for such an honor. In our current cinematic climate, I think the cap of five nominees is/should’ve been more relevant than ever—an elite prestige worth striving for, artistically.

Nearly a decade later, the expansion of nominees hasn’t made a mark on me as an Oscar viewer or a movie fan. If anything, it makes it harder for me to remember what films were nominated each year—but that could be more of a reflection on my waning interest for the show altogether.

In 2016, the Academy was confronted with yet another issue—this time one of moral. The lack of diverse nominees that year spurred a boycott by many African-American artists and viewers, who claimed a racial bias against them. Although I understood the greater issue of diversity, as a minority myself even I had reservations about the campaign. Was the Academy biased, or were there simply no quality films that year that starred African-Americans (or other ethnic groups)? If it was the latter, for instance—the issue wasn’t the Academy, but the movie industry itself.

Nonetheless, in true form, the Academy reacted swiftly with their image in mind—claiming they would add a significant amount of women and people of color to their voting bloc. The validity of this gesture aside, the consequence of this detrimental publicity also left a viewer like me wondering how sincere future nominations would be. As well intentioned as the campaign was to shed light on the Oscars’ lack of diversity, the fallout could be that they might overcompensate and recognize films (not people, mind you) of lesser merit to meet political correctness.

This shifting of objectives and influences only aided the rapidly declining relevance of the Oscars in my eyes. It was not about simply awarding the best films anymore—but a commercial and social experiment gone awry.

But this was nothing new overall: the Oscars have always been about more than just the merit of moviemaking, of course.

I turned eighteen when the world entered a new millennium in 2000, and the year “American Beauty” won against a highly publicized award campaign for its chief rival nominee that year, “The Cider House Rules”. Maybe because I’d technically became an adult and therefore achieved full enlightenment at last, but the fact that a movie studio launched a publicity campaign to swarm voters to choose their film was not lost on me. Apparently, voters don’t just go into hibernation and pick winners, then emerge back into the real world alive and rejuvenated by the purity of their choices.

The validity of their choices has often been debated for other reasons as well: awarding an actor or director for their current, less stellar work simply to acknowledge their greater body of work is another common longstanding ploy.

That being said, it’s safe to say that the curtain has finally gone down on my love affair with the Oscars. Honestly, the last few years I’ve been less and less drawn to the extravaganza. As late as 2013, I still recall having a few vestiges of excitement that I’d had in my youth—feeling like I was witnessing something greater than myself. But the past two years and on the eve of this year, it’s dawned on me now that the heyday of the show has long joined the past. It doesn’t detract from the merit of truly good movies, but that’s the thing: good movies and the Oscars are not the same thing, and they haven’t been for a long time.

So it’s that time of year again—when people come together to talk about what some famous actresses wore—and who wore it best. Oh, and which film won Best Picture. Exactly. That’s all it is.

 

 

La La Land: The Story of Us (Review)

lalaland

They say there are no new stories to tell, and nothing new under the sun—that phrase itself is a cliché, see. Yet we still need these tales, for ineffable and primal reasons. Why? Well, it’s ineffable—so sometimes it’s beyond words.

“La La Land” is one of the most familiar stories in the modern canon: girl has a dream of stardom and pursues it in spite of demoralizing setbacks. On top of that, it’s a love story between a girl and a boy—and the boy also has the same dream, essentially. This sandwich of familiarity is enough to send any quasi-cynic running for shade—and I don’t mean for cover. But oddly, many of us are still on board with this setup; so much so, that this film has become the breakout hit of the year: Oscar bait and pop cultural force. And for good reason: for many, it’s simply our story.

The film opens with an inventive musical nod to one of the hallmarks of the city of stars: L.A. traffic. In a gridlock on a steep highway overpass, passengers do the most natural thing in a musical: break into song and dance—jumping on top of their vehicles, courting each other out of their cars, and dancing on the concrete lane with unmitigated reverie, proclaiming in the song’s title how it’s “Another Day of Sun”. For a person who’s lived in Los Angeles longer than I care to divulge, this scene blew straight passed my jaded antennae and bowled me over with its unabashed whimsy. Instead of scoffing at the absurdity of it all, I wanted to join in.

At the song’s end, we meet Mia, played by pixie-ish but quirky Emma Stone. Mia is an aspiring thespian who heeded her childhood calling to tinsel town to realize her dreams of becoming a star—but mostly to tell stories through her craft, like every bleeding-heart artist on Earth. Although Mia is certainly likable, the film is less about character than plot and ideals. Stone is competent as always, but you guessed it: does not add a new wrinkle to this careworn archetype. She does add another notch to her increasingly impressive repertoire, proving that Hollywood may not be so shallow after all: in one scene, after a humiliating audition, Mia zips through a hall of Stone-lookalikes that are also vying for the part. In the elevator, flanked by two of these clones, she is clearly the least statuesque and nubile.

This doesn’t stop her from catching the eye of another aspiring artist, Sebastian—a somewhat aging (by Hollywood standards—read: thirties) jazz pianist played by the still smoldering and chiseled Ryan Gosling. He has the slightly more original dream of the two by default: to simply open up a jazz club, which is a feat because it’s a jazz club and this is the twenty-first century.

In a subdued, if not entirely original setup, Mia is drawn into a nondescript nightclub by the chords of a pensive tune that she hears Sebastian playing inside. This melody becomes a smartly recurring musical motif throughout the film. It’s there that she spies Sebastian, but thankfully it’s not love at first sight for either party. They meet again shortly after through serendipity (he’s a keyboardist at a Hollywood party that she attends), and the inevitable develops between these two passionate artists—cue: excoriating debates on the merits of their crafts and the plausibility of their dreams to secure them, and—romantic love. They inspire each other and cheer each other on, unsurprisingly.

These scenes are padded by more musical numbers—less grandiose in overall production, but still charming and catchy—particularly the lovely, haunting theme of the film, “City of Stars”. These numbers also continue to pay winning tribute to more L.A. trademarks and locales like the Griffith Observatory, beach piers and the Watts Towers. The film does lose its musical momentum in the second half of its story, which will not go unnoticed by musical connoisseurs. For novices like me, it’s the best of both worlds: I enjoyed the songs far more than I’d expected from a traditional film musical, but I was just as happy to be saddled with plain plot and character in the interim, however uneven.

I won’t disclose too much of the remaining plot, because it’s no trouble guessing for a proverbial tale as this anyway: the story reaches its emotional apex when Mia can’t bear another humiliating failure, and hits the sore spot many viewers who bought into this story for personal reasons, fear most—pondering that she may not be destined for greatness after all.

Nonetheless, the ensuing conclusion is probably what you’d expect for both aspiring artists in a film like this. And with that, this is the reason why these familiar stories still work: we need to be reminded that things are possible. It’s not cliché; it’s human—which one came first? (Well, frankly the human—but one thing informs the other). Notably, the movie does handle the love story between Mia and Sebastian with less hackneyed results, and I will leave that utterly out of this review for the viewer to discover on their own.

“La La Land” is nothing new, but it’s a tale we’ll never grow tired of because (many of us) will always care about the things it cares about.

Pop Culture and Me: a Forbidden Love Affair

popculture2

No one expects me to like pop culture. I believe two key factors play into this: my race, and my lack of style. I’m not going to change either one. Or the unyielding fact that I’ve always been quite enamored by pop culture.

Okay, my race I can’t change. But could I change my style so that it translates into a media-savvy hipster? Or at the very least, someone who looks like they watch TV?

How does that work? Should I wear “Walking Dead” t-shirts? Get a “Breaking Bad” Tattoo? Wear everything I see from Forever 21 to prove that I’m just like everyone else?

The funny thing about being misunderstood is that although we loathe it, we secretly enjoy it too—because it proves that there’s more to us than meets the eye.

I suppose there are some people out there who are happy being simple and straightforward—easily “read”, or as the kids call it these days: basic. See, I am hip enough to know that.

For the rest of us, we instinctively feel that that translates to being shallow, which is generally seen as a pejorative term unless you’re a reality star. Check. I know what constitutes a reality show star.

The truth is, I do play a role in my own conundrum too. It’s my lack of desire to assimilate on some levels that distances me from my peers, which fosters animosity and misunderstanding. But if I’m not interested in jumping on the latest bandwagon, that’s my right too. And being an individual does not preclude an awareness of what’s current in popular culture.

It’s not all bad either, to be fair. When I mentioned something about the Golden Globes one year (yes, I’m even an awards show junkie), a friend innocently remarked: “Wow, I thought you’d be—too cool to watch something like that.” Aww, ain’t that sweet? So maybe there is a contingent out there that isn’t attacking my character when assuming things about me. They’re simply deeming me to be more enlightened than I actually am, which is flattering—and less insulting.

But alas, I can succumb to frivolity as much as the next person. Who doesn’t enjoy the latest celebrity news? It’s like a large order of McDonald’s French fries: not good for you, but you’re not interested in being a saint. You’re allowed an indulgence once in a while. How utterly boring would it be if we only did things that were ethically “good” and enriching for us? If that were the case, there’d be no decent TV shows, movies, or music. We’d all be wearing white robes and chanting scriptures and talking about nothing more provocative than the weather.

So there you have it. The unremarkable reason why a person like me can enjoy the latest Adele album or the Oscars is just that: it’s human nature. Sometimes the simplest answer is the hardest one for people to see or accept. Apparently.

Why do people love the 80s?!?! (Try the 90s!)

80s

America’s unnatural love of all things 1980s is like society’s reverence towards pregnant women: you can’t really counter it without sounding like a complete monster. But since I’m already an inherent outcast twice removed, I guess I’ll be the brave soul to take a stab at it (the ‘80s).

They say trends come in twenty-year cycles. I was born in the 80s, and I remember as a preteen, being glad when all the saccharine gaudiness of the decade vanished by the early 1990’s. Little did I know that it would all come skipping back in an even more mannered, pretentious form—ten years later when I was in my TWENTIES, in the ‘00s.

By 2003, you couldn’t surf the web without coming across an article that proclaimed: “Check out your favorite redheaded ‘80s celebrities HERE!” or hear a song that didn’t sample a classic ‘80s synth-pop ballad, or have a conversation with an adult girl who didn’t squeal: “Ohhh, I LOVE the ‘80s!” Basically, it was like crack in the ‘80s: integral to the social scene.

If you can’t guess by now, I have highly objective reasons why I don’t like the ‘80s. I came of age in the decade that succeeded it: the ‘90s. When I say “come of age”, I mean the (first) era of maturing in one’s life—your teen years.

Nothing is as great (or bad) as when you are a teenager. If I came of age during the 1890s, no doubt I would be sitting here clamoring about how great churning butter was, and how kids these days are missing out on savoring fermented cow milk you procured with your own two hands. So I’m aware that I suffer from a little bias.

For me, I feel sorry that kids today didn’t grow up with angry, forlorn, edgy alternative-rock singers who managed to somehow be both dangerous and mainstream in this perfect window of time known as the 1990s. It was a truly magical time. I mean, MTV not only PLAYED music videos for significant chunks of time, they actually focused on music from earnest, serious artists. Music hadn’t been this socially aware and provocative since the ‘60s!

TV and movies vastly improved in my eyes too. Gone were the days where a movie focused solely on a nuclear family going on vacation, or a kid taking a day off from school. Movies with higher concepts were in vogue now: the term “indie” exploded, with all its subversive and innovative connotations. Disney rode a triumphant wave of Renaissance for the first half of the decade. Summer blockbusters pushed their art to new, exhilarating heights with movies like “Jurassic Park” and “Forrest Gump” setting records.

TV shows delved into darker and more progressive parts of the cultural psyche, with shows like “The Simpsons”, “Seinfeld”, “The X-Files” and “Roseanne” (although some of them debuted in the late ‘80s, they came into their prime in the ‘90s). Shows didn’t have to pander to the ideal family unit anymore. They could push the boundaries of what we found funny or intriguing, and succeed.

Look, I get the objective reasons why people love the penultimate decade of the twentieth century: it was simple. Sweet. Goofy. Over-the-top. Everything my fellow gay men love, which is why all gay men have some voluminous playlist somewhere that is nothing but ‘80s, ‘80s, ‘80s—as well as the perfect ‘80s getup outfit, should they have the divine fortune of crossing paths with an ‘80s-themed party. The ‘80s is like your kooky, fun, and slightly frivolous aunt. Whereas the ‘90s is your cooler but more sedate and socially conscious uncle. It’s kind of obvious who you’d rather party with.

But this is why I don’t like the ‘80s: I don’t like things that are simple, sweet, and over-the-top. It’s not my style. I’m the jerk that likes things to be ironic, dark, and brooding, hence: I will always identify with the Gen-X-dominated ‘90s. And hence: why most gay men have a convenient blind spot for this decade altogether. Seriously—can you imagine a gay man squealing about the ‘90s? ….? Only if they were forced to go to a ‘90s-themed party; they’d be squealing about their “other obligations that night”’—to get out of it. No gay man wants to be reminded of a classic Tarantino movie. It’s way too heavy, and our lives are already heavy enough. The same can be said for society at large, truly.

But the ‘90s are innocent as well, compared to the subsequent decade(s) that follow it. For one: during that decade, “social media” only went so far as logging into AOL via your phone cord, selecting a terrible login name, and signing into a god-awful chat room with other strangers. We had virtually no digital footprint, and honestly: many minds and lives were saved because of it. Terrorism was not truly a household word until the tragic events that ignited it on a fateful day in New York City, the following decade. We didn’t have such a politically divisive country due to a polarizing president yet. And a recession, the likes of which hadn’t been seen since the 1930s, hadn’t yet imploded.

So if you want something innocent, fun, but with a little more edge and a smidgen of self-important angst, why not make a pit stop in the decade before the ‘80s (if you’re going backwards in time)? You can geek out to Ace of Base, camp it up to the Spice Girls—but you can also show your gritty, “street cred” side by wearing baggy gangsta pants or grungy thrift-store plaid. The ‘90s had its perks too, ya’ know.

Thankfully, it is the 2010’s now—well over twenty years since my favorite decade started its rotation under the sun. It’s finally getting more of the “respect” I always knew it deserved. Too bad it takes twenty years for some people to arrive to the party—but better late than never.