‘Buzz’-kill: Promoting Gay Stereotypes on BuzzFeed

“39 Songs that Shouldn’t Have Flopped but Straight People Let it Happen”

“42 Songs Only Gay People Know”

“30 Songs From the Early 2000s that only Gays and Woke Straights Know About”

These are just a few titles from popular pop culture website, Buzzfeed. I’ll give you a hint: close to none of the songs listed in these articles feature guitars or male vocals. That’s pretty much the gist of it. Apparently, if you’re gay, you only listen to dance/pop music from women who run the gamut of fabulous to glamorous, sexy to sultry–and everything in between. No nuance. No variety. Is that what it means to be gay?

According to Buzzfeed, it is. Don’t get them wrong: Buzzfeed loves gays, women, people of color, transgender people, disabled people–witnessed in all the articles they post that are usually on the right side of our current woke zeitgeist. Buzzfeed is nothing if not current in trends, that’s for sure. 

Yet, they are curiously archaic in this innocuous tangent of articles written by gay authors themselves, that perpetuate the narrowest and most superficial gay stereotypes. 

Why does a website that so clearly wants to appear enlightened allow such myopic fodder? 

Maybe because articles about gay people who aren’t fun and flashy aren’t going to sell. Even gay people know that, which is why these gay authors write and sign their names on these frivolous pieces themselves.

Here’s the thing: it’s no secret that stereotypes are generally bad, but the open secret is that people often accept–even revel–in stereotypes they like, such as gay men’s alleged love affair with fun, disposable pop music.

Who cares if it’s a stereotype, when it’s oh-so-cool and flattering, depicting us as exciting, glamorous, and sexy? (Forget that these articles in question could also be accused of portraying us as: vapid, shallow, and superficial).

The main problem is though: obviously, not every gay person is like this. Even the authors (and readers) themselves probably know. Subsequently, when a gay person sees headlines and articles like these and doesn’t recognize themselves in them, it only perpetuates any nascent feelings of isolation and exclusion that are already too common among LGBTQA people–particularly young ones. 

It’s ironic that in our current zeitgeist, which seems to multiply in identity politics by the year, month, day or hour, we can still perpetuate stereotypes like this so cavalierly–even diligently.

As any gay person knows, it’s still all too common to be judged by our gay peers for indeed not liking the latest Lady Gaga song or not idling our lives away at the local gay dance club. 

Why do we still have these holdouts of ignorance, intolerance, and pressures to conform in the gay community, when we–along with the whole national zeitgeist now–allegedly preach enlightenment, tolerance, and acceptance of different lifestyles?

You do not get to pick and choose which stereotype is acceptable or not. It’s similar to double standards. You either reject them all or accept them all. 

These articles are obviously not the equivalent of preaching hate, bigotry, or violence against anyone–but it’s no less damaging towards a community that still struggles to find acceptance and inclusion. Such articles offer too little to compensate for how much they can alienate. 

25th Anniversary: ‘Jagged Little Pill’ — A New Perspective

jagged

Like a lot of older Millennials, I can remember the first time I heard Alanis Morissette. I was in eighth grade, and her debut music video in the U.S., “You Oughta Know”, began making the rounds on TV. I didn’t think much of it, but her second single, “Hand in My Pocket”, arrived months later and I became hooked like the 33 million fans worldwide who’d buy her blockbuster album, “Jagged Little Pill”, in 1995 onwards.

It wasn’t just that I was a fan; she literally introduced me to music—and at that pivotal age, it was momentous. Like many teenagers, my musical taste would be crucial in informing my identity. Aided by Alanis’ music, I discovered the genre she was ostensibly part of—alternative rock—and I was in love.

I remained a fan of Alanis in the subsequent years, still fondly playing her first and second albums regularly, well into my thirties. A quarter of a century later, however, I unexpectedly found myself coming full circle to the minority of outliers whom I recalled initially rebuked the phenomenon of her music, perplexingly.

Something had shifted my sensibilities. Now the instrumentation, style, and concept of her music simply struck me as… inauthentic. They seemed preconceived, affected, and a little silly. It wasn’t that there wasn’t true talent involved; it’s that the music was less about art than it was about entertainment. On that level, yes: the music was certainly catchy—enough for me to listen repeatedly for decades. I would never tire of her music in some sense, but I began to realize: maybe she wasn’t such an authentic artist, but again: just an entertainer with a phenomenal gimmick.

These were the accusations from her critics, twenty-five years ago upon the release of “Jagged Little Pill”. They had baffled me then, in their reservations against a surely indelible and spectacular artist—but now I understood where they were coming from.

I remember hearing people flatly say her music was “whiny” and an outlet for complaining—rather than profound and cathartic, as millions of fans attested. Her most famous critics declared her hackneyed and contrived; I could hear it in the instrumentation now—often, it sounded more like an imitation of rock music than actual  rock music, if that makes sense. It was too slick and mannered for its own good. A sophisticated ear is a tall order for a fourteen year old; what was my excuse for the last twenty years? Maybe when I listened to her music between then and now, it was clouded by my own nostalgic attachment to it.

Morissette’s credibility was always suspect from the start: prior to “Jagged Little Pill”, she’d released two strictly dance-pop albums that were indicative of their time: the early 1990s. Her about-face with “Pill” as an alternative rock singer was suspiciously convenient, a few years later at the peak of the grunge phenomenon. That she collaborated with veteran music producer Glen Ballard—who was accomplished but best known for polished pop rather than rock music—only perpetuated doubts of Alanis’ “rock” status. Honestly, this theoretical calculation on her part wouldn’t have bugged me, except that she didn’t pull it off artistically after all.

It’s telling that Morissette never repeated her success with “Pill”—commercially or artistically, ever again. Not that an artist should replicate their style or subject repeatedly, but she never attained the same relevance even on a strictly esoteric or artistic level. Her follow-up album came the closest, but even now it suffers from a similar quality as its behemoth predecessor: inhabiting a dubious sonic limbo between art and entertainment. In fact, all of her subsequent albums shared this trait. That was no accident.

It’s no wonder that no matter how beloved and entrenched “Jagged Little Pill” is in popular culture, it rarely if ever landed on any of those contentious, retrospective “Best of” lists from presumably serious music critics. Those dissertations always lent themselves to debate, which is why their unanimous omission of this album is all the more telling: and rarely debated!

Don’t get me wrong—I still think Alanis Morissette’s music has merit, but her music is tantamount to a blockbuster movie: it may become a beloved fixture in popular culture, but it’s not necessarily the finest example of its medium. In many ways, it’s no less valuable for bearing this quality, and there’s no shame in liking it. I will always have a place in my heart for her brand of music, just like I do with other fun pop music, blockbuster films, or cheesy TV shows. They all serve a purpose. Twenty-five years later, I may have changed, but I can still laud this landmark album for its most consistent quality: a pivotal moment in pop culture—for me, and the millions of fans that made it one of the biggest albums of all time.

Fiona Apple: A Ranking of Her Albums

fiona2

It feels like a milestone with Fiona Apple’s new album, Fetch the Bolt Cutters. Maybe because of the relief it spells for us in our unprecedented times of ambiguity during a global pandemic, or because it’s her fifth album—rounding out a full discography that officially spans four decades in this new year, or maybe every album feels like a milestone from the famously reclusive singer who has solidified a slow pace of artistic output with her last release, in 2012.

For all of these reasons, it feels warranted to geek out over Apple’s music—with that most irresistible and contentious of efforts: a list. With five albums under her belt now, Apple’s catalogue feels ripe for a ranking of this very discography. It’s all the more tempting because it’s no easy feat, considering how ingenious her music has consistently been these last four decades: how would you rank her albums from best to worst, or in her case: best to almost as best?

  1. The Idler Wheel… Her 2012 release has been aptly described as “distilled” Fiona. It best showcases her artistic sensibility, style, and skill in top form. Although the piano was synonymous with her identity and music at the start of her career, and still is—The Idler Wheel… transcended instrumentation, literally: its pared down sonic landscape was a stark departure from Apple’s prior albums, but her lyrics and melodies were instantly recognizable—and an extension as well, showing her artistic growth. These assets were brought to the foreground, and were always Apple’s greatest strengths. Songs like “Anything We Want”, “Hot Knife” and “Every Single Night” were as rich and potent as any music with multitudes more instrumentation. It’s her most consistent record, without a single weak track. Apple was at her peak: the songs don’t aim to be catchy, but the melodies are indelible anyway. It’s the perfect balance of artistic and accessible.
  2. When the Pawn… – Apple’s sophomore album was also an impressive balance of rich melodies and artistic innovation. In many respects, it’s her most satisfying album because it operates on all cylinders: it features beautiful production values, potent lyrics, and inventive sounds. It’s no wonder that this appears to be the fan favorite, from what I’ve read online—myself included. It draws from several influences and weaves it into a rich tapestry that can be sung along to, while also digested for its lyrical depth: classic rock, hip-hop, show tunes, and spare piano torch songs. “Paper Bag” remains one of her best songs for good reasons: it’s lyrically and melodically taut yet bursting with ripe instrumentation that includes a brass section. “I Know” is one of her loveliest songs—a quiet, infectious meditation on adoration and contentment. When the Pawn… is the complete package.
  3. Fetch the Bolt Cutters – Undoubtedly her least pretty album, but perhaps because it’s the sequential last step in her artistic progression thus far: her most revealing, in a career that always prioritized revelation. Similar to her 2012 release, it moved even further from instrumentation and focused more on lyrics and themes. The result is a palpably cathartic album that marries deeply personal experiences with the primal impulse for release: pure art. What the melodies lack in accessibility, they make up in sheer urgency and authenticity—they’re like chants you made up in the schoolyard as you faced down bullies, or while you lounged quietly in the privacy of your home. They pulse with vitality. “Ladies”, “Heavy Balloon”, and “Relay” touch on themes like jealousy, betrayal, and mental health without being didactic or heavy-handed.
  4. Tidal – Her most accessible album for its sheer sonic gloss, it features her most catchy songs like “Criminal”, “Sleep to Dream”, and “Shadowboxer”. The seemingly surface beauty of these songs is not a detriment to their accomplishments. They sound as vibrant and relevant today as they did a quarter of a century ago. This album is ranked lower than her others only because an artist like Fiona can only improve with age, and starting from 18 years old at that, as she was when this album was released in 1996. The lyrics are not as mature as her subsequent albums, naturally, but the melodies and gorgeous piano-laden instrumentations aid them in their appeal.
  5. Extraordinary Machine – This was always my least favorite album, perhaps because it sounded less urgent and distinctive than the rest. There are a few classic gems that exemplify what I love most about Apple: “Parting Gift” is what she did best at the time: a girl with a piano singing about love askew; “Waltz: Better than Fine” is a throwback, reminiscent of her preceeding album’s foray into classic show tune influences. The rest of the album justified Apple’s talents, but there was a whimsical instrumentation and mood to this album that, though shouldn’t be synonymous with inferiority, was less appealing.

‘Bolt Cutters’ Opens Wounds, Stretches Boundaries of Music

fetchtheboltcutters

Fiona Apple has increasingly stretched the boundaries of music with each release, so that it’s no longer driven by artifice but absolute remove. Her new album, Fetch the Bolt Cutters, is nothing if not a natural progression for the famously reclusive singer then, who has established a slow-drip pattern of releasing material. It may be her most riveting album yet, which is a feat she’s secured with each release in a career that now spans four decades.

Apple starts the album with opener “I Want You to Love Me”, one of its more deceptively conventional tracks. Backed by pretty piano keys that invoke gentle blue skies, she coos with the rich alto fans have long recognized. However, the next track “Shameika” is more indicative of the album’s ethos: lacking a catchy chorus, Apple’s lyrics fixate on one vague line: “Shameika said I had potential”—and repeats it dozens of times by song’s end, no less decipherable with piano keys that clatter like drums.

By the third and fourth tracks—the title song and a ditty entitled “Under the Table”—the heavy, percussive, thread-bone instrumentation and Apple’s idiosyncratic musings threatened to put off even a lifelong fan like me, who essentially grew up with her music when were both teenagers at the dawn of her career in 1996. I’d been prepped by press releases that referenced an incendiary, stark track, “Hot Knife”, from her previous album, 2012’s The Idler Wheel… as a precursor to this album, but was still thrown for a loop—and I loved that track.

I pressed on, and to my chagrin fell for “Ladies”, which is most reminiscent of Apple’s earlier, more accessible work: swooning, lilting melodies with high production values and best of all, Apple’s voice at peak beauty. If one wants a pretty melody though, Apple is no slouch in that department either. “Ladies” is lovely, and shouldn’t be ashamed to be.

Suffice it to say, Cutters is not a pretty record though; but upon repeat listens, it’s certainly not easy to ignore either. After the tentative first round, I was drawn back to it and was able to hear it on its own terms. This is not a typical Fiona Apple record, but it’s that very ornery defiance that makes it an utterly typical Fiona Apple record.

The new album is like an exotic meal you try once—discovering that it’s not doused in sugar or spice or anything immediately definable—then you end up craving it for weeks after. That it also happens to be nutritious is only a bonus—this is music of and for the soul.

“Reinvention” is a cheesy word and would never apply to Apple, but the 42-year old singer has avoided repeating herself since arriving at the tender age of 18 in the only decade she could have gotten traction in: the 1990’s, when it was last plausible for a sad-looking and even sadder-sounding musician to go triple platinum, as she did.

Her debut, Tidal, was disarmingly honest and haunting, but with a pop sheen that she would never rely on again. She quickly dispelled all mounting doubts of being a fluke with her 1999 follow-up, When the Pawn… confidently establishing herself as a true sophisticate with a penchant for timeless melodies and mature craftsmanship on par with her hero, John Lennon. 2005’s Extraordinary Machine expanded her sound and conceptual reaches even more, and 2012’s The Idler Wheel… was her most innovative work yet: stripped down to sonic essentials, it only showcased her lyrical and melodic ingenuity further.

In hindsight, this discography set the stage for Cutters, which rests entirely on something Apple was never short of and still isn’t: authenticity. What draws the listener to the album isn’t escapism, so much as exorcism—of inner demons, anxieties, and revelations that have been percolating under the surface for years or even decades.

Much has already been written about the timely themes that permeate the tracks: Apple’s quest to give voice to the silent majority that still finds itself at the whims of an ostensibly masculine world. However, Apple has always held men accountable for their actions from the onset of her career. That she does so to this day is perhaps most revealing, politically, now. She also clearly ruminates on the tricky nature of female friendships in several songs on Cutters, which is new in her catalogue.  “A girl could roll her eyes at me and kill,” she laments gingerly in the title track, needing no more explanation for the adolescent hell that still haunts most adults.

Chants, wordless vocalizing, and half-speaking fill the album—invariably centered on a phrase that does not belong in Top 40. They’re cathartic only in the capable hands of Apple: “Kick me under the table, I won’t shut up”, she repeats like a petulant preteen. It oddly becomes a mantra, not just literally but for its sheer attitude.

“I spread like strawberries/ I climb like peas and beans”, Apple shouts with conviction in “Heavy Balloon”, invoking a startlingly earthy and esoteric metaphor—a reference from a children’s book that described how the aforementioned flora grows.

It’s not to say that the album is devoid of indelible sounds. Aside from the more conventional tracks mentioned, “Hot Balloon” rouses with a pulsating percussive splash that seems to belie its meditation on depression. The title track has a serene, lazy, almost calypso-style lilt. “Under the Table” has perhaps the catchiest chorus, accented by a shimmering piano loop. “Cosmonaut” recalls the whimsical, melodic instrumentation of her mid-career albums.

Underscoring the visceral quality of this album, it was notable how relieving it was for me to listen to a pure, unabashed pop song from the rest of my itunes playlist afterwards. It made me appreciate that sort of music even more—while also simultaneously appreciating Fetch the Bolt Cutters more. There is a time for each urge; if I want unvarnished authenticity and raw muse literally caught on tape for secondhand witness, I will press play on this album repeatedly.

Not Happily Ever After: Why the Disney Renaissance Ended…

disney

With the recent slate of Disney films being released to theaters, it could be mistaken that we’re in the 1990s again. In the past two years, Disney has released live action remakes of Beauty and the Beast, Aladdin, and The Lion King. Next year Mulan will follow suit. The Hunchback of Notre Dame is officially in the works now too. All of these films are based on blockbuster animated features from Disney that premiered in the 1990s.

They were part of the illustrious era of animated feature films known as the Disney Renaissance—from 1989 to 1999. And frankly, it’s easy to see why these films still seem to be relevant—chronologically and culturally: the generation that grew up with these films, mostly Millennials born between the early 1980s through the mid 1990s, have yet to reach age forty at most—and barely finished college at the least. They’re also a notoriously nostalgic group, like most Americans—but with an even more ubiquitous platform to perpetuate it—on social media and the Web.

They grew up with the Disney Renaissance and have ostensibly kept the relevance of its animated features alive—literally two decades after its official end. With at least half of the films from the era slated for reinterpretation as major live action films, their enduring grasp on popular culture is unquestionable.

As a member of this generation and a bona fide Disney buff, I cant attest that the memory of these films are indeed as vivid and fresh today as they were when I was at the target age for their appeal. They are touchstones of my childhood, imbuing me with a sense of wonder and a worldview that I still hold onto to this day.

Nonetheless, the Disney Renaissance did have a clear beginning and end. It veritably ceased to produce more new films to augment its original run, after 1999. As any Disney fan or even a casual observer will recall, subsequent animated features from Disney experienced a steep drop in popularity for nearly a decade afterwards, in spite of a continual output of releases then.

As a fan with unrelenting exposure to these animated films, I have arrived at a conclusion as to why the phenomenon of the Disney Renaissance drew to a close at the end of the last century.

My theory is rooted in the catalyst that started the Disney Renaissance and made it popular in the first place. Kick-starting the era in 1989, The Little Mermaid was the first fairy tale that the Disney studio had worked on in thirty years. This was largely why it was a resounding success, because it returned to the formula that had worked so well for Disney in the past: a fairy tale musical, centered on a princess. Disney had erroneously abandoned this formula for nearly two decades prior to this, and suffered commercially and artistically with audiences.

Per hindsight, however, I believe the rediscovery of this Disney formula during the Renaissance era would also become its own undoing. If the era had one fault, it was that literally every film adhered to the formula, stringently, with the exception of The Rescuers Down Under, an early entry to the era that proved to be a commercial misfire, tellingly. Every other animated feature between 1989 and 1999 consisted of: a beautiful or outcast protagonist—usually orphaned by one or both parents, amusing sidekicks, romantic interest, colorful villain, an epic setting, and a roster of songs that covered the requisite themes of life goals, romantic love, and villainy. This is not to dispel the considerable diversity of creative and technical achievement of the era—producing some of the most ingenious, memorable and astonishing feats of song, visual, and characters—not to mention an unprecedented level of representation for the first time from Disney (lead characters of different ethnicities: Aladdin, Pocahontas, Mulan).

Nonetheless it’s quite revealing to see that, when compared to previous eras of Disney animated features, no other era could be accused of the same homogeneity: the Golden Age, from 1937 to 1942, only had two films that featured romantic love, only one princess, and two clear musicals. The Silver Age had several films without romantic love or a musical format as well. These two eras are arguably the biggest rivals of the Renaissance, in popularity and artistic achievement. Both reached a demise for their own disparate reasons—economical downturn after 1942 due to World War II, and the death of Walt Disney in 1966, respectively.

The theory of redundancy during the Disney Renaissance had also possibly begun to take shape as early as the mid-way point of its run: after four stellar blockbusters from the studio, things suddenly slowed down with the release of Pocahontas in 1995. Widely viewed as the first critical let down of the era, things didn’t immediately return to form in 1996 or 1997 either, with the releases of The Hunchback of Notre Dame and Hercules. Both films failed to draw audiences back after the misstep of Pocahontas. Something was amiss.

Audiences were being drawn elsewhere too: computer animation. This is perhaps the most commonly held theory of why the Disney Renaissance came to an end: towards the end of the 1990s, a new medium was dawning—usurping traditional, hand-drawn (2-D) animation that Disney was known for. With the release of Toy Story in 1995, a resounding success not just for being the first major computer-animated feature but a true triumph of story, audiences found a new outlet for family-friendly films that appealed to all ages. A slew of computer-animated (or CGI) films followed in gradual succession for the rest of the decade and beyond, none of which followed the renowned Disney formula—and often to critical and commercial success, surpassing even Disney. If the Disney Renaissance proved that animated features could appeal to all ages, CGI animated films proved that they didn’t have to be based on classic, existing literature—opening the doors for innovations in story that just happened to be married to a very innovative technology, now coming to its own at the end of the twentieth century.

Although I agree that CGI certainly disrupted Disney’s momentum in the latter half of the 1990s—particularly since CGI animated features have ostensibly remained more popular with audiences and critics alike, and 2-D animation has never come back into vogue since—I still stand by my theory that it was more of content than just medium. Also, the onslaught of CGI feature-length films actually occurred rather slowly, and did not immediately crowd the market that 2-D Disney animated features dominated: after Toy Story was first released in 1995, the next CGI films were Antz and A Bug’s Life, both premiering in 1998. That left three full years in between, which subsequently saw the release of three Disney animated features to vigorously fill the void to maintain their stronghold on audiences—yet they didn’t. The Hunchback of Notre Dame, Hercules, and Mulan were released by Disney during this period and though not critical or commercial failures, were far less renowned than their predecessors from the Disney Renaissance. Again, a malaise seemed to have settled with audiences, which could be read as a reflection of the medium’s output. Audiences surely weren’t just holding off for the next great CGI film, after only having witnessed the medium’s sole initial output in 1995. The average moviegoer had no idea how the CGI medium would eventually fare, though it was clearly a technological advancement. (It wasn’t until 2001, that the medium exploded with simultaneous releases of multiple CGI animated films, cementing it as a mainstay in cinema).

It was apparent that audiences had simply grown tired of the Disney formula, and so the business changed after 1999, just as it did after the Silver Age in 1967—when the studio entered a prior era of commercial and critical malaise, following the death of Walt Disney.

With that, it’s also helpful to understand what followed the Disney Renaissance: from 2000 to 2008, the Post Renaissance era indeed echoed the era that followed the Silver Age—the Bronze Age of 1970-1988, when the studio struggled to redefine its identity to audiences then too. The resulting films in the new millennium would reflect these efforts, striking into new territories such as Science Fiction, original stories not based on classic tales, even the CGI medium as well—which would be a portent of the studio’s eventual direction. Most of the films from this era didn’t quite resonate enough with audiences to turn them into classics.

The Revival era followed in 2009, with yet another rediscovery of the formula—with Tangled cementing Disney’s return to the zeitgeist, followed by Frozen. Both were clearly fairy-tale musicals centered on a princess, but married to the new CGI medium now, which Disney has converted to indefinitely to fit with the times. Aside from the new look, these films are quite similar to the Renaissance formula. Audiences responded and propelled these films into the public conscience as they did in the Renaissance era, hence the new namesake.

But if these new films from the Revival are following the same virtual formula as the Renaissance, why did the Renaissance cease in the first place? Shouldn’t it have endured, unabated, by sheer demand?

Again: we just needed a break. As a lifelong Disney fan, with the benefit of hindsight, I couldn’t fathom a Disney Renaissance-style film being released by the studio every year for the next two decades after Tarzan, the last of that era, in 1999. On some level, I would enjoy it purely as a diehard fan, but it would almost become campy—a parody of itself if you will. As much as audiences loved the Disney Renaissance, we can also sense artistic malaise. The formula had gotten monotonous and stale—again, already by the midway point of its era—and audiences clearly reacted with their wallets.

Does that mean that the Revival era is doomed to repeat history? Surprisingly, it may be averting this fate because: although it certainly has resuscitated the Disney formula, there’s one telling factor that separates it from the Disney Renaissance—it’s not following the formula for every new film. To their credit and maybe by calculation, they’re not just doing princess stories or fairy tales exclusively. Maybe that’s part of its success: Big Hero 6 and Zootopia are some of the titles that are as divergent from fairy tales and princesses as you can get. Both met with clear commercial and critical acclaim—unlike the misfires of previous eras that also strayed from the formula.

Whether they realize it or not, perhaps this is what audiences need. We will always love and adore princesses and fairy tales, but there needs to be variety. There’s nothing wrong with having a studio trademark (family-friendly films, music, artistic visuals), but the trademark can be broad and expand. Art is about change, pushing boundaries, and expanding possibilities. Sure, we do like some degree of familiarity—all art has a core familiarity: a movie has a beginning, middle and end; music has notes and instruments, verses and choruses. But along with familiarity we need variety.

Perhaps Disney has a new, unprecedented confidence and competency that is allowing them to achieve something they weren’t quite able to do in the past: successfully tell classic stories and original new stories, concurrently. Disney may have failed at pursuing either one at various times in the past, not because either one was theoretically bad—but because they just weren’t truly creating something exceptional. As mentioned, they were going through the motions after a certain point during the Disney Renaissance, settling into a creative ennui—or alternately, striking into new territory with dubious artistic vision, during the Post Renaissance for example. But if a story is truly told well, it can potentially succeed. Audiences will respond to something special even if it defies current trends, as they did when The Little Mermaid reignited this medium that had virtually gone defunct for nearly two decades, kick-starting the Disney Renaissance in 1989.

Will Disney get it right this time? We can only wait and see.

Any long-running institution is going to experience inevitable peaks and valleys in relevance and vitality—hence the different eras of Disney feature animation that exist in the first place. I am resigned to the eventual fate of the storied Disney Renaissance of my youth, because to borrow a platitude: good things can’t last forever. Sitcoms come to an end. Book and film franchises end—and can even be revived again after a certain period of absence (sound familiar?). The much-beloved Disney Renaissance is all the more revered because it wasn’t permanent and was limited in duration. It lends it a rarity that further incites gratitude and veneration. It was beautifully fleeting, as all life is.

It’s almost as if the beginning and ending of each era was inevitable—because like all art, Disney feature animation is an evolving medium. The studio is learning their craft in real time, and we get to watch it unfold onscreen.

 

‘Mind’ Games Keep You Guessing

year

The Year I Lost My Mind certainly avoids current gay indie-film tropes, if not most cinematic tropes altogether, with its bizarre collection of idiosyncrasies.

On the surface, it’s a thriller about a troubled young man who dabbles in petty illegal activities, but its his particular tics and habits that amount to a tantalizing viewing experience, if for no other reason than to find out just what the hell is going on?

Tom is a pale, offbeat, lonely gay man in his early 20s, living at home with his mother and sister in Berlin. Our first introduction to him sets the tone: he dons a large horse mask, compelling his resigned mother to ask “Why do you enjoy having people be afraid of you?”

Her inquiry is apt. Things only get stranger from there as her moon-faced, taciturn son walks into stranger scenarios, often wearing a variety of more masks from his bountiful collection.

Tom soon breaks into a stranger’s apartment where the handsome tenant sleeps peacefully, unaware of the passive crime that hovers over him. Tom simply observes the unsuspecting young man, then leaves—making mental notes for some later transgression perhaps.

This leads to a low-grade stalking scenario, spread out over the course of the strange protagonist’s idle days, spying on his subject’s routine around town from a distance.

Through his increasingly disturbing habits, interests, and behavior, one gets the sense that Tom has not only been marginalized by mainstream society but by the gay subculture too, with his unconventional looks that preclude reciprocation when he’s witnessed making advances on other men.

Is this why he is acting out, morally and legally? And to what extent will it manifest?

A subplot unfolds, where Tom encounters a fellow masked man—larger, stranger, and more foreboding them him, at one of his haunts around town: the nearby woods where men cruise each other.

This stirs another question: is his doppelganger’s existence real or merely a figment of Tom’s demented imagination?

Tom revisits his previous subject’s apartment regularly, affirming his lascivious motives in the absent man’s empty bed. He skirts the calamity of being caught more than once, escaping through the glass doors of the patio. His subject begins to notice missing cookies, misplaced books—but he also has a cat, so the picture is hazy.

The inevitable occurs one night when Tom boldly admires the handsome man sleeping in the middle of the night, but he manages to retreat through the front door, buffered by the shock he’s cast over his newly lucid victim.

It’s through another chance that the victim puts two and two together, and he resolves the situation through his own hands—with unexpected results that are intentionally shocking by the filmmaker. Although compelling, it doesn’t feel quite believable enough to be effective.

With a fairly adroit buildup to this climax, it feels a bit of a cheat to only tumble into improbability. The subplot involving Tom’s frightening double is resolved in a more subdued manner, alleviating some of the discord. Nonetheless, the film is effective enough for everything that occurred before its finale—an interesting study in anomaly: its images, moods, and actions are sure to linger long after the screen fades to black.

‘Mid90s’, middle ground: lacking inspiration.

mid90s1

Mid90s proves that standard movie tropes are always familiar no matter how you dress them up. And first-time director Jonah Hill has certainly earned kudos for dressing his new film up to fit its epochal title: one only has to glimpse a few grainy frames (purposely shot on 16mm film for added effect), to be transported back into the last days before the millennium: compact discs, baggy clothes, big hair and of course a nostalgic soundtrack by a seminal voice of the era, Trent Reznor.

Although the title references an entire cultural zeitgeist, the film is far from being all-encompassing in scope or subject. Instead, it’s an insular story built on specificity, resting under a rather prosaic and vague title for lack of keener inspiration, which is its biggest flaw.

The story begins in Los Angeles during its titular time period, with a young preadolescent boy named Stevie. Hounded by his boorish older brother from the opposite end of the adolescent spectrum and given free rein by a lais·sez-faire mother suffering from arrested development, Stevie is primed for one of cinema’s biggest clichés: a summer he’ll never forget.

This leads into another hallmark of the period: the skateboarding underworld, when Stevie sets his sights on befriending a group of older boys at the local board shop.

As soon as he unremarkably worms his way into the affections of the boisterous but nonthreatening slackers, his story ticks off the requisite milestones of coming-of-age and its subgenre of films: exhilarating new experiences, wise mentors, chafing against his family, high jinks that just skirt the line of true danger and serious trouble.

Since the plot is standard framework, the question is if the parts make up for the sum. Stevie is competent enough as a protagonist: he fits the bill in looks and temperament, without hitting any false notes. The home life he shares with his threadbare family never truly generates a sense of urgency, which curbs any added weight to his arc. Stevie’s older brother and young mother aren’t guilty of anything beyond typical dysfunctional fare: physical taunts from the former and distractions by the latter. As for Stevie’s newfound entourage: they border on caricatures, with raunchy nicknames and slight characterizations that are as nuanced as a junior high yearbook.

 The film suddenly hits a climax that can only be described as inorganic and again, contrived—but this is in keeping with its steadily innocuous tone. Mid90s doesn’t seek to innovate or make a statement. It’s a light tale that never truly triumphs or fails abysmally either—inhabiting a safe middle ground of familiarity, evident all the more by its usage of epidemic-level nostalgia for a past era that’s bound to pique audience interest. It’s the only true star of the movie; without it, it would lose half of its distinction.

Nobody Walks in L.A.

ped2

L.A. has the worst pedestrians in the world—because we’re not used to them. It’s bad enough that it takes forever to drive a relatively short distance in this town due to traffic, but when you need to drive through an intersection and a person dares to walk across it first? It’s enough to make you curse the existence of humanity.

Sometimes it’s truly a test: on more than one occasion, I’ve been delayed by the truly physically impaired. Of course I empathize and wait patiently on those occasions, but those moments feel tailored to test the utmost limits of my character. It’s like halting an epic sneeze or cutting off a bowel movement midstream: the absolute urge to purge and the terror of following through with such a deplorable act calls for your every last nerve to reverse the impossible.

On one such occasion, I had to make a left turn from a moderately busy lane; a slew of cars rolled through in the opposite direction, deterring me. My receptors were already piqued because this traffic was a tad unusual for this area given it was an early Saturday evening. I scanned my target intersection, and saw two young men idling by on skateboards. They cleared before the train of cars did. Impatient, I began to eyeball the nearest traffic light up ahead that could clip this parade to my left. Then I saw it:

A disheveled, middle-aged man ambled arduously forward towards my designated cross street—on crutches. What’s more—in my periphery, I caught an aberration on one of his legs—yes, his right leg was amputated around the knee. Immediately, my mind jumped to do the math: at his laborious pace and with the yellow light imminent up ahead, he would reach the intersection just as the cars on my left cleared.

I wasn’t in a rush. I wasn’t even angry at him. I was just resolutely amused that this was happening. It felt so indicative of this city. Here I was, driving a car that still functioned well past its purported expectancy, with takeout on my passenger seat—no plans for the night, half a mile from home—and normally I would’ve flipped out at this pedestrian who dared to cross a public street in direct tandem to me turning into it, except that in this scenario the perpetrator was possibly a transient with clear physical limitations and little to no means by the looks of his tattered appearance.

If I had flipped the switch into full selfish pig mode at that very moment, even just privately in the confines of my car—I knew it still would’ve been a sin, in the eyes of my conscience and whatever god may exist. I could see an audience of my fellow human beings at that very moment as well, sneering and groaning at me if I were to recall the story on stage or if they were privy to it via a hidden surveillance camera—satisfied in their smugness that I was more terrible than they were, convinced that they would’ve felt nothing but angelic compassion in my position.

I drove home and lamented it all: the feckless logistics of this town, the cruel irony of fate, the snide hypocrisy of humans and my own presumptions about them—and my inability to resist being affected by all of this.

Interpersonal Skills: I can’t deal with people.

interpersonal special SIZE

I’ve come to the conclusion: I can’t deal with people.

Although by my mid-thirties I know life is a constant learning experience and that we can traverse the entire continuum of allegiances and viewpoints, I have gleaned from my social experiences thus far that I’m just not good at interpersonal skills.

First off, I’m poor at asserting myself. When faced with a scenario where a simple expression of my needs would suffice, I am often drowned out by the myriad connotations of the situation: who is involved, how much I love/fear/loathe/need them, what words or actions spurred the need to assert myself—and how it affects me emotionally.

Beyond that, I seem to lack the same interests, motives or needs that many people exhibit in socializing: I don’t crave status, dominance, or social gain through who I associate with.

As any experienced person knows, there are tacit “games” that people play with one another—through physical action, comments, rejection—to assert their needs and agenda in regards to others.

I’m not interested.

I’m not interested.

I can’t deal with people judging others based on what they look like, who they hang out with, what job they have.

I can’t deal with people who aggressively label me—thinking they “know me” but they really don’t, and when I inevitably prove them wrong they get mad at me, of course, because they’re upset that the world doesn’t fit their perception of it.

I can’t deal with people who put others down in order to build themselves up. I can’t deal with people who gleefully abuse others for this purpose—who have no qualms making an innocent human being miserable.

I can’t deal with people using others for personal gain, including those they had considered their friends and closest colleagues.

I don’t want to trade barbs with people, because on an instinctual level I don’t want to sink to that level. It disgusts and unnerves me to see myself behave that way. For many people, if I can’t do that—then I am simply a target for their deplorable behavior, and therefore I must avoid them for my own safety and self-respect.

Consequently, even if I possessed the fortitude to assert myself more effectively—my general distaste in our social mores and behaviors could possibly thwart me from ever engaging. I don’t want to correct people’s behavior towards me—not just because I’m incompetent, but because it offends and repulses me that I have to display certain traits to attain it.

It sounds like a cop-out, and in a way—it is. After all, life is all about doing things we don’t want to but are essential as a means to a healthy life that truly benefits us. Each day, we awake, wash and dress ourselves—that in of itself is a requisite for a healthy existence. The vast majority of us must work at an occupation to earn resources that will acquire us more resources.

Interpersonal skills are not as tangible as our bodies, food, water, and a roof over our heads—but they are just as vital for the social animal that we are.

This is where I clash. My principles seem to be at odds with the rudimentary mechanics of socializing.

It’s a shame, because what I lack in grit I make up for in other virtues: as a friend, I’ve been told that I’m fun, open-minded, tolerant, and unconventional. I challenge the norms of society for the greater good of seeing the world anew. I am loyal, kind, generous, and gracious. I am accepting and thoughtful most of the time. I am engaging, but also capable of great independence. I have clearly defined interests and opinions that define me and can serve others.

Look, I’m also not perfect either and can even be guilty of unsavory behavior towards others, but for the most part I believe in a higher state of coexistence. And this is another hindrance to my interactions with others.

At the risk of sounding hopelessly naïve or oblivious, I believe in a world where we tolerate our differences instead of persecuting each other for them. I believe in treating each other with decency and minimal respect, even if we differ in lifestyle, views or appearances. I believe in equality—that we are all inherently valuable therefore the need for stringent hierarchy or status is irrelevant. I believe that as long as a person is not harming anyone, they should be accepted as they are—not persecuted because of someone else’s expectations or ideology. I know this isn’t plausible in our world, but that is my core approach to life, and informs how I view and interact with others.

This is the reason why I feel separate from most people, and different.

I’ve realized this is the reason why I am often confounded when people invariably end up being… human.

It’s all too common for people, including those we’d entrusted ourselves, to lash out at one another—because of differing temperaments, beliefs, expectations, ideology, and needs.

At this age, I’ve experienced the disappointment of so-called “friends” who display less than stellar traits towards me, and handle me in a way that directly opposes basic decency and humanity.

I’ve only been able to count on a small handful of friends who haven’t eventually turned on me yet—and of that minority, many of them are simply not visible enough in my daily life to risk offending me.

This, I feel must be the resolution to my anomalous condition: to seek out and zero in on the rare peoples who will not see me as a target for their foibles and dire needs.

When I find such a commodity, I must treasure them and keep them in my life—because they will be my principal social outlet, because it appears that I am not capable of much more than that.

Will I ever find such rare exceptions? That’s the question.

The View vs. The Talk

viewtalk

I love watching women gab. As sexist as it sounds, I’ll just say it: they’re good at it. I imagine it’s the equivalent of people tuning in to watch physically fit men play sports. Also, if I really want to fully commit to being politically incorrect: maybe it’s part of my DNA as a gay man to enjoy hearing women yak about everything from the profound to the frivolous. I can relate, and it’s fun.

Since the beginning of this decade, we’ve had two major choices to see the biggest and brightest women in pop culture do just this, on daytime T.V. in the U.S.

Venerated journalist Barbara Walters set the precedent in 1997 with a show called “The View”—featuring ‘different women with different points of views’ sitting around a table and discussing the day’s biggest headlines. They ruled the roost as the lone standard for such a concept, until 2010 when former child actress Sara Gilbert had the sterling idea to do an offshoot of the format (with the angle that it’d consist of a panel of “moms”—although its predecessor never played down the maternal status that most of its panelists could claim too). As a viewer though, I wasn’t discerning—it made sense because: in a nation as large and diverse as ours, one of the benefits is how we can expand on commodities… like talk shows. After all, there have been multiple late night talk shows for decades now, competing directly with one another and thriving in their own right regardless of the saturated market. When a new daytime talk show featuring a panel of half a dozen women talking about topics in the news with their “different points of views” popped up, we took it in stride.

Both “The View” and “The Talk” have succeeded with viewers and been nominated for the same daytime Emmy awards throughout the years, solidifying their place in the pop culture lexicon.

But is there a difference or a clearly superior one?

“The View” has the advantage of experience on its side: thirteen more years over its rival. With that plethora of time, it’s seen and done many things it can learn from. Infamously, placing two panelists who are diametrically at odds with one another in perspective is ratings gold: when outspoken liberal Rosie O’Donnell was recruited as the show’s mediator in 2006 during the contentious Bush/Iraq War years, fate was written on the wall—she would ultimately come to blows with then-outspoken conservative panelist Elisabeth Hasselback the following year. It was the best daytime drama that needed no script.

The show also has the undeniable class factor that only a highly respected figure in the journalism field like Barbara Walters can provide. Although “The View”’s reputation has ebbed and flowed as any long-running entity is prone to, its pedigree is still rooted in solid stock.

It’s not without its trials. The show has “jumped the shark” as much as a talk show can do, in the sense of creative/production malaise. Since the 2010s, there has been a highly visible turnaround in the show’s panelists—it’s hard to even keep up with who’s officially on the roster these days, like watching your favorite sitcom characters getting replaced by new actors or new characters that you just don’t care for. Many of the new recruits were blatantly regrettable as well (Candice Cameron Bure and Raven Simone dragged down the credibility of the show, imho! Thankfully, their tenures were scant). The show has even rehired previously retired or exited co-hosts such as longtime favorite Joy Behar, Sherri Shepherd and even Rosie O’Donnell herself (who ultimately only stayed for one season again in 2014, mirroring her infamously clipped first round).

“The Talk” also tinkered with its lineup initially after its debut season, which is to be expected of a fledgling show though. It found its footing with a consistent lineup afterwards, and has only had one panelist replacement since.

Another difference with “The Talk” is its less emphasis on formality. The show humors its audience and viewers by directly asking them questions after bringing up a headline—from a serious news story to celebrity gossip, mediator Julie Chen will offer a concluding missive to encourage monosyllabic responses, boos, hisses, or laughter from the live audience reminiscent of, well, a daytime talk show (a 1990s version moreso, though).

Since the show is filmed in Los Angeles, another distinction from its New York City predecessor, it also has a daily celebrity-themed guest correspondent who contributes a pop culture headline (adding to the inevitable pop culture news that permeate the show anyway), in a segment loosely dubbed “Today’s Top Talker”.

As one can guess, “The View” and its reputation skews more towards a serious, politically-themed show. Although its current longtime mediator Whoopi Goldberg is a veteran Hollywood actress, she is outspokenly political and even good-naturedly mocks the more frivolous pop culture news she’s required to broach regularly (read: reality show fodder).

Other panelists, regardless of how short their tenures have been in recent years, have frequently been renowned political pundits as well, something “The Talk” has steered from completely. Currently, Senator John McCain’s daughter Meghan McCain is the resident conservative Republican on “The View”.

“The View” has also expanded its most well-known segment, the roundtable discussion deemed “Hot Topics” from just a third of the show’s running time to half or more now, betting on the attention-grabbing headlines and the often heated exchanges between the ladies on the panel to sustain viewers.

Both shows have the requisite celebrity guest interview in the latter half of the show. Again, “The View”, naturally more political, regularly invites political figures such as former president Barack Obama and several political commentators. “The Talk” relies entirely on celebrity guests, occasionally some that are not even major draws. This is moot, since I only tune in to each show to watch the ladies yak amongst themselves in their roundtable segments.

Judging each show based on my proclivities, I do have a clear conclusion of which one succeeds most. “The View” tides me over, for the aforementioned reasons above—it has more legitimacy but is still able to delve into melodrama, camp, and frivolity. Although its high turnover rate is unnerving and dispiriting, it has enough mainstay power players to anchor it. As a child of the 1980s and 1990s, I have a bias for Whoopi Goldberg as a pop culture fixture. Comedian Joy Behar’s sassy Italian schtick hasn’t gotten old—or perhaps, twenty-one years later on the show, I’ve also grown attached to her presence. As for the rest of the current panelists, I feel neither strongly for or against them. Sara is the bright blonde who keeps things light or at least centered; Sunny adds more diversity and a touch of primness. Meghan obviously serves as an antidote to the clear liberal slant from the two veterans of the show, and for the most part I enjoy her dynamic. Not to paint her as an archetype, but I love a good “nemesis”, and Meghan is one by default, constantly having to defend her political party whenever President Trump drags it through the mud, which is often.

“The Talk” is sufficient enough, but my taste doesn’t quite extend to audience participation and an overabundance of pop culture fluff. And although they currently have the steadiest panel lineup longevity, I’m not especially fond of any of the panelists: mediator Julie Chen is too proper; Sara Gilbert is insightful but staid as well; Sharon is the venerable one who’s been around the block—but is a bit too mannered and biased in her outspokenness; newcomer Eve hasn’t proven her worth yet beyond tugging the median age of the group down more; and Sheryl Underwood plays up the sassy black woman trope a bit too much.

Each show brings something to the table, and it’s merely a matter of taste. To me, I primarily blur the edges that separate the shows. They’re like two sitcoms that have an overlap of similarities and differences, and I like them both for different and similar reasons.