Fiona Apple: A Ranking of Her Albums

fiona2

It feels like a milestone with Fiona Apple’s new album, Fetch the Bolt Cutters. Maybe because of the relief it spells for us in our unprecedented times of ambiguity during a global pandemic, or because it’s her fifth album—rounding out a full discography that officially spans four decades in this new year, or maybe every album feels like a milestone from the famously reclusive singer who has solidified a slow pace of artistic output with her last release, in 2012.

For all of these reasons, it feels warranted to geek out over Apple’s music—with that most irresistible and contentious of efforts: a list. With five albums under her belt now, Apple’s catalogue feels ripe for a ranking of this very discography. It’s all the more tempting because it’s no easy feat, considering how ingenious her music has consistently been these last four decades: how would you rank her albums from best to worst, or in her case: best to almost as best?

  1. The Idler Wheel… Her 2012 release has been aptly described as “distilled” Fiona. It best showcases her artistic sensibility, style, and skill in top form. Although the piano was synonymous with her identity and music at the start of her career, and still is—The Idler Wheel… transcended instrumentation, literally: its pared down sonic landscape was a stark departure from Apple’s prior albums, but her lyrics and melodies were instantly recognizable—and an extension as well, showing her artistic growth. These assets were brought to the foreground, and were always Apple’s greatest strengths. Songs like “Anything We Want”, “Hot Knife” and “Every Single Night” were as rich and potent as any music with multitudes more instrumentation. It’s her most consistent record, without a single weak track. Apple was at her peak: the songs don’t aim to be catchy, but the melodies are indelible anyway. It’s the perfect balance of artistic and accessible.
  2. When the Pawn… – Apple’s sophomore album was also an impressive balance of rich melodies and artistic innovation. In many respects, it’s her most satisfying album because it operates on all cylinders: it features beautiful production values, potent lyrics, and inventive sounds. It’s no wonder that this appears to be the fan favorite, from what I’ve read online—myself included. It draws from several influences and weaves it into a rich tapestry that can be sung along to, while also digested for its lyrical meaning: classic rock, hip-hop, show tunes, and spare piano torch songs. “Paper Bag” remains one of her best songs for good reasons: it’s lyrically and melodically taut yet bursting with ripe instrumentation that includes a brass section. “I Know” is one of her loveliest songs—a quiet, infectious meditation on adoration and contentment. When the Pawn… is the complete package.
  3. Fetch the Bolt Cutters – Undoubtedly her least pretty album, but perhaps because it’s the sequential last step in her artistic progression thus far: her most revealing, in a career that always prioritized revelation. Similar to her 2012 release, it moved even further from instrumentation and focused more on lyrics and themes. The result is a palpably cathartic album that marries deeply personal experiences with the primal impulse for release: pure art. What the melodies lack in accessibility, they make up in sheer urgency and authenticity—they’re like chants you made up in the schoolyard as you faced down bullies, or while you lounged quietly in the privacy of your home. They pulse with vitality. “Ladies”, “Heavy Balloon”, and “Relay” touch on themes like jealousy, betrayal, and mental health without being didactic or heavy-handed.
  4. Tidal – Her most accessible album for its sheer sonic gloss, it features her most catchy songs like “Criminal”, “Sleep to Dream”, and “Shadowboxer”. The seemingly surface beauty of these songs is not a detriment to their accomplishments. They sound as vibrant and relevant today as they did a quarter of a century ago. This album is ranked lower than her others only because an artist like Fiona can only improve with age, and starting from 18 years old at that, as she was when this album was released in 1996. The lyrics are not as mature as her subsequent albums, naturally, but the melodies and gorgeous piano-laden instrumentations aid them in their appeal.
  5. Extraordinary Machine – This was always my least favorite album, perhaps because it sounded less urgent and distinctive than the rest. There are a few classic gems that exemplify what I love most about Apple: “Parting Gift” is what she did best at the time: a girl with a piano singing about love askew; “Waltz: Better than Fine” is a throwback, reminiscent of her preceeding album’s foray into classic show tune influences. The rest of the album justified Apple’s talents, but there was a whimsical instrumentation and mood to this album that, though shouldn’t be synonymous with inferiority, was less appealing.

Not Happily Ever After: Why the Disney Renaissance Ended…

disney

With the recent slate of Disney films being released to theaters, it could be mistaken that we’re in the 1990s again. In the past two years, Disney has released live action remakes of Beauty and the Beast, Aladdin, and The Lion King. Next year Mulan will follow suit. The Hunchback of Notre Dame is officially in the works now too. All of these films are based on blockbuster animated features from Disney that premiered in the 1990s.

They were part of the illustrious era of animated feature films known as the Disney Renaissance—from 1989 to 1999. And frankly, it’s easy to see why these films still seem to be relevant—chronologically and culturally: the generation that grew up with these films, mostly Millennials born between the early 1980s through the mid 1990s, have yet to reach age forty at most—and barely finished college at the least. They’re also a notoriously nostalgic group, like most Americans—but with an even more ubiquitous platform to perpetuate it—on social media and the Web.

They grew up with the Disney Renaissance and have ostensibly kept the relevance of its animated features alive—literally two decades after its official end. With at least half of the films from the era slated for reinterpretation as major live action films, their enduring grasp on popular culture is unquestionable.

As a member of this generation and a bona fide Disney buff, I cant attest that the memory of these films are indeed as vivid and fresh today as they were when I was at the target age for their appeal. They are touchstones of my childhood, imbuing me with a sense of wonder and a worldview that I still hold onto to this day.

Nonetheless, the Disney Renaissance did have a clear beginning and end. It veritably ceased to produce more new films to augment its original run, after 1999. As any Disney fan or even a casual observer will recall, subsequent animated features from Disney experienced a steep drop in popularity for nearly a decade afterwards, in spite of a continual output of releases then.

As a fan with unrelenting exposure to these animated films, I have arrived at a conclusion as to why the phenomenon of the Disney Renaissance drew to a close at the end of the last century.

My theory is rooted in the catalyst that started the Disney Renaissance and made it popular in the first place. Kick-starting the era in 1989, The Little Mermaid was the first fairy tale that the Disney studio had worked on in thirty years. This was largely why it was a resounding success, because it returned to the formula that had worked so well for Disney in the past: a fairy tale musical, centered on a princess. Disney had erroneously abandoned this formula for nearly two decades prior to this, and suffered commercially and artistically with audiences.

Per hindsight, however, I believe the rediscovery of this Disney formula during the Renaissance era would also become its own undoing. If the era had one fault, it was that literally every film adhered to the formula, stringently, with the exception of The Rescuers Down Under, an early entry to the era that proved to be a commercial misfire, tellingly. Every other animated feature between 1989 and 1999 consisted of: a beautiful or outcast protagonist—usually orphaned by one or both parents, amusing sidekicks, romantic interest, colorful villain, an epic setting, and a roster of songs that covered the requisite themes of life goals, romantic love, and villainy. This is not to dispel the considerable diversity of creative and technical achievement of the era—producing some of the most ingenious, memorable and astonishing feats of song, visual, and characters—not to mention an unprecedented level of representation for the first time from Disney (lead characters of different ethnicities: Aladdin, Pocahontas, Mulan).

Nonetheless it’s quite revealing to see that, when compared to previous eras of Disney animated features, no other era could be accused of the same homogeneity: the Golden Age, from 1937 to 1942, only had two films that featured romantic love, only one princess, and two clear musicals. The Silver Age had several films without romantic love or a musical format as well. These two eras are arguably the biggest rivals of the Renaissance, in popularity and artistic achievement. Both reached a demise for their own disparate reasons—economical downturn after 1942 due to World War II, and the death of Walt Disney in 1966, respectively.

The theory of redundancy during the Disney Renaissance had also possibly begun to take shape as early as the mid-way point of its run: after four stellar blockbusters from the studio, things suddenly slowed down with the release of Pocahontas in 1995. Widely viewed as the first critical let down of the era, things didn’t immediately return to form in 1996 or 1997 either, with the releases of The Hunchback of Notre Dame and Hercules. Both films failed to draw audiences back after the misstep of Pocahontas. Something was amiss.

Audiences were being drawn elsewhere too: computer animation. This is perhaps the most commonly held theory of why the Disney Renaissance came to an end: towards the end of the 1990s, a new medium was dawning—usurping traditional, hand-drawn (2-D) animation that Disney was known for. With the release of Toy Story in 1995, a resounding success not just for being the first major computer-animated feature but a true triumph of story, audiences found a new outlet for family-friendly films that appealed to all ages. A slew of computer-animated (or CGI) films followed in gradual succession for the rest of the decade and beyond, none of which followed the renowned Disney formula—and often to critical and commercial success, surpassing even Disney. If the Disney Renaissance proved that animated features could appeal to all ages, CGI animated films proved that they didn’t have to be based on classic, existing literature—opening the doors for innovations in story that just happened to be married to a very innovative technology, now coming to its own at the end of the twentieth century.

Although I agree that CGI certainly disrupted Disney’s momentum in the latter half of the 1990s—particularly since CGI animated features have ostensibly remained more popular with audiences and critics alike, and 2-D animation has never come back into vogue since—I still stand by my theory that it was more of content than just medium. Also, the onslaught of CGI feature-length films actually occurred rather slowly, and did not immediately crowd the market that 2-D Disney animated features dominated: after Toy Story was first released in 1995, the next CGI films were Antz and A Bug’s Life, both premiering in 1998. That left three full years in between, which subsequently saw the release of three Disney animated features to vigorously fill the void to maintain their stronghold on audiences—yet they didn’t. The Hunchback of Notre Dame, Hercules, and Mulan were released by Disney during this period and though not critical or commercial failures, were far less renowned than their predecessors from the Disney Renaissance. Again, a malaise seemed to have settled with audiences, which could be read as a reflection of the medium’s output. Audiences surely weren’t just holding off for the next great CGI film, after only having witnessed the medium’s sole initial output in 1995. The average moviegoer had no idea how the CGI medium would eventually fare, though it was clearly a technological advancement. (It wasn’t until 2001, that the medium exploded with simultaneous releases of multiple CGI animated films, cementing it as a mainstay in cinema).

It was apparent that audiences had simply grown tired of the Disney formula, and so the business changed after 1999, just as it did after the Silver Age in 1967—when the studio entered a prior era of commercial and critical malaise, following the death of Walt Disney.

With that, it’s also helpful to understand what followed the Disney Renaissance: from 2000 to 2008, the Post Renaissance era indeed echoed the era that followed the Silver Age—the Bronze Age of 1970-1988, when the studio struggled to redefine its identity to audiences then too. The resulting films in the new millennium would reflect these efforts, striking into new territories such as Science Fiction, original stories not based on classic tales, even the CGI medium as well—which would be a portent of the studio’s eventual direction. Most of the films from this era didn’t quite resonate enough with audiences to turn them into classics.

The Revival era followed in 2009, with yet another rediscovery of the formula—with Tangled cementing Disney’s return to the zeitgeist, followed by Frozen. Both were clearly fairy-tale musicals centered on a princess, but married to the new CGI medium now, which Disney has converted to indefinitely to fit with the times. Aside from the new look, these films are quite similar to the Renaissance formula. Audiences responded and propelled these films into the public conscience as they did in the Renaissance era, hence the new namesake.

But if these new films from the Revival are following the same virtual formula as the Renaissance, why did the Renaissance cease in the first place? Shouldn’t it have endured, unabated, by sheer demand?

Again: we just needed a break. As a lifelong Disney fan, with the benefit of hindsight, I couldn’t fathom a Disney Renaissance-style film being released by the studio every year for the next two decades after Tarzan, the last of that era, in 1999. On some level, I would enjoy it purely as a diehard fan, but it would almost become campy—a parody of itself if you will. As much as audiences loved the Disney Renaissance, we can also sense artistic malaise. The formula had gotten monotonous and stale—again, already by the midway point of its era—and audiences clearly reacted with their wallets.

Does that mean that the Revival era is doomed to repeat history? Surprisingly, it may be averting this fate because: although it certainly has resuscitated the Disney formula, there’s one telling factor that separates it from the Disney Renaissance—it’s not following the formula for every new film. To their credit and maybe by calculation, they’re not just doing princess stories or fairy tales exclusively. Maybe that’s part of its success: Big Hero 6 and Zootopia are some of the titles that are as divergent from fairy tales and princesses as you can get. Both met with clear commercial and critical acclaim—unlike the misfires of previous eras that also strayed from the formula.

Whether they realize it or not, perhaps this is what audiences need. We will always love and adore princesses and fairy tales, but there needs to be variety. There’s nothing wrong with having a studio trademark (family-friendly films, music, artistic visuals), but the trademark can be broad and expand. Art is about change, pushing boundaries, and expanding possibilities. Sure, we do like some degree of familiarity—all art has a core familiarity: a movie has a beginning, middle and end; music has notes and instruments, verses and choruses. But along with familiarity we need variety.

Perhaps Disney has a new, unprecedented confidence and competency that is allowing them to achieve something they weren’t quite able to do in the past: successfully tell classic stories and original new stories, concurrently. Disney may have failed at pursuing either one at various times in the past, not because either one was theoretically bad—but because they just weren’t truly creating something exceptional. As mentioned, they were going through the motions after a certain point during the Disney Renaissance, settling into a creative ennui—or alternately, striking into new territory with dubious artistic vision, during the Post Renaissance for example. But if a story is truly told well, it can potentially succeed. Audiences will respond to something special even if it defies current trends, as they did when The Little Mermaid reignited this medium that had virtually gone defunct for nearly two decades, kick-starting the Disney Renaissance in 1989.

Will Disney get it right this time? We can only wait and see.

Any long-running institution is going to experience inevitable peaks and valleys in relevance and vitality—hence the different eras of Disney feature animation that exist in the first place. I am resigned to the eventual fate of the storied Disney Renaissance of my youth, because to borrow a platitude: good things can’t last forever. Sitcoms come to an end. Book and film franchises end—and can even be revived again after a certain period of absence (sound familiar?). The much-beloved Disney Renaissance is all the more revered because it wasn’t permanent and was limited in duration. It lends it a rarity that further incites gratitude and veneration. It was beautifully fleeting, as all life is.

It’s almost as if the beginning and ending of each era was inevitable—because like all art, Disney feature animation is an evolving medium. The studio is learning their craft in real time, and we get to watch it unfold onscreen.

 

‘Mid90s’, middle ground: lacking inspiration.

mid90s1

Mid90s proves that standard movie tropes are always familiar no matter how you dress them up. And first-time director Jonah Hill has certainly earned kudos for dressing his new film up to fit its epochal title: one only has to glimpse a few grainy frames (purposely shot on 16mm film for added effect), to be transported back into the last days before the millennium: compact discs, baggy clothes, big hair and of course a nostalgic soundtrack by a seminal voice of the era, Trent Reznor.

Although the title references an entire cultural zeitgeist, the film is far from being all-encompassing in scope or subject. Instead, it’s an insular story built on specificity, resting under a rather prosaic and vague title for lack of keener inspiration, which is its biggest flaw.

The story begins in Los Angeles during its titular time period, with a young preadolescent boy named Stevie. Hounded by his boorish older brother from the opposite end of the adolescent spectrum and given free rein by a lais·sez-faire mother suffering from arrested development, Stevie is primed for one of cinema’s biggest clichés: a summer he’ll never forget.

This leads into another hallmark of the period: the skateboarding underworld, when Stevie sets his sights on befriending a group of older boys at the local board shop.

As soon as he unremarkably worms his way into the affections of the boisterous but nonthreatening slackers, his story ticks off the requisite milestones of coming-of-age and its subgenre of films: exhilarating new experiences, wise mentors, chafing against his family, high jinks that just skirt the line of true danger and serious trouble.

Since the plot is standard framework, the question is if the parts make up for the sum. Stevie is competent enough as a protagonist: he fits the bill in looks and temperament, without hitting any false notes. The home life he shares with his threadbare family never truly generates a sense of urgency, which curbs any added weight to his arc. Stevie’s older brother and young mother aren’t guilty of anything beyond typical dysfunctional fare: physical taunts from the former and distractions by the latter. As for Stevie’s newfound entourage: they border on caricatures, with raunchy nicknames and slight characterizations that are as nuanced as a junior high yearbook.

 The film suddenly hits a climax that can only be described as inorganic and again, contrived—but this is in keeping with its steadily innocuous tone. Mid90s doesn’t seek to innovate or make a statement. It’s a light tale that never truly triumphs or fails abysmally either—inhabiting a safe middle ground of familiarity, evident all the more by its usage of epidemic-level nostalgia for a past era that’s bound to pique audience interest. It’s the only true star of the movie; without it, it would lose half of its distinction.

Nobody Walks in L.A.

ped2

L.A. has the worst pedestrians in the world—because we’re not used to them. It’s bad enough that it takes forever to drive a relatively short distance in this town due to traffic, but when you need to drive through an intersection and a person dares to walk across it first? It’s enough to make you curse the existence of humanity.

Sometimes it’s truly a test: on more than one occasion, I’ve been delayed by the truly physically impaired. Of course I empathize and wait patiently on those occasions, but those moments feel tailored to test the utmost limits of my character. It’s like halting an epic sneeze or cutting off a bowel movement midstream: the absolute urge to purge and the terror of following through with such a deplorable act calls for your every last nerve to reverse the impossible.

On one such occasion, I had to make a left turn from a moderately busy lane; a slew of cars rolled through in the opposite direction, deterring me. My receptors were already piqued because this traffic was a tad unusual for this area given it was an early Saturday evening. I scanned my target intersection, and saw two young men idling by on skateboards. They cleared before the train of cars did. Impatient, I began to eyeball the nearest traffic light up ahead that could clip this parade to my left. Then I saw it:

A disheveled, middle-aged man ambled arduously forward towards my designated cross street—on crutches. What’s more—in my periphery, I caught an aberration on one of his legs—yes, his right leg was amputated around the knee. Immediately, my mind jumped to do the math: at his laborious pace and with the yellow light imminent up ahead, he would reach the intersection just as the cars on my left cleared.

I wasn’t in a rush. I wasn’t even angry at him. I was just resolutely amused that this was happening. It felt so indicative of this city. Here I was, driving a car that still functioned well past its purported expectancy, with takeout on my passenger seat—no plans for the night, half a mile from home—and normally I would’ve flipped out at this pedestrian who dared to cross a public street in direct tandem to me turning into it, except that in this scenario the perpetrator was possibly a transient with clear physical limitations and little to no means by the looks of his tattered appearance.

If I had flipped the switch into full selfish pig mode at that very moment, even just privately in the confines of my car—I knew it still would’ve been a sin, in the eyes of my conscience and whatever god may exist. I could see an audience of my fellow human beings at that very moment as well, sneering and groaning at me if I were to recall the story on stage or if they were privy to it via a hidden surveillance camera—satisfied in their smugness that I was more terrible than they were, convinced that they would’ve felt nothing but angelic compassion in my position.

I drove home and lamented it all: the feckless logistics of this town, the cruel irony of fate, the snide hypocrisy of humans and my own presumptions about them—and my inability to resist being affected by all of this.

Movie Review: ‘It’ doesn’t deliver

It

Unless you’re from the tail end of Generation Z, you at least know what Stephen King’s It is about already. The question is if the new film is a worthy take on the classic novel, which had only been filmed once as a well-known 1990 TV mini-series. Spoiler alert: I did give in to nostalgic curiosity and re-watched the original version before viewing this new one. Don’t worry: although I’d long revered it as a fearful preteen back then, I was shocked to find now that it was rather underwhelming—a mild, moody drama with some decent scares thrown in.

So I was primed and as objective as possible to the prospective terrors of an ambitious new take from the best that Hollywood has to offer today. From the opening scenes in the film that lead to the introduction of Pennywise the clown, otherwise known as the title namesake It, the movie looked promising.

Unfortunately, it didn’t exceed expectations from there. First off, Pennywise the clown is the centerpiece of the entire story, hence the title. Without his terrifying image or concept of menacing evil, the story isn’t effective. Not to sound like a purist, but for lack of a better example: the original Pennywise played by Tim Curry in the mini-series was far more sinister. Although his looks were barely a step away from a typical birthday clown, that’s what made him frightening: he was plausible. Here was a clown that could exist in your neighbor’s backyard, surrounded by innocent children—yet there was a studied vitriol to his gaze and a barely controllable sneer to his painted red lips. When he opened his mouth to lunge at last—that spray of razor sharp teeth only solidified our very fears. The new Pennywise, played by Bill Skarsgard, is so stylized he’s as flat as a joker from a playing card. And as engaging. His appearances are not particularly memorable and are often upstaged by the other manifestations of “fears” that he lobs towards his victims, in the forms of an abusive father, a distorted painting of a woman, and a headless child from a history book.

What about the rest of the characters? The story centers around a gang of “losers” in the late 1980s: seven misfits from the local junior high in Derry, Maine, who congregate as a means of survival from the social hierarchy of their peers—and eventually, from the deadly curse that Pennywise has inflicted on the town for nearly a century. The child actors that portray them are all competent, but only three of the characters are given any distinct personalities that leave an impression: Bill, the stuttering de factor leader and protagonist who wants to resolve the death of his little brother from the opening of the film, is appealing and bright. The group’s lone female member, Beverly, stands out not just for being a girl—but because she gets the most screen time to develop her troubled back story that includes an abusive, preying father. Richie, the loudmouth comic relief of the group, is memorable by default because he’s the most vocal and biting. The rest of the kids aren’t fleshed out particularly well—they end up being ciphers who just provide physical power and exposition to the story.

As for the story itself, it lags in places and could have benefited from more urgent pacing—given this is a horror story, where timing is of the essence. Although the film is inevitably going to lapse into some preteen requisites, which is fine for the sake of character and plot development: crushes, friendships, betrayal, etc.—the overall story suffers as a result. Although the original novel was sprawling, it somehow seemed too unnecessarily long onscreen.

It’s fitting that this movie takes place in the 1980s because the special effects for the film seem to be right out of that era, almost. Although visual effects should never be relied on to propel a horror film—they are surprisingly disappointing and innocuous in this movie, given today’s technological advances. Since the movie suffered from middling pacing as well, that left for very little to keep me at the edge of my seat. By the time the movie hit its climactic standoff between Pennywise and the brave, bereaved kids, I gave up my search for something truly terrifying to materialize.

Overall, I don’t think this film will join the pantheon of truly classic horror films in my eyes. The hype clearly overshadowed the actual execution of the story onscreen. It ended up just being another underwhelming horror flick.

Movie Review: Tepid ‘Water’

shape

The Shape of Water reads like an episode of The Twilight Zone or Outer Limits—and not even a good one at that. It’s amusing, slight, but certainly no opus. From the promotional art and trailers, one can already get a clear sense of the inevitable plot: outcast human falls for outrageous human-like animal. It’s bound to be unconventional and exciting, but thoroughly predictable.

When Sally Hawkins’ mute heroine opines that her shocking new lover “doesn’t judge her”, it’s utterly expected. The fact that she’s mute suddenly proves nothing more than a plot convenience for an otherwise typical odds-against-them love story.

Granted, the film shouldn’t be punished solely for lack of originality—what film can truly achieve such a feat in a world where storytelling stretches back millennia? The problem is that it also rests on clichés in execution as well: Elisa, the heroine, lives in an unnamed metropolis above a quaint movie theater and works in a drab factory as a janitor in the mid-twentieth century. The sets for these two locations and the urban landscape in between is utterly a twenty-first century take on such a place.

There is no genuine sense of remove. Again, this film could almost fit snugly on the small screen, within any season of a sci-fi drama series. For a film that aims to be unorthodox and novel in concept—it plays it safe. It could have gone full noir—texturizing the edges of the urban landscape, heightening the grimmer aspects of its story—visually and tonally. But it never does. It’s entirely paint by numbers, counting on audiences to relish the singular anomaly of interspecies love—like a single wilting rose, when it could have been a bucketful.

In an age of increasingly fanciful storytelling and visuals (mostly found on the more risk-prone modern television medium), this film feels hopelessly quaint for all the wrong reasons: it’s not provocative or unusual, but pretends to be.

City of Broken Dreams

wonder

I volunteer at the local gay center occasionally. It’s located in the heart of Hollywood—on Santa Monica Boulevard, just off of Highland. If you go a bit further north on Highland, you’ll hit Hollywood Boulevard next to the Kodak Theater where they used to hold the Academy Awards.

I don’t live too far away, geographically, but as with everything in L.A. it’s cultural disparity that separates us, not distance. Driving up from my nondescript, low-key neighborhood of West L.A. adjacent to Beverlywood, I’m essentially wading into the gritty, smoggy, unfamiliar waters of Hollywood when I venture there. More discerning people would have ardent reservations even going there, barring an absolute emergency or valid necessity. Geographic prejudice is just one of the many charming traits of Angelenos you’ll discover here. I’m certain many of them take gleeful pride in it, much as they would a fine set of hair or an official job title.

One Monday morning, I gamely made the commute to do some filing for an upcoming event at the Gay Center. It was pleasant—getting out of my routine to help out with a good cause, while brushing shoulders with people I otherwise would never encounter on my own. The free pizza and cookies were just a bonus.

Halfway into my shift, I had to move my car to avoid parking regulations. Walking amidst the nearby adjacent residential neighborhood, I got into my vehicle and circled around onto Highland Avenue and parked, then trekked back to the Center. This unremarkable act evoked volumes to the intensity of this city and its continuing unfamiliarity to me.

Within such close proximity to the Gay Center, several of its constituents were milling about in surplus: an African American transgender woman strutted down Highland Avenue, bemoaning the heat under her breath. A pair of young gay men, stylishly dressed, sauntered northward on the street. A lone gay man in his late thirties to early forties, glanced at me curiously as I reached the crosswalk.

The street glowed under the unseasonable heat for late October—all concrete, metal, and glass—cars and casually dressed denizens moving forward with purpose. Businesses held shelter like virtue.

Back at the Center, a middle-aged man and woman danced and frolicked to music from a boombox while a small, hairy dog looked onward at their side. Their diligence seemed to equate with rehearsing for an imminent performance in the future. They paid me no mind, and I didn’t with them.

It was at that moment that I tied everything together. I realized that I no longer possessed a sense of wonder that is synonymous with youth. Not too long ago, I would have been tickled with simple amusement at the sight of this quirky couple and their canine cohort. I would have mused over their arbitrary efforts and location—the myriad possibilities of their intentions and origins.

I would have felt joy at watching the nearby city streets emitting their own special music, new to my ears as a visitor. The pedestrians and storefronts would have told stories that I knew would continue on without my witness—the mystery of it all intriguing me.

I would’ve felt this like a child on a Saturday morning: plain reverence at the beauty of life and all it had to offer on one special day. Now? I’d woken up on a new day, and didn’t recognize what I saw in the mirror anymore. Or I did—I looked just like the hardened cynics who had scoffed at me whenever I expressed unmitigated wonder in this city.

I realized: there was no sense of wonder for me anymore, because there was nothing new for me to see in this city. I knew the end of each story now, or rather: I knew where I belonged in the context of each one. I knew what to expect. I’d been trying in vain to make a connection in this fractured city, to no avail. What did that tell me?

Without ambiguity, there is no need to be curious anymore. This is why people settle down and stop exploring. It isn’t necessarily a choice. It’s an acceptance of who you are and how you are received in this world. I was just holding out on it for much longer than most.

Clothes Don’t Make This Man

clothes

Please do not judge me by what I wear. Clothes are merely functional to me. Yes, I do believe people should at least wear something decent and flattering to their physique. I’m aware of the other extreme and even I am critical of it: I’ve met people who don the sloppiest of attire and it is truly unbecoming of them. I’m aware that there is a valid argument for each person’s responsibility for presentation. But, and I’m aware that I’m proposing my own biased mindset here: we shouldn’t expect more than that minimum, from everyone.

I’m not knocking fashion. Like all creative mediums, it’s an art form in its own right. If you are passionate about it and truly embrace this medium as a form of self-expression: more power to you. But like all art forms—not everybody is interested to the same degree. There are cinephiles who don’t read. Bibliophiles who loathe movies. Foodies who don’t watch films. Fashionistas who don’t care for film. You catch my drift. To hold everyone to the same standard is an imperfect mindset, because like all arts, it’s subjective—and like I said: not everyone is interested to the same degree.

I would wear a potato sack if I could. I’m too busy devoting my time to books, movies, and music—aha, see—I do have aesthetic sense. It just doesn’t extend itself into what I wear. The fact that I love moody alternative rock music does not translate into “moody, alternative” clothing—unless The Gap is considered edgy now. My predilection for obscure, artsy foreign drama’s is hardly conveyed in my completely clean canvas of skin—free of tattoos, piercings and adornments as the day I was born. If you took one look at me and did me a solid by guessing my taste in culture based on my wardrobe, you’d swear I was a Maroon 5 and “Paul Blart” movie fan. (Hint: those are not good things.).

There you go. Sure, there could be some validity in addressing my (lack of) style sense. The decision to not indulge in expressing myself through clothing is a revelation in of itself. If I had to guess, it would mean: I’m reserved, private about my passions and interests, and maybe just maybe—I’ll give this much to my most vicious critics—a tad conservative, but only when it comes to appearance. I don’t worry too much about the latter, because my dark sense of humor and world view is anything but.

See? Even the way I express myself does not translate into what I wear. My closest friends would attest that I’m quite unusual in my beliefs and interests. I’m the one who wants to try new things, go for the unconventional, is inherently bored by the ordinary. And yet: I probably wear the most ordinary clothes out of everyone.

It’s fair to say that we do start out with no fashion sense as children. But as we grow and discover our identity and sensibilities, it’s natural for us to start determining how we present ourselves on the outside. Some of us invest more time and effort into this than others. Somewhere along the way, I didn’t quite make this leap. Sure, I do have some taste in clothing for sure—I know what I’m comfortable wearing and not wearing. But I never went further than the minimum. I never incorporated notable depth into the armor that one wears on the outside, in this world.

I feel like I’m starting to go in circles while waving my own flag here, so I’ll leave it on this: people can express themselves in many forms, so don’t just start and end with their appearance. For some, that is the last place where they would convey any of their expression. Sounds crazy, but it’s true. Dig deeper. Look in other places. Listen and engage, before you judge a person’s character. That cliché “don’t judge a book by its cover” was supposed to be used in real life, you know.

 

 

 

Pop Culture and Me: a Forbidden Love Affair

popculture2

No one expects me to like pop culture. I believe two key factors play into this: my race, and my lack of style. I’m not going to change either one. Or the unyielding fact that I’ve always been quite enamored by pop culture.

Okay, my race I can’t change. But could I change my style so that it translates into a media-savvy hipster? Or at the very least, someone who looks like they watch TV?

How does that work? Should I wear “Walking Dead” t-shirts? Get a “Breaking Bad” Tattoo? Wear everything I see from Forever 21 to prove that I’m just like everyone else?

The funny thing about being misunderstood is that although we loathe it, we secretly enjoy it too—because it proves that there’s more to us than meets the eye.

I suppose there are some people out there who are happy being simple and straightforward—easily “read”, or as the kids call it these days: basic. See, I am hip enough to know that.

For the rest of us, we instinctively feel that that translates to being shallow, which is generally seen as a pejorative term unless you’re a reality star. Check. I know what constitutes a reality show star.

The truth is, I do play a role in my own conundrum too. It’s my lack of desire to assimilate on some levels that distances me from my peers, which fosters animosity and misunderstanding. But if I’m not interested in jumping on the latest bandwagon, that’s my right too. And being an individual does not preclude an awareness of what’s current in popular culture.

It’s not all bad either, to be fair. When I mentioned something about the Golden Globes one year (yes, I’m even an awards show junkie), a friend innocently remarked: “Wow, I thought you’d be—too cool to watch something like that.” Aww, ain’t that sweet? So maybe there is a contingent out there that isn’t attacking my character when assuming things about me. They’re simply deeming me to be more enlightened than I actually am, which is flattering—and less insulting.

But alas, I can succumb to frivolity as much as the next person. Who doesn’t enjoy the latest celebrity news? It’s like a large order of McDonald’s French fries: not good for you, but you’re not interested in being a saint. You’re allowed an indulgence once in a while. How utterly boring would it be if we only did things that were ethically “good” and enriching for us? If that were the case, there’d be no decent TV shows, movies, or music. We’d all be wearing white robes and chanting scriptures and talking about nothing more provocative than the weather.

So there you have it. The unremarkable reason why a person like me can enjoy the latest Adele album or the Oscars is just that: it’s human nature. Sometimes the simplest answer is the hardest one for people to see or accept. Apparently.

Why do people love the 80s?!?! (Try the 90s!)

80s

America’s unnatural love of all things 1980s is like society’s reverence towards pregnant women: you can’t really counter it without sounding like a complete monster. But since I’m already an inherent outcast twice removed, I guess I’ll be the brave soul to take a stab at it (the ‘80s).

They say trends come in twenty-year cycles. I was born in the 80s, and I remember as a preteen, being glad when all the saccharine gaudiness of the decade vanished by the early 1990’s. Little did I know that it would all come skipping back in an even more mannered, pretentious form—ten years later when I was in my TWENTIES, in the ‘00s.

By 2003, you couldn’t surf the web without coming across an article that proclaimed: “Check out your favorite redheaded ‘80s celebrities HERE!” or hear a song that didn’t sample a classic ‘80s synth-pop ballad, or have a conversation with an adult girl who didn’t squeal: “Ohhh, I LOVE the ‘80s!” Basically, it was like crack in the ‘80s: integral to the social scene.

If you can’t guess by now, I have highly objective reasons why I don’t like the ‘80s. I came of age in the decade that succeeded it: the ‘90s. When I say “come of age”, I mean the (first) era of maturing in one’s life—your teen years.

Nothing is as great (or bad) as when you are a teenager. If I came of age during the 1890s, no doubt I would be sitting here clamoring about how great churning butter was, and how kids these days are missing out on savoring fermented cow milk you procured with your own two hands. So I’m aware that I suffer from a little bias.

For me, I feel sorry that kids today didn’t grow up with angry, forlorn, edgy alternative-rock singers who managed to somehow be both dangerous and mainstream in this perfect window of time known as the 1990s. It was a truly magical time. I mean, MTV not only PLAYED music videos for significant chunks of time, they actually focused on music from earnest, serious artists. Music hadn’t been this socially aware and provocative since the ‘60s!

TV and movies vastly improved in my eyes too. Gone were the days where a movie focused solely on a nuclear family going on vacation, or a kid taking a day off from school. Movies with higher concepts were in vogue now: the term “indie” exploded, with all its subversive and innovative connotations. Disney rode a triumphant wave of Renaissance for the first half of the decade. Summer blockbusters pushed their art to new, exhilarating heights with movies like “Jurassic Park” and “Forrest Gump” setting records.

TV shows delved into darker and more progressive parts of the cultural psyche, with shows like “The Simpsons”, “Seinfeld”, “The X-Files” and “Roseanne” (although some of them debuted in the late ‘80s, they came into their prime in the ‘90s). Shows didn’t have to pander to the ideal family unit anymore. They could push the boundaries of what we found funny or intriguing, and succeed.

Look, I get the objective reasons why people love the penultimate decade of the twentieth century: it was simple. Sweet. Goofy. Over-the-top. Everything my fellow gay men love, which is why all gay men have some voluminous playlist somewhere that is nothing but ‘80s, ‘80s, ‘80s—as well as the perfect ‘80s getup outfit, should they have the divine fortune of crossing paths with an ‘80s-themed party. The ‘80s is like your kooky, fun, and slightly frivolous aunt. Whereas the ‘90s is your cooler but more sedate and socially conscious uncle. It’s kind of obvious who you’d rather party with.

But this is why I don’t like the ‘80s: I don’t like things that are simple, sweet, and over-the-top. It’s not my style. I’m the jerk that likes things to be ironic, dark, and brooding, hence: I will always identify with the Gen-X-dominated ‘90s. And hence: why most gay men have a convenient blind spot for this decade altogether. Seriously—can you imagine a gay man squealing about the ‘90s? ….? Only if they were forced to go to a ‘90s-themed party; they’d be squealing about their “other obligations that night”’—to get out of it. No gay man wants to be reminded of a classic Tarantino movie. It’s way too heavy, and our lives are already heavy enough. The same can be said for society at large, truly.

But the ‘90s are innocent as well, compared to the subsequent decade(s) that follow it. For one: during that decade, “social media” only went so far as logging into AOL via your phone cord, selecting a terrible login name, and signing into a god-awful chat room with other strangers. We had virtually no digital footprint, and honestly: many minds and lives were saved because of it. Terrorism was not truly a household word until the tragic events that ignited it on a fateful day in New York City, the following decade. We didn’t have such a politically divisive country due to a polarizing president yet. And a recession, the likes of which hadn’t been seen since the 1930s, hadn’t yet imploded.

So if you want something innocent, fun, but with a little more edge and a smidgen of self-important angst, why not make a pit stop in the decade before the ‘80s (if you’re going backwards in time)? You can geek out to Ace of Base, camp it up to the Spice Girls—but you can also show your gritty, “street cred” side by wearing baggy gangsta pants or grungy thrift-store plaid. The ‘90s had its perks too, ya’ know.

Thankfully, it is the 2010’s now—well over twenty years since my favorite decade started its rotation under the sun. It’s finally getting more of the “respect” I always knew it deserved. Too bad it takes twenty years for some people to arrive to the party—but better late than never.