Boys

The subway doors open and three young men step in together, huddling at that entrance as the train moves forward. I instantly recognize their youth, casual beauty, and modest affluence. They fit the label of college students with all these traits. Conversely, they are perfectly disparate from one another: the one in the center, facing me directly, is compactly built with a clean, clear, masculine, face–the bona fide “ladies’ man” of the group. On his right is the most laidback character with longer hair and facial hair–the “stoner” of the group if there had to be one; on the left is the tallest but most unorthodox member–with glasses, floppy hair and severe but pleasant facial features. If you shaved all of their hair and ordered them to don the same shirts and fatigues, they’d be uniformly attractive, yet retain more subtle vestiges of these distinctions. In other words: they’re all winners, but some more than others.

They speak to one another with purpose, inaudible to me at my distance. Yet–with my imagination, the knowledge that it’s early evening on St. Patrick’s Day, and the fact that the ladies’ man of the group is sporting a striped green scarf and bright green beaded necklace, I can fill in the gaps. Ladies’ man is reading his phone as the tall nerd asks him a question. Ladies’ man shows him the phone and answers. No doubt they’re on their way to some watering hole to celebrate the holiday. It must be one of their first–if not the first–since they became of legal drinking age. If I had any plans for that night, it’d be exactly two decades since I became of legal drinking age. The other difference is: I don’t drink anymore, and even if I wanted to, I have no posse to go drinking with anyway. It’s just one of the casualties I’ve learned that come with age on this earth.

I used to be one of these three young men, though. I used to be part of a group of men who looked similar to me in youth and beauty, yet were not similar. We had little in common with each other besides those traits, but that was enough at that age. It was enough to explore the world together and enjoy life.

I can’t help but ponder the fate of these young men: I guess I already answered the question of who would be most likely to get lucky that night, or any other night. But maybe I’m wrong, and I still wonder how they’d each pursue that primal goal, unique to each of their character, strengths, and knowledge. 

To go even further: unbeknownst to them as it was to me at their age, two decades will be snatched from them and they may find themselves in the most disparate circumstances. Will these doe-eyed, fresh-faced young men be hurtling towards the opposite of where they were going to at this very moment? I couldn’t help but pick the most incongruous scenario: would that seemingly mild “nerd” of the group be saddled with the ultimate responsibility–as a married man, with children? It’d be so different from the tender seedling he appeared today, in contrast to his more traditionally masculine companions. It’s possible. It’s as old as time. It seems to happen to nearly everyone to heed that ancient call.

On this day, I can’t help but wonder now: was there someone watching me and my friends and thinking the same thing when I was that age? I’m sure. In fact, when I got lucky with someone whom I had spied on, on my many ventures about town–he made an admission: “I always thought you were cute.”

He must’ve watched me from afar several times, analyzing the differences between me and my friends, and secretly creating stories for us in his head. How did I compare to my friends? Was I the cute one? The tall one? The funny one? The nice one? How much joy did he take in admiring my or all of our beauty? Probably as much as I do now, with these three young men. 

I could be bitter at the callowness on display today and their ignorance of the storm yet to come in their lives, but that’s not my nature. It’s neither here nor there anyway. If I’ve learned anything else in life, it’s that no two stories are exactly the same–similar, yes, but it’s foolish to make up your mind so surely about most matters. Life is as varied as the people in it. Sometimes we triumph, sometimes we fall down spectacularly. And when I did either one? Most of the time, beyond my mortal efforts–the rest was out of my control. I had my fun, and these young men will too. They will have these memories to look back on, as I do my own.

Tim Buckley: The Power of Be-ing

The best things don’t try to be the best. They simply be. If you think about it, that’s often the hardest thing to do. We are often trying to be something: smart, strong, happy, confident–as we work, socialize, make love, and nearly every other conceivable act of being human. But to simply be is often overlooked and quite extraordinary, least of which for its novelty.

Folk singer Tim Buckley’s most popular song, “Song to the Siren” is perhaps the best example of a song that achieves this rare grace–but perhaps it’s not fair to say that since the world has as many songs as it does birds, great and small. His gentle ode is simply the first in a long time that I recognize with absolute clarity for achieving this.

A simple acoustic guitar accompanies Tim’s voice–which is nothing short of regal, like a minstrel from centuries ago–equal parts: emotion and exposition, mystery and clarity. There is nothing like his voice in modern music. It’s warm and strong, but also cool and gentle.

The lyrics embody the same timelessness. As referenced in its title, it focuses on the theme of sirens at sea. It starts off a little slow, lyrically, but by the third stanza, takes an extraordinary turn–painting a picture we’ve never seen like this before. 

“Did I dream you dreamed of me?” Tim asks–a question I’d never heard before but makes utter sense. “Were you hare when I was fox?” he continues–a startling earthy metaphor that adds to the song’s mythic stature. 

Tim reveals that the siren turns him away without reason, and the effect is devastating. That he recounts it in the same tone of voice only adds to his sorrow. We realize that the song was a dirge all along.

Though the song does not quite have the traditional verse/chorus/bridge structure, there is a semblance in its melodic and lyrical motifs: “Here I am, Here I am, waiting to hold you.” If the song tried to be something indeed, it proved that traditional song structures are quaint and counter intuitive to the art of songs. Its closest relative is a hymn; it could’ve been recited during the Renaissance, yet it was written and recorded in the 1960s. 

It’s not that the song had no purpose or vision; on the contrary–but that how it presented itself was as natural as breathing. As later confirmed by the songwriter (Buckley collaborator Larry Beckett) decades later, the song was a metaphor for unrequited love. Despite being such a ubiquitous subject, what sets it apart is its utter lack of bombast or histrionics. Its silence represents the absence of love itself. 

This song, like all of Tim Buckley’s, was never going to be a hit. It was never going to be well-known to the public. It wasn’t going to win any awards. That’s because it would achieve something even rarer.

[See below for the live version of the song, which I believe is superior yet not much different from the original, which also mostly featured just his voice and guitar]:

21st Century Problem: Choice Overload

In the third decade of the new millennium, we have an embarrassment of riches to choose from for our movie-viewing pleasure: many of us are hooked up with one or more streaming sites that literally have hundreds of thousands of movies and TV shows at our disposal.

There is more content than you will ever conceivably be able to or want to watch in your lifetime: popular movies, indie movies, comedies, dramas, horror, family-friendly, foreign, old, new–every genre you can think of. Even if you had all the time in the world, you could not make a significant dent to the ongoing list of content. It’s a buffet table that seems to grow longer and longer every time you go back for more.

This is unfathomable, when considering that up until about a decade and a half ago, we still had to resort to traipsing down to the local video store and selecting one or two new VHS tapes or DVD’s to satisfy our immediate entertainment needs.

Although that was a paltry setup by comparison, it was certainly less overwhelming than what we have today. While I feel no pressure to watch anything on the streaming sites now, ‘cause not only do they have an immense bounty–some of them are even free now, because of course it’s come to that–it’s that it results in another twenty-first century predicament: fear of missing out (FOMO). There’s a nagging part of me that can’t help wonder if I’m missing out on some random, amazing film that’s buried on a streaming site–maybe an old gem I’d been meaning to watch or a new film that could change my life–and if it’s free: why the hell am I not taking advantage of it? And since I was old enough to remember shelling bucks, gas fare, and physical energy just to select from a fraction of films I’m privy to now, I feel like I’m betraying my old self who would’ve been ecstatic for this privilege today!

It’s too much pressure, like having a free buffet every night. How much can I keep stuffing my face before I get sick of it all and crave something simple, small, and different instead?

I know everyone loves to claim that things were “better” when they were younger, but sometimes I do miss only having a couple of new VHS tapes to watch during the weekend. It made things easier: I could know when to start and stop watching movies, and carry on with my life afterwards. Now, there are no clear boundaries between when to start and stop watching movies. It’s a new skill I’ll have to develop with this new way of obtaining movies.

25th Anniversary: ‘Jagged Little Pill’ — A New Perspective

jagged

Like a lot of older Millennials, I can remember the first time I heard Alanis Morissette. I was in eighth grade, and her debut music video in the U.S., “You Oughta Know”, began making the rounds on TV. I didn’t think much of it, but her second single, “Hand in My Pocket”, arrived months later and I became hooked like the 33 million fans worldwide who’d buy her blockbuster album, “Jagged Little Pill”, in 1995 onwards.

It wasn’t just that I was a fan; she literally introduced me to music—and at that pivotal age, it was momentous. Like many teenagers, my musical taste would be crucial in informing my identity. Aided by Alanis’ music, I discovered the genre she was ostensibly part of—alternative rock—and I was in love.

I remained a fan of Alanis in the subsequent years, still fondly playing her first and second albums regularly, well into my thirties. A quarter of a century later, however, I unexpectedly found myself coming full circle to the minority of outliers whom I recalled initially rebuked the phenomenon of her music, perplexingly.

Something had shifted my sensibilities. Now the instrumentation, style, and concept of her music simply struck me as… inauthentic. They seemed preconceived, affected, and a little silly. It wasn’t that there wasn’t true talent involved; it’s that the music was less about art than it was about entertainment. On that level, yes: the music was certainly catchy—enough for me to listen repeatedly for decades. I would never tire of her music in some sense, but I began to realize: maybe she wasn’t such an authentic artist, but again: just an entertainer with a phenomenal gimmick.

These were the accusations from her critics, twenty-five years ago upon the release of “Jagged Little Pill”. They had baffled me then, in their reservations against a surely indelible and spectacular artist—but now I understood where they were coming from.

I remember hearing people flatly say her music was “whiny” and an outlet for complaining—rather than profound and cathartic, as millions of fans attested. Her most famous critics declared her hackneyed and contrived; I could hear it in the instrumentation now—often, it sounded more like an imitation of rock music than actual  rock music, if that makes sense. It was too slick and mannered for its own good. A sophisticated ear is a tall order for a fourteen year old; what was my excuse for the last twenty years? Maybe when I listened to her music between then and now, it was clouded by my own nostalgic attachment to it.

Morissette’s credibility was always suspect from the start: prior to “Jagged Little Pill”, she’d released two strictly dance-pop albums that were indicative of their time: the early 1990s. Her about-face with “Pill” as an alternative rock singer was suspiciously convenient, a few years later at the peak of the grunge phenomenon. That she collaborated with veteran music producer Glen Ballard—who was accomplished but best known for polished pop rather than rock music—only perpetuated doubts of Alanis’ “rock” status. Honestly, this theoretical calculation on her part wouldn’t have bugged me, except that she didn’t pull it off artistically after all.

It’s telling that Morissette never repeated her success with “Pill”—commercially or artistically, ever again. Not that an artist should replicate their style or subject repeatedly, but she never attained the same relevance even on a strictly esoteric or artistic level. Her follow-up album came the closest, but even now it suffers from a similar quality as its behemoth predecessor: inhabiting a dubious sonic limbo between art and entertainment. In fact, all of her subsequent albums shared this trait. That was no accident.

It’s no wonder that no matter how beloved and entrenched “Jagged Little Pill” is in popular culture, it rarely if ever landed on any of those contentious, retrospective “Best of” lists from presumably serious music critics. Those dissertations always lent themselves to debate, which is why their unanimous omission of this album is all the more telling: and rarely debated!

Don’t get me wrong—I still think Alanis Morissette’s music has merit, but her music is tantamount to a blockbuster movie: it may become a beloved fixture in popular culture, but it’s not necessarily the finest example of its medium. In many ways, it’s no less valuable for bearing this quality, and there’s no shame in liking it. I will always have a place in my heart for her brand of music, just like I do with other fun pop music, blockbuster films, or cheesy TV shows. They all serve a purpose. Twenty-five years later, I may have changed, but I can still laud this landmark album for its most consistent quality: a pivotal moment in pop culture—for me, and the millions of fans that made it one of the biggest albums of all time.

Not Happily Ever After: Why the Disney Renaissance Ended…

disney

With the recent slate of Disney films being released to theaters, it could be mistaken that we’re in the 1990s again. In the past two years, Disney has released live action remakes of Beauty and the Beast, Aladdin, and The Lion King. Next year Mulan will follow suit. The Hunchback of Notre Dame is officially in the works now too. All of these films are based on blockbuster animated features from Disney that premiered in the 1990s.

They were part of the illustrious era of animated feature films known as the Disney Renaissance—from 1989 to 1999. And frankly, it’s easy to see why these films still seem to be relevant—chronologically and culturally: the generation that grew up with these films, mostly Millennials born between the early 1980s through the mid 1990s, have yet to reach age forty at most—and barely finished college at the least. They’re also a notoriously nostalgic group, like most Americans—but with an even more ubiquitous platform to perpetuate it—on social media and the Web.

They grew up with the Disney Renaissance and have ostensibly kept the relevance of its animated features alive—literally two decades after its official end. With at least half of the films from the era slated for reinterpretation as major live action films, their enduring grasp on popular culture is unquestionable.

As a member of this generation and a bona fide Disney buff, I cant attest that the memory of these films are indeed as vivid and fresh today as they were when I was at the target age for their appeal. They are touchstones of my childhood, imbuing me with a sense of wonder and a worldview that I still hold onto to this day.

Nonetheless, the Disney Renaissance did have a clear beginning and end. It veritably ceased to produce more new films to augment its original run, after 1999. As any Disney fan or even a casual observer will recall, subsequent animated features from Disney experienced a steep drop in popularity for nearly a decade afterwards, in spite of a continual output of releases then.

As a fan with unrelenting exposure to these animated films, I have arrived at a conclusion as to why the phenomenon of the Disney Renaissance drew to a close at the end of the last century.

My theory is rooted in the catalyst that started the Disney Renaissance and made it popular in the first place. Kick-starting the era in 1989, The Little Mermaid was the first fairy tale that the Disney studio had worked on in thirty years. This was largely why it was a resounding success, because it returned to the formula that had worked so well for Disney in the past: a fairy tale musical, centered on a princess. Disney had erroneously abandoned this formula for nearly two decades prior to this, and suffered commercially and artistically with audiences.

Per hindsight, however, I believe the rediscovery of this Disney formula during the Renaissance era would also become its own undoing. If the era had one fault, it was that literally every film adhered to the formula, stringently, with the exception of The Rescuers Down Under, an early entry to the era that proved to be a commercial misfire, tellingly. Every other animated feature between 1989 and 1999 consisted of: a beautiful or outcast protagonist—usually orphaned by one or both parents, amusing sidekicks, romantic interest, colorful villain, an epic setting, and a roster of songs that covered the requisite themes of life goals, romantic love, and villainy. This is not to dispel the considerable diversity of creative and technical achievement of the era—producing some of the most ingenious, memorable and astonishing feats of song, visual, and characters—not to mention an unprecedented level of representation for the first time from Disney (lead characters of different ethnicities: Aladdin, Pocahontas, Mulan).

Nonetheless it’s quite revealing to see that, when compared to previous eras of Disney animated features, no other era could be accused of the same homogeneity: the Golden Age, from 1937 to 1942, only had two films that featured romantic love, only one princess, and two clear musicals. The Silver Age had several films without romantic love or a musical format as well. These two eras are arguably the biggest rivals of the Renaissance, in popularity and artistic achievement. Both reached a demise for their own disparate reasons—economical downturn after 1942 due to World War II, and the death of Walt Disney in 1966, respectively.

The theory of redundancy during the Disney Renaissance had also possibly begun to take shape as early as the mid-way point of its run: after four stellar blockbusters from the studio, things suddenly slowed down with the release of Pocahontas in 1995. Widely viewed as the first critical let down of the era, things didn’t immediately return to form in 1996 or 1997 either, with the releases of The Hunchback of Notre Dame and Hercules. Both films failed to draw audiences back after the misstep of Pocahontas. Something was amiss.

Audiences were being drawn elsewhere too: computer animation. This is perhaps the most commonly held theory of why the Disney Renaissance came to an end: towards the end of the 1990s, a new medium was dawning—usurping traditional, hand-drawn (2-D) animation that Disney was known for. With the release of Toy Story in 1995, a resounding success not just for being the first major computer-animated feature but a true triumph of story, audiences found a new outlet for family-friendly films that appealed to all ages. A slew of computer-animated (or CGI) films followed in gradual succession for the rest of the decade and beyond, none of which followed the renowned Disney formula—and often to critical and commercial success, surpassing even Disney. If the Disney Renaissance proved that animated features could appeal to all ages, CGI animated films proved that they didn’t have to be based on classic, existing literature—opening the doors for innovations in story that just happened to be married to a very innovative technology, now coming to its own at the end of the twentieth century.

Although I agree that CGI certainly disrupted Disney’s momentum in the latter half of the 1990s—particularly since CGI animated features have ostensibly remained more popular with audiences and critics alike, and 2-D animation has never come back into vogue since—I still stand by my theory that it was more of content than just medium. Also, the onslaught of CGI feature-length films actually occurred rather slowly, and did not immediately crowd the market that 2-D Disney animated features dominated: after Toy Story was first released in 1995, the next CGI films were Antz and A Bug’s Life, both premiering in 1998. That left three full years in between, which subsequently saw the release of three Disney animated features to vigorously fill the void to maintain their stronghold on audiences—yet they didn’t. The Hunchback of Notre Dame, Hercules, and Mulan were released by Disney during this period and though not critical or commercial failures, were far less renowned than their predecessors from the Disney Renaissance. Again, a malaise seemed to have settled with audiences, which could be read as a reflection of the medium’s output. Audiences surely weren’t just holding off for the next great CGI film, after only having witnessed the medium’s sole initial output in 1995. The average moviegoer had no idea how the CGI medium would eventually fare, though it was clearly a technological advancement. (It wasn’t until 2001, that the medium exploded with simultaneous releases of multiple CGI animated films, cementing it as a mainstay in cinema).

It was apparent that audiences had simply grown tired of the Disney formula, and so the business changed after 1999, just as it did after the Silver Age in 1967—when the studio entered a prior era of commercial and critical malaise, following the death of Walt Disney.

With that, it’s also helpful to understand what followed the Disney Renaissance: from 2000 to 2008, the Post Renaissance era indeed echoed the era that followed the Silver Age—the Bronze Age of 1970-1988, when the studio struggled to redefine its identity to audiences then too. The resulting films in the new millennium would reflect these efforts, striking into new territories such as Science Fiction, original stories not based on classic tales, even the CGI medium as well—which would be a portent of the studio’s eventual direction. Most of the films from this era didn’t quite resonate enough with audiences to turn them into classics.

The Revival era followed in 2009, with yet another rediscovery of the formula—with Tangled cementing Disney’s return to the zeitgeist, followed by Frozen. Both were clearly fairy-tale musicals centered on a princess, but married to the new CGI medium now, which Disney has converted to indefinitely to fit with the times. Aside from the new look, these films are quite similar to the Renaissance formula. Audiences responded and propelled these films into the public conscience as they did in the Renaissance era, hence the new namesake.

But if these new films from the Revival are following the same virtual formula as the Renaissance, why did the Renaissance cease in the first place? Shouldn’t it have endured, unabated, by sheer demand?

Again: we just needed a break. As a lifelong Disney fan, with the benefit of hindsight, I couldn’t fathom a Disney Renaissance-style film being released by the studio every year for the next two decades after Tarzan, the last of that era, in 1999. On some level, I would enjoy it purely as a diehard fan, but it would almost become campy—a parody of itself if you will. As much as audiences loved the Disney Renaissance, we can also sense artistic malaise. The formula had gotten monotonous and stale—again, already by the midway point of its era—and audiences clearly reacted with their wallets.

Does that mean that the Revival era is doomed to repeat history? Surprisingly, it may be averting this fate because: although it certainly has resuscitated the Disney formula, there’s one telling factor that separates it from the Disney Renaissance—it’s not following the formula for every new film. To their credit and maybe by calculation, they’re not just doing princess stories or fairy tales exclusively. Maybe that’s part of its success: Big Hero 6 and Zootopia are some of the titles that are as divergent from fairy tales and princesses as you can get. Both met with clear commercial and critical acclaim—unlike the misfires of previous eras that also strayed from the formula.

Whether they realize it or not, perhaps this is what audiences need. We will always love and adore princesses and fairy tales, but there needs to be variety. There’s nothing wrong with having a studio trademark (family-friendly films, music, artistic visuals), but the trademark can be broad and expand. Art is about change, pushing boundaries, and expanding possibilities. Sure, we do like some degree of familiarity—all art has a core familiarity: a movie has a beginning, middle and end; music has notes and instruments, verses and choruses. But along with familiarity we need variety.

Perhaps Disney has a new, unprecedented confidence and competency that is allowing them to achieve something they weren’t quite able to do in the past: successfully tell classic stories and original new stories, concurrently. Disney may have failed at pursuing either one at various times in the past, not because either one was theoretically bad—but because they just weren’t truly creating something exceptional. As mentioned, they were going through the motions after a certain point during the Disney Renaissance, settling into a creative ennui—or alternately, striking into new territory with dubious artistic vision, during the Post Renaissance for example. But if a story is truly told well, it can potentially succeed. Audiences will respond to something special even if it defies current trends, as they did when The Little Mermaid reignited this medium that had virtually gone defunct for nearly two decades, kick-starting the Disney Renaissance in 1989.

Will Disney get it right this time? We can only wait and see.

Any long-running institution is going to experience inevitable peaks and valleys in relevance and vitality—hence the different eras of Disney feature animation that exist in the first place. I am resigned to the eventual fate of the storied Disney Renaissance of my youth, because to borrow a platitude: good things can’t last forever. Sitcoms come to an end. Book and film franchises end—and can even be revived again after a certain period of absence (sound familiar?). The much-beloved Disney Renaissance is all the more revered because it wasn’t permanent and was limited in duration. It lends it a rarity that further incites gratitude and veneration. It was beautifully fleeting, as all life is.

It’s almost as if the beginning and ending of each era was inevitable—because like all art, Disney feature animation is an evolving medium. The studio is learning their craft in real time, and we get to watch it unfold onscreen.

 

Nobody Walks in L.A.

ped2

L.A. has the worst pedestrians in the world—because we’re not used to them. It’s bad enough that it takes forever to drive a relatively short distance in this town due to traffic, but when you need to drive through an intersection and a person dares to walk across it first? It’s enough to make you curse the existence of humanity.

Sometimes it’s truly a test: on more than one occasion, I’ve been delayed by the truly physically impaired. Of course I empathize and wait patiently on those occasions, but those moments feel tailored to test the utmost limits of my character. It’s like halting an epic sneeze or cutting off a bowel movement midstream: the absolute urge to purge and the terror of following through with such a deplorable act calls for your every last nerve to reverse the impossible.

On one such occasion, I had to make a left turn from a moderately busy lane; a slew of cars rolled through in the opposite direction, deterring me. My receptors were already piqued because this traffic was a tad unusual for this area given it was an early Saturday evening. I scanned my target intersection, and saw two young men idling by on skateboards. They cleared before the train of cars did. Impatient, I began to eyeball the nearest traffic light up ahead that could clip this parade to my left. Then I saw it:

A disheveled, middle-aged man ambled arduously forward towards my designated cross street—on crutches. What’s more—in my periphery, I caught an aberration on one of his legs—yes, his right leg was amputated around the knee. Immediately, my mind jumped to do the math: at his laborious pace and with the yellow light imminent up ahead, he would reach the intersection just as the cars on my left cleared.

I wasn’t in a rush. I wasn’t even angry at him. I was just resolutely amused that this was happening. It felt so indicative of this city. Here I was, driving a car that still functioned well past its purported expectancy, with takeout on my passenger seat—no plans for the night, half a mile from home—and normally I would’ve flipped out at this pedestrian who dared to cross a public street in direct tandem to me turning into it, except that in this scenario the perpetrator was possibly a transient with clear physical limitations and little to no means by the looks of his tattered appearance.

If I had flipped the switch into full selfish pig mode at that very moment, even just privately in the confines of my car—I knew it still would’ve been a sin, in the eyes of my conscience and whatever god may exist. I could see an audience of my fellow human beings at that very moment as well, sneering and groaning at me if I were to recall the story on stage or if they were privy to it via a hidden surveillance camera—satisfied in their smugness that I was more terrible than they were, convinced that they would’ve felt nothing but angelic compassion in my position.

I drove home and lamented it all: the feckless logistics of this town, the cruel irony of fate, the snide hypocrisy of humans and my own presumptions about them—and my inability to resist being affected by all of this.

Interpersonal Skills: I can’t deal with people.

interpersonal special SIZE

I’ve come to the conclusion: I can’t deal with people.

Although by my mid-thirties I know life is a constant learning experience and that we can traverse the entire continuum of allegiances and viewpoints, I have gleaned from my social experiences thus far that I’m just not good at interpersonal skills.

First off, I’m poor at asserting myself. When faced with a scenario where a simple expression of my needs would suffice, I am often drowned out by the myriad connotations of the situation: who is involved, how much I love/fear/loathe/need them, what words or actions spurred the need to assert myself—and how it affects me emotionally.

Beyond that, I seem to lack the same interests, motives or needs that many people exhibit in socializing: I don’t crave status, dominance, or social gain through who I associate with.

As any experienced person knows, there are tacit “games” that people play with one another—through physical action, comments, rejection—to assert their needs and agenda in regards to others.

I’m not interested.

I’m not interested.

I can’t deal with people judging others based on what they look like, who they hang out with, what job they have.

I can’t deal with people who aggressively label me—thinking they “know me” but they really don’t, and when I inevitably prove them wrong they get mad at me, of course, because they’re upset that the world doesn’t fit their perception of it.

I can’t deal with people who put others down in order to build themselves up. I can’t deal with people who gleefully abuse others for this purpose—who have no qualms making an innocent human being miserable.

I can’t deal with people using others for personal gain, including those they had considered their friends and closest colleagues.

I don’t want to trade barbs with people, because on an instinctual level I don’t want to sink to that level. It disgusts and unnerves me to see myself behave that way. For many people, if I can’t do that—then I am simply a target for their deplorable behavior, and therefore I must avoid them for my own safety and self-respect.

Consequently, even if I possessed the fortitude to assert myself more effectively—my general distaste in our social mores and behaviors could possibly thwart me from ever engaging. I don’t want to correct people’s behavior towards me—not just because I’m incompetent, but because it offends and repulses me that I have to display certain traits to attain it.

It sounds like a cop-out, and in a way—it is. After all, life is all about doing things we don’t want to but are essential as a means to a healthy life that truly benefits us. Each day, we awake, wash and dress ourselves—that in of itself is a requisite for a healthy existence. The vast majority of us must work at an occupation to earn resources that will acquire us more resources.

Interpersonal skills are not as tangible as our bodies, food, water, and a roof over our heads—but they are just as vital for the social animal that we are.

This is where I clash. My principles seem to be at odds with the rudimentary mechanics of socializing.

It’s a shame, because what I lack in grit I make up for in other virtues: as a friend, I’ve been told that I’m fun, open-minded, tolerant, and unconventional. I challenge the norms of society for the greater good of seeing the world anew. I am loyal, kind, generous, and gracious. I am accepting and thoughtful most of the time. I am engaging, but also capable of great independence. I have clearly defined interests and opinions that define me and can serve others.

Look, I’m also not perfect either and can even be guilty of unsavory behavior towards others, but for the most part I believe in a higher state of coexistence. And this is another hindrance to my interactions with others.

At the risk of sounding hopelessly naïve or oblivious, I believe in a world where we tolerate our differences instead of persecuting each other for them. I believe in treating each other with decency and minimal respect, even if we differ in lifestyle, views or appearances. I believe in equality—that we are all inherently valuable therefore the need for stringent hierarchy or status is irrelevant. I believe that as long as a person is not harming anyone, they should be accepted as they are—not persecuted because of someone else’s expectations or ideology. I know this isn’t plausible in our world, but that is my core approach to life, and informs how I view and interact with others.

This is the reason why I feel separate from most people, and different.

I’ve realized this is the reason why I am often confounded when people invariably end up being… human.

It’s all too common for people, including those we’d entrusted ourselves, to lash out at one another—because of differing temperaments, beliefs, expectations, ideology, and needs.

At this age, I’ve experienced the disappointment of so-called “friends” who display less than stellar traits towards me, and handle me in a way that directly opposes basic decency and humanity.

I’ve only been able to count on a small handful of friends who haven’t eventually turned on me yet—and of that minority, many of them are simply not visible enough in my daily life to risk offending me.

This, I feel must be the resolution to my anomalous condition: to seek out and zero in on the rare peoples who will not see me as a target for their foibles and dire needs.

When I find such a commodity, I must treasure them and keep them in my life—because they will be my principal social outlet, because it appears that I am not capable of much more than that.

Will I ever find such rare exceptions? That’s the question.

City of Broken Dreams

wonder

I volunteer at the local gay center occasionally. It’s located in the heart of Hollywood—on Santa Monica Boulevard, just off of Highland. If you go a bit further north on Highland, you’ll hit Hollywood Boulevard next to the Kodak Theater where they used to hold the Academy Awards.

I don’t live too far away, geographically, but as with everything in L.A. it’s cultural disparity that separates us, not distance. Driving up from my nondescript, low-key neighborhood of West L.A. adjacent to Beverlywood, I’m essentially wading into the gritty, smoggy, unfamiliar waters of Hollywood when I venture there. More discerning people would have ardent reservations even going there, barring an absolute emergency or valid necessity. Geographic prejudice is just one of the many charming traits of Angelenos you’ll discover here. I’m certain many of them take gleeful pride in it, much as they would a fine set of hair or an official job title.

One Monday morning, I gamely made the commute to do some filing for an upcoming event at the Gay Center. It was pleasant—getting out of my routine to help out with a good cause, while brushing shoulders with people I otherwise would never encounter on my own. The free pizza and cookies were just a bonus.

Halfway into my shift, I had to move my car to avoid parking regulations. Walking amidst the nearby adjacent residential neighborhood, I got into my vehicle and circled around onto Highland Avenue and parked, then trekked back to the Center. This unremarkable act evoked volumes to the intensity of this city and its continuing unfamiliarity to me.

Within such close proximity to the Gay Center, several of its constituents were milling about in surplus: an African American transgender woman strutted down Highland Avenue, bemoaning the heat under her breath. A pair of young gay men, stylishly dressed, sauntered northward on the street. A lone gay man in his late thirties to early forties, glanced at me curiously as I reached the crosswalk.

The street glowed under the unseasonable heat for late October—all concrete, metal, and glass—cars and casually dressed denizens moving forward with purpose. Businesses held shelter like virtue.

Back at the Center, a middle-aged man and woman danced and frolicked to music from a boombox while a small, hairy dog looked onward at their side. Their diligence seemed to equate with rehearsing for an imminent performance in the future. They paid me no mind, and I didn’t with them.

It was at that moment that I tied everything together. I realized that I no longer possessed a sense of wonder that is synonymous with youth. Not too long ago, I would have been tickled with simple amusement at the sight of this quirky couple and their canine cohort. I would have mused over their arbitrary efforts and location—the myriad possibilities of their intentions and origins.

I would have felt joy at watching the nearby city streets emitting their own special music, new to my ears as a visitor. The pedestrians and storefronts would have told stories that I knew would continue on without my witness—the mystery of it all intriguing me.

I would’ve felt this like a child on a Saturday morning: plain reverence at the beauty of life and all it had to offer on one special day. Now? I’d woken up on a new day, and didn’t recognize what I saw in the mirror anymore. Or I did—I looked just like the hardened cynics who had scoffed at me whenever I expressed unmitigated wonder in this city.

I realized: there was no sense of wonder for me anymore, because there was nothing new for me to see in this city. I knew the end of each story now, or rather: I knew where I belonged in the context of each one. I knew what to expect. I’d been trying in vain to make a connection in this fractured city, to no avail. What did that tell me?

Without ambiguity, there is no need to be curious anymore. This is why people settle down and stop exploring. It isn’t necessarily a choice. It’s an acceptance of who you are and how you are received in this world. I was just holding out on it for much longer than most.

Do the Oscars Matter Anymore?

oscars

It’s that time of year again: when people come together to talk about what some famous actresses wore—who wore it best—oh, and which film won Best Picture. Probably something artsy and serious. Sometimes it’s deserved—a film of true excellence and craftsmanship in writing, acting, and directing. But usually it’s just a film that you may or may not have seen. (I don’t know about you, but I’ve gotten to the point where I’ve decided all serious dramas will be relegated to DVD viewing—‘cause, you know: why do I need to see talking faces on a big screen?) Also, movie prices are astronomical, so—okay, I see it: I’m part of the cycle and why Hollywood is nickel and diming every potential film that passes through their gates in the hopes of production. No wonder they’re settling for the bottom line so often—a “sure” thing (read: sequel, prequel, or remake of something that did legitimate business once). But I digress.

Anyway, it’s the Oscars again. And of second most importance, it is 2017. I make a point of the year because frankly, I don’t believe the Oscars are nor have been the same for a long time now.

I often wonder what my younger doppelganger today would think of this Hollywood pastime now. What do young, budding (okay, and gay!) dreamers like me today think of this rapidly declining tradition of awarding the “Best” in Motion Pictures Arts and Sciences?

Cut to: me in the early 1990s. Maybe because things often look better in retrospect or I just didn’t know any better because I was a kid, but: the Oscars felt like they meant something back then. The five, count ‘em, just five nominated films for Best Picture (more on the topic of that category being expanded to ten nominations later) really felt like they earned that coveted spot. Each film that was nominated felt special, and it was usually a tight race that was more or less about merit and not just politicking by studios and adhering to social trends of the day.

Culturally, budding gay—I mean, budding dreamers of all stripes only had a few outlets to view their favorite stars back then: People magazine, and “Entertainment Tonight”. Which meant we were primed and hungry to see all these stars convene on one epic night—a smorgasbord of glamour, glitz, and at least to an idealistic kid like me back then: talent!

The Oscars have been cheekily dubbed “The Superbowl for Women”—in terms of annual cultural impact and significance. But unlike the actual Superbowl, the Oscars have been morphing and changing notably, and gradually eclipsed by other smaller Superbowls in the past two decades.

In the age of Twitter, TMZ, and the E! Channel, we can literally follow our favorite stars online 24/7 to see what they ate for breakfast or what color their kids’ poop is; spy on them as they exit an airport terminal via shaky video footage, or consume their daily lives in a craftily executed weekly reality TV show.

With these enlightening options that we’ve been blessed with through technical progress, the mystery of what it means to be rich and famous and talented has become rote and accessible in ways never before imaginable.

I have a feeling my teenage doppelganger today would view the Oscars the same way I viewed silent films or drive-in movie theaters when I was a teen in the 1990s.

Perhaps in response to this changing culture (read: poorer ratings for the telecast—undoubtedly due to the Academy’s penchant for nominating “serious” films that don’t do much business at the box office)—the category for Best Picture was expanded to include up to ten nominees, in 2009. The Academy claimed this was a throwback to the early years in the 1930s and ‘40s, where there were up to ten nominees per year—but many cynical observers assumed it was a blatant attempt to nab more viewers for the annual show. The quip “Are there even ten films worthy of being nominated every year?” hit the web quicker than you could say ‘Action!’. Incidentally, the Oscars suffered its lowest TV ratings ever the previous year, so read into the subsequent change however way you want.

As I alluded to earlier, I could relate to the criticism on the merit of today’s films—let alone their worthiness of being nominated for such an honor. In our current cinematic climate, I think the cap of five nominees is/should’ve been more relevant than ever—an elite prestige worth striving for, artistically.

Nearly a decade later, the expansion of nominees hasn’t made a mark on me as an Oscar viewer or a movie fan. If anything, it makes it harder for me to remember what films were nominated each year—but that could be more of a reflection on my waning interest for the show altogether.

In 2016, the Academy was confronted with yet another issue—this time one of moral. The lack of diverse nominees that year spurred a boycott by many African-American artists and viewers, who claimed a racial bias against them. Although I understood the greater issue of diversity, as a minority myself even I had reservations about the campaign. Was the Academy biased, or were there simply no quality films that year that starred African-Americans (or other ethnic groups)? If it was the latter, for instance—the issue wasn’t the Academy, but the movie industry itself.

Nonetheless, in true form, the Academy reacted swiftly with their image in mind—claiming they would add a significant amount of women and people of color to their voting bloc. The validity of this gesture aside, the consequence of this detrimental publicity also left a viewer like me wondering how sincere future nominations would be. As well intentioned as the campaign was to shed light on the Oscars’ lack of diversity, the fallout could be that they might overcompensate and recognize films (not people, mind you) of lesser merit to meet political correctness.

This shifting of objectives and influences only aided the rapidly declining relevance of the Oscars in my eyes. It was not about simply awarding the best films anymore—but a commercial and social experiment gone awry.

But this was nothing new overall: the Oscars have always been about more than just the merit of moviemaking, of course.

I turned eighteen when the world entered a new millennium in 2000, and the year “American Beauty” won against a highly publicized award campaign for its chief rival nominee that year, “The Cider House Rules”. Maybe because I’d technically became an adult and therefore achieved full enlightenment at last, but the fact that a movie studio launched a publicity campaign to swarm voters to choose their film was not lost on me. Apparently, voters don’t just go into hibernation and pick winners, then emerge back into the real world alive and rejuvenated by the purity of their choices.

The validity of their choices has often been debated for other reasons as well: awarding an actor or director for their current, less stellar work simply to acknowledge their greater body of work is another common longstanding ploy.

That being said, it’s safe to say that the curtain has finally gone down on my love affair with the Oscars. Honestly, the last few years I’ve been less and less drawn to the extravaganza. As late as 2013, I still recall having a few vestiges of excitement that I’d had in my youth—feeling like I was witnessing something greater than myself. But the past two years and on the eve of this year, it’s dawned on me now that the heyday of the show has long joined the past. It doesn’t detract from the merit of truly good movies, but that’s the thing: good movies and the Oscars are not the same thing, and they haven’t been for a long time.

So it’s that time of year again—when people come together to talk about what some famous actresses wore—and who wore it best. Oh, and which film won Best Picture. Exactly. That’s all it is.

 

 

La La Land: The Story of Us (Review)

lalaland

They say there are no new stories to tell, and nothing new under the sun—that phrase itself is a cliché, see. Yet we still need these tales, for ineffable and primal reasons. Why? Well, it’s ineffable—so sometimes it’s beyond words.

“La La Land” is one of the most familiar stories in the modern canon: girl has a dream of stardom and pursues it in spite of demoralizing setbacks. On top of that, it’s a love story between a girl and a boy—and the boy also has the same dream, essentially. This sandwich of familiarity is enough to send any quasi-cynic running for shade—and I don’t mean for cover. But oddly, many of us are still on board with this setup; so much so, that this film has become the breakout hit of the year: Oscar bait and pop cultural force. And for good reason: for many, it’s simply our story.

The film opens with an inventive musical nod to one of the hallmarks of the city of stars: L.A. traffic. In a gridlock on a steep highway overpass, passengers do the most natural thing in a musical: break into song and dance—jumping on top of their vehicles, courting each other out of their cars, and dancing on the concrete lane with unmitigated reverie, proclaiming in the song’s title how it’s “Another Day of Sun”. For a person who’s lived in Los Angeles longer than I care to divulge, this scene blew straight passed my jaded antennae and bowled me over with its unabashed whimsy. Instead of scoffing at the absurdity of it all, I wanted to join in.

At the song’s end, we meet Mia, played by pixie-ish but quirky Emma Stone. Mia is an aspiring thespian who heeded her childhood calling to tinsel town to realize her dreams of becoming a star—but mostly to tell stories through her craft, like every bleeding-heart artist on Earth. Although Mia is certainly likable, the film is less about character than plot and ideals. Stone is competent as always, but you guessed it: does not add a new wrinkle to this careworn archetype. She does add another notch to her increasingly impressive repertoire, proving that Hollywood may not be so shallow after all: in one scene, after a humiliating audition, Mia zips through a hall of Stone-lookalikes that are also vying for the part. In the elevator, flanked by two of these clones, she is clearly the least statuesque and nubile.

This doesn’t stop her from catching the eye of another aspiring artist, Sebastian—a somewhat aging (by Hollywood standards—read: thirties) jazz pianist played by the still smoldering and chiseled Ryan Gosling. He has the slightly more original dream of the two by default: to simply open up a jazz club, which is a feat because it’s a jazz club and this is the twenty-first century.

In a subdued, if not entirely original setup, Mia is drawn into a nondescript nightclub by the chords of a pensive tune that she hears Sebastian playing inside. This melody becomes a smartly recurring musical motif throughout the film. It’s there that she spies Sebastian, but thankfully it’s not love at first sight for either party. They meet again shortly after through serendipity (he’s a keyboardist at a Hollywood party that she attends), and the inevitable develops between these two passionate artists—cue: excoriating debates on the merits of their crafts and the plausibility of their dreams to secure them, and—romantic love. They inspire each other and cheer each other on, unsurprisingly.

These scenes are padded by more musical numbers—less grandiose in overall production, but still charming and catchy—particularly the lovely, haunting theme of the film, “City of Stars”. These numbers also continue to pay winning tribute to more L.A. trademarks and locales like the Griffith Observatory, beach piers and the Watts Towers. The film does lose its musical momentum in the second half of its story, which will not go unnoticed by musical connoisseurs. For novices like me, it’s the best of both worlds: I enjoyed the songs far more than I’d expected from a traditional film musical, but I was just as happy to be saddled with plain plot and character in the interim, however uneven.

I won’t disclose too much of the remaining plot, because it’s no trouble guessing for a proverbial tale as this anyway: the story reaches its emotional apex when Mia can’t bear another humiliating failure, and hits the sore spot many viewers who bought into this story for personal reasons, fear most—pondering that she may not be destined for greatness after all.

Nonetheless, the ensuing conclusion is probably what you’d expect for both aspiring artists in a film like this. And with that, this is the reason why these familiar stories still work: we need to be reminded that things are possible. It’s not cliché; it’s human—which one came first? (Well, frankly the human—but one thing informs the other). Notably, the movie does handle the love story between Mia and Sebastian with less hackneyed results, and I will leave that utterly out of this review for the viewer to discover on their own.

“La La Land” is nothing new, but it’s a tale we’ll never grow tired of because (many of us) will always care about the things it cares about.