‘Mid90s’, middle ground: lacking inspiration.

mid90s1

Mid90s proves that standard movie tropes are always familiar no matter how you dress them up. And first-time director Jonah Hill has certainly earned kudos for dressing his new film up to fit its epochal title: one only has to glimpse a few grainy frames (purposely shot on 16mm film for added effect), to be transported back into the last days before the millennium: compact discs, baggy clothes, big hair and of course a nostalgic soundtrack by a seminal voice of the era, Trent Reznor.

Although the title references an entire cultural zeitgeist, the film is far from being all-encompassing in scope or subject. Instead, it’s an insular story built on specificity, resting under a rather prosaic and vague title for lack of keener inspiration, which is its biggest flaw.

The story begins in Los Angeles during its titular time period, with a young preadolescent boy named Stevie. Hounded by his boorish older brother from the opposite end of the adolescent spectrum and given free rein by a lais·sez-faire mother suffering from arrested development, Stevie is primed for one of cinema’s biggest clichés: a summer he’ll never forget.

This leads into another hallmark of the period: the skateboarding underworld, when Stevie sets his sights on befriending a group of older boys at the local board shop.

As soon as he unremarkably worms his way into the affections of the boisterous but nonthreatening slackers, his story ticks off the requisite milestones of coming-of-age and its subgenre of films: exhilarating new experiences, wise mentors, chafing against his family, high jinks that just skirt the line of true danger and serious trouble.

Since the plot is standard framework, the question is if the parts make up for the sum. Stevie is competent enough as a protagonist: he fits the bill in looks and temperament, without hitting any false notes. The home life he shares with his threadbare family never truly generates a sense of urgency, which curbs any added weight to his arc. Stevie’s older brother and young mother aren’t guilty of anything beyond typical dysfunctional fare: physical taunts from the former and distractions by the latter. As for Stevie’s newfound entourage: they border on caricatures, with raunchy nicknames and slight characterizations that are as nuanced as a junior high yearbook.

 The film suddenly hits a climax that can only be described as inorganic and again, contrived—but this is in keeping with its steadily innocuous tone. Mid90s doesn’t seek to innovate or make a statement. It’s a light tale that never truly triumphs or fails abysmally either—inhabiting a safe middle ground of familiarity, evident all the more by its usage of epidemic-level nostalgia for a past era that’s bound to pique audience interest. It’s the only true star of the movie; without it, it would lose half of its distinction.

Advertisements

Nobody Walks in L.A.

ped2

L.A. has the worst pedestrians in the world—because we’re not used to them. It’s bad enough that it takes forever to drive a relatively short distance in this town due to traffic, but when you need to drive through an intersection and a person dares to walk across it first? It’s enough to make you curse the existence of humanity.

Sometimes it’s truly a test: on more than one occasion, I’ve been delayed by the truly physically impaired. Of course I empathize and wait patiently on those occasions, but those moments feel tailored to test the utmost limits of my character. It’s like halting an epic sneeze or cutting off a bowel movement midstream: the absolute urge to purge and the terror of following through with such a deplorable act calls for your every last nerve to reverse the impossible.

On one such occasion, I had to make a left turn from a moderately busy lane; a slew of cars rolled through in the opposite direction, deterring me. My receptors were already piqued because this traffic was a tad unusual for this area given it was an early Saturday evening. I scanned my target intersection, and saw two young men idling by on skateboards. They cleared before the train of cars did. Impatient, I began to eyeball the nearest traffic light up ahead that could clip this parade to my left. Then I saw it:

A disheveled, middle-aged man ambled arduously forward towards my designated cross street—on crutches. What’s more—in my periphery, I caught an aberration on one of his legs—yes, his right leg was amputated around the knee. Immediately, my mind jumped to do the math: at his laborious pace and with the yellow light imminent up ahead, he would reach the intersection just as the cars on my left cleared.

I wasn’t in a rush. I wasn’t even angry at him. I was just resolutely amused that this was happening. It felt so indicative of this city. Here I was, driving a car that still functioned well past its purported expectancy, with takeout on my passenger seat—no plans for the night, half a mile from home—and normally I would’ve flipped out at this pedestrian who dared to cross a public street in direct tandem to me turning into it, except that in this scenario the perpetrator was possibly a transient with clear physical limitations and little to no means by the looks of his tattered appearance.

If I had flipped the switch into full selfish pig mode at that very moment, even just privately in the confines of my car—I knew it still would’ve been a sin, in the eyes of my conscience and whatever god may exist. I could see an audience of my fellow human beings at that very moment as well, sneering and groaning at me if I were to recall the story on stage or if they were privy to it via a hidden surveillance camera—satisfied in their smugness that I was more terrible than they were, convinced that they would’ve felt nothing but angelic compassion in my position.

I drove home and lamented it all: the feckless logistics of this town, the cruel irony of fate, the snide hypocrisy of humans and my own presumptions about them—and my inability to resist being affected by all of this.

The View vs. The Talk

viewtalk

I love watching women gab. As sexist as it sounds, I’ll just say it: they’re good at it. I imagine it’s the equivalent of people tuning in to watch physically fit men play sports. Also, if I really want to fully commit to being politically incorrect: maybe it’s part of my DNA as a gay man to enjoy hearing women yak about everything from the profound to the frivolous. I can relate, and it’s fun.

Since the beginning of this decade, we’ve had two major choices to see the biggest and brightest women in pop culture do just this, on daytime T.V. in the U.S.

Venerated journalist Barbara Walters set the precedent in 1997 with a show called “The View”—featuring ‘different women with different points of views’ sitting around a table and discussing the day’s biggest headlines. They ruled the roost as the lone standard for such a concept, until 2010 when former child actress Sara Gilbert had the sterling idea to do an offshoot of the format (with the angle that it’d consist of a panel of “moms”—although its predecessor never played down the maternal status that most of its panelists could claim too). As a viewer though, I wasn’t discerning—it made sense because: in a nation as large and diverse as ours, one of the benefits is how we can expand on commodities… like talk shows. After all, there have been multiple late night talk shows for decades now, competing directly with one another and thriving in their own right regardless of the saturated market. When a new daytime talk show featuring a panel of half a dozen women talking about topics in the news with their “different points of views” popped up, we took it in stride.

Both “The View” and “The Talk” have succeeded with viewers and been nominated for the same daytime Emmy awards throughout the years, solidifying their place in the pop culture lexicon.

But is there a difference or a clearly superior one?

“The View” has the advantage of experience on its side: thirteen more years over its rival. With that plethora of time, it’s seen and done many things it can learn from. Infamously, placing two panelists who are diametrically at odds with one another in perspective is ratings gold: when outspoken liberal Rosie O’Donnell was recruited as the show’s mediator in 2006 during the contentious Bush/Iraq War years, fate was written on the wall—she would ultimately come to blows with then-outspoken conservative panelist Elisabeth Hasselback the following year. It was the best daytime drama that needed no script.

The show also has the undeniable class factor that only a highly respected figure in the journalism field like Barbara Walters can provide. Although “The View”’s reputation has ebbed and flowed as any long-running entity is prone to, its pedigree is still rooted in solid stock.

It’s not without its trials. The show has “jumped the shark” as much as a talk show can do, in the sense of creative/production malaise. Since the 2010s, there has been a highly visible turnaround in the show’s panelists—it’s hard to even keep up with who’s officially on the roster these days, like watching your favorite sitcom characters getting replaced by new actors or new characters that you just don’t care for. Many of the new recruits were blatantly regrettable as well (Candice Cameron Bure and Raven Simone dragged down the credibility of the show, imho! Thankfully, their tenures were scant). The show has even rehired previously retired or exited co-hosts such as longtime favorite Joy Behar, Sherri Shepherd and even Rosie O’Donnell herself (who ultimately only stayed for one season again in 2014, mirroring her infamously clipped first round).

“The Talk” also tinkered with its lineup initially after its debut season, which is to be expected of a fledgling show though. It found its footing with a consistent lineup afterwards, and has only had one panelist replacement since.

Another difference with “The Talk” is its less emphasis on formality. The show humors its audience and viewers by directly asking them questions after bringing up a headline—from a serious news story to celebrity gossip, mediator Julie Chen will offer a concluding missive to encourage monosyllabic responses, boos, hisses, or laughter from the live audience reminiscent of, well, a daytime talk show (a 1990s version moreso, though).

Since the show is filmed in Los Angeles, another distinction from its New York City predecessor, it also has a daily celebrity-themed guest correspondent who contributes a pop culture headline (adding to the inevitable pop culture news that permeate the show anyway), in a segment loosely dubbed “Today’s Top Talker”.

As one can guess, “The View” and its reputation skews more towards a serious, politically-themed show. Although its current longtime mediator Whoopi Goldberg is a veteran Hollywood actress, she is outspokenly political and even good-naturedly mocks the more frivolous pop culture news she’s required to broach regularly (read: reality show fodder).

Other panelists, regardless of how short their tenures have been in recent years, have frequently been renowned political pundits as well, something “The Talk” has steered from completely. Currently, Senator John McCain’s daughter Meghan McCain is the resident conservative Republican on “The View”.

“The View” has also expanded its most well-known segment, the roundtable discussion deemed “Hot Topics” from just a third of the show’s running time to half or more now, betting on the attention-grabbing headlines and the often heated exchanges between the ladies on the panel to sustain viewers.

Both shows have the requisite celebrity guest interview in the latter half of the show. Again, “The View”, naturally more political, regularly invites political figures such as former president Barack Obama and several political commentators. “The Talk” relies entirely on celebrity guests, occasionally some that are not even major draws. This is moot, since I only tune in to each show to watch the ladies yak amongst themselves in their roundtable segments.

Judging each show based on my proclivities, I do have a clear conclusion of which one succeeds most. “The View” tides me over, for the aforementioned reasons above—it has more legitimacy but is still able to delve into melodrama, camp, and frivolity. Although its high turnover rate is unnerving and dispiriting, it has enough mainstay power players to anchor it. As a child of the 1980s and 1990s, I have a bias for Whoopi Goldberg as a pop culture fixture. Comedian Joy Behar’s sassy Italian schtick hasn’t gotten old—or perhaps, twenty-one years later on the show, I’ve also grown attached to her presence. As for the rest of the current panelists, I feel neither strongly for or against them. Sara is the bright blonde who keeps things light or at least centered; Sunny adds more diversity and a touch of primness. Meghan obviously serves as an antidote to the clear liberal slant from the two veterans of the show, and for the most part I enjoy her dynamic. Not to paint her as an archetype, but I love a good “nemesis”, and Meghan is one by default, constantly having to defend her political party whenever President Trump drags it through the mud, which is often.

“The Talk” is sufficient enough, but my taste doesn’t quite extend to audience participation and an overabundance of pop culture fluff. And although they currently have the steadiest panel lineup longevity, I’m not especially fond of any of the panelists: mediator Julie Chen is too proper; Sara Gilbert is insightful but staid as well; Sharon is the venerable one who’s been around the block—but is a bit too mannered and biased in her outspokenness; newcomer Eve hasn’t proven her worth yet beyond tugging the median age of the group down more; and Sheryl Underwood plays up the sassy black woman trope a bit too much.

Each show brings something to the table, and it’s merely a matter of taste. To me, I primarily blur the edges that separate the shows. They’re like two sitcoms that have an overlap of similarities and differences, and I like them both for different and similar reasons.

Album Review: Radiohead’s ‘Moon Shaped Pool’ is One-of-a-Kind Art Rock

Radiohead

The album cover art for Radiohead’s new album A Moon Shaped Pool is a fitting metaphor for the music within: a corrosive, abstract form that defies definition. It is the venerable band’s most oblique and sonically dense record, from their catalog that has consistently defied musical categorization.

It’s easy to get lost in the theorizing of the mystique and motives of such an artistically lauded band instead of focusing on the work that propelled them into such a position in the first place. Their new album stands on its own as a musical work of art, and an admirable extension to their ambitions and abilities as musicians, writers, and artists that have been in the public domain for a quarter of a century now.

The first response I had to the album was not immediate devotion as a fan, but a caveat: it was not outwardly accessible, even for a band that never aimed for such a feat. For a fan, it threw me.

Listened to on a cursory level, the eleven-song cycle could be condensed to obtuse whispers, hushed strings, and formless melodies—all wind and sail, with no immediate soul or beat to anchor one’s mind to. Or so I thought.

Like some records we’ve doubtless encountered, I had to be in the right headspace to receive it. And like the muse behind art, there is no formula for discovering the beauty behind art. It simply arrives on its own.

When it occurred, I found myself intoxicated with ineffable fascination at the album’s sonic landscape. Guitarist Jonny Greenwood was the prime force behind the songs’ use of haunting choruses and emotive strings, imbuing the album with a strange and indescribable beauty that singled it out from any other albums I’ve ever listened to in the rock genre.

Whether it was merely artifice or thematic ingenuity, I found myself quietly stunned and enraptured, like a zealot hearing his gospel at church. If music stirs you instinctually or cerebrally or both, it’s done its job.

The opening track and first single, “Burn the Witch” is the most conventionally structured song in the album. Buoyed by a fast, frightening string section straight out of an arty horror film, it’s a tense and urgent tune that gives way to a far more oblong musical journey afterwards.

“Daydreaming” follows—a languid, sleepy meditation that lead singer Thom Yorke has essentially been writing since OK Computer in 1997. It’s a summation of all the fears, dreads, and wonders the band persistently chases throughout their discography, providing new insights each time with each new phase they enter.

“Decks Dark” perhaps best encapsulates the album’s musical identity: an anomalous hybrid of harrowing archaic choir, weaving strings, Yorke’s croon, and ebullient keyboards. Even if the voice and lyrics aren’t articulate, the emotion is there—it’s all in the sound.

Another track, “The Numbers”, is a heightened version of this combination, propelling the sensation of strings and conventional rock instrumentations into new levels of hysteria and transcendence.

Elsewhere, there are more familiar sounds that Radiohead fans will recognize: “Ful Stop” seeps in quietly, building on a searing guitar loop that crescendos periodically through the cacophony of Yorke’s falsetto wails and shimmering noises.

“Present Tense” is a rollicking, acoustic guitar-driven track that is reminiscent of the band’s prolific first decade of the new millennium. It is no less riveting for being familiar: a plaintive, yearning ode that hooks you throughout its tense course.

Lastly, the album closes with a tune that has batted around the Radiohead canon for two decades now, as a fan favorite at concerts—only officially appearing on the band’s Live EP I Might Be Wrong: Live Recordingsin 2001. Now, “True Love Waits” is punctuated with muted piano chords and lilting keyboard effects. For a fan of the poignant, acoustic guitar-centered live version, it was initially unnerving. But in the context of this searing album, it is fitting. The somber, sedate take feels earned: wary yet hopeful, shattered but enthralled by the wonders of feeling, it’s a serene close to an otherworldly journey.

Movie Review: ‘It’ doesn’t deliver

It

Unless you’re from the tail end of Generation Z, you at least know what Stephen King’s It is about already. The question is if the new film is a worthy take on the classic novel, which had only been filmed once as a well-known 1990 TV mini-series. Spoiler alert: I did give in to nostalgic curiosity and re-watched the original version before viewing this new one. Don’t worry: although I’d long revered it as a fearful preteen back then, I was shocked to find now that it was rather underwhelming—a mild, moody drama with some decent scares thrown in.

So I was primed and as objective as possible to the prospective terrors of an ambitious new take from the best that Hollywood has to offer today. From the opening scenes in the film that lead to the introduction of Pennywise the clown, otherwise known as the title namesake It, the movie looked promising.

Unfortunately, it didn’t exceed expectations from there. First off, Pennywise the clown is the centerpiece of the entire story, hence the title. Without his terrifying image or concept of menacing evil, the story isn’t effective. Not to sound like a purist, but for lack of a better example: the original Pennywise played by Tim Curry in the mini-series was far more sinister. Although his looks were barely a step away from a typical birthday clown, that’s what made him frightening: he was plausible. Here was a clown that could exist in your neighbor’s backyard, surrounded by innocent children—yet there was a studied vitriol to his gaze and a barely controllable sneer to his painted red lips. When he opened his mouth to lunge at last—that spray of razor sharp teeth only solidified our very fears. The new Pennywise, played by Bill Skarsgard, is so stylized he’s as flat as a joker from a playing card. And as engaging. His appearances are not particularly memorable and are often upstaged by the other manifestations of “fears” that he lobs towards his victims, in the forms of an abusive father, a distorted painting of a woman, and a headless child from a history book.

What about the rest of the characters? The story centers around a gang of “losers” in the late 1980s: seven misfits from the local junior high in Derry, Maine, who congregate as a means of survival from the social hierarchy of their peers—and eventually, from the deadly curse that Pennywise has inflicted on the town for nearly a century. The child actors that portray them are all competent, but only three of the characters are given any distinct personalities that leave an impression: Bill, the stuttering de factor leader and protagonist who wants to resolve the death of his little brother from the opening of the film, is appealing and bright. The group’s lone female member, Beverly, stands out not just for being a girl—but because she gets the most screen time to develop her troubled back story that includes an abusive, preying father. Richie, the loudmouth comic relief of the group, is memorable by default because he’s the most vocal and biting. The rest of the kids aren’t fleshed out particularly well—they end up being ciphers who just provide physical power and exposition to the story.

As for the story itself, it lags in places and could have benefited from more urgent pacing—given this is a horror story, where timing is of the essence. Although the film is inevitably going to lapse into some preteen requisites, which is fine for the sake of character and plot development: crushes, friendships, betrayal, etc.—the overall story suffers as a result. Although the original novel was sprawling, it somehow seemed too unnecessarily long onscreen.

It’s fitting that this movie takes place in the 1980s because the special effects for the film seem to be right out of that era, almost. Although visual effects should never be relied on to propel a horror film—they are surprisingly disappointing and innocuous in this movie, given today’s technological advances. Since the movie suffered from middling pacing as well, that left for very little to keep me at the edge of my seat. By the time the movie hit its climactic standoff between Pennywise and the brave, bereaved kids, I gave up my search for something truly terrifying to materialize.

Overall, I don’t think this film will join the pantheon of truly classic horror films in my eyes. The hype clearly overshadowed the actual execution of the story onscreen. It ended up just being another underwhelming horror flick.

City of Broken Dreams

wonder

I volunteer at the local gay center occasionally. It’s located in the heart of Hollywood—on Santa Monica Boulevard, just off of Highland. If you go a bit further north on Highland, you’ll hit Hollywood Boulevard next to the Kodak Theater where they used to hold the Academy Awards.

I don’t live too far away, geographically, but as with everything in L.A. it’s cultural disparity that separates us, not distance. Driving up from my nondescript, low-key neighborhood of West L.A. adjacent to Beverlywood, I’m essentially wading into the gritty, smoggy, unfamiliar waters of Hollywood when I venture there. More discerning people would have ardent reservations even going there, barring an absolute emergency or valid necessity. Geographic prejudice is just one of the many charming traits of Angelenos you’ll discover here. I’m certain many of them take gleeful pride in it, much as they would a fine set of hair or an official job title.

One Monday morning, I gamely made the commute to do some filing for an upcoming event at the Gay Center. It was pleasant—getting out of my routine to help out with a good cause, while brushing shoulders with people I otherwise would never encounter on my own. The free pizza and cookies were just a bonus.

Halfway into my shift, I had to move my car to avoid parking regulations. Walking amidst the nearby adjacent residential neighborhood, I got into my vehicle and circled around onto Highland Avenue and parked, then trekked back to the Center. This unremarkable act evoked volumes to the intensity of this city and its continuing unfamiliarity to me.

Within such close proximity to the Gay Center, several of its constituents were milling about in surplus: an African American transgender woman strutted down Highland Avenue, bemoaning the heat under her breath. A pair of young gay men, stylishly dressed, sauntered northward on the street. A lone gay man in his late thirties to early forties, glanced at me curiously as I reached the crosswalk.

The street glowed under the unseasonable heat for late October—all concrete, metal, and glass—cars and casually dressed denizens moving forward with purpose. Businesses held shelter like virtue.

Back at the Center, a middle-aged man and woman danced and frolicked to music from a boombox while a small, hairy dog looked onward at their side. Their diligence seemed to equate with rehearsing for an imminent performance in the future. They paid me no mind, and I didn’t with them.

It was at that moment that I tied everything together. I realized that I no longer possessed a sense of wonder that is synonymous with youth. Not too long ago, I would have been tickled with simple amusement at the sight of this quirky couple and their canine cohort. I would have mused over their arbitrary efforts and location—the myriad possibilities of their intentions and origins.

I would have felt joy at watching the nearby city streets emitting their own special music, new to my ears as a visitor. The pedestrians and storefronts would have told stories that I knew would continue on without my witness—the mystery of it all intriguing me.

I would’ve felt this like a child on a Saturday morning: plain reverence at the beauty of life and all it had to offer on one special day. Now? I’d woken up on a new day, and didn’t recognize what I saw in the mirror anymore. Or I did—I looked just like the hardened cynics who had scoffed at me whenever I expressed unmitigated wonder in this city.

I realized: there was no sense of wonder for me anymore, because there was nothing new for me to see in this city. I knew the end of each story now, or rather: I knew where I belonged in the context of each one. I knew what to expect. I’d been trying in vain to make a connection in this fractured city, to no avail. What did that tell me?

Without ambiguity, there is no need to be curious anymore. This is why people settle down and stop exploring. It isn’t necessarily a choice. It’s an acceptance of who you are and how you are received in this world. I was just holding out on it for much longer than most.

Clothes Don’t Make This Man

clothes

Please do not judge me by what I wear. Clothes are merely functional to me. Yes, I do believe people should at least wear something decent and flattering to their physique. I’m aware of the other extreme and even I am critical of it: I’ve met people who don the sloppiest of attire and it is truly unbecoming of them. I’m aware that there is a valid argument for each person’s responsibility for presentation. But, and I’m aware that I’m proposing my own biased mindset here: we shouldn’t expect more than that minimum, from everyone.

I’m not knocking fashion. Like all creative mediums, it’s an art form in its own right. If you are passionate about it and truly embrace this medium as a form of self-expression: more power to you. But like all art forms—not everybody is interested to the same degree. There are cinephiles who don’t read. Bibliophiles who loathe movies. Foodies who don’t watch films. Fashionistas who don’t care for film. You catch my drift. To hold everyone to the same standard is an imperfect mindset, because like all arts, it’s subjective—and like I said: not everyone is interested to the same degree.

I would wear a potato sack if I could. I’m too busy devoting my time to books, movies, and music—aha, see—I do have aesthetic sense. It just doesn’t extend itself into what I wear. The fact that I love moody alternative rock music does not translate into “moody, alternative” clothing—unless The Gap is considered edgy now. My predilection for obscure, artsy foreign drama’s is hardly conveyed in my completely clean canvas of skin—free of tattoos, piercings and adornments as the day I was born. If you took one look at me and did me a solid by guessing my taste in culture based on my wardrobe, you’d swear I was a Maroon 5 and “Paul Blart” movie fan. (Hint: those are not good things.).

There you go. Sure, there could be some validity in addressing my (lack of) style sense. The decision to not indulge in expressing myself through clothing is a revelation in of itself. If I had to guess, it would mean: I’m reserved, private about my passions and interests, and maybe just maybe—I’ll give this much to my most vicious critics—a tad conservative, but only when it comes to appearance. I don’t worry too much about the latter, because my dark sense of humor and world view is anything but.

See? Even the way I express myself does not translate into what I wear. My closest friends would attest that I’m quite unusual in my beliefs and interests. I’m the one who wants to try new things, go for the unconventional, is inherently bored by the ordinary. And yet: I probably wear the most ordinary clothes out of everyone.

It’s fair to say that we do start out with no fashion sense as children. But as we grow and discover our identity and sensibilities, it’s natural for us to start determining how we present ourselves on the outside. Some of us invest more time and effort into this than others. Somewhere along the way, I didn’t quite make this leap. Sure, I do have some taste in clothing for sure—I know what I’m comfortable wearing and not wearing. But I never went further than the minimum. I never incorporated notable depth into the armor that one wears on the outside, in this world.

I feel like I’m starting to go in circles while waving my own flag here, so I’ll leave it on this: people can express themselves in many forms, so don’t just start and end with their appearance. For some, that is the last place where they would convey any of their expression. Sounds crazy, but it’s true. Dig deeper. Look in other places. Listen and engage, before you judge a person’s character. That cliché “don’t judge a book by its cover” was supposed to be used in real life, you know.