Greg Oguss on Pop Culture
I AND I
Metaanything, as the self-proclaimed Internet webcocks refer to commentary about commentary, seems to be everywhere these days. On Gawker, snarky columns mock Wired for illustrating the backhanded art of the unflattering cover with a front page photo of former Star! columnist Julia Allison captioned by a headline about her paltry level of microfame. From Gawker to MSN.com to Newsweek Online, everybody seems to have an opinion on the media’s obsession with celebrity babies. In the early 20th century, literary critic Edmund Wilson wrote metaanything-style articles arguing that attempts by journalists like H.L. Mencken and Gilbert Seldes to democratize criticism contained a condescending notion of what Mencken dubbed the All-American “boob-oisie.” Although metaanything has been around at least a century, the Web has exponentially increased its visibility. Along with the invective spewed by Gawker and its rivals, metaanything is the principal reason people dislike the blogosphere. If Web invective has been accepted as another indignity of life in the digital age, the most debated aspect of online criticism is the constant use of the first-person, which is seen as a debased form of journalism by the pros.
The popularity of the first-person among online critics is largely a result of the fact that Web writers are mostly inexperienced amateurs who can only offer an ad hominem reaction to a given subject. The ability to read and self-publish online for next-to-nothing is the principle reason that criticism and publishing in general are suffering. But to discuss Web criticism, a sense of how we read online is necessary. The simplest analogy for online reading is that it’s like the conversation you’re having with a cute girl at a party but you’re expecting someone much more exciting to walk through the door at any moment. No matter how talented a conversationalist the girl is, you’ve got one eye on the door the entire time she’s talking to you. Anytime you’re reading an online book review or a blog about so-and-so’s thoughts on the mortgage crisis, the fact that something better may be just a point-and-click away is akin to the Next Guest Phenomenon which forces you to keep one eye on the door at that hipster party.
In order to compete, as Slate’s Michael Agger explains in an article entitled Lazy Eyes: How We Read Online, Web critics keep articles brief, with easy-to-follow through-lines, short paragraph breaks and links galore for the 2.0 crowd. Basically anyone who can write a decent business report can pull off one of these stories. This means Slate can employ gurus from all walks of life as contributors. The downside is that the quality of writing fluctuates. An attempt by Slate’s Jack Shafer to do a play-by-play of what he deemed Charles Gibson’s sterling interrogation of Sarah Palin in an ABC News interview was a characteristically pointless example of metaanything. However variable the criticism in Slate or its competitor Salon, their writers rarely use “I.” From the standpoint of the pros, the first-person is problematic. Conversely, How-To guides for bloggers stress the importance of frequently sharing intimate details about your life so that readers will feel a part of your world. This is the appeal of “Mommy blogs” which offer a support system for young mothers. But it’s also the appeal of fanboy writing on movies, television and gamer culture where knowledgeable amateurs can befriend others who share their passions. On sites like Film.com, the typical fanboy review provides a highly personal response to a movie, offering a few odd facts about the blogger’s life along with a bit of film trivia to support the reviewer’s opinion. A characteristic example is a review of Pineapple Express recently printed in an online magazine I co-edit by a blogger from a competitive swimming site in which the reviewer began by complaining he had to drive all the way from his house in the sticks to find a theater showing the film. The thrust of the piece was whether the film was worth the drive, although he did provide the requisite knowledge that he’d seen other films featuring the character actor Danny McBride, who he felt turned in a solid performance, and that the movie’s score nodded to 80s action films.
In contrast to the fanboys, the professionals’ view of the use of the first-person is “I try to stay away from it unless it adds something,” as the LA Weekly’s F.X. Feeney told a friend of mine when they were discussing this tendency. The notion of Only When Necessary is a way of devaluing the use of the first-person, which isn’t to say that it can’t be abused. In America, the tradition of writers “everlastingly saying I,” in the words of Edmund Wilson, dates back to H.L. Mencken, the critic constantly referenced as the standard-bearer of journalistic excellence in the pages of Vanity Fair, The Atlantic Monthly, and Harper’s Magazine. In Wilson’s introduction to The Shores of Light, he confessed that American journalists like himself, Mencken and Alexander Woollcott picked up their love of the first-person from George Bernard Shaw and Oscar Wilde, who used it to create larger-than-life personas like the “witty buffoon” in Shaw’s case and the “dandy” in Wilde’s. This isn’t altogether different from the way bloggers are advised to use their websites to go about “personal branding.” The difference, as the pros would point out, is that Shaw and Wilde’s artistry is a far cry from the run-of-the-mill blogger complaining about a long drive from the boonies to watch a Judd Apatow flick.
While bloggers get a lot of hate for their inelegant use of “I,” it’s a trend that has waxed and waned. In the 1952 introduction to the compilation of his early works, Wilson confessed that while “the exploitation of the personality” seemed “integral to criticism” in the 1920s, the habit had since “gone out of fashion.” Wilson’s observation was a reference to the professionalization process that American print journalism underwent throughout the 20th century. This process meant that the first-person as Mencken used it, to write satiric news stories that seemed, as Alistair Cooke put it, to be a “carefree fantasia on the truth” was no longer acceptable. As the readership of daily newspapers and magazines grew, in order to appeal to a broader audience, journalists eschewed the flair of Woollcott and Mencken. And Wilson’s “exploitation of the personality” faded away.
By the 1960s, New Journalists like Norman Mailer, Tom Wolfe, Joan Didion and Hunter S. Thompson began to once again freely adopt the first-person for a mix of aesthetic and political reasons. Their use of “I” went unquestioned because it coincided with the rise of the counterculture and the sexual revolution, which emphasized the importance of a personal response to political events. Mailer and Thompson’s reporting on politics, Didion’s writings on the apocalyptic mood in California, and Wolfe’s travels with Ken Kesey and the Merry Pranksters all seemed appropriate journalistic documentations of the sweeping changes of the era. Few critics thought to point out that the oppressive sense of paranoia which Didion documented may have been the unique perspective of a wealthy writer living in L.A. and not as applicable to Jane Doe living in the fly-over.
With the revelations uncovered in the early seventies about Watergate, CIA misdeeds, and the Nixon Administration’s coordinated “rat-fucking” effort against a variety of targets, there was another shift in attitudes toward journalism. Americans came to view the 4th estate with more respect than their corporate and political leaders. With this increased authority, the relaxed attitude of the New Journalists fell out of favor among members of the press. While styles go in and out of fashion for a host of reasons, the disappearance of gonzo journalism was also tied to the cultural backlash against the sixties, which historian Philip Jenkins suggests began around 1976 in his book Decade of Nightmares. By the Reagan era, mainstream writers had completely forsaken the rhetorical flourishes of gonzo music critics like Lester Bangs and his imitators. Recently, critics and journalists have seen the respect they earned in the Watergate era erode as ethics controversies have hit 60 Minutes and The New York Times and online journalists like Matt Drudge have scooped the majors on a host of stories. Today, the need for stylistic distinctions is paramount since the line between bloggers and the pros has been completely blurred. Sites started by citizen-journalists like Gawker, The Daily Koz and TalkingPointsMemo are all seen as just as legitimate as any “old media” institution.
In the 21st century, professional critics threatened by the popularity of fanboy writing have been forced to shorten their columns, simplify their arguments, and write in the jargon-filled style that’s popular on the Web. As Film Comment’s Kathleen Murphy argued on MSN.com in an article entitled “Criticizing the Critics,” there’s a group-think to online criticism that suggests if it’s popular, it must be good. In Murphy’s rant about the damage that blogging has done to criticism, she offered a number of compliments masquerading as satirically-worded insults to fellow pros like The New York Times’s Manohla Darghis and A.O. Scott, comparing them to the distinctive voices of earlier critics like the legendary Pauline Kael. While Murphy’s anxiety about the diminished authority of the pros was palpable, her satire fell flat because her comparison is off the mark. Unlike today’s professional film critics whose preference for art films over cult classics Murphy acknowledges, Kael was known for her promotion of pop culture and her scathing attacks on self-indulgent auteurs like John Cassavetes and Stanley Kubrick. Kael wasn’t a regular at press screenings, instead attending films with paying audiences and adopting a first-person perspective to delve into what popular movies revealed about the sociology of America. Older colleagues thought of her as a bitch and a sloppy journalist. But her excesses won her legions of fans and the admiration of younger critics, dismissively dubbed “Paulettes,” who aped her style. Writing in his own oft-imitated but never equaled style, Lester Bangs’s cough-syrup fueled tributes to Iggy Pop, Lou Reed, and cult films like The Incredibly Strange Creatures Who Stopped Living and Became Mixed-Up Zombies made him the forerunner of today’s fanboy. But unlike the fanboys or pros such as Darghis and Scott, Kael and Bangs didn’t get bogged down in debates about aesthetics. Opening an article about the Clash in the first-person—“I do not know shit about the English class system, nor do I give a shit about the English class system”—Bangs proceeded to offer a typical gonzo-style account of his travels with the band which revealed much about the state of British youth culture despite his calculated pose as an amateur.
To argue that the Internet has ruined criticism, making the rise of another Bangs or Kael impossible, is to misunderstand the connection between talent and technology. As German cultural critic Walter Benjamin argued in his 1935 essay “The Work of Art in the Age of Mechanical Reproduction,” technology may change the way we experience art, but it doesn’t get in the way of its development. The problem with the Web isn’t that it’s a magnet for those of us who are addicted to the first-person or who all think alike. It’s a magnet for those of us who like attention and access to free criticism, which is nearly all of us. Most of the time, you get what you pay for. There are undoubtedly a few reviews by the next Lester Bangs or Edmund Wilson hiding amongst the garbage. Unfortunately, the Internet acts like French film critic Andre Bazin once suggested the film camera does, concealing as much as it reveals.