Lodinews.com

default avatar
Welcome to the site! Login or Signup below.
|
||
Logout|My Dashboard

Jason Wallis Is the auteur theory working its way into mainstream?

Print
Font Size:
Default font size
Larger font size

Jason Wallis

Posted: Friday, October 15, 2010 12:00 pm

Before we jump into a brief, free-flowing discussion of auteurism — and how it relates to my lack of a review this week — I wanted to take a moment to remind my legions of adoring fans to drop by http://www.lodinews.com/blogs/battle_royale/">Battle Royale and contribute to the debate. We’re still discussing our favorite movie scenes of all time, and as soon as we get our assigned weekly updates worked out, you can expect daily-ish posts on all manner of movie-related nonsense.

(Special thanks to reader Chris Wallace for taking part with such admirable flair and enthusiasm so early in the process.)

I would particularly appreciate any suggestions readers may have in regards to content — we’ve got a lot of ideas, but are eager for more.

Now, to the subject of this column (kind of): I had originally planned to choose the lesser of many evils and see “My Soul to Take,” Wes Craven’s long-waited return to feature filmmaking after 2005’s impressive “Red Eye.” However, after reading even a small sampling of reviews, it became clear to me that the entire venture would be pointless. According to near-unanimous critical and public opinion, it’s perhaps the worst film of Craven’s decades-long career, and offers absolutely nothing new or even remotely interesting to genre aficionados.

I would have a terrible time watching it, and in all likelihood, the whole thing would have resulted in nothing more than a meaningless, obligatory critique that could never in a million years be interesting or enlightening in any way, simply because there’s nothing of any relevance to discuss in the first place. (Yes, I realize that I’m trashing a film I’ll never see, but in many instances this is completely reasonable and ethically sound. “You can‘t judge a book by its cover,” is just another lie we tell kids, like “Violence never solves anything,” and “You shouldn’t lick batteries.”)

I’m willing to suffer for your amusement, but I’ll only go so far. There’s just something so indescribably depressing about witnessing the death rattle of a once-distinctive filmmaker, especially one as trail-blazing as Craven. I recognize that “auteur” (from the French for “author,” applying to a technique-oriented filmmaker whose influence on a given film easily and demonstrably surpasses anything contributed by the writers/actors/producers/etc.) may seem too loaded a word to apply to Craven, but it’s hard to argue against the technical sophistication and aesthetic similarities of his earlier films, from the original “Last House on the Left” (although I’ve always regarded it as a pretty terrible movie, regardless of its importance from a genre perspective) and “The Hills Have Eyes” to “A Nightmare on Elm Street” and the indispensable post-horror satire “Scream.” And now, he’s been reduced to an industry laughingstock. Perhaps “Scream 4” will provide a rebound, but I’m not betting on it.

So what does all this have to do with… well, anything? The way I see it, Craven is just one example of the many directors who, over the course of the past decade or so, have been instrumental in bringing the whole concept of auteurism into the mainstream. In advertisements, “My Soul to Take” is billed as being from the director of “Scream,” and it got me to thinking about how you see that kind of approach so much more often these days. For instance, “The Town” is not primarily promoted as “starring Ben Affleck,” or “featuring Oscar-nominated actor Jeremy Renner,” but rather as being “from the director of ‘Gone Baby Gone.’”

This is by no means a new phenomenon, but it certainly seems like we’ve seen a noticeable increase these past few years, even. And this can only be a positive development, as it is my firm belief that, along with a familiarity with various genre histories, an understanding of auteur theory and its evolution is one of the most valuable tools any moviegoer can have at their disposal.

Naturally, there have always been directors who garnered enough public recognition to eventually become household names, from John Ford and Alfred Hitchcock to Steven Spielberg and Tim Burton. (Even Martin Scorsese broke into the mainstream consciousness, though it wasn’t until 1990’s “GoodFellas.”)

But I think it was in the mid-’90s, when post-modern filmmakers like Quentin Tarantino and Joel and Ethan Coen began to dismantle mainstream viewers’ most basic notions about what cinema can be, that audiences began to develop more of an appreciation for the medium’s formal elements, and consequently began to more often recognize and appreciate individual filmmakers’ overall style, tone and sense of mise-en-scene (another cool fru-fru French term, referring to the visual composition of a given shot).

This brings us to another filmmaker who emerged in the mid-’90s and has gone on to become, if not a household name, certainly someone who has carved out a unique spot for himself in the cultural zeitgeist. I’m speaking of David Fincher, who from 1995’s “Se7en” (which followed the heavily studio-influenced but still under-rated “Alien 3”) has marked all his films with his own absolutely unmistakable style, most noticeable in terms of his approaches to sustained visual schemes and attentive pacing.

For two weeks, Fincher’s new film, “The Social Network,” (which, as I noted in my review last week, is his most accomplished and inventive effort since 1999’s “Fight Club”) has been the top box-office draw, and even if it’s inevitably going to be dethroned by this weekend’s debut of “Jackass 3D,” such a feat, for this kind of movie, represents a significant achievement.

Audiences are embracing both the film and its director’s novel techniques, and if the sizable font used for Fincher’s name on the film’s advertisements is any indication, a large segment of the viewing public is assumed to know that the film being touted is by the director of “Panic Room,” “Zodiac” and “The Curious Case of Benjamin Button,” without being explicitly told. How encouraging is that?

(I was going to continue to prattle on, shifting focus to the unlikely significance of the comic-book superhero genre to all this auteurism business, but it appears that will have to wait until next week. Word is that “Red” is solid, so expect a look at that as well.)

Jason Wallis is a News-Sentinel copy editor. He can be reached at jasonwallis@comcast.net.

Poll

Loading…

New Classifieds Ads

Twitter

Mailing List

Subscribe to a mailing list to have daily news sent directly to your inbox.

  • Breaking News

    Would you like to receive breaking news alerts? Sign up now!

  • News Updates

    Would you like to receive our daily news headlines? Sign up now!

  • Sports Updates

    Would you like to receive our daily sports headlines? Sign up now!

Manage Your Lists