Tuesday, May 25, 2010

Arbitrary Walter Benjamin Reference

After Cannes last week, Mahnola Dargis, Roger Ebert and then even Cinematical piped in on whether or not the august film festival should be showing films digitally, whether or not we can tell the difference, whether the filmmakers care, and what if anything, exactly is lost by this.

It is a conversation fraught with anxiety, nostalgia, knee-jerk conservatism, and nerdy one-up-manship. There's a sense in the air that the age of film is dying before our eyes. Buyers aren't buying at the festival, and sellers aren't coming anymore. There were no US films there except for "Robin Hood," which doesn't even seem to have a presence in America. It's not even really made by Americans. Godard, Ebert reports, is now finally and culturally irrelevant.

I think they're really nostalgic for a community that was worldwide and was dependent upon meeting face to face to see art and discuss marketing and sales. Do people really still care about watching films on film? Doesn't it scratch? Wear? Fade? How high is hi-resolution? Isn't everything already done digitally nowadays anyway?

The question should be reframed. To engage in the question of the value of a film print as a unique format for art to be preserved and shown in its native state is not a question that has anything to do with the filmmaker's motive or intent. The vast majority of the production and industrial habits and processes are ingrained in the capitalistic infrastructure and the artist has no input into the end result. We read about and discuss outliers like Spielberg, who may insist on cutting on film with a flatbed because he likes the "feel" of the cuts, and is allowed to because he's rich enough to be this eccentric, but even he can't stop "Raiders 4" from being digitally projected in most venues (or prevent the analog stunts in "Raiders 3" from looking fake). Fincher insists he can fix it all in post, because his post started 2 years before filming, but that freedom (and distraction) is the worst thing that happened to him. His grosses are a lot lower but the studios are more than happy to have him experiment on making Brad Pitt younger, on-the-job research for when the real Brad Pitt isn't around anymore and they still want to make Brad Pitt movies.

And yet Fincher's upcoming remake of "Reincarnation of Peter Proud" will be struck to film and projected at the local Galaxy theatre, possibly out of focus. Digital effects be damned.

Nor should we accuse poor film showing environments as damning of the medium en toto. Too many times proponents of digital point out that it doesn't scratch. Yet digital fails unexpectedly in different ways, and it's presumed divorcement from the mortal coil of physicality only replaces the problems of inertia, dust and time; it does not make it invincible. Every venue and every showing is different - film will scratch and bulbs will dim and hard drives will skip and the white balance won't be properly recalibrated. Not to mention human error.

To decide that film or film theatres suck because we had a bad experience when a showing of "Sex and the City" broke 2 years ago is to confuse effect with cause. Theatres haven't had union projectionist in their booths for 10 years - those new fancy digital projectors will manifest the same inability to keep going without constant and expensive upkeep soon enough.

Instead we should consider the nature of film projection vs. the nature of digital delivery in a utopian and ideal environment and what the medium, or should I say "format," conveys phenomologically rather than anecdotally. It's a question of reception rather than in the transmission contingencies in the first place.

Film, rather famously, is indexical; that is, it is a one-to-one representation of what the makers created, a print off the negative which was cut/edited from actual strips of film in a camera that was on the set in front of the famous (or infamous) actors there. The line of provenance from object to object is clear and possesses a tangible value as an artifact of a specific time, a place, a method by which it was worked, and its travel through space to arrive behind you as you sit. Light bounced off Ringo Starr and landed on the emulsion in the camera in 1965, in which negatives were printed from. Those negatives were the actual source of the 100s of prints that were sent to theatres. The light shines through it and you see the shadow of those grains of chemical, a kind of apparition of history rewritten on the wall in the glance of light 24 times a second.

The one-to-one-to-one direct lineage is perhaps more important philosophically than cognitively. But the sense that you're watching the same film that has been seen by thousands of others when you see a distressed print of "Help" gives one pause. It creates an awareness of the extra-narrative context over time.

Digital delivery packages, conversely, are translated information from data and output electronically. It is not indexical and has no one-to-one provenance from the original event. It is an electronic construct, and re-constructed. Digital files capture images and events by math. Not by chemical proximity. This material shift from object to data may mean the image is as perfect or as degraded as it has been programed to be. It can be smooth, and has a creaminess that film with its 24-frames-a-second grain can't duplicate. It doesn't manifest scratches. It glows and "feels" perfect.

It doesn't age with experience. Each showing is a new event, a unique performance without previous history.

We respond differently to the flicker of film, some people suggesting it creates a dreamlike fugue state in which the dark between the frames, upwards of 70% of the time, lulls us into a more forgiving right-brain epileptic engagement. Digital streams at 60 or double that frames a second, and commands our attention like a shining inheritance. It's a florescence compared to film's incandescent shimmer, exciting our senses without rest.

Our minds process and get tired watching digital. Certain films work much better that way, insistent and quirky and of the surface. That image up there, the film-strip dress, will not keep her warm at night. It's something else - it looks tactile and has a visual pleasure that barely hides as much as it reveals. It only suggests, promising what it can deliver. The question is whether or not we prefer the shot to be clear and even. Or do we like the evidence of those 24 frames? Does that comfort us, or do we want an image entirely and continually as smooth as milk?

Tuesday, May 18, 2010

Red Dessert

Like R. Knight, I too have a thing for Monica Vitti and the surreal and alienating wonders that Antonioni has managed to slip into the culture, unseen and now inoperable. Criterion is releasing "The Red Desert" (Il deserto rosso (1964)) in blu-ray and it seems too little too late, especially when we read that sales of DVDs are down 30% more this year from last year's already alarming 30%.

People aren't buying catalog titles anymore, and they're barely buying the new stuff ("Avatar" is an exception and an outlier, and should be discounted for more than one reason. What are you people thinking?). What does a 45-year old art-house avant-garde visual tone-poem, Antonioni's first color film that so aggressively fucks with the palette to make it seem more finger-painted than choreographed, corrupted rather than blooming, have to offer a generation raised on perfect and digital hi-def hi-res imagery?

"Rosso" is poised between the hip and minimalist "Blowup" (1966) and the stoic and elitist (and to my mind perfect) "L'eclisse" (1962). Circling around a popular mode of film-making after overly existential narratives influenced but never embracing the populism (or socialism) of neo-realism, Antonioni seemed genuinely hurt by "L'avventura"'s critical drubbing in 1960 at Cannes. It's a tough movie. Not of this time, and I'm not sure of any time. Conceptually more fun to talk about than to watch, I think very few people have given it a chance, really, at least until the (seemingly) only camera move in the whole film, that slow push-in on the empty street showing us... nothing. It's indulgent, arrogant, yet transcendent. It opens up the film ontologically as well as metaphysically. Yet it happens a good hour in, and trying to explain that to anyone is a fool's errand, like telling someone to hang with Warhol's "Empire" - a bird flies by in hour 6.

Ergo, "Rosso." Antonioni seemed insistent in making an art piece. That looked and felt like art-capital-A. Reportedly set decorators painted walls red and leaves green to give the film a palette more impressionistic than realist. The acting, as was the speed of prevailing traffic in Europe at the time, is conveyed mostly with words spoken to the table rather than at the people in the room, looks out windows and walking across the industrial sets to strike a pose, framed within and sometimes dwarfed by the manipulated (if intentionally) ugly sets.

It displays the best things about Antonioni, as well as the worst. It's too much and yet not enough, a recipe of elements that don't quite bake together. The existential ennui drowns the narrative momentum in a manner that points ultimately to the explosive and resonant failures of "Zabriskie Point" (1970). Vitti, game if done, finally becomes what had never happened before, a decorative detail, a directorial flourish, just an undigested element in the set design, caught between existential malaise and the director's obsessive blindness.

I can't wait. "Rosso" had a sub-standard DVD release in 1999 that didn't properly master or balance the color or aspect ratio. I'm guessing the Criterion DVD blu-ray will, at the least, go back to original reference materials to make it look as close to the original release as possible, regardless of how out-of-date and stale it might be to current audiences. It's the missing link in Antonioni's 1960s oeuvre. It's like a cake that's a little too dry to go down, but you can't stop nibbling at.

Saturday, May 8, 2010

Drips, Thousands of Them

Last month the Library of Congress took the bold and inevitable/ inenviable move of deciding to preserve all of Twitter's feeds since its inception in 2006. This was cause for concern in many cultural circles - the corpus of "tweets" are famously inconsequential, brief (140 characters means they have to be as concise as haiku and they don't even rhyme) and so very much of the moment.

They exist outside any context beyond the immediate, without footnotes or backstory, disconnected to meaningful singular or tracable threads.

Or not. It seems like so much digital noise, either seldom serious or way too personal. I imagine the anxiety in some quarters has to do not with the content as much as the size of this archive. How is this possibly going to illuminate any future historical research?

The problem isn't necessarily in the weight of the corpus - millions of small and disjointed tweets about who and god knows what. They refer to timely and ephemeral cultural events that fade as quickly as they rise in the search-fed trending charts. A cursory look at Twittter's own page of trending topics ("Right Now" vs. "Today" vs. "This Week") reveals the distorted view from the rear view mirror of historical perspective. Objects are closer than they appear.

A bigger question is how are we'll know who wrote these tweets or why. These terse, clever, obscure koans are anonymous to the larger population - the usernames are often pseudonyms, synonyms, acronyms, homonyms. Is that being archived as well, Twitter's proprietary user backend with ISP#s and geo-locations embedded? What if users disabled that feature? Is there a privacy issue at stake? Who is represented geographically and who is anonymous?

And who's tweets will have greater historical weight in the future? Which ones will be more heavily researched simply because they have more surrounding context? Spelled things correctly? Levels of "impact," re-tweet factor, rate of followers, whatever?

Many news stories broke on Twitter in "real time" including the widespread dissemination of Michael Jackson's ride to the hospital in the sky - there are legit reasons to track what's discussed in this skewed and auto-democratic forum. The Iranian election protests in June 2009 on the streets caused 200,000 users to change their avatar green (and some are still are). What does this say about Iran... or about the average Twitter user - that they're politically committed or that they forgot?

It's a new level of discourse, outside journalism and academia. The importance of this LOC archive if it lasts - if it's actually maintained - won't be in what people say in those 140-character text-bubbles but how.

Language when repressed or limited expands in strange and revealing ways. People express themselves differently if they think they're anonymous and if they got no time to finesse. When Twitter goes away, and it will, this collection of immediate inconsequential snapshots, these text notes under the bed, will reveal a time and a place in which, facilitated by mobile devices and the attention spans that all the new toys of our age engender, will show us as a community digitally connected in ways never thought possible, still trying to say something meaningful to the people around us.