Sunday, December 13, 2009

The Comfort of the Cut


As a new generation of horror films change what we scream and cower at - or at least, get nervous watching on our couch at home - the techniques horror films employ have to adapt as well.

We are increasingly used to streaming and handmade handheld filmmaking on YouTube and reality television-based weblettes. A POV shot follows the protagonist through their homes, waiting for the reveal as it rounds the corner to show the mess in the living room, the girl undressing in the back, or the boat stashed in the leaves on the island.

Our response to such imagery is a combination of suspense and discomfort. We're not sure we're following the right people, that we're in good hands, that the plot will unfold satisfyingly - it being shot on-the-cuff - or even resolve, or is resolvable.

A sense of mediation in constructing film narratives gives us cold assurance, even as the films may be out to fornicate with our heads and create anxiety in such sensual blunderbuses as Saw, Irreversible and Funny Games. Yet such horror films (all stemming directly from the overly formalized experiments of Carpenter through to Tarantino) create mood, tension and plot by careful and methodical arrangements of shots and compositions. They all spectacularly show us what we want to see, but on their editorial terms.

They're not accidental. We get tense as we watch a character walk through the dark basement, the camera directly in front of the heroine, in too close - backing up, and we're unable to see what is ahead of her (although since she's facing forward and continues to walk, we assume there is nothing largely or visibly dangerous in her view ahead. Although whatever that may be is behind us and out of "our" sight.).

When the camera is instead behind her, following and in close as it moves with her, we may be able to see what's ahead of her (as she does) but unable to know what's creeping up just behind (besides us, and a steadicam rig. We want her to ... turn around!).

These are specific constructions that reveal to us visually only certain information and perversely withholds other (including the cat that is just about to jump or be dropped onto the heroine's shoulder). A well-constructed horror film manipulates and withholds, teasing out bits of eye candy and gruel in careful measure and juxtaposed in blunt and spectacular manner.

We enjoy this grand guignol display. When it works we give over to the sensual roller-coaster of confrontational and impossible logic, with a hidden design of frustration and surprises that reveals itself only in the process of our enrapture if not in the quoditian denoucement of the plot. A solid story well told is its own reward, and we laugh as we exit, wiping our flopsweat off our brows and telling ourselves it's only a movie.

The held shot, in suspense, is the single unit of horror that challenges our understanding of the underlying design, waiting to commit to an edit to which the intent of the protagonist is finally exposed, when he decides to go into the basement; when the filmmaker will withhold what's ahead and not behind; when the spectator must accept that what's about to be shown is texture rather than text.

The edit creates order. The edit, a collision of images, is a sequence like a sentence, constructed logically. It is in the edit that tension is created and then released, a question is asked and then answered, in which options are abandoned for the one that is chosen. The spirit becomes body, God becomes the word, the word becomes flesh. The comfort of storytelling becomes manifest. Seeing is good.

A handheld and undisiplined camera troubles us. We simply can't get our bearings. We wait for the angle to change so we can see more. Once we see past Jamie Lee Curtis to what's behind her, we discover how close the monster is, what the space is between them.

We wonder about the how rather than the what. The cut is comforting. It releases us from a film's grip, for an instant and instantly, in the space between 2 frames, and for a moment, we can relax.

Friday, November 27, 2009

History Will Be Written By Nobody


The historical record is the most important object that civilization probably creates. It's not a discreet product or manufactured building or monument that is pre-ordained, pre-determined, or pre-meditated.

It's not a cultural mandate or steered agenda. It's not controlled and it's not finished. The historical record is made up of millions of memos and emails, hundreds of thousands of news stories and video feeds. Documents and bank statements and journal entries and tape recordings. Photographs. Paintings and graffiti and poems and testimony.

Evidence. It's authentic and it's honest and it's made for reasons other than historical reasons, which is why it is so valuable. It's not worried about how it will look 100 years from now; it's worried about now. It all survives as a cumulative and infinite monument to who we were and what we cared about so that the culture of the future will understand how we lived, why we lived, what we were trying to discover about the world and about ourselves.

The most important primary resources of the 19th century frontier life were the hand-written letters that were saved by the pioneers. It was a big deal to get a letter in the old days, and endless minutia were relayed in those pages, which still exist today for historians to discover how things were in the summer of (18)49, how much bread costs, where the roads were being laid down by who's property, who sired what children.

This everyday discourse isn't written down with pen and paper anymore. It's hiding in emails, Facebook news feeds, or Twitter. Its sheer amount - and the perception that it's all so very unimportant noise - precludes anyone from wanting to save it, or being able to, certainly not the people who first created it. Facebook isn't archiving their site... except to mine your data to place ads. While someone' s grandfather may still be printing out all their emails, no one I can imagine is printing out all their friends' status updates.

You won't be able to pull your tweets out of a shoebox under the bed in 100 years like you could a box of letters. The vast amount of social interactions are now taking place between those iPhone IMs and Google docs and whole new generations of us will never commit our diaries, business contracts, family photos, geneology, or bank transactions to anything other than the cloud, up there on someone else's server, where no one's saving it for the sake of its historical value.

Only for its financial exploitation.

We all have stories of that hard drive that crashed last year and lost the pictures of our trip to Disney World or our Aunt Lora, who's dead now and we'll never see what she looked like the last 10 years of her life.

We're likely living in a digital dark ages, right now, and in 100 years we won't be able to know who our friends were, what we said to each other, what roads we travelled next to what properties, how much we made or who sired our children. All, uncommitted to long-term storage and without true historical custodians, will be lost, along, I'm sure, with this post.

Thursday, November 19, 2009

Would You Buy A Used DVD From This Man?


The good news is that we, as a race of consumers, have finally figured out that we really don't need to buy every sell-through DVD in stacks at Best Buy, in spite of any value-added deleted scenes or alternate soundtracks. The DVD of Don Roos's "Bounce" had 120 minutes of stuff that was deleted from the final version, longer than the running time of the film itself.

I for one would have loved to have seen the integral version, all 3 3/4 hours edited artfully together.

DVD sales have dropped over 10% last year, and are falling faster this year as consumers figure out how they want to consume their media, either by paying for 2-disc/special edition box they may not want to watch more than once or ordering it instantaneously on their increasingly hi-def devices in their living rooms.

Blu-ray can go home. It's estimated that 20% of people watch some sort of video online daily. Whether it's Hulu, YouTube, or Netflix's streaming, it's clear that consumers aren't beholden to the old model of buying individual widgets anymore. I remember the days of walking into the used DVD aisle in Amoeba and seeing literally 100+ used copies of Cast Away, all for less than a quarter the original price. The disconnect between our need to "possess" a cultural event (which Cast Away arguably was, at least for a month) and realizing we had woken up with the hangover after having drunkenly overindulged was clear to me then. The chilling feeling that our pockets had been picked when we weren't paying attention made us want to just get rid of the evidence and take a long hot shower.

We will have more access to more video and other filmed entertainment once broadband reaches to every corner of every coffee shop, to every device large and small. Quality will depend on what we're watching and where. We won't have to buy director's cuts of films that had no directors in the first place or collect deleted scenes just to be completists, a Sisyphean quest in that it's like trying to collect everything that isn't there.

What we will be buying access to all this stuff. It'll be in the famous "cloud." It's up there, somewhere. That means it won't be on your shelf, and that also means you won't be in control of it. Consider it Web 3.0. While the last iteration was nice for all you home-brew radio jockeys who got off on changing facts on Wikipedia and remixing Lawrence Lessig, now the corporations have a chance to feed you the films, the videos, the songs, the content wirelessly onto devices they are building to make sure their content plays just for you.

And plays just from them. That's the bad news. No more all-access t.v.s, radios, or computers. (Or even, iPods, a more restrictive but still relatively crackable container, in part because the songs are objects that can move and morph fairly easily.) Set-top boxes like Apple TV and Netflix's Roku are the beginning of the movement to get Trojan horses into your gates. DRM'd, all of them. Disney has their KeyChest scheme and an impressive handful of other major companies have announced DECE. Best Buy and CinemaNow are in cahoots to build and sell and fill these devices in the short term.

This paying for access through a box we don't have the keys to will effectively replace cable eventually, as well as the easy ability to TIVO or record these things off the "air." Disney, as you may have read, is promising to allow access to any film or program you "lease" from them forever.

Forever is a long time. I'm not sure I believe that.

As we, as a race of consumers, let these big players in distribution take charge of where we get our content, and how much we pay, we have lost an important part of out rights to choose, to browse, and to do with what we want with the content we (think we have) bought. Even it it's copying the damn thing onto our drive to remix it and selling the original to Amoeba.

Sunday, August 9, 2009

Spectacles Public and Private


Movies seem bigger than ever and less relevant than ever. We're not falling in love with going to the movies. Because we don't go, certainly not as often. They're simply around too much. In too many sizes. "Star Trek" notwithstanding, and even that feels like a t.v. show that will translate well to my iPod.

The common lament since about "Star Wars" is that filmmaking stopped being an artform (as if it ever really was) and became only about selling tickets. No more cine-clubs discussing Bergman, Fellini or Pakula. But looking through the nostalgic fog of a past we read about but didn't live through, show business is about spectacle and always has been.

From the earliest days, the hits are those that are the biggest events - the ones that get our attention one way or the other. By electrocuting Jumbo, having sound for the first time, being in color, louder, more expensive, by simply being a new take on an old story, better.

Spectacle grabs people's attention. "Transformers" and "Harry Potter" would be at home in a theatre in 1977, but they're wrapped in 2009 digital fireworks. They're not so much films as controlled burns. The aggressive retro-new excess of something like Scorcese's "New York, New York" was its own film-nerd spectacle of its day, artschool indulgence writ large.

It didn't help anyone's career. It didn't help anyone other than the critical studies majors. But at the time it drew its own attention. Worth doing if not worth the price. A conundrum when we interrogate what and why studios produce what they do.

Nowadays business decisions take the ego and arrogance out of the equation. New modes of delivery mean new modes of audiences. What's old isn't new - it's simply new.

The spectacle is the way in which it is engaged in, modern, digital, and transformative. The content is less important than simply that there is some.

It's a bottom-up shift, driven by the public who simply don't buy a movie if they don't buy the hype, or buy a ticket in spite of all indications to the contrary if it's what they want to see. The studios are playing catch up and realizing the old ways aren't going to work much longer.

What is available always eventually reflects how people watch moving images. Soon, portably and in chunks, in low-definition - and most fatally - casually. Films won't matter anymore culturally because they won't have a cultural impact. Film will become the moving wallpaper of science fiction.

There will be space for spectacle, for CGI-candy. But Bergman and Pakula is over. They don't translate.

Some will appreciate the past and enjoy it privately. Maybe find a handful of other enlightened individual believers. We will not be watching the same screens.

Thursday, July 16, 2009

Teen Scream


Teen comedies have changed over the last 10 to 20 years because teens have changed.

Of course. The audience is what dictates what's produced, because if a movie shows in an empty theatre, does it make any noise?

Around the '70s and '80s there was a rash of films about teenagers spying on other teenagers. "Private School," "Fast Times at Ridgemont High," "H.O.T.S," and of course "Porky's" all depend upon sometimes complicated setups in which teenagers attempt to steal glimpses of the opposite sex undressed in semi-private situations (and often end up naked themselves).

At the movies were the only place you could see what a naked person looked like (besides from fine art books) and drive-ins became the preferred and privileged site of such voyeuristic pleasure by teens. Often for more than just what was on the screen.

In burgeoning age of cable and video, it became easier to experience what was forbidden and withheld. Teen comedies continued to be produced, but they were increasingly out of touch with how teenagers acted and what they wanted - they shifted from a life-style accoutrement to the exploitation they frankly were. I seem to remember some Brendan Fraser films in there somewhere, and the ubiquity of video didn't do teenagers any favors. The increasingly parent-safe "10 Things I Hate About You," "She's All That" or "Clueless" are all based on classics - yet they still feel like your pocket's being picked by 50-year-old men in shark-skin suits.

The "American Pie" movies returned to the earthier trends of the '70s with a knowing, post-modern tone and less desperation in the need to see skin. They simultaneously went farther sexually and embraced a Farrelly Brothers sweetness (which continues through the Apatow comedies) that makes them both controversial and conservative. Now that anyone can see anything online, teen films are no longer merely about the struggle to catch glimpses of naked people, let alone to get laid. Now they strive to make it mean something more than the smarmy sniggling innocence of "Porky's" would have you believe.

The teen films of the '80s are hopelessly dated now, but capture a specific time in everyone's development when being alone with your lust and fantasies was allowed and comodified.

Teens may have not changed so much but their modes of finding out about the opposite sex have. With the Internet and 100 channels on cable, the sense of discovery is no longer in a car, in the back seat, at the drive-in. In front of a glowing screen revealing secrets.

The emotional attachments, the physical and psychological changes we felt while viewing forbidden images (it's something out of "A Clockwork Orange") aren't there for a new generation.

Teen comedies (and sex comedies in general) are carriers of a different kind of information. They're too damn responsible. They're too damn polite.

Wednesday, July 8, 2009

Dark Archives


In film archiving programs much like the one I am in, what you end up learning is a lot more about library studies than actual preservation of film.

What's important now is not trying to find an extant copy of an old lost classic. Let's presume that most of the films that can be found have been... or are deteriorated to the point of past saving. Now "archiving" is figuring out how to present what's still around to future generations, and future generations aren't interested in going to museums.

What archiving means now is to learn how physical document-style record-keeping archives keep track of their stuff. It means cataloging, and creating metadata for the Internet.

Describing moving images with words is a challenge that has yet to be conquered. As machines and software get better at "identifying" what a film clip or series of shots is about, the more a human with some kind of cultural sense and taste needs to intervene and perform triage on the alphabet soup that's created. You can't describe the elegance of a match cut in Renoir with even two stills together on a webpage.

You can't capture the flicker in Marlene Dietrich's eyes. Or the swagger in Asia Argento's poise.

Yet everything is being streamed to us anyway, on the Internet in any form they can deliver it to us. We no longer can be concerned with the best possible copies. Now we are beholden to creating the fastest-deliverable ones. There are over 4000 35mm prints of Transformers: Revenge of the Fallen in existence. In 6 months when it hits DVD, over 3800 of them will have been purposely destroyed to prevent piracy (although it's already on the Internet in digital form). And by the time of the third Transformers film there may be no film prints at all - it will be delivered digitally to your local exhibition spaces.

Newsweek (or was it Film Comment?) was right: Film is dead. They merely announced it a couple of decades too early. Sure, the old classics (and not-so-classics) on film are still being saved, on negative if they still exist, forgotten in dark archives. The temperature is lowered and the lights are dimmed so no more damage is done, for that moment some time in the future when people care about film again and want to see actual light shone through actual chemicals on celluloid and reflected off a silver screen, rather than transmitted with the electronic glow of digital perfection.

The archives are quiet. Companies are releasing the same hits over and over again in newer formats rather than exploring deep into the canon. The industry is trying to shake as much money as possible out of people, but it's hard when everyone is getting everything in a reduced resolution and in small pieces, often only temporarily - and for free.

No one's figured out what to do when people expect so much more for so much less. The old business model of selling atoms people keep is being challenged and undermined forever.

We're in a profound period of transition, psychologically, culturally, financially, and philosophically.

Friday, July 3, 2009

Now


Has there ever been a film more review-proof than Transformers 2? The word is so uniformly and excoriatingly bad, not only from the egghead academic critics from such august publications as the New York Times and Aintitcool.com, but from our friends who saw it - to let us know it was so god-awful bad to try to, fruitlessly it turns out, warn us off.

That's a more immediate and direct kind of "word of mouth." From the very type of people who were predisposed to like it or were at least up for the dare and waited in lines (and there were long lines) on the 1st, 2nd, and 3rd days and paid for the privilege of being simultaneously bludgeoned by the effects and sound while being insultingly starved by a paucity of content or intentional nuance.

$200 million worth of people saw Transformers 2 (and that's only domestically) and if the word of mouth really has serious effect and the grosses fall 75% each week for the next 4 weeks its sheer momentum will still ensure it will finish closer to $1 billion by August.

Is it worth all the money that was spent on and for it?

It's not high art (and I submit, it is art) but rather, an instance of performance. A triumph of marketing, branding, of sheer hype and push. No one wanted this film who hadn't dodged the 1st one, but its existence seems to assert itself - as a kind of fait accompli - as an event by its mere monumental presence. It's being sold not as a continuation - as a sequel or even a deeper exploration of plot points introduced and hinted at in the first. Shia and Megan aren't anywhere to be found in the materials.

Its about being in line, surrounded by a hundred other half-drunk fratboys, screaming and "ahh"ing, and covering your ears. The digital billboards on the day it opened didn't even insult us by listing the title, as some desperate hat-in-hand attempt to sell tickets.

They merely say "NOW." That muscular blue and orange image was enough.

We get it. This is happening. You in or out? Where's the line?

Friday, June 26, 2009

Dogma


About 10 years ago the Dogme movement emerged from Denmark, attempting to assert a new stripped-down aesthetic in filmmaking. Filmmakers such as Lars von Trier and Thomas Vinterberg embraced a new straight-forward, honest (and presumably cheaper) mode of filmmaking which precluded real actors, constructed sets, post-synced sound or effects, all in an attempt to strip away the over-determined rules that inflected (and infected) normal picture making.

Only natural lighting environments were allowed to be filmed; no extra lights could be added. And only existing objects in real locations could be used. No props or guns or other genre elements to add visual "interest." It all had to be present and available for the filmmakers... or anyone. The idea was to capture the truth as it happened in front of the camera and record it un(pre)mediated as it occurs, with no subjective manipulation, no trickery, no egos. Truth at 24 frames a second.

They were unsuccessful for the most part. While this is an interesting approach to making films - and especially for ones that aren't documentaries - it makes for difficult, overly mannered yet loosely structured and finally rather restrictive results. Such avant-garde narratives - without artifice or production values - are an acquired taste. Without most of the tools of 100 years of filmmaking at their disposal, the dogme-ists paint themselves into an ascetic conundrum in which flights of cinematic fancy are by default precluded.

The last successful Dogme film was 2001's "Italian For Beginners" (and there's consensus that that didn't follow the Dogme vow of chastity rules to the letter either). Yet the spontaneous no-production-value aesthetic has been embraced by a new generation of filmmakers. It's a reflection of our familiarity with streaming videos on YouTube and our small personal devices, lo-fi but authentic. Such above-ground hits as "The Blair Witch Project" and "Quarantine" (by way of "REC") appropriate (if don't rigorously follow) the Dogme ideals of hand-held cameras and off-the-cuff shooting in natural, real-world settings with a documentary narrative drive. J.J. Abrams' "Cloverfield" also uses the videocam reality-t.v. model to great effect, tapping into our voyeuristic tendencies.

(Although it's likely 80% of that film is fake, manufactured by CGI in post.)

Interestingly, and tellingly, all these are horror films.

The Dogme '95 movement was an articulated attempt to capture the spectacle of the real, in unmediated and unfiltered visual terms. It turns out that mode of filmmaking is discomforting.

We like a little artifice between us and reality. The spectacle the camera captures, when allowed to film uninhibited and unfiltered, is truthful, perhaps - but also (or therefore) profound, scary, intense, forbidden, and a bit horrifying.

An unintended progression of those Danes 10 years ago.

Saturday, June 20, 2009

Independent Days


What I really want to do is direct.

What everyone wants to do is direct. Everyone's a closet moviemaker. Everyone's a comedian. Everyone has a screenplay in their bottom drawer, but no one's heard of anyone they know actually making it in Hollywood.

I went the independent production route myself. You get some friends together, scare up a couple thousand dollars, a film camera and shoot your clever Tarantino/Linklater pastiche, convinced that since it costs so little, there's no way it can't make money. The video store is full of them. Why not add to the noise?

We've all heard of the independent filmmaker success stories. Make a film in a weekend (or over 3 years) and it sells at Sundance for $3 million, and the next thing you know you're hired to direct the Luke Cage remake. They know what you can do with pocket change, so only if he had some real money....

It's an elegant theory. But it's disingenuous. For every Bryan Singer, there's a thousand Jacob Freydont-Atties. For every David Gordon Green that (eventually) gets pulled into the majors a dozen JP Allens remain unknown. Hundreds of films get submitted to each of over 200 festivals in the US every year (and that's just the features) and even of the ones that are selected, it's likely their first, best and last showing are at these festivals, never getting a distribution deal, or even ending up on DVD except as souvenir home burns for the cast and crew.

There are more movies out there than you can ever find out about. More people want to make movies than the industry can possibly gainfully employ. If you don't believe me, ask yourself how many times you've heard someone say words along the lines of "You know what would make a great movie?"

You've said it yourself. Everyone's got an opinion, and you know what they say about that. We think we can do it better, and perhaps we can. We'd do anything to be in pictures. But it's not just about having a better idea. It's about being in the right place - at the right time, with the right people surrounding you, and often with the right amount of money sitting on the table orphaned and waiting to be invested.

Financing is all - more projects come to fruition because they've been paid for than because they need to be told. Independent films always have a hard and schizophrenic life. They're borne of passion and necessity and wear their sponsor-less authenticity as a badge of honor, the entire time putting on airs to convince they're more than the backyard make-believe they are. They push the envelope and defiantly resist categorization and (often) coherence, because that would be selling out.

Yet they exude a needy greed to be loved, because ultimately they can't afford to piss off their audiences or their producers, and end up playing to the cheap seats, simultaneously wishing and fearing a state-funded co-opting or, at least the perceived notion of one - pursuing and risking a Kurt Cobain-ian reduction of street cred as the zeros multiply on the residual checks.

Even Ron Howard started as a seat-of-your-pants go-for broke exploitation director, which in a way is still reflected in his gilded work on "Angels and Demons," done not for art but to assert his position in the industry. A $200 million budget, completely competent and completely forgettable, reminding us that there is never enough resources of the right kind on any picture. It's the difference between the first "Terminator" and the second, the difference being a budget 10 times the size so aesthetic challenges aren't solved, so much as financed to death.

Ridley Scott makes one movie a year, and while we can discuss the vagaries of his depiction of the CIA in "Body of Lies" or "Matchstick Men," we'll never see the mad independence of "The Duellists" again. In today's environment, the list of directors able to generate a meaningful body of work is extremely short. Bad penny Terry Gilliam and rock star Martin Scorsese still can't put together the projects they really want to do. Scorsese had defaulted to music documentaries, which are probably the level of fight he's willing to take on nowadays.

And what about the filmmakers that didn't have the fortune of having worked with Robert DeNiro in their early careers? Who had a unique voice but couldn't sell a ticket? They've moved on to shooting cable shows. Or pulling cable.

Or writing for cable. Or writing work orders for cable installation.

Being independent comes with a price. By the time someone offers to pay the bill, you're already face-down in the pool.

Friday, June 19, 2009

Data, Metadata, and Statistics


Digital objects exist in a different way beyond mere objects in the physical world. They're created and the information by which they are described is added to the object, so it can be found.

This is "metadata" - kind of like the stuff that gets stuck to your shoe that you simply can't rub off.

Every digital object collects this as it moves, gets copied, is altered - even deleted. No fingerprints remain invisible. (Yes, even an object that isn't there still declares itself, if only by virtue of the fact that it is no longer present.) Lots of times metadata is intentionally added to an object. Titles, dates, to-do lists ("Delete after end of quarter," "Save for blog," "unused takes").

But just because this digital object has collected all this extra descriptive information doesn't mean it's the better for it. The object becomes larger as it travels, and it costs time and energy to preserve all this stuff on the object, not just the object itself.

And just because there's all this new information on it doesn't mean it's good info. Much of it may be wrong. Or incomplete. Or mean different things to different people, programs, or systems.

The signal to noise ratio begins to change. And just because it's all info about the object itself also doesn't mean that it's metadata, either. Maybe the info is part of the object's creation, but doesn't actually describe it. It might not be about the object, just riding along, attached accidentally or through someone's alterior or altruistic motives.

Once an object collects information about itself, that doesn't mean it should all be preserved with the object.But figuring out what belongs, what might be needed in the future, and what's merely a parasitic piece of code costs resources to deduce.

Not all metadata is created equal. It has a lifecyle, and some becomes obsolete at a certain point in the various iterations of the object, as it moves from VHS to laserdisc to DVD to Blu-Ray, for example. Just because you got it, just because it's right, doesn't mean anyone's gonna give a damn.

Metadata lives and dies and people get paid a lot to create, preserve, and migrate it. But it's invisible and of unknown value. So we spend more time worrying about it than what it is describing. We shouldn't lose sight of the underlying artistic creation that makes it necessary in the first place, in this world of digital access. A page of poetry or cut of music, a clip of film that people fell in love with, 100 years ago. And maybe 100 years from now.

In the future there will be no record players. You'll want to be able to find Miles Davis, won't you?

Saturday, June 13, 2009

Vistas


As I catch up on all the old films I've never seen, and I could never catch up if I saw 6 a day and limited myself to what was made before 1945, the differences between then and now are striking. Besides a mode of cutting (much slower) and a level of discourse (most of the golden age writers came from Broadway), there's a sense of place that is so resolutely real and beautiful.

Films used as part of their production value actual locations, often as part of the spectacle of visual entertainment. Movies had traded from the very beginning on showing something you had never seen before, whether or not it was a train coming into the station head-on, the beheading of Ferdinand, Death playing chess, or what images were kicking around inside of Fellini's head.

Film, being a photographic and (deceptively) real representation of what happened in front of the camera, brought the world to us.

Producers would as part of their production strategy, cast the location as well. Even small family dramas such as "Rebel Without A Cause" use striking locations to give a sense of scope and natural beauty. (Hitchcock, perversely, would fake some of his up, but that's an epistological discussion for another time.)

Of course filmmakers use trickery, intercutting, and stunt doubles to convince us that they travelled to Rome to shoot their ancient saga (and not Bronson Canyon). But the setting was considered an important character in the film.

We saw vistas used to dramatic purpose as late as the '80s - I remember "The River Wild" being more interesting to look at than to listen to. But in the current age of CGI, producers have gotten used to, and audiences have grown to expect, any spectacular and fantastic visual fireworks that can be created with computer graphics.

So we get the nuanced cityscapes of "The Dark Knight," the miles of weather-ruined vistas of "The Day After Tomorrow," and the virgin-sand blue sky chicanery of "Castaway."

It's all spectacular and beautiful. And it's fake.

They can do anything. 90% of what you see in "Star Trek" doesn't exist in the real world. Everything down to the lens flares coming off the lights on the bridge was created in a room of computers months after shooting, and I'm not entirely sure about Chris Pine either.

TV with the 100-cuts-a-minute CSI style editing feeds into our need to see something shinier, faster, brighter than the last half-hour. We expect it now; we've culturally forgotten that locations and place used to be an important part of the texture of movies.

Movies used to show us people, real people, in big places doing interesting things.

No wonder people don't fall in love with the movies anymore. "Casino Royale" attempted to reclaim this dynamic after the cartoony "Die Another Day." And that's why I prefer "The Eiger Sanction" which seems to be shot entirely on a mountain, to "Cliffhanger" which only tries to convince me for certain key scenes. And even when it got boring, which it certainly did, I could always let my mind wander to look at the scenery.

Thursday, June 4, 2009

Something/Anything?


Readers going past the last couple of posts here will note clearly I've gone through various stages of thinking about what this blog discusses.

I started out bemoaning the de-sacredification (new word) of the movie-going experience (such as in Hollywood Ending and Nice Things Destroyed). I was conscious that a new generation wouldn't fall in love with film, and it was a ripe and sitting target close to my heart at the beginning.

As I continue with grad school, there's been a new understanding of what archiving meant, and how the past intersects with the future. What do you save, why do you save it, and for what purpose. Not everything should be saved, even if it's becoming obsolete or out of favor.

An exploration which led to a kind of phenomenological investigation of the indexical qualities of film vs. video. Around December of last year I struggled into an amateur Lacanian discourse about aura and semiotics, while trying not to use those words too much. I'm glad I got that out of my system (in Worked Matter, The Real and the True) but I still believe our cognitive relationship to what's filmed and projected, versus what is captured and streamed, is very different and shapes whether or not we are transported by what we see, or merely amused.

So, I've realized how deeply we're seeped in a new age of spectatorship and reception. By January, I'd accepted the Border between Calm and Catastrophe. New Moguls and Post Modern engage with moving images and films not as objects, not as site-specific performance events (such as a movie theatre) but as hypermodern events, infinitely accessible and duplicatable.

Although importantly, not with the same qualities. The future of moving images is as a stream. That matters in the long run as we figure out what we will be saving, preserving, restoring, and archiving for some mythical future audience: the actual object, a digital version of the analog object (which won't have the stability or long-livedness), an analog copy of the digital object (which won't act in the same interactive, dynamic ways), or merely a proxy that may approximate the look and feel as a faded souvenir.

I've always been aware of the impulse to wanting to fetishize the film object (Fin de Cine) and am guilty of it myself. But more and more moving images will be borne digital and be delivered that way, never enjoying a status as physical object. Without existing in some repository, they will remain around only as long as people copy and share them.

When they stop being used and migrated, they will deteriorate, lost to the past, in a cloud of memory.

This realization of the shift in the quality of moving images will be reflected in where I hope this blog is going.

Tuesday, May 26, 2009

Incredible Third Dimension


Pundits, and Jeffrey Katzenberg, would have us believe that the new 3-D is a technological breakthrough that will change the face of motion picture history and narrative story-telling as important, as game-shifting, as the coming of sound and color.

Perhaps. But it smacks so very much of gimmick. You know, like the last two times 3-D came around, in 1952 and in 1982. It's about right on schedule, if a couple years late.

Blame it on the theatres, who don't want to install those $100,000 digital projectors.

Those 2 times Hollywood was in desperation panic mode, first when television caused people to stay home (attendance never recovered from the mid- '40s levels), then when VHS and cable did the same in the '80s. Now of course it's the Internet, and the desperation is acute. While at least the studios figured out how to sell their product to those young upstarts (making more billions producing t.v. and selling rights to ancillary markets), now it's different. When content is routinely ripped and streamed online they find themselves holding a bag filled with hype and the sound of crickets Tivoing through the commercials.

If some new and technologically-dependent system forces people to go to theatres again, maybe Hollywood will survive this downturn. So everything under the sun is being produced in 3-D versions now, including a new version of A Christmas Carol (with Jim Carrey, whom I always thought was 3-dimensional enough), a remake of Piranha, and new "second eye view" re-renderings/re-issues of every Pixar movie.

But whether or not films can adopt a spacial dimension to their narrative strategies has yet to be proven to have traction. The flat and indexical surface of a photographic image works as an artistic abstraction that creates meaning from counterpoint, sequencing, and measured use of composition.

Distracting the plane of focus off the screen's cognated surface reduces your viewing experience from a narrated one to a vertiginous demonstration of technological disorientation.

All it does is make you tirelessly adjust your focus. If this really does "change" the way films are made and told from now on, I'll stay home and watch my flat screen. And of course the proof of the new storytelling mode will be how it translates to all the other devices and places people watch films, at home, on their computers, on a hand-held device.

None of which will have the immersive 3-D technology, which why it's being pushed - it can't be duplicated elsewhere and you have to go to the movies again. Out in the real 3-D world.

If the content doesn't work anywhere else but in giant digital thunderdomes, the long tail of revenue has been prematurely flattened.

I wouldn't buy any version of "The Polar Express." 3-D seems more at home at Disneyworld.

Saturday, May 16, 2009

Prints And The Revolution


Film prints, it is reported, can last upward of 200 years if treated properly, kept in a dry and dark place, with a minimum of humidity and not stored next to gasoline, mold, or the vinegar.

Yet there is a move nowadays to digitize everything, and get them all saved. To preserve them once and for all. The precious visual imagery of Hollywood past, your grandmother's photos, or the paintings of the masters must be stored on computers.

Safe. Digital and forever.

But the words "digital" and "preservation" don't belong together. To preserve something is to freeze its physical state in a moment close to perfection or originality.

For a while. For a long while.

The format in which it is frozen must be stable, or it isn't preserved. Only copied. Digital formats are electronic rather than analog/object based, and aren't preserving, only changing it into another format - bits and zeros. You don't make a new digital object, only duplicate the information, elsewhere and over and again. This suggests that digital is endless; actually it's only promiscuous.

You never save a digital object - you only make successive copies, over time and from system to system.

Sure. I guess that could work. But magnetic storage devices are prone to every magnet in your home, wallet, and cell-phone, to gamma rays and the passage of time in non-predictable and catastrophic ways. It's all intangible and temperamental. Digital discs fail suddenly and fatally. A VHS tape may turn to snow over the years everytime you watch it, and even if it stretches it pulls its way through the heads. Yet a digital file will work at 2:00 p.m. then not at 2:05.

Recopying over and over has its pratfalls as well. Each migration of a digital file to a new medium loses 1% to 5% of the formatting, depending on the conversion and compression strategy.

Digital looks fantastic, when projected properly, but that is only for the short-term. There is no standardized way to keep it.

Nowadays the Hollywood studios all preserve their films, even (and especially) the ones shot on digital cameras and that have no negatives, by burning them out onto 35mm celluloid in three-color separation masters.

Analog. That's how they preserved old Technicolor films, too. "The Wizard of Oz" still looks vibrant every time it's reissued in a new digital format because they go back to the color film negatives. It's the original best copy with all visual information still intact - not the 2k scan they did in 2002.

Those film masters'll probably still be around in 200 years. All the drives the digital files are on will still be here, too. As door stops.

For now the only way to preserve those digital files is to move them to analog format. Which renders them less than they were, no longer having the qualities that makes them unique, interactive, portable, "digital" - at the cost of making them stable.

That is the crux of the revolution happening now. Do we abandon physical objects in order to look forward, or implement them into our preservation strategy, as a nostalgic and retrograde technology that prevents progress?

Friday, May 8, 2009

New Moguls


We are nostalgic for the old studio days, in which a paternal (or dictatorial) regime bought, sold and traded properties, actors, directors and screen counts to fill the theatres of the nation.

Many classics were made under this system, and many more low (and high) budget programmers, a "mode of production" which kicked in around the capital-intensive time of the 1920s and didn't really begin deteriorating until the Paramount decree in 1948 and came to resemble full nitrate-style deterioration by the '70s, with bankruptcies, mergers, selling off of ruby slippers, and Disney buying ESPN in an attempt to diversify out of the film business.

The studio system is dead, and independent production is now dying as well. Part of it is there's no loose money in the economy. Part of it is there's no audiences anymore willing to go out and take a change on art films, no culture to discuss and support marginal, original, challenging films; only the roller-coaster blockbusters.

Part of it is the "film experience" itself - it's changed for good.

I'm not just talking about people not going to theatres anymore (I did that here). I'm not talking about how everything seems to be bootlegged and online weeks or days after their expensive wide theatrical releases (I did that here). Making cinema itself has finally become decentralized. Less privileged. And finally democratic.

The avant-garde promise of 16mm in the late '50s and early '60s was hindered by the high barrier to distribution. The VHS and DVD revolution allowed home-filmmakers to get their product out to a public if they could reach them. Now with the Internet, the barrier to entry is near zero. If you have it, you can download it. Even if it isn't yours.

And it's therefore available... TO THE ENTIRE WORLD. The biggest theatre ever.

The second important part of the film equation that has become democratized is the equipment. Not just digital cameras, which at the high end (the Red which shot Soderbergh's Che and Sommer's upcoming G.I. Joe) now go for under $12,000, one-twentieth of what similar ones did 4 years ago, but also the abilities to edit, score, and of course upload it.

In 1970, an 80-minute film shot on black-and-white 16mm stock would cost $200,000 at least, in lab work and processing alone. And that was if you could borrow the equipment from your local film school. In 2009, an 80-minute film shot on digital doesn't have to cost more than $100.

The final part of the way cinema has moved away forever from the studios is the changing aesthetics of audiences. With quicktime streaming, bite-sized iPhone links, and portability valued over a ritualized community experience, it simply doesn't matter that your film isn't in the multiplex (or arthouse) down the street. You don't even have to burn a copy - you just upload to Rapidshare or YouSendIt and you're a film distributor.

The new Saturday night is pulling out a camera and capturing footage, miming to your reference copy of "There Will Be Blood," ripping music in, and uploading to your stealth site. It doesn't even have to go viral - those 12 hits back home in Kentucky mean more to you than any $30 million gross opening weekend - the studios didn't keep much of that anyway after the independent contractors get done with you.

The new film experience is at home, with your friends and less than $1000 in equipment. You can carry it around. You can be your own mogul, in charge from beginning to end.

You can imagine how much this panics Hollywood, who doesn't know how to crack this market. And even if it is a market.

And if your film is good, well then,... prepare to be co-opted by a young executive in Culver City, charged with trying to hold on for dear life to last year's model.

Saturday, May 2, 2009

Insurance Insurance


I recently got a small (if you call a terabyte hard-drive "small") portable back-up for my computers at home. It plugs in and automatically copies any new files or changes since the last time, once an hour if you'll let it, and makes a full back-up every week.

This is so if (when) my computer crashes, I can restore almost all my information onto my next future door stop. The key is that I have to keep the thing plugged in, or at least have done so 5 minutes before my digital storage and work device goes south for the winter.

I'm not sure what the best strategy is for back-up. Once it's full it apparently begins to piece-meal write over itself, and the clear headroom it needs is now clogged with old "back-ups." Is it a back-up if you never use it, or does it merely have the "potential" to be a back-up, unrealized and impotent until it's cured, released to fulfill its destiny?

There's no instruction manual - well, there is but it's 26 pages in 13 different languages. That's not a manual, that's a collection of inserts stapled together. Once you plug it in, it automatically loads the full complete manual onto your computer's hard drive (and then proceeds to back it up for you, back onto itself).

If you don't install it with the pre-loaded software, the manual doesn't load and it asks you if you'd like to.

It knows what you've done, or didn't do. It's post-modern; it refers to itself. It's recursive in that it needs itself to work, to create its own presence for you to figure out how to work it (assuming you even read how).

You can otherwise merely let it do its magic without a thought and have a blissful and unfounded faith that everything is now "safe."

I don't know if I want my back-up to be so self-aware. I appreciate not having to be in charge, but if it's infringing on my need to not think about it by being so insistent, I'm shopping for a different back-up strategy that's a little less needy.

Monday, April 27, 2009

Man Without A Movie Camera


Peter Tscherkassky, in Austria, comes from a long line of avant-garde artists who prefer to work with the actual material of film. Man Ray, and Stan Brakhage and Bruce Connor after him all preferred to work with celluloid, and most resisted making the transition to (arguably cheaper and more flexible) video, and then to digital.

Tscherkassky starts with previously existing footage, reprinting (and misprinting) images over themselves, misregistering the film so sprocket holes become visible, framelines jump in and out of focus, images blur and stutter on top of and into one another. It's filmmaking without a film camera. Only the dark room and some unexposed film, and some previously exposed work which becomes the victim.

His films approach a kind of virtual performance art, embracing and emphasizing the visual surface of the object, as images run through a projector, and we "read" them as they flicker past. It's about film stock and dim light and flash cuts.

The tension has less to do with the narrative content (although that has something to do with it) than with whether or not the film will actually make it to the end without fatal mishap.

This process of re-printing may result in the destruction of the representational. Photography is unique as an artform in that it is more than artistic representation, more than symbolic rendering - for all the artifice created by filmmakers (before and after the shutter is opened), film carries a profound and powerful meaning by being an index of reality.

By moving into a post-modern and self-referential plane, what's shown is no longer nearly as important as how it is shown. Tscherkassky obscures the text, and seems to be suggesting the film is an index of itself, a kind of recursive performance art in which itself is the subject.

And in this process which seems little more than a film-school exercise, subtext in the imagery (which may or may not be intentional) becomes highly visible. Normally "conservative" footage reveals itself to be laden with political meaning, symbolic and stripped of narrative limitations, now free and convulsing as it's seen.

Who needs a camera?

Monday, April 20, 2009

One Year Anniversary

I've been at this one full year, and I'd like direct you to 5 of my favorite previous posts. When you write a blog, generally only the most current postings are visited, the others falling into disuse and neglect through the tyranny of the "archive." So the best (or at least most interesting) work you've done in the past gets hidden.

Should I wait before posting something new, so everyone can see the latest as long as possible, or do I keep putting up new stuff, to generate new hits and hoping finally getting it right?

Some of my favorite posts of the last year:

May 26, 2008 - These Sawdust Caesars



...in which I explain most emotionally that the teenage moving-going audience had more power to change the movie-going experience, and maybe what movies were actually produced. But they just. Don't. Behave.

June 12, 2008 - The Perfect Time to Think Silver



... a quote from Warhol, and a meditation on nitrate, the avant-garde, silver screens in cinemas, and how nice that all is.

October 17, 2008 - Forget It, Jake



...because I got to talk about my favorite Hollywood film, "Chinatown" and tie it to new audiences who don't seem to appreciate it.

January 12, 2009 - The Border Between Calm and Catastrophe



... an actually (for me) optimistic and realistic acceptance of the coming age of digital cinema, with the caveat that it makes us anxious, a cool title and picture of Edie Sedgwick, who was poised on the border of catastrophe herself.

July 2, 2008 - Nice Things Destroyed



...in which I elucidate a main concern, that this thing called movies may go away if we don't pay attention, with a clearer explication of film as object without going into technical or academic theoretical specifics I sometimes do (here, for example).

Sometimes I shouldn't try so hard.

Thanks for reading.

Wednesday, April 15, 2009

The Worthy Actioners


The magnetic tapes of Glyn Johns' original acetate of the "Let It Be" sessions, originally known as "Get Back," survive and allow future generations to hear the original versions of what was supposed to be the Beatles' return-to-their-roots record, after the over-produced "Sgt. Pepper's" and the over-determined "White."

"Abby Road" would come later, perhaps the most fingered and massaged record in their career, disarmingly named simply for the location of its creation with a cracked brick sign on the back, disingenuously suggesting that it was humble and authentic, but indeed close to "the end."

There are over 100 hours by most accounts of the reference tapes made by director Michael Lindsay-Hogg's crew during the filming of the "Let It Be" sessions. They were done on monaural Nagra machines with room mikes at Twickenham, and have an audible "beep" every minute for syncing, audio "cues" regardless of what future Beatle or past Chuck Berry classic is being played.

These also still exist, and have been bootlegged in various forms over the last 35 years. The coating on these magnetic tape has lasted. Every fight, missed note, snippy comment and cigarette break has been preserved. The old masters of the "Yellow Submarine" originals allowed a remix of that soundtrack 9 years ago that sounded better than any playback technologies back in 1968 could demonstrate.

But Glyn Johns may not have been the best fit for the greatest pop band in the world. He had been producing the greatest rhythm & blues band up to that time and created 3 different versions of the Get Back record, none of which met the Beatles' approval. The tangled history of the Get Back mixes include getting leaked and bootlegged, and becoming a blunt object that came between them while they were trying to figure out how to stop being Beatles themselves.

This allowed "Abbey Road" to be recorded and released in the meantime, until finally Phil Spector took what was supposed to be a stripped-down and naked collection of improvs and dressed it up in hooker's make-up to get it out of the house one last time.

No one was happy, except maybe Capitol/EMI. The damn label didn't even have the right color apple on it.



The complete rooftop concert on January 30, 1969, some of which ended up on the Spector mix, became its own unique version of the "Get Back" sessions. Filmed and recorded specifically for the film, with most of the songs performed twice ("Take two" - the best takes landed on the record.), it has its own integrity and historical circumstances. Not only is it the last Beatle "concert," it has these provocative production realities embedded within it to create a subtext beyond the shortened playlist.

30 years later, 2 Beatles were dead and Paul finally intended to pull those Spector strings off "Long and Winding Road" that had been bugging him the last 3 decades. With the original tapes, he spearheaded a final and Beatle-authorized version of the sessions, once and for all.

The resultant "Let It Be...Naked" has a different song order, with the half-assed improvisations such as "Maggie Mae" gone along with the incidental talking that always seemed precious and a little fussy. Instead are only Beatle songs, in democratic order (John, Paul, George, Paul, Paul, John, George...).

They even "fixed" a sour guitar note in "Dig a Pony" and edited the first half of one "Don't Let Me Down" on the roof to the second half of the other.

This most recent version, "naked" and without Johns' or Spector's superfluous influences or attempts at authenticity, exists due to the longevity of the original tapes and the ability of new digital tools to manipulate the information on a granular level.

The original performances, drunken or strung-out, never properly played, captured or released in the spirit in which they were intended, come to us a 4th time, this time due to action of one of the participants, the resilience of the original recordings, and 35 years of hindsight.

In many respects, it's the most manipulated version of all.

Saturday, April 11, 2009

Altered States


I saw "Eraserhead," rather naively, at a midnight show a long time ago before I quite knew what I was getting into. Not everything was available on VHS in the late '80s, and actually going out to see films in theatres was part of the movie-watching process.

"Eraserhead" is an experience not quite like any other. It's like a film from another planet. The viewer may be able to distinguish the shot-countershot construction, deduce the narrative and understand protagonist-antagonist conflicts. But the very way in which it is told, the actual visual fabric of the film, an askew, out-of-step filmy and mythic cadence comes through in a non-intuitive, instinctual and subtly but ultimately unsettling way.

I walked out of that film feeling stoned, unable to see the world in the same way. Lynch used to have that power, and that's why we are still talking about him, perhaps like we will talk about no other filmmaker. The images now are delivered online or in digital byte-sized pieces, not large, overpowering, linear and in an unmediated way.

No longer through the eyes and ears, right into the soul.

The last time I felt that way in a David Lynch film was walking out of "Blue Velvet." I wasn't high then, either, except for the film, that seemed on the surface so "normal" but underneath so subversive, transgressive, and morally frightening. That was the whole beautiful idea of course. Eventually Lynch knew what he could get away with - or was getting away with - and "Twin Peaks" and "Wild At Heart" don't seem other-worldly so much as merely weird. I made it a point to see "Island Empire" in a theatre, all 4 and a half hours of it, but still came out more annoyed than anointed.

Art has the power to transform the viewer, only if it's allowed to be received in the best possible, most effective way. Art has a way to make you drunk, high, confused and immoral. Want you to go out and take down the government. Or make art of your own. The best art doesn't compromise. And as a viewer, you shouldn't compromise how you engage with it.

Sit back, shut up, pay attention, and let it work on you.

Tommy Wiseau's "The Room" has the similar power to intoxicate. An overpowering mix of inept acting, plodding plotting and confounded mise-en-scene tied with a (-n un-intentional?) sincerity worse than Ed Wood. Micro-budgets are the new authentic, and perversely he reports the film cost $6 million to achieve its epic shabbiness. Perhaps if Wiseau ever makes another film, he'll be revealed as just an artless opportunist. That'll be a shame, but perhaps to be expected.

His film has been playing for almost 5 years in West LA once a month at midnight, and being out late, on the Sunset Strip, in a movie theatre, certainly adds to the insidious power of the film. For now, I'll be happy drifting along considering him in touch with something otherworldy. Something alien and special and wrong and outside the majority of us.

Not a lot of films make me feel that way nowadays.

Sunday, April 5, 2009

Contempt


Sometimes the conversation gets around to what good movies there are about Hollywood. We know most of the usual suspects, but which ones, we sometimes wonder, really captures the true essence of Hollywood and the moving-making dream machine.

They've been making movies about movies pretty much since they started making movies. Chaplin made more than one "Behind The Screen" film as early as 1914, trading on the assumption that the audiences already had an understanding of how the flickers are manufactured, on sets with cameras on tripods, fat directors, and piles of flour and styrofoam pillars just waiting to be thrown or tripped over.

The films have changed as Hollywood - and its perception of itself - has changed over the years. It's a continuum. Early in the sound era Hollywood was content to place romances or gangster plots on the backlots of the studios, using the soundstages as picturesque backdrops and getting the opportunity to stick a cameo or two in there. 1930's "Free and Easy" and 1938's "Crashing Hollywood" both have a jovial let's-put-on-a-show tone that belies our foreknowledge that both stars were drunken has-beens by the end of the years they were produced. 1936's "Hollywood Boulevard" follows a has-been silent actor and initially criticizes the fickle tastes and fortunes of those tied up in star-making, but ultimately devolves into a blackmail story as he tries to regain his respect, ending in a Hooray for Hollywood finale with barely an ounce of irony.

Things had changed after the war, and long after the Arbuckle and Normand scandals and a couple versions of "A Star Is Born," Hollywood is producing (or allowing to be produced) such poison letters as "Sunset Blvd," "The Big Knife," and "The Bad And The Beautiful," all haunted by a Budd Shulberg snark that links decadence, histrionic acting and an insider cache that made them seem new and modern. The entertainment media was reporting on stars, deals and how much money everyone was making, and the capitalist inspiration could not be ignored. Television cast a pale glow on Hollywood that made everything seem fallow and undead. We all hated Hollywood, in part because we wanted to get rich as everyone else with no talent had.

By the late '60s, there had developed a fatalistic resignation and sense of humor about how Hollywood corrupts - absolutely. A score of films including Charles Grodin's "Movers and Shakers," McTiernan's "The Last Action Hero," and Mamet's "State & Main" playfully and rather impotently tried to pierce the veil of deceit, ego and arrogance in Hollywood, none to much financial gain. Which meant the adage was true about people not wanting to see films about films. If we wanted to be told we were suckers for believing all the lies we were told, we'd become screenwriters.

A minor counter-movement tried to regain a sense of glamour and opportunity Hollywood offers in "The Big Picture" (1989), "Hollywood Shuffle" (1987), and even "The Muppet Movie" (1979). This would soon be balanced by a darker and excoriating trend started by "The Player" (1992) and continuing through "Swimming With Sharks" (1994) and "An Alan Smithee Film" (1997).

Except for "The Player," all bombs at the box office. And tellingly all the films minus the "A Star Is Born"s are basically pitched as comedies, most rather dark, bitchy, and absurdist. The topic may dictate the approach. Yet they're all, strangely, good-natured.

They're throwing darts at the thing they profess to hate but can't hide their affection for. People make movies about things they care about, and they care about Hollywood. The process. The struggle to do good work. To do any work.

My favorite documentation of the process is still probably Godard's "Le Mepris" (1963) from the increasingly cynical early 1960s (and increasingly cynical Godard) that folds Brigitte Bardot, the Odyssey, Hollywood hubris, Fritz Lang and the French New Wave all into a concoction that gently and effectively skewers Hollywood process, and all the drama that happens behind the scenes when people are trying to make art.

And as a bonus, it has those loving shots of Bardot's naked butt, which were inserted at producer Carlo Ponti's insistence, which Godard followed to the perverse letter, creating an abstract, discordant, beautiful and visual non-sequiter as she and Piccoli talk, a color-geled panning shot up her half-draped behind for what seems like the first 10 minutes of the film.

That imagery colors and inflects any amount of wry commentary that might be thrown our way for the next 2 hours. I believe Sofia Coppola was up to the same trick in "Lost In Translation."

It's the perfect example of how films are not borne whole but are the sum of their discordant parts, regardless of the inspirations ("I want you to have more shots of Bardot's rear in the film!") that get them there. It's an essential ingredient in what could have otherwise been a bitchy avant-garde satire of Hollywood, but instead suggests character motivations as well as all the ways films arrive at our multiplex, suggests something more real and life-affirming than Hollywood back-stabbing.

Now Bardot's derriere is an essential part of Godard's love poem to cinema, to montage, and to the viewer's fantasy of Hollywood and picture making. Yes, they call it show business - but they also call it show.

Wednesday, April 1, 2009

Hypermodern


The Archives Trilogy - Part Three

_ _ _ _ _

We've reached a stage where we have access to anything we may want... or think we want... without going to other people or other places to get it.

We have moved beyond the age of post-modernism, in which everything is ironic, disconnected, and self-aware. Classical meaning is undermined by the post-modern, and the ability of archives to curate and contextualize is undermined as well, specifically by the audience's disregard for the effort.

Now post-modern meaning has been undermined as well.

We've arrived at what can be described as "hypermodern." Experience is stripped of context, even ironic context, as everything is everywhere and all is available and within reach. Nothing trumps anything else - there is no authority or final word. Only comparison.

We no longer are limited by the burden of the past or of geography. We aren't defined by politics or our access to goods. We travel through the virtual realm of social networks without being social. Online communities that have no tradition and no memory. Political "clouds" that have no force. Because they have no mass. All access, no opinions.

We're alone and by ourselves - together.

The technological advances have lured us away from the comfort and awkwardness of social situations where we go bowling with bosses, or negotiate baby-sitting, or talk face to face about what is wrong, really wrong with "Slumdog Millionare" and can you back that up, mister? We don't go out on dates - we hook-up online, tweet at the club, and bang in the back of the roller disco. We're reduced to our individual i.p. addresses. And there are no consequences.

We are all travelling in the same direction, but in our little boxes. When there is no social ramifications of actions, do actions matter? And if there is no link to tradition or history, there is no reason to preserve or value it. When context no longer has meaning, archives - specifically designed and charged with preserving culture - lose their purpose.

When people are no longer able to access the flow of tradition or the arrow of history, will there be a reason to go back to old films?

Cultural resonance is embedded in what passes before us. Advertisements blend with reality television blends with post-anti-neo-architecture. Fellini and Bogart don't appear in anyone's remixed postings (a cultural force no one's archiving either) as much as Stephen Colbert. Sergio Amadio nowadays makes the cut by virtue of Italian democracy. Or is it Gloria Guida?

Do we fight that or bathe in it?

A post-modern archive may attempt to embrace the new convergent intelligence of the hypermodern, by running into the oncoming traffic of re-mixed media, allowing digital access to non-authorized versions and recreating a social space around the artists rather than the artwork. The archives, unable to name, arrange, control or limit material, must ride the web of meaning, not try to assert a gravity at one end where too many strands have already been unraveled. Everything's free on the digital highway - intellectual property, advice, storage, context, and meaning.

So value travels back to the experience of spectatorship, not the ownership of objects, or the control of the information. Instead, merely a familiarity with it, engendered by YouTube and Bittorrent and delivering a pale ghost of the original impact.

Archives can't be the sole curator of culture anymore. The copyright holders have taken back their property. Culture comes to the people, not the people to the culture.

An attempt to embrace the art-space around the content rather than to limit, drive, or control the content, is how an archive of the future will remain a resource. A move away from the object and to the experience. From static to dynamic.

There will remain a need or desire to repurpose material for whatever the new digital delivery system will be . . . redoing what has already been done, over again, to the extent that our attention and what the finances available will bear.

That work is now to be done by others, unofficially and without authorization, in a vacuum.

For future users of archives, the destination has to be worth the trip.

Monday, March 23, 2009

(Mis(Re))Presentation


As digital projection systems slowly infultrate neighborhood movie theatres, it's worth noting that old-style mechanical systems have been pretty reliable for the last 100 years. That's a big reason why they're still around. 3 years ago when I worked with AMC, there was a digital system in San Francisco that broke down about once every 2 days. Maybe once every 10th showing.

That's a 10% failure rate. That's unacceptable.

I have been in enough movie theatres to be there when the film has broken, in the middle, rather than at the beginning (which strongly suggests user error and that it never got started correctly). I always presumed it had something to do with the shape in which the films came in, but having worked in theatres, I now know that even in the middle, it's more often the fault of what the projectionist did - or didn't - do that's at the bottom of it.

Of course there is no projectionist. Actual trained union-certified projectionists nowadays are a pleasant anachronism, and sure as hell aren't in your local multiplex. The platters installed hold the entire film on one horizontal reel and have allowed one candy girl to run 12 or 16 films at a time, pushing a button and walking away to the next projector. That's why if it's out of focus it stays out of focus. She's over in house 10, or downstairs putting hot dogs on the roller.

The theatres got rid of the projectionists in the '70s. Film prints get irreparably ruined about twice a year, and cost the chain $2000- $3000 or so each. The savings for one union projectionist was $45,000 annually. Do the math.

The key variable in that above equation is the word "irreparably." More of that in a second. Also hidden in those savings are the amount of bad will created when a film breaks down and stays broken. If you had an expert up in the booth, the celluloid wrap around the gears and electronic brain would be discovered early and un-done in a minute and the show would go on. As it is, if the noon show of "The Da Vinci Code" goes haywire, you just hand them passes and direct them down the hall to the 1:10 show.

It's showing on 4 screens, and the built-in redundancy lessens the short-term risk.

I remember a guy a couple years back when I worked in theatres who came back with one of those passes and instead went to "Lady In The Water." His anger at this misguided decision was mitigated by the fact that this broke half-way through as well, and he was relieved of the latest M. Night tomfoolery (until "The Happening" of course - more due next year) to go for "The Break-Up" the following week.

That dumped 3/4th of the way through as well, and I directed him to the free-pass line where he would collect another admission ticket that allowed him to see "Miami Vice" the following week.

"Vice" didn't break (at least, I didn't see him the time that it did) (it only seems like it with that cold opening) but he had managed to see at least 5 hours of bad Hollywood cinema on one ticket price, 4 parking fees, and an inordinate amount of bother and anxiety.

That's what going to the movies are about for him now. And I'm not sure he will be so easy to get back once he discovers Netflix.

People respond differently depending upon the movie. If they're enjoying it they'll patiently sit and wait, counting the minutes as the projectionist resplices or rethreads, polite but anxious. But if they've been hating the entire experience ("Crooklyn" comes to mind) the audience will revel in the opportunity to demand, nigh - insist - on their money back; NO they don't want to wait, NO they don't want a free pass... unless it's good for something else too; and when does "Crooklyn" leave? I'll be back one day after that.

When a film breaks or freezes, the frame in the gate melts and burns from the heat of the lamp, creating a spectacular 20-foot mandela onscreen. It's only a frame or 2 and you splice that bit out to put it right. But when the film wraps around the platter "brain" you have to cut through inches of stacked celluloid, wound around the mechanical feeder in the middle (the brain) and then get it feeding back in order.

I once cut chunks out of a print of "Snakes On A Plane" and threw them on the ground trying to free the film from a particularly nasty tangle of machine knotting. I then haphazardly spliced the pieces back together, and the mish-mash of intercutting seemed to go with whatever the hell was happening at that time, in or out of order. No one at least ever mentioned an apparent avant-garde editing strategy during that one portion of the film.

I'm guessing all the pieces were right side up.

These brain wraps tend to reach critical mass late in the films; after they've been slowly wrapping themselves tigher and tighter for an hour, working themselves into a tight ball of acetate plastic around the brain, until the film finally stresses and breaks at the 3rd-act mark, just as the hero has begun to enter the villian's lair. By that time, there's 4 inches of film wrapped around the inner roller-set of the platter, and when it stops it means it. Someone's going to be untangling celluloid for the rest of their shift.

I never saw the last half hour of "The Italian Job" - I presume they got away with the money (or perhaps not - there's a sequel coming). Maybe the next time they'll get it right. But only if the projectionist does.

Thursday, March 19, 2009

New Model. Original Parts.


In an amazing marketing move, Universal is selling the new "Fast & Furious" as the old "Fast & Furious" - no Roman numeral 4, no subtitle - not to fool us, though that may not be unintentional, but to remind us of what we think we like about the series.

Vin Diesel has always been the best thing in his films, but that depended in a way on the films being pretty bad. As soon as he moves too close to the camera or up the credit list ("XXX," "A Man Apart") his power diminishes proportionally. But I still remember "discovering" him in "Boiler Room" and "Pitch Black." He was on the sidelines, with more charisma in a look than Spielberg could pull out of Matt Damon with more screentime and 10x the budget.

While audiences may think the "F&F" franchise has drifted with Justin Lin's previous installment, the only thing is to go back and make what's old new again. It helped that Diesel is flirting with straight-to-video films now (an ignoble fate for the guy who stole the original "F&F" from a bunch of really cool car chases) and is exec-producing this one.

And in a telling detail, Lin directs again as well. It's not about who's behind the camera but who's in front of it. It won't matter if this is a remake of the original or a new plot taking place in Singapore (as IMDB would have you believe) or in London, as the above (premature) pre-release poster toyed with.

With the original 4 actors together again, they should have gone with a Van Sant shot-for-shot remake. The original's only 8 years old, and no classic - not only is everything old new, everything new is new.

I just hope they don't give Vin too much screen time. Less is more, at least in his case.

Tuesday, March 10, 2009

Nobody Knows Anything


(The "Jagged Edge" story.)

William Goldman is a famous screenwriter who famously said the title line up there, in reference to Hollywood and the moguls who try to predict what will be successful. But we are not here to talk about William Goldman, the celebrated writer of "Marathon Man," "Absolute Power," "Magic," and "The Ghost and the Darkness."

We are here to talk about Joe Eszterhas, the celebrated writer of "F.I.S.T.," "Sliver," "An Alan Smithee Film," and "Jade."

It's a fact that Eszterhas has never written a quotable line of dialogue, except maybe a couple from the risible "Showgirls," and I'm not sure who's responsible for those drunken binges.

Eszterhas's heyday was in the late '80s, when he was paid a series of increasing rising paydays after being involved in "Flashdance" and "Jagged Edge," getting the (still) astronomical $3 million for "Basic Instinct" (which was beaten 6 months later by Shane Black for "The Long Kiss Goodnight" sold for $3.5m and a producing fee). Eszterhas managed to beat that, in a way, by selling the pitch for "One Night Stand" on the back of a napkin for about $1.7 million. The thinking is that he could expand that to at least 3 or 4 napkins and make the new record.

All of Eszterhas' scripts are similar - a (usually) woman enters into a dangerous yet seductive previously unknown aspect of her own past, and finds out she either loved or had faith in an institution who/which did exactly the opposite of what she expected. She's in love (or had faith in) the morally reprehensible person/thing and sometime during the running time is naked or making love a lot.

In "Jagged Edge" lawyer Glenn Close is defending and falls in love and into bed with the slick and attractively rich Jeff Bridges, accused of murdering his wife. The slimy and unconvincing Peter Coyote tries to tell her not to mix business with pleasure, and at the end hard-scrabble detective Robert Loggia (right out of a b-movie) kills the masked killer who's trying to steal the last bit of evidence that's kept there in Glenn Close's bedroom.

Loggia pulls the mask off the rain-soaked dead killer and reveals, it was, indeed, Jeff Bridges the entire time. Shock, aha, it all makes sense now, "fuck him - he was trash," love sucks, the end.

But - master director Richard Marquand has a shot of Bridges, dead, wet, in shadow, and upside down, that doesn't quite look like him on screen. Many people in the audience "thought" it was him, but felt in a way that the visual confusion of the shot indicated that it wasn't. By not getting a 100% convincing shot of Bridges, dead and in full focus, they took that to mean it wasn't him.

They came out of the theatre (I worked this film) saying, "But who was the killer?"

"Jeff Bridges."

"But it doesn't look like him."

"It was. Who else could it be?"

"It could be no one else. No one else makes sense."

The film takes as its primary mode of pleasure the teasing of us suspecting and deducing that the only possible suspect is Bridges, but tying a series of unlikely alibis to the 1000-watt charm he was capable of generating at the time ("The Big Lebowski" in a way hurt his career in that he made it look too easy. We figured out he's walking through these roles, may very well be high or drunk half the time, and picking roles in which that works as his method.).

The film delivers to the audience exactly what they wanted - but not in the way they wanted it, a solution that's askew and worrisome (and undermined by our own insecurity about the people we love, often for the wrong reasons). Eszterhas in his prime confounded us and teased out an anxiety that was for the most part intellectually satisfying but didn't deliver emotional closure.

Tied to the right director, the texts remained "open."

It's a fine line to walk. Testing and re-shooting the ending to "Sliver" revealed a complete misunderstanding of how audiences engage in these high-trash films.

Once a patron walked in 5 minutes late and asked me what happened in the first couple of minutes. I replied, in spite of myself, that Jeff Bridges had killed his wife, and now Glenn Close, who didn't yet know it, was trying to defend him. The patron thanked me and watched the rest of the film, perhaps enjoying it in a completely different but equally valid way right up to the non-surprise ending.

And they wouldn't have been confused by that last shot of Bridges on the ground.

Closure. Now that's an ending.