Sunday, December 13, 2009

The Comfort of the Cut


As a new generation of horror films change what we scream and cower at - or at least, get nervous watching on our couch at home - the techniques horror films employ have to adapt as well.

We are increasingly used to streaming and handmade handheld filmmaking on YouTube and reality television-based weblettes. A POV shot follows the protagonist through their homes, waiting for the reveal as it rounds the corner to show the mess in the living room, the girl undressing in the back, or the boat stashed in the leaves on the island.

Our response to such imagery is a combination of suspense and discomfort. We're not sure we're following the right people, that we're in good hands, that the plot will unfold satisfyingly - it being shot on-the-cuff - or even resolve, or is resolvable.

A sense of mediation in constructing film narratives gives us cold assurance, even as the films may be out to fornicate with our heads and create anxiety in such sensual blunderbuses as Saw, Irreversible and Funny Games. Yet such horror films (all stemming directly from the overly formalized experiments of Carpenter through to Tarantino) create mood, tension and plot by careful and methodical arrangements of shots and compositions. They all spectacularly show us what we want to see, but on their editorial terms.

They're not accidental. We get tense as we watch a character walk through the dark basement, the camera directly in front of the heroine, in too close - backing up, and we're unable to see what is ahead of her (although since she's facing forward and continues to walk, we assume there is nothing largely or visibly dangerous in her view ahead. Although whatever that may be is behind us and out of "our" sight.).

When the camera is instead behind her, following and in close as it moves with her, we may be able to see what's ahead of her (as she does) but unable to know what's creeping up just behind (besides us, and a steadicam rig. We want her to ... turn around!).

These are specific constructions that reveal to us visually only certain information and perversely withholds other (including the cat that is just about to jump or be dropped onto the heroine's shoulder). A well-constructed horror film manipulates and withholds, teasing out bits of eye candy and gruel in careful measure and juxtaposed in blunt and spectacular manner.

We enjoy this grand guignol display. When it works we give over to the sensual roller-coaster of confrontational and impossible logic, with a hidden design of frustration and surprises that reveals itself only in the process of our enrapture if not in the quoditian denoucement of the plot. A solid story well told is its own reward, and we laugh as we exit, wiping our flopsweat off our brows and telling ourselves it's only a movie.

The held shot, in suspense, is the single unit of horror that challenges our understanding of the underlying design, waiting to commit to an edit to which the intent of the protagonist is finally exposed, when he decides to go into the basement; when the filmmaker will withhold what's ahead and not behind; when the spectator must accept that what's about to be shown is texture rather than text.

The edit creates order. The edit, a collision of images, is a sequence like a sentence, constructed logically. It is in the edit that tension is created and then released, a question is asked and then answered, in which options are abandoned for the one that is chosen. The spirit becomes body, God becomes the word, the word becomes flesh. The comfort of storytelling becomes manifest. Seeing is good.

A handheld and undisiplined camera troubles us. We simply can't get our bearings. We wait for the angle to change so we can see more. Once we see past Jamie Lee Curtis to what's behind her, we discover how close the monster is, what the space is between them.

We wonder about the how rather than the what. The cut is comforting. It releases us from a film's grip, for an instant and instantly, in the space between 2 frames, and for a moment, we can relax.

Friday, November 27, 2009

History Will Be Written By Nobody


The historical record is the most important object that civilization probably creates. It's not a discreet product or manufactured building or monument that is pre-ordained, pre-determined, or pre-meditated.

It's not a cultural mandate or steered agenda. It's not controlled and it's not finished. The historical record is made up of millions of memos and emails, hundreds of thousands of news stories and video feeds. Documents and bank statements and journal entries and tape recordings. Photographs. Paintings and graffiti and poems and testimony.

Evidence. It's authentic and it's honest and it's made for reasons other than historical reasons, which is why it is so valuable. It's not worried about how it will look 100 years from now; it's worried about now. It all survives as a cumulative and infinite monument to who we were and what we cared about so that the culture of the future will understand how we lived, why we lived, what we were trying to discover about the world and about ourselves.

The most important primary resources of the 19th century frontier life were the hand-written letters that were saved by the pioneers. It was a big deal to get a letter in the old days, and endless minutia were relayed in those pages, which still exist today for historians to discover how things were in the summer of (18)49, how much bread costs, where the roads were being laid down by who's property, who sired what children.

This everyday discourse isn't written down with pen and paper anymore. It's hiding in emails, Facebook news feeds, or Twitter. Its sheer amount - and the perception that it's all so very unimportant noise - precludes anyone from wanting to save it, or being able to, certainly not the people who first created it. Facebook isn't archiving their site... except to mine your data to place ads. While someone' s grandfather may still be printing out all their emails, no one I can imagine is printing out all their friends' status updates.

You won't be able to pull your tweets out of a shoebox under the bed in 100 years like you could a box of letters. The vast amount of social interactions are now taking place between those iPhone IMs and Google docs and whole new generations of us will never commit our diaries, business contracts, family photos, geneology, or bank transactions to anything other than the cloud, up there on someone else's server, where no one's saving it for the sake of its historical value.

Only for its financial exploitation.

We all have stories of that hard drive that crashed last year and lost the pictures of our trip to Disney World or our Aunt Lora, who's dead now and we'll never see what she looked like the last 10 years of her life.

We're likely living in a digital dark ages, right now, and in 100 years we won't be able to know who our friends were, what we said to each other, what roads we travelled next to what properties, how much we made or who sired our children. All, uncommitted to long-term storage and without true historical custodians, will be lost, along, I'm sure, with this post.

Thursday, November 19, 2009

Would You Buy A Used DVD From This Man?


The good news is that we, as a race of consumers, have finally figured out that we really don't need to buy every sell-through DVD in stacks at Best Buy, in spite of any value-added deleted scenes or alternate soundtracks. The DVD of Don Roos's "Bounce" had 120 minutes of stuff that was deleted from the final version, longer than the running time of the film itself.

I for one would have loved to have seen the integral version, all 3 3/4 hours edited artfully together.

DVD sales have dropped over 10% last year, and are falling faster this year as consumers figure out how they want to consume their media, either by paying for 2-disc/special edition box they may not want to watch more than once or ordering it instantaneously on their increasingly hi-def devices in their living rooms.

Blu-ray can go home. It's estimated that 20% of people watch some sort of video online daily. Whether it's Hulu, YouTube, or Netflix's streaming, it's clear that consumers aren't beholden to the old model of buying individual widgets anymore. I remember the days of walking into the used DVD aisle in Amoeba and seeing literally 100+ used copies of Cast Away, all for less than a quarter the original price. The disconnect between our need to "possess" a cultural event (which Cast Away arguably was, at least for a month) and realizing we had woken up with the hangover after having drunkenly overindulged was clear to me then. The chilling feeling that our pockets had been picked when we weren't paying attention made us want to just get rid of the evidence and take a long hot shower.

We will have more access to more video and other filmed entertainment once broadband reaches to every corner of every coffee shop, to every device large and small. Quality will depend on what we're watching and where. We won't have to buy director's cuts of films that had no directors in the first place or collect deleted scenes just to be completists, a Sisyphean quest in that it's like trying to collect everything that isn't there.

What we will be buying access to all this stuff. It'll be in the famous "cloud." It's up there, somewhere. That means it won't be on your shelf, and that also means you won't be in control of it. Consider it Web 3.0. While the last iteration was nice for all you home-brew radio jockeys who got off on changing facts on Wikipedia and remixing Lawrence Lessig, now the corporations have a chance to feed you the films, the videos, the songs, the content wirelessly onto devices they are building to make sure their content plays just for you.

And plays just from them. That's the bad news. No more all-access t.v.s, radios, or computers. (Or even, iPods, a more restrictive but still relatively crackable container, in part because the songs are objects that can move and morph fairly easily.) Set-top boxes like Apple TV and Netflix's Roku are the beginning of the movement to get Trojan horses into your gates. DRM'd, all of them. Disney has their KeyChest scheme and an impressive handful of other major companies have announced DECE. Best Buy and CinemaNow are in cahoots to build and sell and fill these devices in the short term.

This paying for access through a box we don't have the keys to will effectively replace cable eventually, as well as the easy ability to TIVO or record these things off the "air." Disney, as you may have read, is promising to allow access to any film or program you "lease" from them forever.

Forever is a long time. I'm not sure I believe that.

As we, as a race of consumers, let these big players in distribution take charge of where we get our content, and how much we pay, we have lost an important part of out rights to choose, to browse, and to do with what we want with the content we (think we have) bought. Even it it's copying the damn thing onto our drive to remix it and selling the original to Amoeba.

Sunday, August 9, 2009

Spectacles Public and Private


Movies seem bigger than ever and less relevant than ever. We're not falling in love with going to the movies. Because we don't go, certainly not as often. They're simply around too much. In too many sizes. "Star Trek" notwithstanding, and even that feels like a t.v. show that will translate well to my iPod.

The common lament since about "Star Wars" is that filmmaking stopped being an artform (as if it ever really was) and became only about selling tickets. No more cine-clubs discussing Bergman, Fellini or Pakula. But looking through the nostalgic fog of a past we read about but didn't live through, show business is about spectacle and always has been.

From the earliest days, the hits are those that are the biggest events - the ones that get our attention one way or the other. By electrocuting Jumbo, having sound for the first time, being in color, louder, more expensive, by simply being a new take on an old story, better.

Spectacle grabs people's attention. "Transformers" and "Harry Potter" would be at home in a theatre in 1977, but they're wrapped in 2009 digital fireworks. They're not so much films as controlled burns. The aggressive retro-new excess of something like Scorcese's "New York, New York" was its own film-nerd spectacle of its day, artschool indulgence writ large.

It didn't help anyone's career. It didn't help anyone other than the critical studies majors. But at the time it drew its own attention. Worth doing if not worth the price. A conundrum when we interrogate what and why studios produce what they do.

Nowadays business decisions take the ego and arrogance out of the equation. New modes of delivery mean new modes of audiences. What's old isn't new - it's simply new.

The spectacle is the way in which it is engaged in, modern, digital, and transformative. The content is less important than simply that there is some.

It's a bottom-up shift, driven by the public who simply don't buy a movie if they don't buy the hype, or buy a ticket in spite of all indications to the contrary if it's what they want to see. The studios are playing catch up and realizing the old ways aren't going to work much longer.

What is available always eventually reflects how people watch moving images. Soon, portably and in chunks, in low-definition - and most fatally - casually. Films won't matter anymore culturally because they won't have a cultural impact. Film will become the moving wallpaper of science fiction.

There will be space for spectacle, for CGI-candy. But Bergman and Pakula is over. They don't translate.

Some will appreciate the past and enjoy it privately. Maybe find a handful of other enlightened individual believers. We will not be watching the same screens.

Thursday, July 16, 2009

Teen Scream


Teen comedies have changed over the last 10 to 20 years because teens have changed.

Of course. The audience is what dictates what's produced, because if a movie shows in an empty theatre, does it make any noise?

Around the '70s and '80s there was a rash of films about teenagers spying on other teenagers. "Private School," "Fast Times at Ridgemont High," "H.O.T.S," and of course "Porky's" all depend upon sometimes complicated setups in which teenagers attempt to steal glimpses of the opposite sex undressed in semi-private situations (and often end up naked themselves).

At the movies were the only place you could see what a naked person looked like (besides from fine art books) and drive-ins became the preferred and privileged site of such voyeuristic pleasure by teens. Often for more than just what was on the screen.

In burgeoning age of cable and video, it became easier to experience what was forbidden and withheld. Teen comedies continued to be produced, but they were increasingly out of touch with how teenagers acted and what they wanted - they shifted from a life-style accoutrement to the exploitation they frankly were. I seem to remember some Brendan Fraser films in there somewhere, and the ubiquity of video didn't do teenagers any favors. The increasingly parent-safe "10 Things I Hate About You," "She's All That" or "Clueless" are all based on classics - yet they still feel like your pocket's being picked by 50-year-old men in shark-skin suits.

The "American Pie" movies returned to the earthier trends of the '70s with a knowing, post-modern tone and less desperation in the need to see skin. They simultaneously went farther sexually and embraced a Farrelly Brothers sweetness (which continues through the Apatow comedies) that makes them both controversial and conservative. Now that anyone can see anything online, teen films are no longer merely about the struggle to catch glimpses of naked people, let alone to get laid. Now they strive to make it mean something more than the smarmy sniggling innocence of "Porky's" would have you believe.

The teen films of the '80s are hopelessly dated now, but capture a specific time in everyone's development when being alone with your lust and fantasies was allowed and comodified.

Teens may have not changed so much but their modes of finding out about the opposite sex have. With the Internet and 100 channels on cable, the sense of discovery is no longer in a car, in the back seat, at the drive-in. In front of a glowing screen revealing secrets.

The emotional attachments, the physical and psychological changes we felt while viewing forbidden images (it's something out of "A Clockwork Orange") aren't there for a new generation.

Teen comedies (and sex comedies in general) are carriers of a different kind of information. They're too damn responsible. They're too damn polite.

Wednesday, July 8, 2009

Dark Archives


In film archiving programs much like the one I am in, what you end up learning is a lot more about library studies than actual preservation of film.

What's important now is not trying to find an extant copy of an old lost classic. Let's presume that most of the films that can be found have been... or are deteriorated to the point of past saving. Now "archiving" is figuring out how to present what's still around to future generations, and future generations aren't interested in going to museums.

What archiving means now is to learn how physical document-style record-keeping archives keep track of their stuff. It means cataloging, and creating metadata for the Internet.

Describing moving images with words is a challenge that has yet to be conquered. As machines and software get better at "identifying" what a film clip or series of shots is about, the more a human with some kind of cultural sense and taste needs to intervene and perform triage on the alphabet soup that's created. You can't describe the elegance of a match cut in Renoir with even two stills together on a webpage.

You can't capture the flicker in Marlene Dietrich's eyes. Or the swagger in Asia Argento's poise.

Yet everything is being streamed to us anyway, on the Internet in any form they can deliver it to us. We no longer can be concerned with the best possible copies. Now we are beholden to creating the fastest-deliverable ones. There are over 4000 35mm prints of Transformers: Revenge of the Fallen in existence. In 6 months when it hits DVD, over 3800 of them will have been purposely destroyed to prevent piracy (although it's already on the Internet in digital form). And by the time of the third Transformers film there may be no film prints at all - it will be delivered digitally to your local exhibition spaces.

Newsweek (or was it Film Comment?) was right: Film is dead. They merely announced it a couple of decades too early. Sure, the old classics (and not-so-classics) on film are still being saved, on negative if they still exist, forgotten in dark archives. The temperature is lowered and the lights are dimmed so no more damage is done, for that moment some time in the future when people care about film again and want to see actual light shone through actual chemicals on celluloid and reflected off a silver screen, rather than transmitted with the electronic glow of digital perfection.

The archives are quiet. Companies are releasing the same hits over and over again in newer formats rather than exploring deep into the canon. The industry is trying to shake as much money as possible out of people, but it's hard when everyone is getting everything in a reduced resolution and in small pieces, often only temporarily - and for free.

No one's figured out what to do when people expect so much more for so much less. The old business model of selling atoms people keep is being challenged and undermined forever.

We're in a profound period of transition, psychologically, culturally, financially, and philosophically.

Friday, July 3, 2009

Now


Has there ever been a film more review-proof than Transformers 2? The word is so uniformly and excoriatingly bad, not only from the egghead academic critics from such august publications as the New York Times and Aintitcool.com, but from our friends who saw it - to let us know it was so god-awful bad to try to, fruitlessly it turns out, warn us off.

That's a more immediate and direct kind of "word of mouth." From the very type of people who were predisposed to like it or were at least up for the dare and waited in lines (and there were long lines) on the 1st, 2nd, and 3rd days and paid for the privilege of being simultaneously bludgeoned by the effects and sound while being insultingly starved by a paucity of content or intentional nuance.

$200 million worth of people saw Transformers 2 (and that's only domestically) and if the word of mouth really has serious effect and the grosses fall 75% each week for the next 4 weeks its sheer momentum will still ensure it will finish closer to $1 billion by August.

Is it worth all the money that was spent on and for it?

It's not high art (and I submit, it is art) but rather, an instance of performance. A triumph of marketing, branding, of sheer hype and push. No one wanted this film who hadn't dodged the 1st one, but its existence seems to assert itself - as a kind of fait accompli - as an event by its mere monumental presence. It's being sold not as a continuation - as a sequel or even a deeper exploration of plot points introduced and hinted at in the first. Shia and Megan aren't anywhere to be found in the materials.

Its about being in line, surrounded by a hundred other half-drunk fratboys, screaming and "ahh"ing, and covering your ears. The digital billboards on the day it opened didn't even insult us by listing the title, as some desperate hat-in-hand attempt to sell tickets.

They merely say "NOW." That muscular blue and orange image was enough.

We get it. This is happening. You in or out? Where's the line?

Friday, June 26, 2009

Dogma


About 10 years ago the Dogme movement emerged from Denmark, attempting to assert a new stripped-down aesthetic in filmmaking. Filmmakers such as Lars von Trier and Thomas Vinterberg embraced a new straight-forward, honest (and presumably cheaper) mode of filmmaking which precluded real actors, constructed sets, post-synced sound or effects, all in an attempt to strip away the over-determined rules that inflected (and infected) normal picture making.

Only natural lighting environments were allowed to be filmed; no extra lights could be added. And only existing objects in real locations could be used. No props or guns or other genre elements to add visual "interest." It all had to be present and available for the filmmakers... or anyone. The idea was to capture the truth as it happened in front of the camera and record it un(pre)mediated as it occurs, with no subjective manipulation, no trickery, no egos. Truth at 24 frames a second.

They were unsuccessful for the most part. While this is an interesting approach to making films - and especially for ones that aren't documentaries - it makes for difficult, overly mannered yet loosely structured and finally rather restrictive results. Such avant-garde narratives - without artifice or production values - are an acquired taste. Without most of the tools of 100 years of filmmaking at their disposal, the dogme-ists paint themselves into an ascetic conundrum in which flights of cinematic fancy are by default precluded.

The last successful Dogme film was 2001's "Italian For Beginners" (and there's consensus that that didn't follow the Dogme vow of chastity rules to the letter either). Yet the spontaneous no-production-value aesthetic has been embraced by a new generation of filmmakers. It's a reflection of our familiarity with streaming videos on YouTube and our small personal devices, lo-fi but authentic. Such above-ground hits as "The Blair Witch Project" and "Quarantine" (by way of "REC") appropriate (if don't rigorously follow) the Dogme ideals of hand-held cameras and off-the-cuff shooting in natural, real-world settings with a documentary narrative drive. J.J. Abrams' "Cloverfield" also uses the videocam reality-t.v. model to great effect, tapping into our voyeuristic tendencies.

(Although it's likely 80% of that film is fake, manufactured by CGI in post.)

Interestingly, and tellingly, all these are horror films.

The Dogme '95 movement was an articulated attempt to capture the spectacle of the real, in unmediated and unfiltered visual terms. It turns out that mode of filmmaking is discomforting.

We like a little artifice between us and reality. The spectacle the camera captures, when allowed to film uninhibited and unfiltered, is truthful, perhaps - but also (or therefore) profound, scary, intense, forbidden, and a bit horrifying.

An unintended progression of those Danes 10 years ago.

Saturday, June 20, 2009

Independent Days


What I really want to do is direct.

What everyone wants to do is direct. Everyone's a closet moviemaker. Everyone's a comedian. Everyone has a screenplay in their bottom drawer, but no one's heard of anyone they know actually making it in Hollywood.

I went the independent production route myself. You get some friends together, scare up a couple thousand dollars, a film camera and shoot your clever Tarantino/Linklater pastiche, convinced that since it costs so little, there's no way it can't make money. The video store is full of them. Why not add to the noise?

We've all heard of the independent filmmaker success stories. Make a film in a weekend (or over 3 years) and it sells at Sundance for $3 million, and the next thing you know you're hired to direct the Luke Cage remake. They know what you can do with pocket change, so only if he had some real money....

It's an elegant theory. But it's disingenuous. For every Bryan Singer, there's a thousand Jacob Freydont-Atties. For every David Gordon Green that (eventually) gets pulled into the majors a dozen JP Allens remain unknown. Hundreds of films get submitted to each of over 200 festivals in the US every year (and that's just the features) and even of the ones that are selected, it's likely their first, best and last showing are at these festivals, never getting a distribution deal, or even ending up on DVD except as souvenir home burns for the cast and crew.

There are more movies out there than you can ever find out about. More people want to make movies than the industry can possibly gainfully employ. If you don't believe me, ask yourself how many times you've heard someone say words along the lines of "You know what would make a great movie?"

You've said it yourself. Everyone's got an opinion, and you know what they say about that. We think we can do it better, and perhaps we can. We'd do anything to be in pictures. But it's not just about having a better idea. It's about being in the right place - at the right time, with the right people surrounding you, and often with the right amount of money sitting on the table orphaned and waiting to be invested.

Financing is all - more projects come to fruition because they've been paid for than because they need to be told. Independent films always have a hard and schizophrenic life. They're borne of passion and necessity and wear their sponsor-less authenticity as a badge of honor, the entire time putting on airs to convince they're more than the backyard make-believe they are. They push the envelope and defiantly resist categorization and (often) coherence, because that would be selling out.

Yet they exude a needy greed to be loved, because ultimately they can't afford to piss off their audiences or their producers, and end up playing to the cheap seats, simultaneously wishing and fearing a state-funded co-opting or, at least the perceived notion of one - pursuing and risking a Kurt Cobain-ian reduction of street cred as the zeros multiply on the residual checks.

Even Ron Howard started as a seat-of-your-pants go-for broke exploitation director, which in a way is still reflected in his gilded work on "Angels and Demons," done not for art but to assert his position in the industry. A $200 million budget, completely competent and completely forgettable, reminding us that there is never enough resources of the right kind on any picture. It's the difference between the first "Terminator" and the second, the difference being a budget 10 times the size so aesthetic challenges aren't solved, so much as financed to death.

Ridley Scott makes one movie a year, and while we can discuss the vagaries of his depiction of the CIA in "Body of Lies" or "Matchstick Men," we'll never see the mad independence of "The Duellists" again. In today's environment, the list of directors able to generate a meaningful body of work is extremely short. Bad penny Terry Gilliam and rock star Martin Scorsese still can't put together the projects they really want to do. Scorsese had defaulted to music documentaries, which are probably the level of fight he's willing to take on nowadays.

And what about the filmmakers that didn't have the fortune of having worked with Robert DeNiro in their early careers? Who had a unique voice but couldn't sell a ticket? They've moved on to shooting cable shows. Or pulling cable.

Or writing for cable. Or writing work orders for cable installation.

Being independent comes with a price. By the time someone offers to pay the bill, you're already face-down in the pool.

Friday, June 19, 2009

Data, Metadata, and Statistics


Digital objects exist in a different way beyond mere objects in the physical world. They're created and the information by which they are described is added to the object, so it can be found.

This is "metadata" - kind of like the stuff that gets stuck to your shoe that you simply can't rub off.

Every digital object collects this as it moves, gets copied, is altered - even deleted. No fingerprints remain invisible. (Yes, even an object that isn't there still declares itself, if only by virtue of the fact that it is no longer present.) Lots of times metadata is intentionally added to an object. Titles, dates, to-do lists ("Delete after end of quarter," "Save for blog," "unused takes").

But just because this digital object has collected all this extra descriptive information doesn't mean it's the better for it. The object becomes larger as it travels, and it costs time and energy to preserve all this stuff on the object, not just the object itself.

And just because there's all this new information on it doesn't mean it's good info. Much of it may be wrong. Or incomplete. Or mean different things to different people, programs, or systems.

The signal to noise ratio begins to change. And just because it's all info about the object itself also doesn't mean that it's metadata, either. Maybe the info is part of the object's creation, but doesn't actually describe it. It might not be about the object, just riding along, attached accidentally or through someone's alterior or altruistic motives.

Once an object collects information about itself, that doesn't mean it should all be preserved with the object.But figuring out what belongs, what might be needed in the future, and what's merely a parasitic piece of code costs resources to deduce.

Not all metadata is created equal. It has a lifecyle, and some becomes obsolete at a certain point in the various iterations of the object, as it moves from VHS to laserdisc to DVD to Blu-Ray, for example. Just because you got it, just because it's right, doesn't mean anyone's gonna give a damn.

Metadata lives and dies and people get paid a lot to create, preserve, and migrate it. But it's invisible and of unknown value. So we spend more time worrying about it than what it is describing. We shouldn't lose sight of the underlying artistic creation that makes it necessary in the first place, in this world of digital access. A page of poetry or cut of music, a clip of film that people fell in love with, 100 years ago. And maybe 100 years from now.

In the future there will be no record players. You'll want to be able to find Miles Davis, won't you?