How Do YOU Decide What Movies To See?

6 04 2008

Anne Thompson, in last week’s Daily Variety, has a column about the recent spate of firings of newspaper and magazine film critics. She makes some valuable points about how her students at USC can’t name any film critics besides Roger Ebert (thanks to his television show). Contrast that with the era when names like Pauline Kael and Andrew Sarris were known for their reviews and their theories.

While I don’t disagree with her facts (and, as a former film reviewer, I have a certain sympathy for those people who have to sit through five or six horrible films a week and then write about them) I find her conclusions both obvious and unregrettable.

Younger moviegoers are fickle; they’re just as likely to play Guitar Hero or download episodes of “The Office” from iTunes. And the studios, for the most part, continue to bank on short-term, wide-release youth movies vs. riskier, execution-dependent movies for adults.

Thus, as boomers age and their subscriptions expire, the increasingly desperate economics of newspaper publishing are forcing aging movie critics out the door. And younger ones too.

We hear the same lament from studio heads and a plethora of old media types. The democratization of the media also applies to critics as well.

These students — and today’s youth auds in general — more often get their movie info straight from the studio marketing departments, who couldn’t be happier. These kids go to YouTube, Yahoo Movies and Apple to find trailers. As they surf the Web, bits of movie flotsam and visuals planted by the studios on MSN Movies or IGN or JoBlo eventually cross their eyeballs. But they also listen to their friends more than any authority figures, and distrust obvious studio hype.

I don’t know about you, but I find that holding up Sarris and Kael as examples of all film critics is like pointing to Hank Aaron and Mikey Mantle as examples of all baseball players — both major and minor leagues. In fact, I’ve only once found a daily film critic who could tell me anything about a film that was illuminating — and Art Murphy is no longer with us. I also find Elvis Mitchell’s interviews/critiques of films on his KCRW show “The Treatment” to be amazingly insightful and educational. Most film critics are really no more than reviewers, rehashers of basic plot and opiners on whether they liked performances, cinematography or direction.

I’m not saying that I don’t like reading newspaper and magazine critiques of films. In rare cases, I can also use them for viewing decisions. But, in general, I have never used reviews (printed or otherwise) as a guide to help me decide whether I should see a film or not. I didn’t when I was 18 and I don’t now that I’m 108.

So, how do we decide what we want to see?

If you’re like me, it’s a combination a number of factors — subject matter, my mood at the time, the proximity of the theater, the creative factors behind the film (I’ll go see most any movie that Scott Rudin, Sam Mendes or Robert DeNiro is involved in), and how well the film’s and my schedule overlap. And, perhaps most importantly, what my friends and colleagues are saying about it.

I will sometimes see a movie before any of my friends, and then the other factors become prominent. But the so-called water cooler effect (in which a group of office buddies grouped around the water cooler creates buzz about any particular subject) is biggest in my mind. For years, publicity departments at the studios, have spent millions of dollars trying to create that water cooler buzz, to greater or lesser impact. I remember that buzz boosting BORAT but look what it did to THE POSTMAN.

The obvious point here is that the Internet, in general, and social networking, in particular, has become this decade’s water cooler. Reviews of films that I used to get from my neighbor, have now moved onto Facebook and Rotten Tomatoes. That’s no different than it used to be, it’s just larger and more ubilquitous.

Thompson makes two very cogent, and somewhat depressing points, later in the article.

Over the years, critics helped audiences appreciate the likes of Orson Welles‘ “Citizen Kane,” Alfred Hitchcock‘s “Psycho,” Stanley Kubrick‘s “2001: A Space Odyssey,” Arthur Penn‘s “Bonnie and Clyde,” Bernardo Bertolucci‘s “Last Tango in Paris,” Brian De Palma‘s “Dressed to Kill,” Robert Altman‘s “The Player,” the Coens’ “No Country for Old Men” and Paul Thomas Anderson’s “There Will Be Blood.” Where would we have been without them? It will soon be up to Pajiba or Cinematical Indie to influence readers to seek out small releases once heralded by critics.

and

There’s hope for critics at well-funded magazines: John Powers soldiers on at Vogue, Anthony Lane and David Denby compete for space at the New Yorker, Gleiberman and Lisa Schwarzbaum are well-read at EW, and David Edelstein writes and blogs at New York Magazine, which has invested heavily in an improved — and well-trafficked — website.

So, the issue of the problems of distribution of independent films, off-the-beaten track films, small niche films, continues to raise its ugly head. Now that we’ve got YouTube, how do those films get noticed? And, now that we’ve got the “thumbs-up, thumbs-down” philosophy, how do those films get reliably reviewed?

Of course, it’s all well and good to note that Thompson talks about mainstream films. Virtually no larger circulation newspapers reviewed Stan Brakhage films that I’m aware of.

But, in my mind, what Thompson is talking about, fits squarely in the middle of the same argument that we’ve all been having — how are the Internet and socialized media changing the world of old media, and what can old media do to keep relevant in this new world.



Death Comes In Twos

18 03 2008

Last year, when both Ingmar Bergman and Michaelangelo Antonioni died on the same day, it felt like more than a coincidence. It was as if some uber film critic was making a cosmic ironic comment on the state of movies today.

What, then, are we to make of the deaths of both Arthur C. Clarke and Anthony Minghella today? There is no cosmic joke here, just a sad realization that the man who gave us the book of 2001:A Space Odyssey and the man who gave us TRULY, MADLY, DEEPLY and THE ENGLISH PATIENT will create art no more.

Ben Kuchera, in a column in Ars Technica today, quotes the three laws that Clarke was famous for.

  1. When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
  2. The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
  3. Any sufficiently advanced technology is indistinguishable from magic.

The last of the three is famous, in and of itself. I often wonder, imbued with the good ol’ sensawonda, just how someone merely 100 years old can hope to internalize all of the changes in his or her lifetime. I know that when I emerge from the editing of a film and look around, it seems like the editing technology has drastically changed. A mere five years ago, a mention of the acronym DI would have gotten you stares of incomprehension (unless they thought you were talking about drunk driving). And that’s just in my small little neck of the woods, and in five years time.

Clarke (who has written “Against The Fall of Night”, “Childhood’s End,” “Rendezvous With Rama,” and “Islands In The Sky” in addition to the novel he wrote with Stanley Kubrick) has been writing since 1937 and, in that time, has created some remarkably detailed and plausible future worlds. Remember, when he started writing, the concept of launching anything into space was incomprehensible. The Internet? Not even a gleam.

Yet Clarke, and a few other science fiction writers at the time, managed to conceive of all of this, at a time when the magazines that published science fiction were more concerned with Bug Eyed Monsters and women in the clutches of monsters.

Now, that is a visionary.

Still, I’m particularly entranced by that second law, that one needs to go beyond what we consider possible in order to discover reality’s true limitations.

Speaking narrowly, there are two types of directors in the reshaping process in editing. There are those who will make big, broad changes early on and see what breaks. They will remove entire scenes, rearrange whole sections of the film, drop favorite moments and excise great lines Then they’ll see what absolutely needs to go back to the way it used to be (or, to be more precise, go back a little ways to what used to be).

There are also directors who will work in smaller incremental changes, slowly chipping away at problems until they arrive at a comfortable resolution.

Neither approach is right. Both of them work (though the second method takes longer).

My own preference is to make broad changes — to push past the possible into the impossible — and to see what works. It is axiomatic that once you take a scene out of a film, no one misses it. When you do, you know you’ve got to keep it in the film in some form. So, plenty of things that I’ve resisted changing for what I thought were very good reasons, turn out to be quite expendable in the long run. You never know what is going to work and what won’t (within reason). It’s a cliche, but, really, you never know.

So, Clarke’s second law has ramifications everywhere.

Anthony Minghella didn’t have Clarke’s same speculative fiction side of things, but he managed to blaze a few paths in storytelling and character development. The people in TRULY, MADLY, DEEPLY (the awesome Alan Rickman, years before the caricature he plays in the Harry Potter series) felt blindingly real. The story of a woman who cannot let go of her husband, after his death at an early age, the emotions that Juliet Stephenson portrayed were touching. Not because they were telegraphed, but because they weren’t.

THE ENGLISH PATIENT was a different canvas altogether. Those of you who have taken my Intermediate Editing course know that I play the Caravaggio interrogation scene to demonstrate the use of silence and sound contrast. Walter Murch is given credit for the concept but, as we all know, nothing gets put in a film without the director’s permission, and I’m sure that Minghella enthusiastically was aboard the beautiful use of sound and music to create the horrifying mood of the scene.

It’s that kind of collaboration that we all seek in this business. I know that Walter Murch admires Minghella almost as much as Thelma Schoonmaker admires Martin Scorsese. That comes from a respect for talent, of course, but it also comes from a realization that their directors allow them to do good work. These directors have the ability to step back and let their collaborators come up with ideas.

Not every director can open up that easily. The ones that do are worth their weight.

I will certainly miss the art that Anthony Minghella and Arthur C. Clarke created, even though it will live on — past my own death, I’m sure.



Editing Your Own Films

14 03 2008

Occasionally I like to veer off the path of this blog and head into media reviews. Just because I can. It’s my blog and I’ll cry if I want to.

One of my pet peeves, as an editor, is the director who decides to cut his or her own film. I rarely see that work. Most of my students at USC do it because “only I can really understand what I want for my film.” There’s so much wrong with that statement. On almost every level.

First, that word understand. I’ve worked with directors who can’t understand their own films on levels that differ from their original conceptions. But the key to having the film accessible to many people, as opposed to a masturbatory self-involved work, is to realize that the best films appeal to people on multiple levels — levels beyond their author’s original conceptions. In order to do that, the filmmaker needs to be challenged. He or she needs to be helped to see other points of view. In classical terms, it’s the thesis/antithesis/synthesis flow. An original thesis, when challenged by an antithesis, creates an idea which is better than either one individually — a synthesis of ideas that can bring the film to a higher level.

Peter John Ross, over at sonnyboo.com, wrote a piece in American Movieworks which tackled this issue and started with this introduction:

If you are one of those director that can look at the raw footage, or even edit a scene together, look at it in the context of the movie & make a decision to cut out one of the best moments the actor gave because you realize that the scene is erroneous THEN SKIP THIS ARTICLE. Or if you have what you thought was one of the funniest jokes on paper, and even if it’s not 100% great delivery, but you choose to use it anyway because it “might” be good, then please READ ON.

I could argue that John Sayles’ best movies are those in which he did not edit. I think that James Cameron is a better director of editing than he is an editor (when I worked with Milos Forman I was always impressed with his editing acumen, but equally impressed that he worked with other editors to get the best picture). I certainly feel that Robert Rodriguez has long needed an editor (and a cinematographer, but I’ll let people better versed in that art to take up this arguement).

And, even though I really liked the film NO COUNTRY FOR OLD MEN, I continue to feel that the Coen Brothers would have done better work if they had had someone to work with.

Now, I’ve never felt the strong pull that most people feel towards the Coen brothers’ films. I have enjoyed a few of them — BARTON FINK and THE BIG LEBOWSKI — but I normally found them too clever by half and, even in FARGO, more distanced from their characters than involving. I’ve enjoyed the laser penetration of Peter Stormare in FARGO, but I can’t say that I found any of the characters in their films worth spending much time with, aside from John Turturro’s tortured writer character in FINK, and the fun of The Dude in LEBOWSKI.

Now, NO COUNTRY comes along and I’m almost ready to jump over to their side, thanks to some amazing performances completely in tune with the story and filmmaking of the piece. But there is enough holding the film back that I doubt that I’ll ever jump over to the side of director/editors.

The shape of the lead characters in NO COUNTRY is particularly fine. Javier Bardem, well-deserving of his Academy Award, plays a character who is consistently driven, but seems well-understood by the filmmakers. Josh Brolin, while much more enigmatic and slightly drawn, manages to build a steady, interesting performance, even against Bardem’s juggernaut of a role.

I’m less entranced by Woody Harrelson’s and Tommy Lee Jones’ performance, however. I don’t believe that I need to have everything explained to me in order to like a film. Far from it. But I like to have characters who, in the words of a director I once worked with, “earn their moments.” To put it in another way, I want a character’s behavior in a film to grow out of what we know about him or her, not just because it says so in a script.

But that is one of the hardest things for writer/directors to do. They live inside their characters heads for so long, and have had so much discussion and interaction with the actors playing those characters, that it is extremely to see connections when they don’t really exist. It is way too easy to ascribe more to a look or a body movement then a normal audience would.

Even editors are prone to falling into this trap, though it’s one that we train ourselves to fight. In order to freshen our view of our films we use preview screenings. They help to ground us. When I worked on the movie HAIR, we had a screening in which someone, in the discussion group afterwards (we didn’t call them “focus groups” back then, and we didn’t have NRG Research to run them for us), mentioned that he “really like the part where Claude’s sister watched Treat Williams dance on the table.”

The problem was that Claude didn’t have a sister in the film. This audience member was confused. And while we’d never recut a movie based on one comment, if enough people can’t follow plot or character, then it’s time to look at what we know about our film.

The real problem for writer/director/editors is that there is precious little opportunity to have someone say “Wha??” There is less day-to-day input from the world outside the director’s mind.

And, even with some preview screenings and good producers (Scott Rudin may be the most interesting producer in the world today, along with Christine Vachon, in terms of the variety of projects he brings to the screen), the world of filmmaking just gets too insulated. Where was the person who asked the Coen Brothers to step back and see if Harrelson’s character went for caricature and plot, instead of real contrast to Bardem’s? Where was the person who discussed the shaping of the Brolin death scene, and how it impacted the rest of the film’s energy and emotion?

[As an aside, even though I didn’t like the choice, I’m not going to fault the film for its choice to hand off the film from Brolin to Jones two-thirds of the way through. But I am going to note that, the way in which was done, replaced one character’s more interesting search with another less developed one. It was an imbalance that the film never recovered from.]

In the best of all worlds, who would have been able to ask those and other questions about the choices being made? Who would have advocated for the audience’s side?

It would have had to been an editor. And that is what a good, honest, direct editor can bring to a project, that a director cannot. Not possible, not even close. Even with really really great directors.



Oscar Cynicism

25 02 2008

Cintra Wilson, over at salon.com posted a snippy (and, often, funny) review of the Oscars which takes the odd stance that, because people are still hating each other from the writer’s strike, we all went out and voted for foreigners to win the acting awards. Aside from the odd notions that:

  1. there are no foreigners in the Academy,
  2. American-born members can’t recognize value wherever they find it,
  3. we are some huge monolithic block that tends to vote in lock-step, and
  4. the Academy is an American-only instituion

    this completely ignores the fact that most of the other categories went to American born writers, directors, editors, etc. (well, not all of course, but Dante Ferretti can take home this and any future Oscars that we give out — he’s that good).

    However, she did get a funny dig about a fictional meeting between “Hollywood power brokers in $6,000 Brioni suits” as they… oh, hell, I’ll let her tell it.

    It must have been grim at that academy meeting, just a few weeks ago. No writers, just a bunch of liminal Hollywood power brokers in $6,000 Brioni suits sitting glumly around a large obsidian table in one of the Carrara-marble, earthquake-proof bunker-vaults deep in the ground under CAA, too depressed even to eat their grilled seafood salads.

    “Editors,” someone finally said, the idea light bulb suddenly reflecting off his hairless scalp.

    “Huh?”

    “Fuck the writers. They’ll all eventually eat each other like the Donner party. We have editors. This Oscars? We break new territory.”

    Eyes peer up hopefully through $3,000 Japanese glasses frames made of hammered titanium and hand-carved wood.

    “This year? All new: all old. We just montage the living shit out of it. Wall-to-wall montages of Oscar footage recycled from the last 80 years.”

    “Great.”

    “Thank God.”

    “Let’s go home.”

    Actually, I’m sure lack of writing time accounted for the preponderance of mind-numbing montages that were presented last night. (Though I should point out that even the more written stand-up routines often felt… well… unwritten. Or, at least, not written very well. But at least they beat most of Jon Stewart’s ad libs.)

    To, once again, quote Wilson:

    For nearly every major award, there was a montage of all 79 other winners from the past. In short: This year, Oscar honored the heart-touching magic of the film industry’s celebration of life by sucking every possible ounce of spontaneous life, marrow and energy out of the event by waterboarding it to the point of gag-reflex failure with canned montages.

    Wilson then veers off into strained argument, self-parody of liberal American, about how we all are self-hating Oscar voters.

    Not that anybody asked me, but I found that I almost yearned for the day of atrocious bloated staging of the Best Song nominees. Aside from the earnestness of the song from ONCE, the other four songs suffered the twin disadvantages of being both too glitzy and too boring.

    My biggest diappointment, however, is that ATONEMENT wasn’t completely shut out.



    Hyper Kinetic Editing — Part Two

    23 02 2008

     Elizabeth Shoemaker comments on my post on Hyper Kinetic Editing by asking:

    How do you feel about the use of multiscreen images to propel stories in television (i.e. CSI MIAMI). And, do you think that audiences will just adapt to the hyperkinetic editing? I had an experience years ago watching the classic “M.” Many in the group thought it was too slow. And I had to wonder if it’s because audiences are so much better at making leaps in story that it made them impatient for the movie to “move on.”

    Personally, I think that picture-in-picture/split screen/multiple screen editing can be quite effective if it is used to tell the story properly.  Split screen was used very early on to show two people on a phone conversation (much like 24 does today, though slightly less kinetically).  It seemed to take off in the Sixties, after Dupont, IBM and a number of other companies used it in the films they showed at the New York’s World Fair (1964-1965) — films directed by Charles and Roy Eames, and Francis Thompson (who I worked for in the late seventies, by the way).

    Check out the original THOMAS CROWN AFFAIR, for instance, to see a use of split screen that isn’t about telephone conversations.

    So, it’s not the fact that there is split screen but the fact that it’s used more energetically than before that creates the difference.

    On Elizabeth’s second point, there is no doubt in my mind that this rapid style of editing is both influenced by and has an influence on the culture that we live in.  It has long been pointed out that editing changed with the advent of MTV.  It has also been noted that the number of edits per 2000 foot film reel (about 22 minutes of film) has gone up since the introduction of digital non-linear editing.  It is also obvious to me that experimental filmmakers like Ed Emshwiller, Kenneth Anger, Michael Snow, and Stan Brakhage created a filmmaking style that made it possible for kinetic editing to move to the mainstream.  But I think that it is far from clear which is the chicken and which is the egg.

    I had an M type experience like yours.  A few years ago I decided to rewatch THE FRENCH CONNECTION, with the intent of using its famous car chase underneath the elevated train tracks in an editing class of mine.  However, when I took a look at Friedkin’s direction and Jerry Greenberg’s editing on it, while it still blew me away as amazing, it no longer seemed to be the frenetic, nausea-inducing editing style that caused many viewers to complain that they couldn’t watch it.  It had been too much of an experience.

    Culture changes and experimental cinema generally is in the forefront of that change.  Commerical film, on the other hand, always lags behind.  So, now that frenetic editing seems to be in every film short of a Shakespeare adaption, the audience is used to it.  But, in my opinion, it’s not because film is changing our sensibilities.  It’s because film is following our evolving sensibilities.

    If you take a look at many movies from the 30s, you’ll notice that there’s a lot of shoe leather — shots of people walking or driving from one place to another.  A character will say “I’m going home now.”  And then he will turn and walk away, open the door, go out into the hallway, get into the elevator.  We’ll watch the floor indicator descending, and then see him get out of the elevator and walk through the lobby to his car (always conveniently parked right in front of the building — were there always convenient spots in that era?) .  He’ll get in, start the car and take off.  After a shot or two (with a wipe between them) of the car driving, it will pull up in front of the house.  The man will shut off the engine, get out of the car and walk to the front door.  He’ll open it, step inside, and we’ll cut to him walking into the living room.  “I’m home,” he’ll announce.

    Here, in 2008, we’ll hear him say “I’m going home” and there’ll be a cut to him stepping into the living where he’ll announce “I’m home.”

    Audiences change.  Film eventually changes with them.



    Best Gazillion Movies of All Time

    18 02 2008

    I mentioned this back a hundred years ago, in the first incarnation this blog, but I thought it deserved another mention.

    USC apparently sends a list of movies that they would like incoming students to have seen and Mike Gerber published the list. It’s actually a pretty impressive list and I wondered how many all of you have seen. As for me, I’ve seen many of them (the films I haven’t seen are in italics — go ahead, razz me now):

    MOVIES:
    A Hard Day’s Night
    African Queen
    Alice in the Cities
    Alien
    All About Eve
    Amadeus
    American Friend, The
    American Grafitti
    Annie Hall
    Apartment, The
    Apocalype Now
    Apu Trilogy, The
    Band of Outsiders
    Band Wagon, The
    Barton Fink
    Battle of Algiers
    Being John Malkovich
    Bicycle Thief, The
    Big Lebowski, The
    Black Orpheus
    Blade Runner
    Blow-Up
    Blue
    Blue Velvet
    Bob le Flambeur
    Bonnie and Clyde
    Boyz ‘n the Hood
    Breathless
    Butch Cassidy and the Sundance Kid

    Read the rest of this entry »



    Predicting The Oscars

    18 02 2008

    Film School Rejects takes on the unenviable chore of predicting the winner of the Best Editing Oscars.

    Frankly, this is a fools’ errand (though I’m perfectly happy to have fools other than myself do it). I’ve been a member of the Academy for years now, and I can never figure out why one film gets the applause and others do not. I’ve sat in the midst of the Academy weekend screenings and heard the audience hiss and boo, and then watched as the film went on to get nominated. (It happened last year with DREAMGIRLS) I know that the films that I nominate or vote for, rarely get the award.

    That having been said, the site notes that:

    An award since 1934, the winner has often been films that have raked in plenty of other awards. It’s not always shared with Best Picture, but it usually comes out of that category. Seen as a technical feat as much as an artistic feat, editing is important to pacing, story and character. People may not remember all Best Editing winners (like Barbara McLean for Wilson in 1944), but more often than not, it’s known for honoring a major film.

    Then it goes on to talk about each film and describe why it might win, and why it might not. A sample of why BOURNE ULTIMATUM might win:

    Rouse isn’t new to the Oscars, although he hasn’t won. He was nominated for United 93, so he carries a degree of reputation. Also, this is the only film in the fray that fits the big-budget action style that this category often honors.

    Honestly, I can’t imagine that anyone in the Academy actually pays attention to who is nominated (what “degree of reputation”) the editor might have. Most of them barely can figure out what we do, much less what we’ve done before. In my experience, they often choose either the flashiest film (because they believe that editing is all about splicing), or the film that they liked for Best Picture (because — “Hey, I liked that film. So, I guess it was well edited” — actually, not a bad guideline now that I think of it). No one thinks of the oeuvre of an editor’s work, unless that editor is Dede Allen or Verna Fields.

    An example of why they think THERE WILL BE BLOOD might not win:

    Quite simply, the performances of Daniel Day Lewis and Paul Dano, along with the top-end awards that P.T. Anderson is vying for, might leave Dylan Tichenor in the dust. Additionally, it’s hard to award an editing honor to a film that runs as long as this one did.

    Last night I sat at the Eddie Awards, the yearly award given the American Cinema Editors organization. There was much talk of the subtlety of the editing in BLOOD and the success of the editing in NO COUNTRY FOR OLD MEN (there was much laughter at the picture of Roderick Jayne, who is the nom de montage for the Coen brothers, and sighs of relief when the film didn’t win). Ultimately, BOURNE won for Drama, proving that even editors are influenced by quick cutting. [An aside here, I also thought that the editing on that film was masterful — the scene in Waterloo Station is so intricately shaped that I smiled both times I saw it.]

    Film School Reject’s pick — BOURNE. No rejects they.



    Shaping Scenes — even if by accident

    10 02 2008

    Cristian Mungiu directed the Cannes sensation 4 MONTHS, 3 WEEKS AND 2 DAYS. This is a film that had me scratching my head during most of it. The direction is so formalist (virtually every scene is done in a single shot master) that, for me, it undercut the emotion of the characters. Many critics disagreed with me though, oddly, the Foreign Film Branch of the Academy pointedly omitted the film from its list of nominees this year.

    Despite the rigidity of the direction, however, a great example of editing did come through and Sean Axmaker, in an interview with Mungiu on his blog, highlights it in a very interesting way.

    There’s one scene in particular that stick out stylistically, with the two girls talking to Bebe in the hotel room, which is the only scene where you actually cut in the middle of a scene. You cut from the two-shot of Otilia and Bebe to a close-up of Gabita, where she realizes the gravity of the situation and what’s really at stake for Otilia and she tries, late as it is, to take the responsibility upon herself.

    Honestly, you are the first person to identify something which is a mistake in the film. That was not supposed to be like that, I can’t claim that I have an explanation for this. It only happened because I changed the dialogue that Bebe had to say and I needed to have it off-camera, that’s all. I don’t have an explanation for this. It doesn’t make sense, it shouldn’t happen like this.

    So, in order to solve a storytelling problem he chose to break his formalistic structure. That happens all the time. I don’t think I’ve ever worked on a film where we could afford to be dogmatic and rigid in our structure (is that where they got the term Dogme for that filmmaking manifesto?) (and that’s a joke, by the way)

    However, the next question and answer is particularly revealing.

    I feel that, because it’s the only time you cut in the middle of a scene, and it abruptly jumps into a big close-up, it brings the scene to her in a very powerful way.

    This is why I hope that this is why I decided that I will change the dialogue and go for this, but this is not what triggered the decision. What I wanted to do was to make sure that I never make a formal decision belonging to me as an author and not divide from what the characters do in the shot. If you watch the film from this perspective, you will see that there is no pan in the film unless there is a line by some other character or there is a movement in the shot triggering the camera into a specific direction. We were very much following what was happening in the scene, except in this scene.

    In other words, despite his claim that he would never make a formalistic decision separate from what the characters would do, if it wasn’t for the fact that he had to cut to her in order to change the dialogue, he would have blown off the possibility of emphasizing her emotion in that moment.

    I understand that there are many ways of emphasizing character and plot moments beside editing. In fact, my upcoming book, THE LEAN FORWARD MOMENT, is all about that. So I don’t think that he needed to make a cut all of the time. But this is a perfect example of form leading function, and it seems wrong in my mind. It also drives home, perhaps, why I didn’t respond to the film — since the decisions seems to be based on form rather than the individual storytelling needs of a moment.



    Man With A Movie Camera – Part Deux

    5 02 2008

    Dziga Vertov directed a pretty fascinating film called MAN WITH A MOVIE CAMERA back in 1929. In it we see a cameraman shooting a day-in-the-life documentary about Russian life. We see the film being edited, we see the film being screened but we don’t see much of the film itself. It’s a very clever approach to documentary work and is really about filmmaking as well as Russia.

    Well, there’s a project afoot over at Man With A Movie Camera to recreate the film, shot by shot. The website lists every shot in Vertov’s film. Web site visitors/filmmakers are encouraged to recreate the shots in their own way. It’s fascinating and very creative (the shots aren’t frame by frame copied, of course — that would be Van Sant’s PSYCHO and we don’t want to talk about that).

    Check it out.

    Powered by ScribeFire.



    Grand Central Freeze-In

    1 02 2008

    ImprovEverywhere loaded a video on Vimeo which documents mass event in which 207 people stopped at exactly the same time in Grand Central Terminal in NYC. They didn’t move despite being poked by curious onlookers or being honked at by workers in vehicles trying to get by. The way the video is edited it creates a real shape to their event, including applause when they broke their poses after five minutes.

    By the way, from ImprovEverywhere has a whole bunch of these events captured on Vimeo, including a pretty cool synchronized “swimming” event in the Washington Square Park fountain.