Production and Post Wars (or Why Red Should Buy Final Cut)

29 04 2010

Well, all right, I’m exaggerating there. I don’t really think that Red should buy FCP, and Production and Post aren’t exactly at war (though sometime you’d be forgiven if you thought that) but I want to make a point here.

Every year it seems that camera manufacturers create many “improved” codecs that answer their needs — increased quality with reduced file size. However, that goal is pretty much immaterial to post-production professionals. We don’t care if an image takes up a large file size. In fact, with the faster processors and cheaper storage costs (last I checked, a medium-ish quality 2Tb drive costs less than $300 on Amazon), we don’t much care what size the original file is. If it’s too big to use, we’ll just create a lower rez transcode in ProRes or DNxHD and edit with that. In fact, it’s more important to editors that it be easy to edit.

This means that Long GOP file formats, where most frames are not stored as full frames but as a smaller list of changes from the preceding frame, are horrible. They are exceedingly hard to edit with. Whatever speed gains we might conceivably get from working in a smaller file size are more than undermined by the extra work our NLEs need to do in order to display them.

[Note of ignorance. I haven’t yet had a chance to play with the parts of the new version (5.0) of Avid Media Composer which allegedly make a lie out of that last sentence. Pushing their Avid Media Access technology forward, and allowing the Media Composer to natively work in Quicktime, Red and various Long GOP formats, they promise to make editing much easier with these previously hated formats. This has proved to be true in my experience with the Sony EX-1 and EX-3 cameras, so this could be a great boon. And I’ll talk about that in a few paragraphs, so stay tuned.]

Let’s face it. Editors are never going to get camera manufacturers to stop looking for their version of “better” codecs. We’ve long since learned to live with it. But it does mean that, unless these manufacturers work ahead of time with the NLE manufacturers (the way Red did with Apple, for instance, before the initial release of the Red One) it’s going to take some time for our favorite NLEs to catch up with each new release of a camera codec.

It’s a war and the winner of that war is… well… no one. But the biggest loser is the filmmaker.

This is less of a visible problem on the bigger budget productions where the camera and editorial departments are made up of different people, each of whom have varying levels of tech support that go beyond typing “Long GOP won’t work” into a Google search bar. But as more and more of us are shooting with small crews, and taking it back into the editing room where we have to ingest and edit it (and output it) ourselves, this becomes more than an annoyance, it becomes an impediment to our livelihoods (you know who I’m talking about, you WEVA folks out there).

So, what’s the best solution to this war? Is hope for reconciliation only slightly less feasible than the Democrats and Republicans agreeing on anything in Washington today?

Well, yes it is. But there are some signs of hope.

I’ve already mentioned Avid’s AMA. What that does is create a set of open architecture hooks for camera manufacturers, so that they can more easily create a way for editors to edit natively in the Media Composer. It’s an attempt to make it easier to do what Red did with Final Cut before the Red One’s release.

In both cases, it’s the NLE manufacturers telling the camera manufacturers — “Hey, if you’re going to create your own camera codecs, you’ll have to create your own editing codecs.” Well, not exactly, but Apple and Avid are placing the onus on the camera manufacturers to dig themselves out of their self-constructed hole. And that makes sense, so long as your NLE is one that has enough of an audience to make it worth the camera folks’ attention. I might be wrong, but I doubt that Sony, Panasonic, Red and the HD-DSLR manufacturers are going to spend buckets of money writing plug-ins for Liquid or Vegas.

So, what are our other alternatives?

In the old days, every single camera manufacturer had to create cameras that worked with the industry standard 35mm film gauge. If they wanted to create a film that was a different width — such as, say, 38mm — they had to be able to manufacture the film, the lab processing equipment, the editing equipment and the projectors to accommodate that.

Needless to say, we never saw 38mm film. [We did see 16mm and 70mm film — which at half and double the normal size was easy for Kodak to manufacture film for. When it became clear how it opened up new markets, the camera, editing and distribution worlds came along for the ride (to greater or lesser degree).]

But what if a company could manufacture a camera and editing and distribution equipment (like Sony) and didn’t have their heads up their posteriors (like, uh.., like… oh never mind)? In a frighteningly anti-competitive way, they could then create a camera codec that worked fine in both capture and post production.

We haven’t yet seen that company, though if Red bought Final Cut from Apple (or MC from Avid, let’s say) it would certainly be a start in that direction. Please note, I have absolutely no inside information on anything that Red, Final Cut or Avid might be up to. For all I know, Apple is planning on buying Red, though that would shock me in ways that I can’t describe in public.

In the meantime, Red Cine X and AMA are two ways that post and production are attempting to bridge the gap. last time I looked, Avid wasn’t manufacturing cameras, which will make it more difficult to keep up with Red Cine X.

When Cisco bought Flip last year, I was hoping that we’d see some real synergy in the production and post areas. At the very least, I was hoping that we’d see some changes in the Flip that would enable them to interact with the web backbone much more easily. That hasn’t happened yet, and there’s no indication that it’s imminent.

But wouldn’t it be awesome if someone came up with a series of codecs that could take footage shot by a camera, make it easily editing ready and trivially distribution ready. By this, I mean more than projector-ready (something that I am hoping that Red Ray will pave a path for) but will make it easier to distribute files safely to theater owners, television networks, web distributors, mobile device partners, et al.

And, I’m hoping that these solutions are provided by multiple companies so we don’t have to be tied to one technology.

Whoever creates that chain will be the Dag Hammarskjöld of all things digital video, and their company will be its United Nations. Peace at last!

Assistant Editor Appreciation Day

26 08 2009

Just found out, thanks to Scott Simmons and the French web site FinalCutMtl, I’ve learned that tomorrow, August 27th, is I Love My Assistant Day.  Awwww.  Go out and hug your assistant.  For those of us who don’t have assistants (I’ve presently copying my media from the transport drive that I was sent from the East Coast yesterday to a backup drive), go out and hug yourself.

Collaboration, The Sequel — And A Contest

23 08 2009

Daisy Whitney, host of New Media Minute

Daisy Whitney, host of New Media Minute

Seems like just yesterday that I finished writing about collaboration (it wasn’t, it was actually two days ago) and I’ve just watched Daisy Whitney’s latest episode of New Media Minute which is all about collaboration.  (Daisy is one of the most informed, entertaining, correspondents on the media area, hosting This Week In Media as well as writing for a slew of magazines and web sites.). She talks about new technology which is enabling people to collaborate across great distances including Wiredrive, web conference software Adobe ConnectNow and sites like video hiring hall Spidvid and online collaborative amateur site Pixorial.

Along the way, Daisy also mentions a project that I was involved in earlier this year — Mass Animation’s “Live Music”.  This was a Facebook application in which animators from across the globe were able to download a trial copy of Maya, and use it to create individual shots in an animated short that is going to be released at the top of Sony’s fall film PLANET 51. There were weekly contests, polls and judged competitions. I was one of a panel of judges that looked at individual sections of the films, gave feedback to the worldwide animators, and awarded badges to the shots we judged the best. It was a fantastic experience and created a much better film than it would have been without that diverse input.

Daisy also announced a contest for web videomakers that I want you all to know about. To dovetail with the publication of a friend’s book (Alison Winscott’s “The Time of My Life”) she has asked animators to create and post a short 10-15 second video based on the idea of “The Time of My Life”. Send her the link and, after judging, the winner will run on her popular show along with a featured interview. Sounds worth it to me. Also, a good chance to learn more about yourself.

The contest (read more details about it on Daisy’s blog for New Media Minute) has no announced final date but, as usual in life, earlier is better. So get those videos shot, edited and in.

Real Collaboration – Editors and Directors, Editors and Editors

21 08 2009

Over on my other blog I long-windedly answered a question that someone sent me on my Twitter feed a few weeks ago: “How do you deal with directors who ask you to do stupid things?”

The short version of my answer was that, if each of you are doing your job right, then there really aren’t any stupid requests because each one is a window into what the director really wants, even if he or she isn’t capable of communicating it well.

But that led me to start thinking about two times when I’ve seen editorial collaboration help enormously in the editing room.

I was an assistant editor and assistant music editor on the film HAIR, way back in the Editorial Stone Age. We had two great editors on the film – Lynzee Klingman and Stan Warnow – as well as a director (Milos Forman) who really knew editing. But there was once sequence that none of the three could quite figure out how to edit. It was a song called “Black Boys/White Boys” in which a row of Army medical examiners decided whether a line of inductees were healthy enough to march off to Vietnam. Choreographer Twyla Tharp had designed this clever set of homoerotic dance moves for the two trios of examiners to be intercut with two trios of women who sang and made eyes at the boys around them in Central Park. The idea was that the juxtaposition of these very straight military men, the naked inductees in front of them, and the trios of seductive women in the park would make the entire medical exam seem absurd and somewhat surreal.

It was supposed to be clever and funny and it absolutely didn’t work.

So Milos and the producers hired Alan Heim with the specific goal of having him edit that sequence. Alan had been Bob Fosse’s editor for quite awhile and had cut films like ALL THAT JAZZ (still one of the most amazing biographies in Seventies cinema – and way ahead of its time), LIZA WITH A Z and LENNY. He was hired one day and disappeared, with an assistant, into a room at the Trans Audio Building on 54th Street in New York (above the famed Studio 54) and came out a week or so later with a first pass that blew everyone away. It wasn’t perfect and underwent many changes between then and the final cut of the film. But it so clearly pointed Milos and his other editors in the correct direction, that Alan was convinced to stay on and work on the film in its entirety.

It by no means belittles the editing contribution of Lynzee and Stan to say that the scene could not have been shaped as well without the outside viewpoint that broke the logjam of their preconceived ideas.

The second example came the second time I worked with director Michael Lehmann. We had previously worked on the film HEATHERS together and it was a fantastic experience for me. When he asked me to move onto his next film, MEET THE APPLEGATES (a satirical farce starring Ed Begley Jr, Stockard Channing and Dabney Coleman, about large Brazilian bugs who get sick of humans destroying their habitat and turn into humans and move to Ohio to blow up a nuclear power plant terrorist-style) I jumped at the chance.

The film came together relatively easily, considering its low budget nature and high ambitions, but it still didn’t feel like the movie that we wanted to make in places.  There were areas that weren’t funny enough. Other scenes had great moments, but didn’t propel the story forward enough.

So we brought in a mutual friend, editor Barry Malkin, to look at the areas of the film that most concerned us (and any others that he wanted to work at).  We put Barry, who had worked with on THE COTTON CLUB and had been an editor with Francis Coppola for years, in a room with a Moviola, an assistant and a ton of film. In a few days he did two things. The first was, he told us that he understood perfectly why we had edited the individual scenes the way we did. He would have done it the same way. But he had some ideas on rethinking scenes in ways that we hadn’t really thought about. We let him go back into the room and, a few days later, he started showing us a few scenes that had been subtly or greatly revamped.

Like on HAIR, the changes weren’t perfect, and they went through many changes before we locked the film a little while later. But they opened up thought processes and brain synapses that we hadn’t used before. It helped to bring us out of our mindset. (Barry got a credit as “Editorial Consultant”.  He should have been credited as “Logjam Breaker”)

Every project needs a place where its creators can step back and re-evaluate what they’ve been doing. Most of the time, there’s neither the time nor the money to do that. What is most painful is when you could do it, but don’t because you’re locked into a conception of your project that can’t move.

The Greeks, I’m told, talk about it this way. Every idea (a “thesis”) needs to meet up with a second different idea (the “antithesis”). When they are allowed to work off of each other, they create a third, usually better, idea (the “synthesis”). The key to making this work in both HAIR and APPLEGATES was to allow the new editor to actually sit and work the material, as opposed to simply giving notes. Sometimes great ideas can come from a comment, but often those ideas just don’t work when they’re exposed to the light of day. You can’t find a character’s smile, or there is no close-up when you’d need one. But with enough time and freedom, a good editor will work towards that alternative goal.

The goal of good collaboration is to allow good new ideas to bubble to the surface without distracting the leader from the overall spine of the project. It’s not easy sifting through thousands of ideas over the course of the day-to-day work on a film. But that is what distinguishes a good director from a mad or mediocre one.

Even Orson Welles Makes Mistakes…

29 07 2009

… but you have to be over 40 to know it.

Shane Ross, over at his fantastic blog Little Frog In Hi-Def, has posted an old video in which Orson Welles talks about editing.  It’s an incredibly wise, and short, piece in which, standing over a 16mm flatbed, Welles talks about the musicality of editing and how being in an editing room is “home” for a filmmaker.

“A Moviola is as important as a camera… This is the last stop between the dream in the filmmaker’s head and the public.”

But there’s one big mistake which makes me realize just how divorced he was from the actual mechanics of editing. See if you can spot it.

This does raise the issue of the difference in involvement from the great editors of the past and today, but we’ll talk about that when I see what sort of response I get to this challenge.

Color Correction Made Easy — Well, Easier

23 07 2009

Color Correction Window in Media Composer

Color Correction Window in Media Composer

One of the mystical and wonderful aspects of finishing a film is color correction where you get the opportunity to give an entire visual “feel” to your image. When I did the low budget JACK IN THE BOX, we couldn’t really afford to light every nook and cranny of the basement location in the dark, moody feel that the director wanted. In post production, using Magic Bullet Looks, among other tools, the colorist (and that was not me — my wife insists that I must be color blind when she sees what I wear to work every day) was able to put the characters into an arena of increasing panic and jeopardy.

But whenever I go into tackle color correcting some work, it’s clear that the task is not as easy as Apple or Avid would have you believe.  “Just click on the flesh tone” or “Just click on something that must be black” or “Find me the whitest part of the frame.”  Never looks right to me.

And then there’s the aspect of what I’m looking at the image on. The temptation among editors (and certainly among many of my students) is to color correct with whatever is right in front of them — often a laptop screen, or the perfectly good but not-meant-for-color-correction client monitors.

Mike Jones, over at Digital Basin, has gone a great way to helping me to understand the concepts behind color correction (or “colour grading” as he calls it). He essentially breaks the process down into three parts:

1. Impression – our visual response
This kind of grade is one designed to imprint on the mind of the viewer an element beyond the picture; to leave an impression by creating a visual response from a set of tones overlaying the image.

2. Expression – our emotional response
The Expressionist grade reflects emotional states, emotional changes and emotional journey’s.

3. Construction – our cultural response
A Constructivist grade is one that builds upon, exploits or plays with or against pre-existing knowledge the viewer may have.

This is actually a pretty good way of thinking of the process. How can we get it look right, then how can we get it to feel right, and finally how can we get it to seem right within our world. He goes into much more detail about this, including giving valuable examples, and it would be well worth a trip to his site to check it out.

He also has a link to a colorist who has a great site of his own, Kevin Shaw. The site has number of great resources for the color blind people like me. One article, From One Light to Final Grade, is a particularly good description of the entire process.

Oh, also, there is a section in my book, THE LEAN FORWARD MOMENT, in which I deal with how color and camera influence storytelling.

Brighter hopes for Digital Theaters

22 06 2009

The recent news that Sony and Regal Theaters reached an agreement to install 4K projectors at Regal Theaters, combined with Friday’s item that the German Federal Film Board (FFA) agreed to provide 40 million Euros (that’s over 55 million US type dollars) to help the digitization of German theaters, shows that the feature film world is finally beginning to get its digital film houses in order.

Of course, there is plenty of desperation in these measures, as well as a large dollop of politics (the FFA co-produces films, and Sony is one of the majors and mini-majors that is still standing). But as the panicked move into 3-D and IMAX shows, the distributors and exhibitors — who are often on opposite ends of the interest continuum when it comes to showing films — are both smelling the snapping dog of internet distribution behind them.

It’s not that 4K makes the films look much better than a typical HD projector. Of course, there are those who see the differences, but most filmgoers couldn’t tell the difference if the words “This Is Better” were flashed on screen during the 4K projection. But it’s that 4K fits into the present filmmaking workflow so much better when you start to look at the very gimmicks that could keep recalcitrant filmgoers in theater seats. The high-powered digital effects of Big Tentpole monstrosities like TRANSFORMERS are created in that high res.  Digital Intermediates are increasingly being done in 4K. 3-D begs for higher resolution in order to create lower cost distribution.

In short, 4K finally makes sense as a differentiator between the theater experience and your living room (even if you’ve got a nerdlike sound system and huge-screen television there). If you don’t have the story to bring them in, at least get the high-priced splash and, for now, that looks way better on a big screen with great sound and incredible effects of things blowing up. All things that the smaller-budgeted indie films and web-based projects can’t really deliver.

I’m not sure where this leaves a film like Woody Allen’s latest WHATEVER WORKS, which had a visual effects component that could barely fill up one screen’s worth in the end credits. But, after years of pooh-poohing 4K as a real possibility in theaters, I must say that I’m thinking that it could really happen. In this case, it’s not the audience that is clamoring for it. And it’s not solely the distributors, finally. It’s the entire chain — all the way to the exhibitors.

I’m not sure how I feel about this. Will it make the filmgoing experience more awesome? I doubt it. Will it make the filmmaking experience easier? I doubt it. Will it make the transition of films to all sorts of ancillary markets easier? Probably, by a hair’s breadth. I’m waiting to see if it does what the industry clearly wants it to — to bring more butts into the seats, and to make the entire process a little cheaper.

The iPhone and the Future of Filmmaking

16 06 2009

Okay, that title is more than pompous, but just follow me for a second.

Debra Kaufman writes in her blog “Mobilized TV” about an application for filmmakers that she found at last weekend’s Cinegear.  Called Helios, it is ideal for cinematographers — it shows “a graphical representation of the sun’s position on a compass dial (azimuth) for any time of day, showing the sun’s elevation and proportional length of shadow an object would cast.”

What I’m interested in seeing, now that the new iPhones and the new operating system is all about to hit the street, is how developers start to create niche applications that they can really make some money out of. There are several advances that Apple is giving there that can make all of the difference.

The first is that the hardware interface will be opened up — so people can start to sell gizmos that hook into the iPhone and interact with it. Think of engineering firms that can input directly into an app on the phone. Think of medical instruments being able to hook directly into this tiny phone/iPod touch and interact with an application inside that gives real time feedback in both directions.

And then think of how your iPhone can hook directly into your Red One or a script supervisor’s keyboard and then broadcast timecode data, along with subsets of any necessary metadata back to a post house or the editing room. It’s going to make the set, the editing room, the producers’ office, the lab/post house, and all of the other pieces of the film chain much more integrated. At very low cost.

So, for now, go read Debra’s review and start imagining.

When Is Too Much, Too Much?

3 03 2009

Sony's PMW-EX1

Sony's PMW-EX1

Or, just because you can shoot a lot, should you?

I hope this isn’t too muchb “inside baseball” but there was a meeting of a lot of the production faculty here at USC last weekend where we got a chance to sample the new workflow using our Sony HDCAM-EX1 and EX3 cameras. (Ironically, two days later Avid announced a great upgrade to their Media Composer product, to MC 3.5, that makes it possible to edit the XDCAM-EX files natively, but that’s a story for another post.) It is a transition that we started making this past fall and is slowly taking over the film school. Our higher end classes are using the F900 or the EX-3, but we are definitely making the move to HD and digital capture across the entire school.

The really interesting point came in a long discussion that we ended up having about one of our key undergraduate course — called Production III — which moved to the EX-3 this past fall. Now, if you’ll pardon me, I’m going to take a little detour to tell you how the class is set up, since it’s germane to the central question of how do we move into the file-based capture world.

The class, called CTPR 480, and is a course in which four teams of about ten undergrads each, make a short film in an intense collaborative format. Each film has a director, two producers, as well as two cinematographer, editors, sound recordists/designers, production designers and one AD. They use other students help to fill out their crews. So this turns out to be the class in which these students learn how to work in very detailed ways in a particular specialty, as well as to work collaboratively with a large group of people. (A trailer for one of these 480 films can be found on YouTube). Up until last semester, the students shot on 16mm film, with a total allotment of 4400 feet of film — or about two hours worth of original shooting. This gives a shooting ratio of about 10:1, since the films have a maximum length of 12 minutes without credits.

The bad news about this, is that students are always stressed about the amount of footage that they have, and they sometimes tend to shoot in tiny little bursts — a line at a time, precutting the film in camera. The good news about this is that it requires the students to really think ahead of time about what is important to their overall story — once they run out of film, they simply can’t get anymore. The entire class and faculty can watch all of the dailies every class and really look at how the students are progressing week to week.

But what happens when there is no longer a physical/cost limitation on the amount of film that can be shot because they are capturing digitally with a file based format? In other words, if they can shoot 26 takes of a set-up, with no film cost penalty, what changes in the class? And, if I can be presumptuous, what changes in the filmmaking process?

Well, the first thing that the teachers in the class learned is that they will shoot 26 takes. If they need to do ten more takes to get the perfect dolly move, they will. But, what happens to the actors’ performances over that length of time? What happens to the crew’s?  What happens to the rest of the shooting schedule? And, from my point of view, what happens to the post-production schedule which hasn’t changed at all?

To move this out of film school, what happens when you remove one of the barriers to excessive shooting, but not the others?

AS anyone who has ever been on the set with an indecisive director can tell you, shooting take after take after take, doesn’t insure better takes. In fact, it usually insures the exact opposite — you may end getting a dolly without a bump, but a performance suffers. You may end up getting a great performance from one of the actors, but the other (who peaked after take four) goes downhill. And when you get into the editing room, does the indecisiveness really end? What about trying a version with a small smile? What about one with a quizzical frown?

Nope, in my opinion, though there is a lot to be gotten from experimentation, it rarely helps to broaden the boundaries of what you want as a filmmaker, to the extent where your collaborators can’t figure them out. I describe my process as “crawling up inside the head of my director” and it helps me to be creative in a way that can advance the overall project. It’s the way a good director can get my artistry without going all over the map.

But if the inside of the director’s head is a huge maze of constantly dead-ending corridors, I’m not going to know what to do, and it will be hard for me to create in a way that the filmmaker is going to consider helpful. I can cut a sequence 80 different ways, but only ten of them might be helpful to the overall story. What I’d really love my director to do, is to give me the outlines of the territory of the film so I can deduce those ten ways and do them in the most effective way. If I’m trying to cram five months of work into two months, then I’m going to have to eliminate at least 70% of those dead ends. Since each change expands the work exponentially (since it affects the way I cut the scene before and the scene after that change).

And that’s just in the editing.

So, the idea that unlimited footage equals better filmmaking is a complete sham (unless you have unlimited money and time, as well as an unlimited capacity for getting bad results). Just because you can shoot 26 takes, doesn’t mean you should.

The Sundance Film Festival and Me

17 01 2009

sff09-tiles-graphicI’m up at the Sundance Film Festival  where I’ll be speaking on Monday and signing copies of my book on Tuesday.

Here are the details for the Monday afternoon panel:

Monday, January 19, 2009
From 2-3:30 p.m. – New Frontier on Main Street
Long time film editor, USC Professor and author Norman Hollyn will moderate a panel with 2009 Sundance filmmakers on a topic loosely based on his book “The Lean Forward Moment: Create Compelling Stories for Film, TV, and the Web.”  Hear directly from directors, producers and editors with films at this year’s Sundance Film Festival about how they find their “lean forward” moments and turn those into compelling stories that entertain millions.

Panelists include:
Jason Stewart, editor of 2009 Sundance Film “World’s Greatest Dad.”
Sterlin Harjo, director/writer of 2009 Sundance Film “Barking Water” and 2007 Sundance Film “Four Sheets to the Wind.”
Ondi Timoner, director/producer of 2009 Sundance Film “We Live in Public” and 2004 Sundance Grand Jury Prize Winner “Dig”

I will also be signing copies of the book the next day at Dolly’s Bookstore, right in the heart of Park City at 510 Main Street, from 1:00 to 2:00pm.  That’s on Tuesday, January 20th.  That’s right, watch the inauguration and then come and have me sign a book.