Is Film Dead? Then Why Do People Keep Wishing For It To Return?

11 04 2012

I am a member of the Academy of Motion Picture Arts and Sciences, which is that Academy.  The one that gives out the Oscars every year. Though, actually, that’s only one teeny tiny part of what the Academy does.

One other thing that it does is to recognize great student work from around the world — by giving out Student Oscars. I am one of a whole slew of members who watch shorts (defined as 40 minutes or under — which often doesn’t seem so short) from non-U.S. film schools so we can vote on the ones that we think represent filmmakers who we would love to see be nominated for feature films in the future.  It’s a great committee

But something odd happened the other night, and it dovetailed nicely with an annual survey that Harry Miller conducts for A.C.E. every year.

Here’s the odd thing that happened.  One of the committee members got up and noted that fewer and fewer of the films submitted to us are captured on film. This member wondered if there wasn’t some way that we acknowledge and reward films that were actually shot on film. He wasn’t suggesting that we vote with that in mind, he hastened to add. He just felt that the Academy awarded films. And he wanted to acknowledge those that were shot on film.

With that, my jaw nearly dropped to the floor and one of my row-mates asked if I wanted to stand up and kick some butt.  Well, I did want to do that, though it was not the forum for that. So I kept my seat, and put my jaw back in its proper place.

You see, it seems to me that what we really do in the Academy is honor good stories, well told (THE ARTIST notwithstanding). It doesn’t matter if they’re captured on a Flip Cam (well, not anymore, I guess) or 70mm. Entrancing, captivating stories know no format.

This was borne out by a survey that Harry Miller helps to conduct every year among members of A.C.E. who are editing movies and television. Since 2004 he has asked a number of questions. One of them is what format (“camera original” in his survey) the editors’ projects were captured on. Back in 2004, the breakdown went something like this:

16mm film 7.5%
35mm film 72.6%
70mm 0%
DV-HD 0%
HD (24p) 10%
Digital (Drive/Tape/etc.) 0%

Now, let’s jump ahead a mere seven years to last year – 2011.

16mm film 2.48%
35mm film 15.53%
70mm 0.62%
DV-HD 15.53%
Digital (Drive/Tape/etc.) (includes 24p) 62.11%
Other 4.35%

If my math is correct (and I was pretty damned good at simple math back in high school) that is a six-fold increase in Digital acquisition, while 35mm film fell to one-fourth of its 2004 percentage.

Now Harry would be the first to confess that this survey was completely non-scientific. It includes pretty much whoever wanted to respond and doesn’t include anyone who either forgot or didn’t want to respond. But the trend is completely obvious. Kodak isn’t just in bankruptcy, its film side is dead, dead, dead. Labs may be making some decent money making prints worldwide, but more than 50% of U.S. theaters are digital now and the world is fast catching up. Those cinematographers who are still developing film negative are looking at a future in which it will get increasingly more difficult (and, hence, more expensive) to process film neg. Which means that fewer and fewer productions will shoot film. Which means that lab work will get even more expensive.

Which means that film will pretty much die. No, let me take that back.  It won’t “pretty much die,” it will totally absolutely die.

Since all of our theaters will eventually be digital projection (and nearly 100% of our films will go through a digital finish anyway), I defy anyone’s mother or non-industry friend to tell the difference between a digital capture film like THE GIRL WITH THE DRAGON TATTOO or the upcoming SPIDERMAN 3, and a film capture. Either subconsciously or consciously.

Wishing that film would come back seems about as pointless to me as pining after those really great lemon cookies that Keebler used to make that I loved so much.  That now are dead, dead, dead.

I think it’s time to reward “good stories, well told” and forget how they were shot. Or, let’s bring those Keebler Lemon Cookies back.

Share


Keeping Organized – A Free Webinar

8 09 2010

One of the things that many low budget productions suffer from, as well as nearly all student films, is a lack of organization. It makes those tougher films even harder, but no one ever feels they have the time to set up their systems.

This is crazy shortsightedness and to give a few examples of what I mean by organization, I’m going to take some examples from my book, THE FILM EDITING ROOM HANDBOOK, 4th Edition, and present them (in my usual rambling fashion) during a webinar being given by the good folks over at New Media Webinars.

Every editor does things differently, and Shane Ross has done a pretty good DVD on the subject within Final Cut Pro. I’m going to toss my own thoughts into the ring  tomorrow, Thursday, September 9, 2010 at 10am Pacific time.

There are some good things about this webinar — the first is that it’s free, if you can make it at that time (NMW will be making the webinar available for a fee afterwards, along with some added content — a video where I’ll talk about organizing a VFX  workflow, as well as a copy of the glossary from my book). You’ll also get a chance to win some prizes, always a good thing.

Finally, I think that you’ll learn some things and, if you haven’t, you’ll have a chance to ask questions.

It should be a blast.  And you don’t even have to be in LA to see it.  So, c’mon down.  Just click on the link below.

Editing Bootcamp. Get Organized!!

Share


The Right Tool For The Job and ROI

27 05 2010

AppleInsider had an article on May 18, 2010 which was titled “Apple Scaling Final Cut Studio Apps to fit prosumers” which generated a ton of blogosphere panic. Even I was caught up in the rumor mongering here, reacting to a post I’d read on Twitter and then, after reading the AI piece, tweeting about it myself. Phillip Hodgetts had a very intelligent post on his blog last week that used a historical approach to take the AppleInsider piece apart, rebutting nearly everything that the article said. Larry Jordan followed up with another article which also took pains to point out why that original piece was Dead Wrong.

But, in doing so, he made another excellent point.

For me, this is the key point — as editors our job is to tell stories visually. The tools we have today do a really great job of helping us put food on the table and pay the rent.

The emphasis is mine, by the way.

Now, I’d be the last one to paraphrase Larry (though I will be doing a bunch of that in a vidcast with him which will start in mid-June — more details on that to come), but let me try. What I think was so cogent about Larry’s comment is this: We only need enough tools to do the best job we can.

Of course, there’s a lot to pick apart in that statement. We were fine working on 35mm and 16mm film, drawing diagonal grease pencil lines down the middle of the film to indicate dissolves. But then videotape editing came along and, soon, we were able to actually see that dissolve. Very quickly, those diagonal lines were not “doing the best job” anymore.

Then there’s the reality that one editor’s “need” is another one’s “nice to have but I don’t care.” New tools in Avid’s Media Composer make displaying 3D footage must easier, but most everyone I know doesn’t work in 3D so (for now) we won’t care about it.

But those issues aside, the truth of that statement is strong. It’s not as important for us to have access to every tool out there, as it is to have the right tool. Until very recently, many feature films were edited on a very old version of Avid’s Media Composer hardware and software because that version of the program was stable, worked beautifully and gave editors everything they needed. Of course, with the advent of HD and visual effects, you can’t say that anymore, except if your job only involves straightforward SD editing. Then the urge to upgrade just isn’t there. Businesses call it ROI (“return on investment”) and the equation holds true in editing as well. Will we make or save as much money upgrading to a new tool as it will take to buy it, install it and (most importantly) learn it?

As the world changes, our editing tools must change of course. But the inverse is not necessarily true; as our editing tools change, the world doesn’t have to change as well. If something works really well in version 4.0 or in version 6, why should we upgrade to 5.0 or 7?

Incorporating new technology into our own work lives can be fraught with peril and we’ll only jump at the changes that make sense. How can we determine what makes sense without reflexively avoiding something just because it’s a change, or darting to every new bell and whistle just because it is new? Good question. We deal with that all the time.

Recently, I’ve been playing with two tools that are designed to make editing life more sensible and I’ve now incorporated them into my own editing life. In each case, I got something more by the change, than I had to put out in order to make that change. That is real life ROI.

I first saw PluralEyes back at NAB in 2009, where it was stuck all the way at a side wall. The way it was pitched to me got my juices excited — this is a tool for editors (FCP only at the time, it has now expanded to Premiere and Vegas; where is Media Composer???) that will automatically sync takes from different cameras that were shot at the same time and have matching audio. This seemed to be a godsend editors of music videos or events (think speeches or weddings) that are captured using multiple cameras. Six cameras capturing a speech can be easily sunk up to each other, even if the audio is of varying quality. Editors who have to sync multiple takes of a musical performance that was shot to a common playback will also benefit from this.

What a cool idea, right? I can hear editors all over the world counting up the amount of time that they will be saving in syncing up footage. In the “old days” this would have involved finding common points between each and every take (a verse where the band sang the word “Killer”, for instance — hard consonants like “K” are useful in finding sync), mark a sync point at those points in all of the takes, and combine the takes into one multicamera clip. This was pretty reliable but was incredibly time consuming and prone to error, especially if the person doing the syncing had to make sure that he/she wasn’t using that same word, but from different verses. In addition, at times the audio on an individual camera might not have been at the same level or quality as another camera, making it harder find the exact match by listening or looking at the audio waveforms in our NLEs.

So, PluralEyes could be a great timesaver but in order to do that, it has to require less work to set up than we benefit by using it. As examples, Avid’s ScriptSync used to take too much of my editing time to set up and so I never used it. Once they put voice recognition into it, it became a very usable tool and I now love it. On the other hand, I’m still waiting for Adobe’s Transcription tool to get to a usable state — right now I get around 50% accuracy, which creates more work fixing a transcription than I’ve saved by doing it automatically in the first place — Scott Simmons has a great review of it in his Editblog.

So, was PluralEyes helpful? Does it pass that test?

Way yes!! It can’t sync everything, but it does a great job of finding the sync points between takes, even if one of the clips is only a partial subclip from waaaaay down in a take. It does a remarkable, though not flawless, job in matching audio recorded at different levels and echo. I was able to effortlessly sync two cameras with direct feed audio, up to one that was using the camera mic, with all of its attendant room echo and noise. In the one or two cases where, for no known reason, it couldn’t sync up a track, it created a separate FCP timeline with those clips on it. This made it easy to see what wasn’t automatically sunk up so I was able to hand-sync those pieces. Synching two or three pieces, rather than thirty, is a huge time saving and so PluralEyes deserved to be in my editing tool chest.

It was the Right Tool for that very limited job and, even at $149, that was way worth it (Honesty Policy: Singular sent me a review copy of PluralEyes, so I didn’t pay that $149. But that doesn’t change my feeling about its worth.) I don’t know what your pay scale is, but if you use this application for three jobs and it saves you two hours in each, that’s about $25 an hour. If you’re not charging at least that for your time, you are either a student or starving or both. One key to this program’s success is its laser beam focus on one thing — help editors sync audio takes together quickly. That’s it. Priced accordingly, it’s a no-brainer for anyone who needs that one thing.

As an aside, Larry Jordan mentioned in his May 20, 2010 Digital Production Buzz podcast, that he has more editing applications on his computer than you can “shake a stick at”. (I’m not sure why you’d want a shake a stick at a computer — I often shake my fists, but that’s different.) He went on to say that he used different ones because not every NLE is good as another at specific things. I got to thinking about that. I used Media Composer a lot for my editing, but I absolutely hate their Titles creation tool — both Marquee and AvidFX/Boris — so I usually bop over to Motion to create lower thirds and the like and then import those files into my Avid machine. The right tool for the job. This is another example of creating a focus on single tasks. When I want to teach students how to create a simple DVD I’d rather use iDVD than DVD Studio Pro (even in it’s simple mode) because it’s Stupid Easy. But it’s phenomenally awful to do anything more complicated. For that I use DVD Studio Pro.

I apologize here for my total lack of knowledge of most Adobe products. I’ve been quite impressed by their improvements in the last few years, but my main body of knowledge still revolves around the NLEs that we use most here in the US — primarily the Media Composer and Final Cut.

Sorenson 360Another tool that I’ve been testing on and off for several months is something called Sorenson 360, which makes it much easier to upload videos that I’ve created for viewing and approval by my producing and directing collaborators. It will come as no surprise to those of you who have been reading this blog for a while that I am a strong proponent of long distance collaboration. I believe that, for editors of the future to be successful, we are going to have to be working with clients all over the world, often many of them at the same time. The feature I’m cutting now has me sitting in front of my computer in Los Angeles, the director is in Rhode Island and the producer is in Massachusetts. We need to be able to easily show each other sequences without flying all over the U.S. To that end, a number of cloud-based review and approval sites have been born on the web. They make compressing, commenting and approving much easier.

Sorenson 360 does all of that to great degree. Like any good compression tool, Sorenson Squeeze can take a while to efficiently and decently compress your films. For a 2 minute trailer that I recently created for that feature I mentioned, it took over an hour. For a documentary that I’m editing on Global Rivers, I had to create a 12 minute excerpt reel. The compression on that sequence, which was originally shot in HD/P2 format, took at least three hours — I left it after about 50 minutes and let it work overnight. When it was done, I had the site send me and my producers an email message that the upload was ready for them and gave them the password. It could have also sent us a text message as well.

Now, as anyone who has ever done any compression can tell you, finding the right compression settings is never as easy as they tell you. I’m okay at this, but I never can find the proper settings for quality, size and platform right out of the gate. Most compression programs give you a number of presets for each use but I find that these are no more than starting points. I am continually tweaking the settings for optimal image quality and web playability. Of course, once you determine the best setting for a particular project you should save it in a preset so you can use it all the time without the need to experiment each and every time (and I usually create a preset or two for each project I do — compression seems to be that finicky).

So, Sorenson Squeeze does all of that, as does Compressor. But Sorenson also provides a direct connection to its Content Delivery Network — the aformentioned Sorenson360 — as well as the notifications that streamline the approval process. It also gives me some rudimentary metrics — such as how many views each video received as well as the viewing duration for each video. This is great for web videos so you can basically tell where a viewer stopped watching your show (I find that the average viewer often dumps out of a video part way through — this way you can find out a bit of the “why”).

So, is this a tool that you need? And is it a tool that’s worth the cost (after a year of the free service that comes with Sorenson 6, the costs “start at $99″ and, yes, their website is that opaque about the costs saying that it’s “pay-as-you-go”)? Well, it depends on what you need it for. Brightcove, a leader in the CDN space (also acronymed the “ODN space” — Online Delivery Network), already provides pretty strong streaming in a variety of platforms with a full set of the statistics necessary for advertisers and sponsors. Can Sorenson deliver the same goods? Their prices range from the same $99 per month (50 videos and 40GB of bandwidth) to $5oo (for 500 videos and 250GB bandwidth).

I have to say that I’m not a Brightcove user so I don’t know the answer to that question. The real question is whether I’d reup with Sorenson 360 when my free one-year is up, and that is also a decision based on my own needs. I don’t create so many videos per month that $1200/year is worth it for me. But if you’re a video professional who finds him or herself increasingly working over distances this also might be the right tool for the job. I love its integration with Sorenson Squeeze (my compressor of choice). I love that I can drop a timecode window on top of my video in Squeeze to provide my producers with an easy way to key their notes to a specific spot in the video. I like the RTMP streaming which enables viewers to easily start a video from any point within the stream, rather than start at the top. I don’t like the fact that there are presently only two real formats for display — H.264 or Flash. I’d like some HTML5 capabilities as well. But it’s a great tool; well thought out and (with the recent upgrade to Version 2) becoming increasingly more sophisticated.

To see the example of how I used this tool on the Global Rivers documentary, you can temporarily check it out at my Sorenson360 site. I output this 12 minute trailer to a Quicktime movie, compressed it in Sorenson 6 and uploaded it to that site behind a password which, in this case, is “globalrivers“.

But, for many people, these applications could be another example of The Right Tool. Would it be really cool if we could get all of this in Final Cut or Media Composer? Maybe. Would it be awesome to be able to create Edit Lists or Film Cut Lists right in our NLE (the way we used to in Media Composer) without having to jump out to a separate program? Again — maybe.

Larry Jordan’s point is well taken. Not every tool needs to do everything. In fact, at a certain point, a tool that does everything is going to resemble Microsoft Word, where most users don’t take advantage of 95% of what the program can do, but it loads incredibly slow nonetheless because Microsoft is putting everything in the tool. Every NLE is going to need just the right tools to let the editors do their job, and no more. The real trick, with so many different editors out there, is figuring out just what the bulk of our editors need, and then give them The Right Tools to do that.

[PluralEyes disclaimer added - June 2, 2010]

Share


Production and Post Wars (or Why Red Should Buy Final Cut)

29 04 2010

Well, all right, I’m exaggerating there. I don’t really think that Red should buy FCP, and Production and Post aren’t exactly at war (though sometime you’d be forgiven if you thought that) but I want to make a point here.

Every year it seems that camera manufacturers create many “improved” codecs that answer their needs — increased quality with reduced file size. However, that goal is pretty much immaterial to post-production professionals. We don’t care if an image takes up a large file size. In fact, with the faster processors and cheaper storage costs (last I checked, a medium-ish quality 2Tb drive costs less than $300 on Amazon), we don’t much care what size the original file is. If it’s too big to use, we’ll just create a lower rez transcode in ProRes or DNxHD and edit with that. In fact, it’s more important to editors that it be easy to edit.

This means that Long GOP file formats, where most frames are not stored as full frames but as a smaller list of changes from the preceding frame, are horrible. They are exceedingly hard to edit with. Whatever speed gains we might conceivably get from working in a smaller file size are more than undermined by the extra work our NLEs need to do in order to display them.

[Note of ignorance. I haven't yet had a chance to play with the parts of the new version (5.0) of Avid Media Composer which allegedly make a lie out of that last sentence. Pushing their Avid Media Access technology forward, and allowing the Media Composer to natively work in Quicktime, Red and various Long GOP formats, they promise to make editing much easier with these previously hated formats. This has proved to be true in my experience with the Sony EX-1 and EX-3 cameras, so this could be a great boon. And I'll talk about that in a few paragraphs, so stay tuned.]

Let’s face it. Editors are never going to get camera manufacturers to stop looking for their version of “better” codecs. We’ve long since learned to live with it. But it does mean that, unless these manufacturers work ahead of time with the NLE manufacturers (the way Red did with Apple, for instance, before the initial release of the Red One) it’s going to take some time for our favorite NLEs to catch up with each new release of a camera codec.

It’s a war and the winner of that war is… well… no one. But the biggest loser is the filmmaker.

This is less of a visible problem on the bigger budget productions where the camera and editorial departments are made up of different people, each of whom have varying levels of tech support that go beyond typing “Long GOP won’t work” into a Google search bar. But as more and more of us are shooting with small crews, and taking it back into the editing room where we have to ingest and edit it (and output it) ourselves, this becomes more than an annoyance, it becomes an impediment to our livelihoods (you know who I’m talking about, you WEVA folks out there).

So, what’s the best solution to this war? Is hope for reconciliation only slightly less feasible than the Democrats and Republicans agreeing on anything in Washington today?

Well, yes it is. But there are some signs of hope.

I’ve already mentioned Avid’s AMA. What that does is create a set of open architecture hooks for camera manufacturers, so that they can more easily create a way for editors to edit natively in the Media Composer. It’s an attempt to make it easier to do what Red did with Final Cut before the Red One’s release.

In both cases, it’s the NLE manufacturers telling the camera manufacturers — “Hey, if you’re going to create your own camera codecs, you’ll have to create your own editing codecs.” Well, not exactly, but Apple and Avid are placing the onus on the camera manufacturers to dig themselves out of their self-constructed hole. And that makes sense, so long as your NLE is one that has enough of an audience to make it worth the camera folks’ attention. I might be wrong, but I doubt that Sony, Panasonic, Red and the HD-DSLR manufacturers are going to spend buckets of money writing plug-ins for Liquid or Vegas.

So, what are our other alternatives?

In the old days, every single camera manufacturer had to create cameras that worked with the industry standard 35mm film gauge. If they wanted to create a film that was a different width — such as, say, 38mm — they had to be able to manufacture the film, the lab processing equipment, the editing equipment and the projectors to accommodate that.

Needless to say, we never saw 38mm film. [We did see 16mm and 70mm film -- which at half and double the normal size was easy for Kodak to manufacture film for. When it became clear how it opened up new markets, the camera, editing and distribution worlds came along for the ride (to greater or lesser degree).]

But what if a company could manufacture a camera and editing and distribution equipment (like Sony) and didn’t have their heads up their posteriors (like, uh.., like… oh never mind)? In a frighteningly anti-competitive way, they could then create a camera codec that worked fine in both capture and post production.

We haven’t yet seen that company, though if Red bought Final Cut from Apple (or MC from Avid, let’s say) it would certainly be a start in that direction. Please note, I have absolutely no inside information on anything that Red, Final Cut or Avid might be up to. For all I know, Apple is planning on buying Red, though that would shock me in ways that I can’t describe in public.

In the meantime, Red Cine X and AMA are two ways that post and production are attempting to bridge the gap. last time I looked, Avid wasn’t manufacturing cameras, which will make it more difficult to keep up with Red Cine X.

When Cisco bought Flip last year, I was hoping that we’d see some real synergy in the production and post areas. At the very least, I was hoping that we’d see some changes in the Flip that would enable them to interact with the web backbone much more easily. That hasn’t happened yet, and there’s no indication that it’s imminent.

But wouldn’t it be awesome if someone came up with a series of codecs that could take footage shot by a camera, make it easily editing ready and trivially distribution ready. By this, I mean more than projector-ready (something that I am hoping that Red Ray will pave a path for) but will make it easier to distribute files safely to theater owners, television networks, web distributors, mobile device partners, et al.

And, I’m hoping that these solutions are provided by multiple companies so we don’t have to be tied to one technology.

Whoever creates that chain will be the Dag Hammarskjöld of all things digital video, and their company will be its United Nations. Peace at last!

Share


Techy Talk

12 04 2010

I’ve got a post percolating about the use of the iPad in education but it’s not really ready yet. In the meantime, I wanted to spend a post or two talking about some more tech-y things.

It’s so damned easy to get swallowed up by the technology in post production nowadays. About five years ago, no editor that I know was using the term “workflow” and now it seems that that is all we talk about. Codecs?  Why should I know about them?  Well, honestly, it’s because that knowledge helps us to do our job better.  When I was a wee assistant editor, I made it my business to learn how the film optical houses did their job, as well as the labs.  I learned about white core mattes and black core mattes, so I could talk more intelligently about them when I was conveying our requests.

Now, take that and multiply by a thousand. I’ve talked before about how we need to know VFX, sound design, color correction and much much more in our editing rooms. Sometimes it seems overwhelming. Luckily, there are tools out there to help us do our jobs better.

Color correction is one thing that continually stumps me.  My wife, in fact, thinks that I’m color blind; she often stops me as I am on my way out the door in the morning with a “You’re wearing those together?”.

So, when Christian Förster, over at Avid Screencasts podcast, recently posted three separate vidcasts about color correction on the Media Composer I devoured them.  I waited until all three were released so I could watch them at one sitting and it was well worth the while.  You can go to his website, Avid Screencasts, to see them (as well as a number of other valuable episodes) or go directly to any of the three episodes here:

Color Correction Basics I – Laying the Groundwork

Color Correction Basics II – Manipulating Contrast

Color Correction Basics III – Manipulating Color Balance with Curves

Hey, Christian, you should put these three casts together into one, add some deeper discussion (primary vs. secondary for instance) and then sell them.  They’re that good. I’m going to put the three together for some of my classes.

Share


The iPad, Film Editing, My Book and Delays

10 02 2010

My book sitting quietly in a Barnes and Noble bookshelf

Long time readers of this blog will realize that it has been a long time — since I’ve posted. There are some very good reasons for that, not the least of which is that my new book was being written, rewritten, rewritten again, and published — all of which required a time sucking amount of work.  All of which I’m thrilled about.

This is the fourth edition of my ancient book on editing room workflow, written originally back before anyone knew what the word “workflow” meant. It is a total page one rewrite and, because I’m not an assistant editor any longer, I had to do a ton of research with assistants (those that are left). I learned a tremendous amount about what assistant editors do today and much of that shows up in the new book. I’ll be dropping some of that on you in the weeks ahead.

Of course, I want each and everyone of you to go out and buy 50 copies each of the book.  But that’s not what I’m interested in talking about today. So, let me go on.

Another reason why this latest posting has been inordinately delayed is that I’ve been editing one or two films. One of them is a great comedy road movie that follows a self-destructive screenwriter as he drives across country accompanied by the young kid who’s been assigned by the film producers to babysit the guy . The film is, I think, going to be loads of fun, but what’s really interesting about it for me is that I’m editing it long distance. My co-editor is in Massachusetts and my director is in Rhode Island.

That means that the three of us are going to spend lots of time shooting copies of our Avid bins back and forth to each other so we can see what each of us are doing. This excites me a lot, but that may be because I’m slightly crazy about the future. A conversation I had a little while back, showed me that not everybody shares this mania.

Last summer, when Final Cut Pro 7 (or whatever they’re calling it) came out, I remember enthusiastically talking to a friend about the iChat Theater function, which allows the editor to play out anything in FCP over an iChat video conference, simply by pointing to it. It’s an easy way to play dailies or your sequence to any of your collaborators. It doesn’t have any of the real interactive functions that would make it a true shared editing platform (I’ll be looking at Fuze soon, which promises much more), but it certainly is a start to long distance communication in the editing process and I was telling my friend about it.

He looked at me horrified and said “I’ve got one word for you — outsourcing.” He was worried about his job going overseas.

“But you’ve got to look at it from the other side,” I told him. “You’re an accomplished Hollywood feature and television editor. There will be plenty of people around the world who would love to work with you. But they haven’t been able to because you live here in Los Angeles and they don’t.”

He agreed that this was possible but then said “A lowering tide lowers all boats. Even if I could get those jobs, my salary is going to go down. Way down.”

Hard to disagree with that.  Welcome to the 21st century. With the collapse of television syndication and the advertising market, the days of 10 month guaranteed jobs for tv editors are going away. As Hollywood moves more and more to large tentpole films, the number of mid-range films is also disappearing and, along with them, a sizable number of cushy mid-level jobs. Those of us who live off of these types of projects are going to have to get used to the fact that our incomes are going to go down, unless we adapt to the new markets.

And, miraculously, those markets are all over the world. What my friend, and all of us, are going to have to do, is to learn to juggle multiple jobs across multiple time zones. Some of us are doing that already. It’s really only the larger job markets that haven’t been doing it. No producer is going to share his/her editor’s time with someone across the globe. But if that same producer is hiring his/her editor for a few months, laying them off, bringing them back on again for a month or two, and then laying them off again — well, they’re going to have to get used to sharing them with the rest of the world.

So working long-distance is going to be a smart thing to learn how to do. And somehow I’ve stumbled right into it.

Apple's new iPad

Then, enter the iPad. I’ve been asked endlessly whether I’m ready to rush out and buy one. Honestly, not really. I’ll wait until the device matures a bit more (just like I waited for the iPhone 3G and am thrilled that I did). However, the possibilities that this new device gives us in the vertical market that is filmmaking are thrilling.

Imagine a producer pitching a project to a studio. Right now they send a script and, perhaps, some accompanying materials, to the studio where (if their readers like it) it is sent home with 50 or so executives to be read over the weekend. This is called, in a predictable burst of studio originality, the “weekend read.” Many studios have moved the weekend read from paper to the Kindle, which saves paper but does nothing to brighten the experience for those poor junior executives.

Now, imagine if you will, that the producer has loaded the script onto an iPad and that there are embedded links within the script to location photos, audition tapes, CAD drawings of sets, and 3D mockups of the worlds that are only hinted at in the script. That is going to be a clearer, more interesting vision of the story for every single one of those bored-to-tears weekend readers. It’s also going to be more helpful to me, when I read a script before an interview, or to an art director as he/she tries to figure out what’s inside of the director’s mind.

And that’s just one single use for this device. If you take a look at the dozens of applications for filmmakers available on the iPhone (Taz Goldstein has a great list, adapted from his recent Supermeet talk, up at his site Handheld Hollywood and, by the way, the Supermeet was a great event, even if I did have to watch it streamed on Ustream — you should go and look at it right now). There are slates galore, some of which even will help you import your footage into your NLE. There’s a very cool application to allow you to remotely control your f-stop settings on your camera. There are director’s viewfinders, storyboard creators, teleprompters and research tools. And that’s for the iPhone.

Imagine what we’ll be able to get with a 10″ screen.

Here’s my point. For years we’ve been on the cusp of something really new and exciting in the filmmaking world. We’ve gone all digital — from capture through editing. We’ve also seen the world of distribution change — so the need to print film for theaters is fast disappearing, and we will be easily distributing to each of the four screens that people watch their entertainment on (see an earlier post of mine about Four Play).

What’s been missing is the ease of getting from this digital creation, to the digital consumption in any way that resembles a realistic viewing format.

The iPad is more than a hint into that future, it’s the door ajar (not fully open yet, but not closed).

Share


Final Cut Pro – Baby Steps Into The Future

23 07 2009

For the two or three of you who don’t know yet, Apple released its updates to its suite of video applications today.  Final Cut Suite 3, has updates and new enhancements to nearly all of the parts of the suite, including some cool title manipulation tools in Motion, voice level matching in Soundtrack Pro (a boon to quick and easy temp mixing), cooler markers and more flavors of ProRes in Final Cut, and more. Some of the features, like a floating timecode window and global transitions, are attempts to catch up with Avid’s Media Composer which has had that for a very long time. (Apple’s list of new features can be found on this page on their website.)

That, by the way, is a great advantage of competition.

But it is in the aspects of ease-of-use and collaboration that Apple has shown that it is paying attention to what it’s core market really wants. Despite the high-end videos of Francis Coppola and Walter Murch on TETRO, Final Cut’s appeal has always been to people on the lower-priced end of the market — the students, the low-budge indies, the people putting together their own shops. The entire suite concept caters to them — if your market is made up of people who can’t afford to hire separate title designers and sound editors, then the idea of charging people separate amounts for separate applications is a non-starter. For the indie filmmakers and podcasters who are creating their own soundtracks and flushing them out to the web in record time, buying ProTools and Media Composer is just too expensive. Even if Soundtrack Pro is way inferior to ProTools, it just simply doesn’t matter to that market. Having everything in a box (with round-tripping between the apps) is The Way To Go.

I’ll talk about the coolest indicator in a minute, but let me also say that the ease of use factor is also huge for this market. If I’m doing my own lower thirds, and I’m not a visual effects guru like Mark Christiansen, then I want easy-to-use templates that provide me with a great default setting.  I’ll change the look and feel if I want, but the fact that I don’t need to program in a motion effect, with a glow, and time everything out from scratch, means that I can get things done much more efficiently (even at the expense of greater individuality).

So, starting with something much higher than Ground Zero, appeals to many of the filmmakers that Apple is targeting as their market.

But here’s the cooler thing for me.

As many of you know, I’ve been harping on the idea of long distance collaboration for several years. It’s clear that more and more of us are working with people who we don’t see every day. Two years ago, I co-edited a small horror film called JACK IN THE BOX. It’s director and my co-editor were both on the East Coast, while I sat in Los Angeles editing. We exchanged files and projects via the net. It was a successful collaboration, but a bit frustrating because of the lack of face-to-face contact. This month I’m starting a new film where the director will be in Rhode Island, my co-editor in Massachusetts and me — still in California.

My point is that this is becoming more of the norm, rather than a rare instance. Commercials, corporate films, sponsored videos, and more, are fast being done by the People Who You Want To Hire, even if they’re in another city. But the tools just aren’t there yet to help re-create the face-to-face experience. We’ll be experimenting with some newer techniques on this one and I’ll report back, but the struggle is always to help all of us to feel like we’re in the same room.

Now Apple has introduced iChat Theatre, which allows the editor to play back his or her timeline right over iChat. If I read the tutorials properly, you no longer need to create a Quicktime export and then upload/FTP it. In fact, you no longer even need to create a Quicktime at all. This feature of Final Cut allows others on the iChat to look directly into a Viewer (or Canvas) on the editor’s machine. That’s it.

Now, it doesn’t have the real interactivity that I’d love — to have my iChat buddy be able to use his or her mouse to stop and scroll the cursor around on the timeline  (like Syncvue, for instance, does), and I don’t know if you can have more than two people on the iChat, but you can video chat with each other while you’re scrolling around. Mike Curtis says that you can show the timecode window as well, and that will be great for more precise discussion. But you certainly can’t take a mouse or Wacom tablet pen, and circle items on the screen (which would be handy for discussion visual effects) like you can on some services. It would also be cool if you could attach comments/markers to particular places on the timeline — so you could easily accumulate notes. But, using a screen grab tool like Snapz Pro X, you could record a notes session for later playback.

Very cool. Since one of the biggest issues in distance collaboration (as well as in any notes meeting, now that I think about it) is misinterpretation of notes.

My point, however, is that Apple has once again identified a growing need in their core market. Many of us working in lower budget ranges need to work with people across great distances. They haven’t given us any real groundbreaking tools to do that, but it is clear that they are thinking about it, and slowly introducing early versions of the tools that we will all need very very soon. These tools are very basic, and don’t really do much more than take ideas that have been floating around elsewhere for a while, and bring them into the suite. But the real takeaway here, is that they’ve now brought these things into their own tool and made them easy to use and integrate with their other tools. And that is going to be very appealing to this market.

Another aspect to this distance collaboration is their Easy Export feature which, on first glance, looks like an easy way to upload to YouTube, MobileMe and more (including BluRay — cool; direct export to DVDs from the timeline).

Oh, and one final point. They’ve made both the price of the suite and the upgrade price incredibly low. The upgrade for someone who already has a purchased copy is $299. That means that they are essentially telling the community that they’ve be idiotic not to upgrade. No one who has the money to make a video project of any kind, doesn’t have $300. (The full price, for those people who don’t have access to an educational discount or their own copy already, is $999.). Once again, Apple is saying to the indie and low budget community — this is for you.

Now it’s time for Avid and Adobe to decide if this is a market that each of them want, and then go for it.

==============================

By the way, some other bloggers are beginning to post their own thoughts on this. Steve Cohen, over at Splice Here, is one of them. Richard Harrington, at the Pro Video Coalition, and Mike Curtis are two others who you should check out.

Share


What User Groups Can Do

18 06 2009

Just got back from the June meeting of the Los Angeles Final Cut Pro Users Group which was, as usual, a blast. Let me tell you what was on the official agenda (and please stick around for my point which follows two paragraphs down).

Andy Neilgave a demonstration of the design capabilities of Motion, two people from Adobediscussed some of the new things in Premiere CS4.1 including the ability to do simpler RED workflow and read VOB files directly without ripping, Bruce Nazarian discussed some of the new developments in Blu-Ray that might make it even usable for most of us, and SmartSound’s Stephanie Joyce gave a demonstration of the new Sonicfire Pro Plug in for Final Cut Pro which actually is a major step on the way to simplifying and improving needledrop music.

But let me tell you about the things that were not on the agenda that were even more valuable.

I got to talk to Philip Hodgetts about how his program First Cuts can be integrated into the workflow of an editing room, despite its brute force method of determining editing points. Along the way, we had a great discussion about the various types of editing rooms, editors and clients, and how to teach a new generation of editors who often have more to teach us then we have to teach them.

I chatted with  a representative from, and got to see a demo of Veescope
Live, a program which does live keying.

I got to chat with a woman who was an early proponent of digital editing in Los Angeles, who is now a farmer who has recently completed a film about a restaurant owner who has developed a clientele for locally-based produce.

In short, I got involved in a lot of discussions that were more about ideas and breaking boundaries, than about uses for Final Cut.

And this is the great advantage of any user group — whether it is one devoted to Avid, Final Cut, Premiere, the RED, or a group of local basket weavers.

it is the contacts and conversations that will provoke your mind and help you grow, and it is those very things which will make you more and more attractive as a filmmaking collaborator.

I’m sure that you’ve got a user group near you.  Most of the companies that make the products that you like to use have lists of them on their web sites  Avid, Final Cut Pro, and Premiere (among others) all have active, thriving user groups.
One of the biggest events each year are the Final Cut Pro User Group Supermeets, which attract hundreds and hundreds of rabid fans, who listen to people present, exchange tips and tricks themselves, and vie to win raffle prizes that can reach thousands of dollars. The next Supermeet will be held in London and for those of you who haven’t been to one, I’d insist that you go.  To be held on Thursday, June 25th, at the KensingtonConference and Event Centre in London, the FCPUG SuperMeet will feature speakers from major equipment and software manufacturers, filmmakers and a speech by Walter Murch on his work on Tetro.”

This will be an event that you will kicking yourself in the butt for years to come if you miss it. And because it only costs £15.00 to get in, there’s virtually no excuse to not go (I will accept the fact that you won’t be in Europe as a valid excuse — that will be mine).

But if you’re anywhere near London around that time, I guarantee that you’ll have a great time meeting tons of great people.

And that is the real value of User Groups. It’s how we move forward in this Freelance Editor world of ours.

Share


The iPhone and the Future of Filmmaking

16 06 2009

Okay, that title is more than pompous, but just follow me for a second.

Debra Kaufman writes in her blog “Mobilized TV” about an application for filmmakers that she found at last weekend’s Cinegear.  Called Helios, it is ideal for cinematographers — it shows “a graphical representation of the sun’s position on a compass dial (azimuth) for any time of day, showing the sun’s elevation and proportional length of shadow an object would cast.”

What I’m interested in seeing, now that the new iPhones and the new operating system is all about to hit the street, is how developers start to create niche applications that they can really make some money out of. There are several advances that Apple is giving there that can make all of the difference.

The first is that the hardware interface will be opened up — so people can start to sell gizmos that hook into the iPhone and interact with it. Think of engineering firms that can input directly into an app on the phone. Think of medical instruments being able to hook directly into this tiny phone/iPod touch and interact with an application inside that gives real time feedback in both directions.

And then think of how your iPhone can hook directly into your Red One or a script supervisor’s keyboard and then broadcast timecode data, along with subsets of any necessary metadata back to a post house or the editing room. It’s going to make the set, the editing room, the producers’ office, the lab/post house, and all of the other pieces of the film chain much more integrated. At very low cost.

So, for now, go read Debra’s review and start imagining.

Share


NAB

11 04 2009

The largest get-together of television, film and media makers and distributors is the National Association of Broadcasters convention in Las Vegas every April. This year NAB (as it is called) happens April 18-23 and I’ll be speaking at a number of venues, as well as going to the first NAB Tweetup.  If you’re going to be there please drop me a note (norman@normanhollyn.com) and let’s try and get together. For now, here’s what I think I’ll be doing while I’m there:

Monday, April 20, 2:00 PM — I’ll be signing copies of my new book, THE LEAN FORWARD MOMENT, at the NAB 2009 Official Bookstore

Monday, April 20, 8:15 PM — I’ll be at the ProMax Digital Lounge, talking about Shaping Stories Through Editing.

Tuesday, April 21 9:35am — I’ll be at the Avid Technology booth (Booth # SU 902, South Hall), talking about “Where are the new editors coming from? And how will they learn how to get there?”

Wednesday, April 22, 9:30am — I’ll be at the Official NAB Podcast Digital Production Buzz booth, being interviewed by Larry Jordan

Wednesday, April 22, 11:00am — I’ll be at the Final Cut Pro Users Group Booth (Booth #SL10129), talking about “15 Film School Tips in 20 Minutes”

Stop by.  At many of them I’ll be giving away a few copies of my new book!!

See you there.

Share