Monday, November 16, 2009

The Dehumanization Of Movies: Scrooge Was Always Into Numbers, But Now He's Made Of Them




By Lars Trodson

Is it possible to adapt yet another version of Charles Dicken's "A Christmas Carol" without having read the book?

There are so many versions of this story -- starring everything from Barbie to animals to Muppets to Mr. Magoo -- yet all remain so faithful to the same template that it is not unreasonable to ask if filmmaker Robert Zemeckis even cracked open this modest ghost tale to make his digitized 3-D version of the story.

He may have read it, but his inspiration -- for the screenplay and not the look of the film -- appears to be other filmed versions of the story. Zemeckis even yanked, for no apparent reason, a reference from a lovely little 1935 English version of the story called "Scrooge" into his own. It's at the beginning of both films when a butcher throws out a chicken leg to some hungry kids.

In Zemeckis' version I think the butcher has the voice of Bob Hoskins (who also later voices old Fezziwig, alive again!). There's no specific reference to this butcher in the Dickens' text, but given that Zemeckis and his animators copied the shape of the window directly from the 1935 version (and also made it a basement window, as does the 1935 film) seems to indicate a preference for filmic sources rather than written ones.

Zemeckis even has Bob Cratchit slide down an icy path on the street, as does the Cratchit in the 1935 film. There is a reference to Cratchit doing this in the novella, but hardly anyone seems to have used this detail except these two versions.

It probably doesn't matter how a new version of "A Christmas Carol" is arrived at. The joys of the story seem almost quaint now -- and its familiarity is ruinous toward the true power of the narrative. We're so intimate with the details of Scrooge's life, to both his trials and his triumphs, that we hardly pay attention any more.

But that is why it may have been more important than ever for Zemeckis to wring something original out of the story. But all he did was decide how to film it a new way, he didn't figure out the why.

Walt Disney Pictures and Robert Zemeckis have pulled off a mighty public relations coup with this digital adaptation of “A Christmas Carol.” There has been much huffing and puffing about “performance capture” and the painstaking methodology it uses to reproduce human facial expressions and body movements, that you may actually believe this approach works. But you'll only believe that if you don’t really bother to look at Zemeckis' film.

There is no question the filmmakers have done a superlative job in fussing over the tiniest detail with the main character, Ebenezer Scrooge (voiced? -- played? -- by Jim Carrey), but almost every secondary character, and certainly all the background players, the partygoers and the townspeople and Bob Cratchit’s family, are so badly realized that sometimes the production looked like one of those garish mass-produced digital productions that are actually tough to watch.

All you need to do is concentrate on the scene during the Ghost of Christmas Past segment when Scrooge is taken back to Old Fezziwig’s Christmas party. Fezziwig looks like a creepy red-faced blank-eyed doll. But he’s downright lifelike compared to the figures who cavort in the background during the dance. If you haven’t seen the movie yet, look at them and you’ll see bodies that barely move, and if they do move they certainly don’t move like any human being you know.

The famous “eye problem”, which caused so much talk when Zemeckis made “The Polar Express” a few years ago, still exists. The problem is that “performance capture” can’t get the eyes right -- they look lifeless. The word used most often to describe the look is "creepy." In Scrooge's face the eyes look OK, but Zemeckis and his crew also cheat at this a little bit. In many of the shots the eyes are in shadow -- including some of the closeups of Scrooge. This happens to many of the other characters, too. So Zemeckis did his best to disguise the fact that technology, no matter how enamored of it we may be, or how much faith we have in it to recreate real life, is finding a stubborn adversary in the more mysterious and spiritual elements of life.

It is difficult to figure out what this latest version of “A Christmas Carol” is about, anyway. Something tells me that it is nothing more than an effort to do a little research and development on “performance capture” and new 3-D technology, but on our dime.

Disney may feel that it is continuing its long tradition of innovation, but I think there is a quantitative difference between the time when a stable of illustrators locked themselves away and experimented with animation so that drawn characters behaved and acted realistically to the point where we're trying to create real people on the screen without having to use real people at all.

The early Disney animated features caught some fire, in fact, by critics who were confused about the need for animation if the intended result was to simply mimic real life. But Disney understood, just as the early CGI pioneers understood, that audiences would marvel at the technology -- at the spectacle of it. In an animated film, whether handmade or built inside a computer, was never intended to have the audience suspend its disbelief. The intended reaction has always meant to be: "That looked real!"

But now, with performance capture technology, we're heading into a weirder, stranger area. It wants to be a kind of animation, but animation that is so hyper-real that we actually do suspend disbelief long enough to forget we're in an animated movie. They don't want us to think it "looks real" but that it is real -- or at least as real as we ever think a movie can be.


They may very succeed, but at what cost to artistry? Make no mistake, if "performance capture" becomes a preferred method of filmmaking, then there will be no way to differentiate between films and filmmakers, because a director will be able to tinker with an image and a film until they get it right. It didn't used to be that way. There was real pressure, because directors knew they only had a limited number of chances to get it right.

One director was distinguished from another by getting unique performances out of actors in the time allotted to get them. If they didn't succeed, the film suffered. If they didn't get the shot before the light ran out, the film suffered. If they couldn't get the crowd scene framed right, the film suffered. If they forgot to get coverage for a certain shot or a certain line of dialogue, then the film suffered.

Not to worry, any more.

There's an old and tired joke among filmmakers: If something doesn't go right, you can always fix it in post.

Soon, it seems, the entire process of creating a movie will be a kind of post-production process. If you didn't get a line of dialogue just right two months ago, no matter, you can digitize a whole new shot -- a wonderful shot! -- of a long departed actor without even leaving the editing room. We'll no longer be able to appreciate the beautiful precision that went into building something like the opening shot of Welles' "A Touch of Evil" if it can be easily and seamlessly recreated in a machine simply by stitching digitized actors and pieces of data together.

There won't be any joy in it that all, in fact, and isn't that a problem?