The Hobbit 48 FPS Preview Divides Audiences at CinemaCon

The Hobbit 48 fps Trailer

Unveiling 10 minutes of Hobbit footage in 3-D at the revolutionary frame rate of 48 frames per second (vs. the standard 24 fps), as Warner Bros. did Tuesday at CinemaCon, should have been the first big buzz moment for Peter Jackson's return to Middle Earth. The immediate reaction to the presentation, however, was anything but good news for the studio or for proponents of the kind of cutting-edge high frame rate cinema technology Jackson and folks like James Cameron and Douglas Trumbull have been championing as the future of film. Instead, it left members of the blogger corps. calling it "jarring," "non-cinematic," and "like a made for television BBC movie," predicting that audiences will be split in embracing the brave new advance.

The footage, preceded by a taped introduction by Jackson, drew breathless raves for portions of aerial footage whisking, in the style of an IMAX nature doc, over wide landscape shots that seemed to prompt unanimous praise. Then came the character footage, which told another story: At its increased frame rate, Jackson's 48-fps scenes were reportedly almost too realistic, approximating what many compared to an HD TV or television soap-like quality.

Badass Digest's Devin Faraci described the effect thusly:

...Here's what The Hobbit looked like to me: a hi-def version of the 1970s I, Claudius. It is drenched in a TV-like - specifically 70s era BBC - video look. People on Twitter have asked if it has that soap opera look you get from badly calibrated TVs at Best Buy, and the answer is an emphatic YES.

Slashfilm's Peter Sciretta concurred:

The movement of the actors looked… strange. Almost as if the performances had been partly sped up. But the dialogue matched the movement of the lips, so it wasn’t an effect of speed-ramping... It didn’t look cinematic.

Variety's Josh Dickey was a bit more reserved in his reaction:

48 fps has an immediacy that is almost jarring. And lighting it just right will be a learning process, as 3D was and still is...48 fps also, unfortunately, looks a bit like television. But it does bring 3D to a different level.

And The L.A. Times' Amy Kaufman called the feel of 48fps "hyper-realistic," quoting one anonymous projectionist in attendance ("It was too accurate -- too clear") as well as an unnamed film buyer who wasn't quite ready to discount The Hobbit's playability:

"The question is if people want to watch movies that really look real or not. I was expecting a subtle difference, but this was dramatic," he said. "Might that work against a narrative? I don't know. But I'm not going to judge it based on 10 minutes."

Stay tuned for more from the Hobbit camp as Warner Bros. and Jackson regroup from the CinemaCon blow, and in the meantime sound off below: Does 48 fps still seem like the future of cinema?


  • Max Renn says:

    This was my concern regarding the choice of 48 fps; that it would look like that weird effect LCD television TV motion smoothing has on TV show and movie images. Some non-techy people like it and say it looks better, while others like myself find it strange to watch and very unlike film (I usually have to turn the effect off or at least put it on the lowest setting). The Hobbit may very well be a dividing movie event for audiences.

  • I saw a film at DisneyWorld way back in 1978 that had been shot at 48fps. It was in fantasyland as a special attraction at that time and I cannot remember it's title at all. It had lots of nature footage.

    I recall being quite taken aback by it. It was TOO realistic looking. The film did 'star' afew people, and when they were in conversation, I don't know how to explain it' but everything seemed...slow motion. I THINK because the brain is having to process so much more information.

    Although it LOOKED great, almost 3D, it was entirely too realistic and because of the 'time slowing' effect, I could not ever imagine seeing a real narrative movie in this format.

    Tl;dr, I had a feeling once people saw this high frame rate, it would not go over very well.

    • Steve says:

      48fps? It will never catch on!

      Just like talkies, films in color, and HDTV never caught on either.

      • Baco Noir says:

        Thank you, Steve. You beat me to it. Exactly. Talkies. Sheesh. Never catch on.

        • j'accuse! says:

          Dude, it's not remotely the same. Adding color or sound still allows for that dreamlike, suspended belief effect that 24pfs allows. 48fps? Hate it. Have seem a lot of it and still hate it. This makes me sad, because I wanted to see this movie. Guess I can save my money.

      • Jake says:

        Yes, just like Smell o vision? And 3D? Boy that really caught on when it disappeared for 50 years. Now it's back and guess what? Most people hate it again. And let's be realistic. Very few REAL people thought talkies weren't going to stick around. Only some stubborn industry people were against it. How many people are refusing to buy the latest HDTV's that have refresh rates above 120? The manufacturers are all backing off that technology in the past few months. Listen to the people. Real people loved Talkies, films in color, and HDTV. They are not going to love the unnatural look of 48fps.

        And HDTV? Why did you include that? No one thought that was going to fail. All you had to do was look at it once and you were hooked.

        Sorry, but your logic is stupid. 48 fps is going to fail because it looks unnatural because the motion blur is not like the normal human eye. Crazy as it seems, 24 fps (captured with a shutter speed of 1/48th of a second) is actually very close to what we view with our eyes. So shooting 48 fps (with a normal shutter speed of 1/96th) is totally off-putting. People will reject this technology. (except ignorant sycophants who fail to think for themselves)

        • Bob says:

          Besides the fact that our eye sees close to 60fps?

          • Jake says:

            Not entirely true. The eye sends information to the brain somewhere between every 1/48th of a second to 1/60th of a second. Which, surprise, surprise, is the same as shooting 24 fps to 30 fps. These frame rates have a shutter speed of 1/48 and 1/60 respectively so the motion and motion blur is very similar to our own eyes. Shooting at 48fps has a shutter speed of 1/96th. Which I clearly stated above. That is why the motion captured at 48 fps looks UNNATURAL. Because it's not what our eyes do.

            Bob, you should learn to read better.

            And beyond that, all it takes is a simple look test. This stuff clearly looked bad. Let's hope the movie is good though. Luckily, based on the reaction, I won't have to see it in 48 fps because the exhibitors aren't going to pay to upgrade their projectors to have it look like a daytime soap opera. Also, lucky for me, where I live, theater owners realized that most people hate 3D and so when a big movie comes out, they will have only 1 screen show 3D and the other 4 to 6 screens are in 2D. I feel bad for people in LA where its harder to find 2D versions (though not impossible).

          • Per Lichtman says:

            I thought you and Jake might want to take a look at this document rather than arguing with each other. Among other things, it addresses how complicated the question is. 🙂


        • harold says:

          Per Lichtman nailed it with that page he linked to. The idea that "The eye sends information to the brain somewhere between every 1/48th of a second to 1/60th of a second" is just not true. 48fps is not going to fail because it "looks unnatural". Quite the opposite. It looks much more realistic than film. Which is why people were amazed by the aerial nature shots from Jacksons footage but as soon as we see a couple of guys in makeup and wigs on a set pretending to be hobbits it kicks everybody right out of the movie.

          Basically whenever anything moves at a decent speed in 24fps film you get a lot of blur. Its not necessarily obvious until you compare it side by side to 48fps or higher. 24fps feels like you are being taken to another world whereas hidef 48fps feels like you are right there onset. And unless people are willing to adjust to that its potentially quite at odds with the sort of "movie magic" that helps immerse you in Middle Earth or any other fictional world.

          • I don't agree Harold, but I liked the 100fps page a lot. I think you missed the phrase, "maybe you need 4000 fps, maybe less, maybe more" at the bottom of the page. I think the reality doesn't set in for all people at 48fps, maybe some, but to the rest of us it doesn't look real, it looks fake. From the info on the 100fps page, maybe the frame rate of 48 is too slow for us to make it blur properly, like real life. The "soap opera look" sickens me for sure. I saw a demo of Sony's $25,000 4K projector at CES this past January with exactly the same problem. So it's not limited to Best Buy TVs...

  • jdempcy says:

    I bought a TV last year after my previous flatscreen died and was surprised that _there was no way to disable 120hz upscaling_ or whatever they call it, when playing movies over the network. (If I watch a movie on my PS3 then I can disable the weird upscaling effect). Anyway, after months and months, I thought I would never get accustomed to that effect but I finally got used to it! It used to annoy the hell out of me, making everything look "soapy" and whatnot. But literally something in my brain changed and I just don't see it that way any more. When friends come over and I'm watching something upscaled they say "wow it looks like crap, all soap opera" and whatnot, but I literally don't even notice any more until they point it out, then I have to kind of stop and closely pay attention to notice what they're talking about. But I _do_ notice watching non-upscaled 24FPS films now because they look .. almost choppy, like a strobe light effect! I'm quite interested to see if movie viewers as a whole adapt to this change or not.

    • McRib says:

      I feel much the same as you on the subject. I'm willing to give it a chance, and personally I think the complainers will just have to deal with it. People can get used to anything; there's nothing "more realistic" about 1/24; it's just the best standard that was possible with the technology of 40 years ago.

    • kevin says:

      same i hated it now its normal to me

  • MOVIOLA says:

    To Jake: What you say is sort of right. So, The Hobbit was shot in 1/64 shutter speed. They did this as a compromise because they still had to make 24fps versions of the film for theatres and Blu Ray. Blu Ray can't handle 48fps, and since that is the format most people will actually see it in... they made a compromise. On 24fps, if you shoot at below 1/48, the image looks like daytime soaps from the 70s. When you shoot above, it looks kind of like the "Saving Private Ryan" Point of View Effect (that was 24fps @ 1/96). If they shot 48fps at 1/96, when they cut the frames out The Hobbit in 24 would look jerky like that SPR stuff. They should have found a way to shoot two streams, via beam splitter or something. So 48fps is going to look like bad TV and 24fps is going to look like a sort of strange less dreamlike movie, but probably will look better between the two. 48fps at 1/96 might actually be different but not bad, like the video game fps everyone compares it all to. It might not be bad, but of course they would have to increase money in lights, make up, effects, sets, etc to shoot at that rate.

    • Jake says:

      Thanks for the info. I just get tired of people bowing down to Cameron and Jackson when the look doesn't pass the smell test.

      At the end of the day, what's most important is that Jackson made a good movie. Fingers crossed on that.

      • j'accuse! says:

        Jake is right all over this comments section. I will only try 48fps if they also bring back Percepto! so I can get a totally immersive experience.

  • Stoy says:

    48 fps may look too realistic, but 3D in 24 fps is a pain. The jerkiness and the abundance of motion blur give the eyes some hard time. And 3D is already breaking the cinema illusion, so it is not a big deal going to higher rates.
    Here is a balanced piece that I largely agree with:

    • Max Renn says:

      Clearly it IS a big deal going to higher frame rates. The real test will be in December.

  • Ed says:

    48fps is an entirely new experience, which means that the methods for filming, lighting, panning, angles, etc need to be tweaked.

    I haven't seen the clip, but if the filming methods were the same as done on a normal 24fps film, then it is like trying to make a normal movie using the same lighting, angles, etc that was good in a black and white film.

    The new frame rate may be inevitably part of the movie evolution, but still needs more work to make it look right, in the same way that computer generated special effects need work to look right. Our eyes all can tell if a scene has bad special effects, and it's usually unconsciously through incompletely tweaked lightings, pannings, anglings, etc. The movie industry took years to get this right.

    48fps right now seems like how special effects were during the era of 70s/80s films when it was the new thing.... Entertaining, though with that subtle awkward feeling of doubt that it looks real.

  • filmfan says:

    It's interesting that back in the late 90's someone tried to get a 48 fps movie system launched. I can't think of his name right off, but he was the film editor for The Fugitive, starring Harrison Ford. He called the process Maxivision and it worked by shooting 3 perf 35mm film frames at 48 fps and then projecting back at the same speed. I believe that Roger Ebert saw a display of this and was impressed with it. He also had designs for improving standard 24 fps film projection by using some really high tech type of projectors that would eliminate film weave in the gate and thereby improve picture sharpness.

    The one thing that some movie makers are either forgetting or ignoring is that cinema works because at 24 fps movies look "real" but NOT "live". Because the movie does not look live, it allows the viewer to but into the make believe, or fantasy, that is playing on the screen. "Live" looks ordinary, but cinema or the movie look makes things seem more important somehow. Plus, actors look better in a movie than in real life because they are not presented in super detailed images. Motion picture film presents a pseudo sharpness in that the image recorded might not show every sweat pore or wrinkle on the performers faces, but when the print is made from the negative the contrast is boosted up a bit and the resulting image appears sharp. Therefore it has a beautifying effect for the actors, costumes, and sets.

  • anonymous says:

    There is some very misguided, utter nonsense being thrown around in this thread, as if it were fact.

    Clearly certain people do not understand this topic as much as they think they do.

    Video terminology being confused with Film terminology, shutter speeds being confused with shutter angles, refresh rates being confused with discrete frame rates etc etc.

    The notion that 48fps looks "unnatural".

    24fps looks unnatural, but it is a technical compromise that we have become accustomed to as an aesthetic (captured with a 180 degree shutter angle).

    48fps looks shockingly more "natural" and will probably take some getting used to.

    60fps looks indistinguishable from reality.

    Now it may be the case that a fantasy movie, that inherently showcases prosthetic makeup and hair, costumes and sets, as well as other vfx, may not be the best way to convince an audience of the radical improvement in image quality.

    It is going to reveal any flaws in the craft, that would otherwise be lost in motion blur, and potentially make the fantasy a lot less convincing.

    And so there may well be a learning curve, as to which are the most suitable genres for higher frame rates, as well as advances in makeups etc.

    48fps is just the first step - a necessary compromise, as it easily facilitates conversion (a simple software prompt) for existing cinema projectors, as well as converting to 24fps for Blu-ray.

    Currently, this has the advantage of drastically reducing motion blur on a per-frame basis, and reducing projector flicker.

    This is great news for 3D presentations, as not only does motion blur not produce three dimensional information, the inherent nature of 3D's alternating left eye, right eye presentation (144 "frames" per second, alternating between left and right eyes 72 "frames" each) results in strobing, blurring and each frame having to be repeated more times at 24fps.

    The big limiting factor is still that projector bulbs are expensive, and theatre managers never like to run them at full brightness, even on 2D presentations.

    With 3D presentation, the light has to travel through the polarising filter on the projector, and then through the polarising glasses that you wear.

    The projectionist ought to increase the lamps brightness to compensate for this (though they are never allowed these days).

    Instead, where the screen should be reflecting light at a brightness of 15 foot-lamberts, it is usually down to about 2 foot-lamberts.

    The big advantage of higher frame rates, besides the elimination of motion blur, is that it enables the image to be projected brighter (which enables greater contrast range, and more vividly realistic colours) without revealing projector flicker.

    The plan at the moment, is that by the time Avatar 2 is released (shooting at 60fps as per Showscan) Cinema's will be willing to convert existing projectors further (remember, they already are capable of displaying 144fps for 3D).

    Douglas Trumbull is currently developing "HyperCinema" which will run at 120 discrete frames per second - that is 60 frames for each eye, being flashed only once, rather than repeated as is currently necessary - onto a high gain Torus screen that yields an image at 30 foot-lamberts AFTER polarisation filtering at both the projector and glasses.

    The point is, that it will seem "real", possibly even hyperreal.

    It is not intended to become the standard approach to all cinema, and will probably generate future debate about which seems "natural" :

    24fps with its flicker and motion blur, or 60/120fps with its uncanny window like quality.

    • Jacob says:

      Understood, but if it doesn't make the movie more enjoyable it is simply a pointless technical exercise.

  • anonymous says:

    If this looks the way people are saying it does, like televison soap opera, I'm not going to like it or think that it fits with a fantasy movie like The Hobbit. Sure you can get used to that look. I once watched a show that changed the way it was shot and I got more or less used to it but I never thought it was better. When I watched old clips it still looked much better and more fitting to the type of show that it was than the new way and wished they didn't change it. Hollywood seems obsessed with the notion that newer techology is always superior. Its not.

  • biggles says:

    dunno why, but whenever i see video on high fps, i just hate it - always have done and always will. it reveals all the flaws and imperfections in the world, and watching a film is one of the ways i can ignore all the flaws and imperfections in the world.

  • Steven McKay says:

    Seems to me like the main problem here is the fact "The Hobbit" is a fantasy movie. It would work perfectly well as a cartoon. I WANT to see unrealistic scenes, that's what fantasy IS! Unless they are going to somehow create REAL Hobbits and dragons, instead of fake ones with make-up and rubber feet etc, I don't understand why a film-maker would want to make it look realistic.

  • marcericjo says:

    Ok so if we suppose 24fps is a perfect mathematical subset of the actual framerate our eyes see, it would appear sensible that an exact doubling of the frame rate would have the same effect, with improved detail.

    What if 24fps works because it is fairly close to the perfect mathemetical subset, and when we double the frame rate, the brain somehow can't correct for the error anymore, so it starts saying "this isn't real".

    Maybe we need to go far faster to truly feel that this is reality, so that all who view like it. There is no question that detail is good, but it isn't the detail I complain about in high frame rate presentations I've seen. It's something empty, flat, unreal. Somehow my brain is complaining (or warning) that this isn't right.

    I think we should wait for Trumbull's 100FPS experience. It might be that we finally get both effects we want. Lots more detail, and no warnings from the brain that this just looks wrong...

    • anonymous says:

      You're over thinking this issue, and putting the cart before the horse.

      There was never some grand scientific study in persistence of vision, made by neuroscientists at the behest of the worlds filmmakers, that determined "24fps is the 'frame rate' the human eye sees at" (that's not even remotely accurate, as it's nothing more than an analogy).

      It's not even the case that 24fps was chosen because it "fools the eye" in to thinking it is seeing fluid, continuous movement - at the time 24fps was established, cinema audiences were perfectly happy, being transported away by films presented anywhere between 12fps and 18fps.

      24fps was standardised for one simple reason : optical soundtrack.

      When "Talkies" became all the rage, synchronous sound became of paramount importance - that meant a projector with a fixed speed motor was fundamental, as a film cranked too fast in order to squeeze extra screenings in per day, would make the actors sound like chipmunks, and dual system might go out of sync.

      Furthermore, the supposed "standard" speed of silent films (16fps) could not deliver adequate audio fidelity or bandwidth (dialogue looses intelligibility, but singing a sustained note / holding a note on a piano or violin sounds hideously warbled), and so the speed was increased.

      24fps was a good technical compromise because it yielded adequate sound fidelity, bandwidth, dynamic range, and most importantly it did not render filmmaking cripplingly expensive. Another deciding factor in 24fps standardisation (over 23fps or 25fps) is that it is mathematically convenient, in that 24 whole frames = 1 second, 12 whole frames = 1/2 second, 6 whole frames = 1/4 second, 3 whole frames = 1/8 second.

      Were it not for the fact that film stock is expensive, developing film on a "per-foot" basis is expensive, theatrical prints are expensive, distribution (on a per-kilo basis) is expensive, storage (by volume) is expensive, and film is already bulky (think about increased bulk, and cost for re-tooling cameras / developers / flatbed editors / projectors etc), it might have been the case that 32fps, or 48fps, or higher might have been chosen as the standard. But economic and technical limitations prohibited that, and 24 fps adequately achieved the goal of good quality sync sound.

      The fact that 24fps also improved picture quality, was simply an added benefit. A side effect. It was never the goal.

      The notion that 24fps looks "real" is laughable.

      24fps looks "cinematic", and it's part of the language of film that we have become accustomed to, that's all.

      24fps is not going away, and high frame rates are not intended to become the new standard, or the way in which all future films are made - though it may be the case that as audiences become accustomed to that look, that's what they expect, and that's what film makers will actually want to deliver.

      Everybody relax.

      • marcericjo says:

        I think you completely missed my point. I certainly never claimed that there had been some grand study. It seems from the information at 100fps that no-one knows really. I didn't claim that 24fps was real, just that when I view stuff at higher frame rates it bothers me, and reminds me of my original reaction when watching TV in the 70s, I hated the way non-filmed shows looked. I speculated that perhaps all those of us who do dislike 48 fps would actually like a faster frame rate, BECAUSE at 48fps there might be a mismatch with our brain that doesn't happen with 24fps...

        • anonymous says:

          No, I 100% got your point, it's just that you're speculating without understanding.

          I really don't mean that to sound patronising, and I'm actually trying to help everyone on this thread have a better understanding of this subject.

          The language that you, and others, are using in your posts presupposes that there is a psychological, or cognitive reason, in terms of visual acuity, that 24fps was standardised - be it scientifically understood at the time, or not.

          There is nothing magical about 24fps, or "in-sync" with our brains, it's just a matter of becoming conditioned to it as looking "filmic" or "cinematic".

          As I said, it was established for primarily mechanical reasons (sync-sound fidelity), and not psychological reasons (persistence of vision).

          The reason TV shows (70's or otherwise) look different to Film, is nothing to do with the fact that NTSC (Never The Same Colour as we used to mock) has a higher frame rate of 30fps, compared to Film at 24fps - just look at the difference between PAL (25fps) and NTSC...

          I wasn't singling you out, and much of my prior post was in relation to earlier statements by others, particularly in terms of 24fps looking "real".

          I'm not saying 48fps (or 60fps for that matter) is "better" than 24fps, because it depends entirely on what you are trying to achieve, and what your criteria for "better" is - ultimately it's a different look, but also a different feel, certainly in terms of the "immersive" affect it imbues (which is only suitable for certain types of movie).

          I can tell you now, that if you like 24fps, and hate 48fps because it seems odd, video-like or "soap-opera-ish", or too "real" - you are going to absolutely hate 60fps and 120fps because they could not look further removed from "film".

          There is no "mismatch" with our brains that 60fps, or 120fps will re-align. Instead you will be confronted with an experience much like looking out of a window, rather than at a screen.

          Think about how strange the reality of that will look, in terms of a modern action movie, with it's big close-ups, swooping crane shots, and rapid editing.

          If you're prepared to accept that movies will continue to look the way we have been conditioned to accept, as they will continue to be shot at 24fps, and that "Immersive Cinema" (using higher frame rates) is a different beast, that happens to be made in largely the same way as conventional movies, and exhibited in largely the same venues, I think you'll be fine with the whole situation.

          It's really all down to conditioning, and in a generation's time (assuming HFR takes off), the kids who have grown up watching ALL of their media content in HFR, are going to watch 24fps movies thinking they look strange - perhaps not as odd as when we watch Keystone Cops, Harold Lloyd, or Buster Keaton movies, but maybe something like the difference in 50's & 60's film stock compared to 80's film stock.

          Hope that clears things up.

          • marcericjo says:

            I beg to disagree. I believe that the discomfort I felt when watching 70's video vs. film on TV is the same discomfort I feel when watching 4k video demos. I feel the same way. So, since color, detail are all improved visibly with 4k and other high frame rate interpolators, it has nothing to do with NTSC never the same color twice. I think, as do many others here, that it's related to frame rate. I'd like to see some 100 fps demos. Perhaps I'll like it. Your argument is that the difference is simply conditioning. I don't buy that. I don't think the dissonance is caused by it "looking too real". I think there is something else at work. Time will tell.

          • Jacob says:

            You seem to be quite emotionally invested in this technology. The point you are missing is people are not enjoying the experience. Hope that clears things up.

  • marcericjo says:

    Once again, I and (I think everyone here) understand that 24fps was not created on the basis of scientific analysis of brain/eye perception. However, we feel that the research doesn't appear to have been done yet. You certainly haven't quoted any. You simply say that you KNOW. How do you know? Where is the scientific study of brain/eye perception that convinced you?

  • gary says:

    Perhaps 48 fps seems unnatural because it changes our ALPHA /BETA BRAIN WAVE PATTERNS.
    Maybe 24 fps does induce a waking dreamlike state in the brain, leading to the "suspension of disbelief' necessary for a good cinema experience.

  • Tim says:

    I am not sure what everyone is so up in arms for with regard to embracing this technology. I have been watching movies for 2 years cranked up with a 120hz TV, and I have to say that I can't go back. I maxed out the 120hz and the Judder reduction and watched Avatar rendered in crystal clear bluray with 1080p resolution and when you see that scene of the helicopter flying down the waterfall and you can see the individual drops of water as the camera pans is unparalleled clarity. Now yes I agree that it gives it a home video look, but honestly after you watch about 10 movies and become acclimated to the clarity and smoothness, you'll find that you don't even notice the home video feel anymore, while becoming spoiled by the clarity of fast moving scenes. Going back to 24 frames per second however, is almost painful as you will feel like your are chasing the scene and it is staying ahead of you giving you a blurry rendition and leaving you longing for the smoothness you have become accustomed to. I have become so used to the appearance of 60fps that when I saw The Hobbit an hour ago I thought that the scenes were still too blurry, even at 48 fps. I am no expert and please correct me if I am wrong those of you with the technical insight in this field, but if the movie is shot and played back at 48fps vice 24fps, that is less of a jump when converting to the appearance of 60fps on a 120hz HDTV, so I would think that it would look even better then the original trilogy watched at 24fps in 1080 dialed up to 60fps by the 120hz. I say give it a about 10 movies and see if you even notice the "TV" look. And afterwards try to watch a traditional 24fps movie and see if you can sit through it without missing your 60fps. My challenge to you 😉 Also, for those of you worried about the loss of the fantasy "feel" in the Hobbit due to the lifelike look! Don't! There is however one caveat. I recommend watching a few movies with the 120hz and judder reduction cranked up so you can get used to it and you don't ruin the experience that the movie can give you.That being said, jackson's team must have done an amazing job editing the cgi in the film with the real actor shots because i never lost the feeling that i was immersed in middle earth.

  • marcericjo says:

    Hmm, Tim, you might have a point, certainly worth the ten film test you recommend. I'll try it for sure and report back.

    Last Friday night, I watched the Hobbit in the HFR version and essentially complained (internally) throughout the film. However, I'm no longer sure that the frame rate was the cause or perhaps not the sole cause of my unhappiness.

    I'm an amateur landscape photographer and do a fair amount of work with photoshop and other image editing software applications. In landscape photography, the "dynamic range" of the image is paramount in almost every image we work on. The human eye can see an astonishingly large range. In one view of a landscape brightly lit by the sun, the naked eye can detect detail in the brightest contours of the fluffy white clouds and still see detail in the shadows of the scene. So, when a landscape photographer says an image looks "flat" or "lifeless", we are typically describing an image with a vary limited light palette, with not much detail in the bright areas ("blown highlights") and/or not much detail in the shadows ("blocked shadows").

    In scene after scene of the Hobbit I saw this flatness. In the interior scenes, and in the flying cameras over the landscape of the mountains, I saw well exposed mid-tones, without shadow detail and without highlight detail. Flat. no drama to anything. I wondered if this is an artifact of the polarizing glasses we use to provoke the 3D effect. When I took the glasses off, the image was quite a bit brighter, and it was completely obvious that the highlights of the on screen image were completely blown, lost all detail. This means that in the process to create the 3D version, substantial parts of the image were lost, compressing the dynamic range. I'm sure that the original in camera image was not exposed that badly.

    I will go to see the 2D version and check out the difference. I'm sure that everything will look much better.

    As for the detail of the HFR version, I found it very pleasing most of the time. I don't think that the theory of too much sharpness holds any weight in my view. Though I wish the beard makers had made beards as good as Gandalf's for the dwarves. They could go through any kind of battle, and emerged with perfectly combed beards, not a hair out of place. That's more of a creative beef though. Gandalf's beard looked good, lots of little tangles in it....

    Maybe Peter Jackson should take a look at HFR without 3D, it might look much better...

  • Try this.  Hold out your hand, with your finger pointing up, against a black or high contrast background.  Now shake or wave your finger back and forth like a 'naughty-naughty' gesture.  Notice the blur.  
      I see my finger become a smeared arc of blur, with ghosty finger images on each end.
      My point is that we see blurs, and 24fps looks more like this phenomenon to me than this new stuff.

  • Genevive says:

    Just wanted to chime in and say very nice job on your blog. I for one mostly agree with what you are saying and hope to see more of your posts in the near future.