Originally Posted by manasecret
The explanation as I understand it:
First off, there is some confusion that 120Hz = Auto Motion Plus stuff. That is not true. The confusion arises because they both first appeared on the same TVs at the same time, and were often advertised together. They are in fact two separate things.
1. 120Hz:
First, consider that the previous standard was 60Hz for LCDs. Then consider the refresh rate of all the typical sources these days -- they boil down, at least here in North America, to either 24 frames per second, 30 fps, or 60 fps. 30 and 60 fps are as I remember typically relegated to TV sources, while 24 fps I believe is typical of DVD and Blu-Ray movies.
The problem is that 60 Hz is not equally divisible by 24 (60/24 = 2.5). And since 24 fps is the standard for the most high quality source out there -- Blu-Ray movies -- it's rather important when it comes to HDTVs.
Since you can't just put up half a screen and expect people not to notice, you have to do some trickery. The math is a little hard to get my head around, but the essence of what happens is that some of the frames of the 24 get stretched out and are played longer in the 60Hz cycle than the other frames (technically the technique is called telecine). This usually goes unnoticed, but can be very obvious on long slow pans, where it looks like the camera jerks along instead of making one smooth pan.
So basically the complaint comes down to you're not getting the original source to play as it was meant to be played. The source has to be changed to make it work on 60Hz. And the change can introduce some very noticeable problems into the film.
You may already be ahead of me at this point, but now think about 120Hz. It is equally divisible by 30, 60, and 24. So every frame is simply doubled (120Hz/60Hz = 2), quadrupled (120Hz/30Hz = 4), or quintupled (120Hz/24Hz = 5). There is no changing the source to make it fit into the LCD's cycle.
So, that is all 120Hz does. It plays every 24, 30, and 60 fps source exactly as it was meant to be played.
(Note that some will say that 120 Hz is also simply better than 60 Hz because it will reduce motion blur and make watching something appear smoother, I guess the same idea with games where 60 fps is better than 30 fps. While this is true if the source material actually went up to 120 Hz, I think it's entirely bullshit when the source material can only go as high as 60 fps. Watching a 60 fps source on a 60 Hz cycle should look exactly the same as watching it on a 120 Hz cycle because the source is still only 60 fps.)
2. Auto Motion Plus (or equivalent)
My understanding of Auto Motion Plus is that it uses an algorithm to interpolate between two frames of the source material to figure out what that frame would have looked like if it had been captured in the original source, and then shows those extra frames. It's like taking a 24 fps film and using an algorithm to fill in the frames in between each of the 24 to make it a 120 fps source. (That's a rough description.)
Think about it. With a 24 fps source and a 120Hz cycle, you are now showing each frame five times in that 120Hz cycle. That's a lot of extra time in there that you can do some fancy stuff in.
The upside is that it sharpens the source and brings out the detail that you wouldn't see without it. The downside is that you're making up frames that aren't there, and as with any algorithm it can introduce very noticeable artifacts into the image. But on the good TVs the artifacts are hardly every seen.
The other downside (for some anyway) is that you're not seeing the director's original intent. The director only wanted you to see the 24 fps, not what the TV can interpolate in between. While this premise is arguable to begin with, I'm on the side that prefers to see the extra detail when, in my opinion, it looks that good.
|