Does the Hz on a TV actually make any difference? How many should I get (60, 120, 240)?

by Blogger ‎09-23-2010 10:46 AM - edited ‎11-01-2010 07:31 AM

Hertz measurement on a TV comes down to it’s frames per second. This usually matters on LCD/LED formats as plasmas use 600Hz subfields to display a moving image. The reason why 120Hz is sold as an advantage to customers is because it uses the TV’s processor to double up an image frame rate creating a smoother transfer between frames. This prevents image motion blur (scrolling images).


One big functional advantage to having 120Hz frame rate is that it is the lowest common denominator between film’s 24fps and NTSC’s 30fps. This creates a very smooth film recreation that is similar to what the director intended (as long as that feature is activated on the bluray player) without it looking awkward.


240Hz is an effect feature that again doubles the frame rate attempting to enhance motion even further. Personally, I cannot see the enhancement beyond 120Hz. The advantage I see to purchasing a TV that has this feature is that the TV requires quite a bit more processing power to accommodate this feature. If you turn off this feature, the TV is capable of diverting it’s processing power towards motion itself allowing the TV to create a moving image without dropping in resolution* (this usually looks better than a 120Hz image).


*An overloaded TV processor will drop the resolution of the moving image in order to keep the speed and fluidity.

By GSI1 Installer Nick Free

by n0xin Luminary
on ‎09-23-2010 12:11 PM

Awesome article, I was unaware that turning the motionflow off on my 240hz LG will result in more direct motion processing without the frame rate, and without degrading the resolution to compensate for speed. I'm going to try this today with video gaming to see what changes this visually makes for me, espically for FPS games where blurring is somthing i'm trying to shy away from with having 240hz.

by Legendary Oracle
on ‎09-23-2010 11:39 PM

I prefer quicker refresh rates when I look at screens... yet when it comes to movies, I notice a preference to 24 frames per second (slower) instead of 30 frames per second for movie enthusiasts.... can anyone explain why movies like a lower frame or hz?

by infingity
on ‎09-24-2010 01:14 PM



Film buffs prefer 24 because the cameras hollywood filmmakers use shoot 24fps with a 180 degree shutter (meaning light is allowed to pass through exactly half the time).  In the theatre, the film is played back using the same 24x180 system to which our brains have become accustomed, leading to the distinctively choppy "film look".  TVs in North America use the NTSC system, playing back at 30fps with interlaced scanning.  The motion is smoother, but the interlacing leaves those funny looking scan lines behind on the screen.  


Those scan lines, combined with the generally lower production values of television compared to film, have caused enough peoples brains to associate the film look (slightly choppy) with higher production value than the smooth look.  This preference persists at 30p or even 60p where there isn't any interlacing.  Although higher frame rates give cinematographers much more flexibility in filmmaking (such as the ability to pan faster than a snail's pace), society's preference towards the film look is unlikely to change any time soon.  In any case, it's always preferable to watch video in it's original frame rate.