The manufacturers have begun their SMART picture processing that allows for frames to be inserted between actual movie film frame rates (24) and then they confuse the crap out of everything with a 30 cycle display format (TV) TV is actually a 60 cycle based system but each frame is interlaced with the same info just offset by one line. New displays are 1080 P, the P being Progressive, that's to say there is no interlace, just one frame at a time and the info in each frame is different. The smart functions delay the display function until the frames ahead of and behind can be compared and inserted frames are "worked on" to duplicate without distorting or enhancing the picture info. 60 Cycle is the normal or assumed minimum and 120 and 240 cycle sets will have this smart settings and it doesn't work on all program sources. 120 is sufficient unless it's a projector and 240 is best there.
Can someone offer some delineation between the "24 frame rate" setting, and the 120MHZ, 240mhz, 480, frame rate enhancement processing? I understand that the 24 frame setting, allows for 24 frame media to be exactly matched; and that the other settings(motion-flow on my Bravia) anticipate frames and seem to add frames. I am missing something elementary here; because it seems like one setting is reducing frame rate to 24 (from thirty I think) and the other is trying to add more frames to give clarity(thus giving the soap opera affect). How are they different, and what am I missing in relativity here? Seems like we are going in opposite directions here?

Chowhound
Comic Vine
GameFAQs
GameSpot
Giant Bomb
TechRepublic