hubsi Wrote:On my 100 Hertz Samsung F86, I'm experiencing jittering when I set my HTPC to 60 Hertz; no matter if I'm using XBMC or MediaPortal with the Cyberlink codec. When I set the HTPC to 24 Hertz everything is smooth. Unfortunately playback at 24 Hertz draws significantly more CPU power, that's why the audio is out of sync (just guessing). With Beta 2 however, the delay is constant so I was able to even it out by delaying the audio. I'm also having a couple of framedrops. In summary, I'm now able to watch 24Hertz material smooth and with synced audio This is probably a result of the dual core support in Beta 2.
Maybe a quad core would remove the necessity of the manually added audio delay and the dropped frames. Any opinions on that?
My Setup:
AMD 64 x2 5200+
GeForce 8600GT
XP MCE 2005 SP2
Nvidia driver settings:
Threaded optimization: off
Multi display/mixed GPU acceleration: single display performance mode
Tripple buffering: on
The way I see it it's as follows: The source is running at 24 Hertz. The flat screen panel is running at e.g. 100 Hertz (or at 60 or 120 Hertz). So upscaling is definitely required. We have two options:
1) Let the software (codec) do the upscaling by setting your HTPC to 100 Hertz
2) Let the TV do the upscaling by setting your HTPC to 24 Hertz
I claim that you cannot change the refresh rate of your flatscreen. You can only change the resfresh rate of the input and if it doesn't match the panel's refresh rate the TV scales up. Correct me if I'm wrong.
If I'm right then using intermediate values between 24 Hertz and your panels refresh rate (like 60 Hertz in my case) is a bad idea because then upscaling is done twice. Once by the codec and once by the TV.
The question is: Who do you trust more to do the upscaling correctly?
Flatscreen TVs were built with bluray in mind, so they do the upscaling from 24 Hertz to whatever the panel refresh rate is pretty well.
So my conclusion is: Set the resresh rate of your HTPC to match the refresh rate of the source and let the TV worry about the rest.
I'd highly appreciate the possibility to switch the refresh rate within XBMC.
Well, you've got it a bit mixed up.
Refresh rate is the number of times a screen updates - thats between the video card/PC and the display.
Sources don't have refresh rates, they have frame rates.
Upscaling also isnt the right word to use - its not a matter of scaling, its a matter of timing.
It's the job of the software to make sure that the frame rate of the source is evenly timed to the refresh rate of the display.
Monitors have variable refresh rates - old CRTs are very flexible, LCDs not so much. They all 60hz, most also do 75hz, some do 70hz or 72hz. Some might be able to go lower, I've personally never used them.
TVs are an entirely different beast. The *vast* majority (all non-hd, all non-120hz HDs) of TVs are hard locked to 60hz.
The new 120hz TVs are a bit odd. They can display and accept either a standard 60hz TV signal/refresh rate or switch over to 24hz to directly display a 24fps blu-ray movie. They run internally at 120hz, but they can't accept a 120hz/fps signal.
120hz is a special number because its divisible by the common frame rates of video (24, 30, 60fps). So the first benefit is that they can display those frames perfectly evenly spaced. The second benefit is that 120hz leaves plenty of "extra frames" that allow for the opportunity of smoothing the frame rates on top of evenly spacing the "real frames".
Since the vast majority of content is going to come through 60hz signal, it needs to be able to determine what the frame rate of the actual content is by looking at the pattern of how the frames are spaced out in the signal. 30fps is easy (every other frame), 24fps a bit trickier but there a *very* standard way to do this which has been in place for decades.
The problem is that XBMC isnt respecting those standard frame spacings. For instance - it might be displaying all 30 frames each second, but its not displaying a frame every other hz. The end result is jerky pans because the video is technically speeding up and slowing down several times a second.
Having to resort to switching refresh rates isnt a solution - there's no reason it shouldnt be outputting properly at 60hz. Whether or not *you* can change the refresh rate of your display isnt the problem, its the 90% (at least) that can't change from anything but 60hz. It needs to respect the standard.
I'm sure its just a bug and all of this is well known to the devs. I just hope it gets fixed soon!
Edit: Just so there's no confusion - 24, 30, 60 are the US standards - European TV uses 25 and 50 instead - so they use 100hz TVs instead of the 120hz we get here.