Posts: 26,215
Joined: Oct 2003
Reputation:
187
Can you even buy an interlaced display nowadays?
Posts: 1,832
Joined: Nov 2010
Reputation:
58
if I understand well, the goal isn't support for CRT (interlaced display) but let doing deinterlacing to the TV deinterlacer instead of XBMC.
Posts: 89
Joined: Jan 2010
Reputation:
1
Yes when I refer to an interlaced display I include flat panels that accept 1080i and there's a lot of those.
I am pushing this because it potentially avoids a lot of problems associated with watching TV material (XBMC-PVR) and camcorder home movies which generally are interlaced. The right hardware combination to support this is woefully limited. For example all nVidia ION based systems cannot perform combined IVTC and temporal-spatial deinterlacing for 1080i video at the required frame rate. AMD E350 boards running Linux cannot do ANY VA deinterlacing because it is not supported by VAAPI (actually this applies to all AMD graphics cards).
Currently the only adequate hardware setup for Linux and XBMC is one containing an nVidia graphics system with enough grunt to perform IVTC and temporal-spatial deinterlacing at 1080i 60fps. Since the only graphics systems capable of this are PCI cards, the motherboard must have a PCI express 16 slot to. That means a large form factor PC case. These cards are also power hungry.
Posts: 110
Joined: Apr 2012
Reputation:
8
Shine
Senior Member
Posts: 110
This sounds interesting.
Did you solve the issue with scaling interlaced material or are you relying on output in native resolution? Native resolution would be pretty annoying because I haven't see a single TV yet where you could disable overscan for SD content (only for HD content) - so you'll lose the edges of the picture when outputting SD material in native resolution.
Posts: 89
Joined: Jan 2010
Reputation:
1
No I did not solve the issue with scaling interlaced material. It has to be IVTC'd and deinterlaced before scaling.
You are right to be concerned about SD content. In the most basic hardware setups displaying SD at native resolution would be desirable but I suspect this is very tricky. For HDMI connected displays 576i and 480i modes require pixel doubling to avoid the lower pixel clock limit specified for HDMI. This means both the video and all of the GUI would need to be rendered to a frame 720x576 pixels for 576i (600x480 for 480i) and then stretched to the full 1440x576i. In OpenGL you would texture map with the setting glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) to double pixels horizontally.
Also mode switching the TV display and the low res GUI might be annoying. I suspect a halfway-house approach might be more suitable where if you say had an ION system it is quite capable of IVTC and deinterlace of SD material so you could then display this on a 1080i display. For 1080i material the deinterlacer would be turned off and the interlacer turned on.
Posts: 473
Joined: Dec 2008
Reputation:
5
This would be of HUGE interest to me and I know a lot of others. Many people with HT setups have Pre/Pros or even receivers (in addition to those that feed directly to their TV's) with onboard chips that handle scaling and deinterlacing duties far more effectively than XBMC can with software (or even hardware assist). To me, this goes hand-in-hand with bitstreaming audio, and let the components that are designed for this do the work.
Posts: 89
Joined: Jan 2010
Reputation:
1
Some good news; the patch to the FFMPEG tinterlace filter has been adopted (modified slightly) by the FFMPEG maintainers.
I'm now trying to see if it is possible to do something similar for VDPAU output. I ruled out the getbits and putbits method because for 1080i this would involve something like 350mbytes/s transfers which I suspect ION would not be able to achieve. Instead I'm trying the NV_vdpau_interop nVidia OpenGL extensions method to manipulate video using OpenGL. It's a steep learning curve for me.
The problem with this approach is it requires the use of OpenGL header files specific to nVidia drivers. Do the XBMC developers have any kind of policy on the use of OpenGL extensions specific to a particular vendor?
Posts: 515
Joined: Feb 2012
Reputation:
13
Is there a branch that I could use to test with VDPAU rendering?
Posts: 89
Joined: Jan 2010
Reputation:
1
Yes Fernetmenta has just pulled the WEAVEX2 modification to his master branch.
Posts: 6,743
Joined: Oct 2008
Reputation:
317
noggin
Posting Freak
Posts: 6,743
2013-01-22, 04:56
(This post was last modified: 2013-01-22, 04:57 by noggin.)
An alternative approach for 576i and 480i might be to do what broadcast digital video effects devices did when they had to scale interlaced video.
If you scale an interlaced frame as a frame, you end up mangling your interlaced fields together and you get all sorts of nastiness. (Motion judder, odd banding on motion etc.)
However if you scale in the field-based domain, you avoid this. Of course you don't get the benefit of the improved vertical resolution that might be possible with a high quality de-interlace, scale (in the 2x frame domain) and then re-interlace - but it might be an option? (As most DVEs were shrinking rather than zooming pictures the resolution loss was less of an issue)
Effectively you'd scale each 240 (480i) or 288 (576i) line field to a 540 line field (1080i). As you are scaling within the field domain you don't end up with mangled fields.