Interlaced video output - proof of principle

  Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Post Reply
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #1
I've submitted a feature request http://trac.xbmc.org/ticket/12960 that includes patches to demonstrate how interlaced video output could be achieved.

The method is described in more detail in this old post:
http://forum.xbmc.org/showthread.php?tid=81834

The method overcomes the problem of random bad field synch when watching interlaced video on an interlaced video display.

My hopes are that this will negate the need for power hungry deinterlacers in the mediacenter instead relying on the deinterlacer in your TV.

Thanks
find quote
jmarshall Offline
Team-XBMC Developer
Posts: 25,685
Joined: Oct 2003
Reputation: 169
Post: #2
Can you even buy an interlaced display nowadays?

Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


[Image: badge.gif]
find quote
joethefox Offline
Team-XBMC Member
Posts: 1,175
Joined: Nov 2010
Reputation: 21
Post: #3
if I understand well, the goal isn't support for CRT (interlaced display) but let doing deinterlacing to the TV deinterlacer instead of XBMC.
find quote
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #4
Yes when I refer to an interlaced display I include flat panels that accept 1080i and there's a lot of those.

I am pushing this because it potentially avoids a lot of problems associated with watching TV material (XBMC-PVR) and camcorder home movies which generally are interlaced. The right hardware combination to support this is woefully limited. For example all nVidia ION based systems cannot perform combined IVTC and temporal-spatial deinterlacing for 1080i video at the required frame rate. AMD E350 boards running Linux cannot do ANY VA deinterlacing because it is not supported by VAAPI (actually this applies to all AMD graphics cards).

Currently the only adequate hardware setup for Linux and XBMC is one containing an nVidia graphics system with enough grunt to perform IVTC and temporal-spatial deinterlacing at 1080i 60fps. Since the only graphics systems capable of this are PCI cards, the motherboard must have a PCI express 16 slot to. That means a large form factor PC case. These cards are also power hungry.
find quote
Shine Offline
Junior Member
Posts: 48
Joined: Apr 2012
Reputation: 1
Post: #5
This sounds interesting.

Did you solve the issue with scaling interlaced material or are you relying on output in native resolution? Native resolution would be pretty annoying because I haven't see a single TV yet where you could disable overscan for SD content (only for HD content) - so you'll lose the edges of the picture when outputting SD material in native resolution.
find quote
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #6
No I did not solve the issue with scaling interlaced material. It has to be IVTC'd and deinterlaced before scaling.

You are right to be concerned about SD content. In the most basic hardware setups displaying SD at native resolution would be desirable but I suspect this is very tricky. For HDMI connected displays 576i and 480i modes require pixel doubling to avoid the lower pixel clock limit specified for HDMI. This means both the video and all of the GUI would need to be rendered to a frame 720x576 pixels for 576i (600x480 for 480i) and then stretched to the full 1440x576i. In OpenGL you would texture map with the setting glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) to double pixels horizontally.

Also mode switching the TV display and the low res GUI might be annoying. I suspect a halfway-house approach might be more suitable where if you say had an ION system it is quite capable of IVTC and deinterlace of SD material so you could then display this on a 1080i display. For 1080i material the deinterlacer would be turned off and the interlacer turned on.
find quote
eversteegt Offline
Junior Member
Posts: 21
Joined: Jun 2009
Reputation: 0
Post: #7
A big +1 from here!!! Wink

Since there is still very much content being published in interlaced form (especially SD and HD TV in the Netherlands), the problem of deinterlacing is among the last of problems I am still trying to tackle.

My nvidia ION system certainly does not have the power to properly de-interlace 1080i content and even an rather powerful PCI-E card in my upstairs PC (430 GT) is not always up to the job.

Being able to send the output to my TV in it's original interlaced form (just as the set top box of my TV provider does) would be awesome! Smile
find quote
deh2k7 Offline
Donor
Posts: 436
Joined: Dec 2008
Reputation: 5
Post: #8
This would be of HUGE interest to me and I know a lot of others. Many people with HT setups have Pre/Pros or even receivers (in addition to those that feed directly to their TV's) with onboard chips that handle scaling and deinterlacing duties far more effectively than XBMC can with software (or even hardware assist). To me, this goes hand-in-hand with bitstreaming audio, and let the components that are designed for this do the work.
find quote
kortina Offline
Donor
Posts: 146
Joined: Jun 2007
Reputation: 1
Location: Australia
Post: #9
Another +1 here...

I record a lot of free-to-air tv with TVSchedulerPro (http://sourceforge.net/projects/tvschedulerpro/). (3 Hours per Day)

The Australian broadcasts that I record all suffer from the interlacing issue discribed, unfortunatly my ION and ION2 systems are not really up to the deinterlacing.

This would greatly improve the playback of recorded content for me (and wife).
find quote
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #10
Some good news; the patch to the FFMPEG tinterlace filter has been adopted (modified slightly) by the FFMPEG maintainers.

I'm now trying to see if it is possible to do something similar for VDPAU output. I ruled out the getbits and putbits method because for 1080i this would involve something like 350mbytes/s transfers which I suspect ION would not be able to achieve. Instead I'm trying the NV_vdpau_interop nVidia OpenGL extensions method to manipulate video using OpenGL. It's a steep learning curve for me.

The problem with this approach is it requires the use of OpenGL header files specific to nVidia drivers. Do the XBMC developers have any kind of policy on the use of OpenGL extensions specific to a particular vendor?
find quote
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #11
More good news. After some advice from FernetMenta I dropped the FFMPEG filter approach. I successfully implemented a new interlaced output mode within the XBMC render manager, I have called RENDER_WEAVEX2. The mode can work with software rendering and shader rendering modes. It can also work with VDPAU rendering however to achieve this, this fork of XBMC must be modified with the new mode:
https://github.com/FernetMenta/xbmc

I will request that the new mode be implemented in both the official and FernetMenta XBMC repos.

Current limitations are WEAVEX2 works well in 1080i output modes with perfect field synch, but 576i and 480i modes are not possible yet because of the pixel doubling requirement of HDMI at these low resolution modes. I have tested 576i video at original size within 1080i mode (imagine big black borders round a small picture) and the TV quite happily accepts that, but not suitable for comfortable viewing.
find quote
jdembski Offline
Fan
Posts: 501
Joined: Feb 2012
Reputation: 12
Post: #12
Is there a branch that I could use to test with VDPAU rendering?

VU+ / Enigma2 PVR Client: Documentation | Development | Discussion
find quote
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #13
Yes Fernetmenta has just pulled the WEAVEX2 modification to his master branch.
find quote
noggin Offline
Fan
Posts: 568
Joined: Oct 2008
Reputation: 11
Post: #14
An alternative approach for 576i and 480i might be to do what broadcast digital video effects devices did when they had to scale interlaced video.

If you scale an interlaced frame as a frame, you end up mangling your interlaced fields together and you get all sorts of nastiness. (Motion judder, odd banding on motion etc.)

However if you scale in the field-based domain, you avoid this. Of course you don't get the benefit of the improved vertical resolution that might be possible with a high quality de-interlace, scale (in the 2x frame domain) and then re-interlace - but it might be an option? (As most DVEs were shrinking rather than zooming pictures the resolution loss was less of an issue)

Effectively you'd scale each 240 (480i) or 288 (576i) line field to a 540 line field (1080i). As you are scaling within the field domain you don't end up with mangled fields.
(This post was last modified: 2013-01-22 04:57 by noggin.)
find quote
Stu-e Offline
Member
Posts: 85
Joined: Jan 2010
Reputation: 1
Post: #15
Sorry for not keeping this thread up to date. I think the Weave deinterlacer in Frodo has been modified to perform field rate weave (double rate weave) but only for software rendering.

Quote:However if you scale in the field-based domain

This is called Bob. Select the Bob deinterlacer and try it.

Quote: Of course you don't get the benefit of the improved vertical resolution that might be possible with a high quality de-interlace

What is worse is if you are watching a progressive movie delivered over interlace video your TV will not get the chance to perform inverse telecine and you will lose half the resolution.

Stu-e
find quote
Post Reply