XBMC Community Forum
XBMC for Linux VDPAU - NVIDIA GPU video decoding support (now in the mainline SVN) - Printable Version

+- XBMC Community Forum (http://forum.xbmc.org)
+-- Forum: Help and Support (/forumdisplay.php?fid=33)
+--- Forum: Kodi General Help and Support (/forumdisplay.php?fid=111)
+---- Forum: Linux and Live support (/forumdisplay.php?fid=52)
+---- Thread: XBMC for Linux VDPAU - NVIDIA GPU video decoding support (now in the mainline SVN) (/showthread.php?tid=45525)



- motd2k - 2009-02-25 16:20

sofakng: davilla runs a lesser GPU on an atom 330. Its fine, killa with no framedrops.

rodercot: I upgraded the database version, which is partly the reason i suggested not to use 'make install' with this branch, sounds like you perhaps did that? make distclean, i'd not usually recommend that unless something is broken - you'll normally get away with just 'make'... however always try this before posting here if something is broken. Regarding the green image, I'll need your logfile please.




motd


- dbldown768 - 2009-02-25 16:42

motd2k Wrote:Please note that i'll very shortly be implementing changes which require NVIDIA driver version 180.35.

http://www.nvnews.net/vbulletin/showthread.php?t=122606




motd

what is the best way to upgrade the nvidia drivers from the earlier releases? Does anything need to be uninstalled? Or can we just download the new package from the link and install them on top of the previous VDPAU drivers?

does this overwrite my previous xorg.conf file? As i have custom modelines setup for the overscan in my tv.


- fasteddy - 2009-02-25 17:30

dbldown768 Wrote:what is the best way to upgrade the nvidia drivers from the earlier releases?

Here's what I do - someone can correct me if they see a glaring error (assuming you're not currently using a version from the official ubuntu repos, if that's the case, see here):

1. SSH into the machine (or, directly access the machine and drop into a console <ctrl><alt><F2>)
2. Stop the display manager
Code:
/etc/init.d/gdm stop
3. Remove the nvidia module
Code:
modprobe -r nvidia
4. Uninstall current driver
Code:
sudo nvidia-uninstall
5. Get the latest driver:
Code:
wget ftp://download.nvidia.com/XFree86/Linux-x86/180.35/NVIDIA-Linux-x86-180.35-pkg1.run
6. Install it (Note: Don't bother downloading precompiled kernel interfaces when asked. Have it compile one for your machine - it takes less than 30 seconds)
Code:
sudo sh NVIDIA-Linux-x86-180.35-pkg1.run
7. Start the display manager
Code:
/etc/init.d/gdm start
8. Verify it worked:
Code:
cat /var/log/Xorg.0.log | grep NVIDIA


dbldown768 Wrote:does this overwrite my previous xorg.conf file? As i have custom modelines setup for the overscan in my tv.
Yes, I believe it does (though it should back up your previous version). However, it should (AFAIK) use your previous custom options. For example, I updated last night, and it kept DPMS enabled, which I had previously added by hand. However, as always, YMMV.


- rodercot - 2009-02-25 17:42

motd2k Wrote:sofakng: davilla runs a lesser GPU on an atom 330. Its fine, killa with no framedrops.

rodercot: I upgraded the database version, which is partly the reason i suggested not to use 'make install' with this branch, sounds like you perhaps did that? make distclean, i'd not usually recommend that unless something is broken - you'll normally get away with just 'make'... however always try this before posting here if something is broken. Regarding the green image, I'll need your logfile please.

motd

Thanks chum,

Ok, so here is what I did I had updated to 180.35 and NONE of my VC1 files would play I got that pink/orange/yellow tinged screen but with sound and all my h264 files were fine.

I rolled back to 180.29 and vc1 and h264 are playing great now.

Remember all my tests are using a M2N-SLI Deluxe mainboard with 2gb pc6400 ram and an 8600 or 9400GT Video Card Using a VGA connection to a Sony CRT running 1280x1024 @ 85Hz. My Samsung 24" only has a single DVI input and then an analog VGA, I am trying to convince the wife to let me swap my samsung for her new LG 24" with 3 hdmi inputs but that is not sitting well with her - LOL.

Yes! I did sudo make install, sorry. It is my test machine so blowing things away, breaking and fixing things is what it's all about on that machine.

dbldown,

there are some instruction a few pages back for installing the drivers. If you are concerned you should back up your xorg.conf file first.

I choose yes to overwrite exinsting xorg file from the install all the time but it does not change my options in my xorg file, but to be safe you should have B/U I think the instructions were page 7 or 8 of this thread.

rgds,

Dave


- dafart - 2009-02-25 22:39

180.35 adds vc1 decoding to my gf 9500GT Smile
so far I have not yet encountered major problems with the last days revision of the svn tree, just the know lack of support for ssa subtitels and some trouble getting digital audio to work, which is most likely related to pulseaudio..

Code:
./vdpinfo
display: :0.0   screen: 0
API version: 0
Information string: Unknown

Video surface:

name   width height types
-------------------------------------------
420     4096  4096  NV12 YV12
422     4096  4096  UYVY YUYV

Decoder capabilities:

name          level macbs width height
------------------------------------
MPEG1             0  8192  2048  2048
MPEG2_SIMPLE      3  8192  2048  2048
MPEG2_MAIN        3  8192  2048  2048
H264_MAIN        41  8192  2048  2048
H264_HIGH        41  8192  2048  2048
VC1_SIMPLE        1  8190  2048  2048
VC1_MAIN          2  8190  2048  2048
VC1_ADVANCED      4  8190  2048  2048

Output surface:

name              width height nat types
----------------------------------------------------
B8G8R8A8          8192  8192    y  Y8U8V8A8 V8U8Y8A8
R10G10B10A2       8192  8192    y  Y8U8V8A8 V8U8Y8A8

Bitmap surface:

name              width height
------------------------------
B8G8R8A8          8192  8192
R8G8B8A8          8192  8192
R10G10B10A2       8192  8192
B10G10R10A2       8192  8192
A8                8192  8192

Video mixer:

feature name                    sup
------------------------------------
DEINTERLACE_TEMPORAL             y
DEINTERLACE_TEMPORAL_SPATIAL     y
INVERSE_TELECINE                 y
NOISE_REDUCTION                  y
SHARPNESS                        y
LUMA_KEY                         y

parameter name                  sup      min      max
-----------------------------------------------------
VIDEO_SURFACE_WIDTH              y         1     4096
VIDEO_SURFACE_HEIGHT             y         1     4096
CHROMA_TYPE                      y  
LAYERS                           y         0        4

attribute name                  sup      min      max
-----------------------------------------------------
BACKGROUND_COLOR                 y  
CSC_MATRIX                       y  
NOISE_REDUCTION_LEVEL            y      0.00     1.00
SHARPNESS_LEVEL                  y     -1.00     1.00
LUMA_KEY_MIN_LUMA                y  
LUMA_KEY_MAX_LUMA                y



- danillll - 2009-02-25 23:05

danillll Wrote:motd2k

I just finished fixing the "save filter" issue, you were going the right way by saving to the DB, however the code does not read the video settings from the DB anymore, the db settings table is never updated something is broken, not sure what or if it's just a deprecated feature, anyhow, all the settings are read/saved from/to guisettings.xml which is controlled in Settings.cpp

I extensively tested the fix and the filter settings are now working as designed and also made sure that the saved value, once XBMC rebooted, it gets set correctly by the dp_video_mixer, here is the diff to the latest rev of Settings.cpp

xbmc@XBMC:~/xbmc-vdpau/XBMC$ diff xbmc/Settings.cpp ../../../xbmc/Desktop/Settings.cpp
1026,1028d1025
< GetFloat(pElement, "vdpaunoise", g_stSettings.m_defaultVideoSettings.m_NoiseReduction, 0.0f, 0.0f, 1.0f);
< GetFloat(pElement, "vdpausharpness", g_stSettings.m_defaultVideoSettings.m_Sharpness, 0.0f, -1.0f, 1.0f);
< XMLUtils::GetBoolean(pElement, "vdpinversetelecine", g_stSettings.m_defaultVideoSettings.m_InverseTelecine);
1661,1663c1658
< XMLUtils::SetFloat(pNode, "vdpaunoise", g_stSettings.m_defaultVideoSettings.m_NoiseReduction);
< XMLUtils::SetFloat(pNode, "vdpausharpness", g_stSettings.m_defaultVideoSettings.m_Sharpness);
< XMLUtils::SetBoolean(pNode, "vdpinversetelecine", g_stSettings.m_defaultVideoSettings.m_InverseTelecine);


motd2k
not sure why html is adding a space, in the above ^^^^ for example m_InverseTelecine is becoming m_InverseTelec ine (note the space) when I save the comment, I will email you the file


@motd2k
did you get a chance to check in the above so we can get the save feature working, I have been merging my fix everytime I check out a new revision.

I saw you implemented the brightness and contract, kudos to you, very fast ... now it's time to try it Smile


UPDATE:
I just noticed Changeset 18089, so you are trying to fix the database route? no go for the guisettings.xml? I guess I will try it.


- jmarshall - 2009-02-26 00:43

The xml based settings are only used for the defaults (i.e. when you "Set as default for all movies"


- danillll - 2009-02-26 00:58

jmarshall Wrote:The xml based settings are only used for the defaults (i.e. when you "Set as default for all movies"


Right and this is what is missing for the vdpau settings


- motd2k - 2009-02-26 04:16

committed


- SofaKng - 2009-02-26 04:18

I'm not sure what you committed or what else is going on, but I really, really appreciate your work with VDPAU, motd2k!!


- slicemaster - 2009-02-26 04:31

motd2k Wrote:sofakng: davilla runs a lesser GPU on an atom 330. Its fine, killa with no framedrops.

motd
what mobo is he running? i didn't know they had Atom powered boards with nVidia graphics available yet?


- mr.b - 2009-02-26 05:01

intel atom330 board
http://www.newegg.com/Product/Product.aspx?Item=N82E16813121359

with 8400GS PCI card
http://www.newegg.com/Product/Product.aspx?Item=N82E16814187042


- BLKMGK - 2009-02-26 05:51

Have a look at -> http://www.newegg.com/Product/Product.aspx?Item=N82E16856167037

I have two of these running single core ATOMs for kidster computers and IMO they are darned nice. IF they have enough power as dual cores, and with this branch it's possible, they are very small, very low powered, and cheap! Heck the ones I have run off a wall wart power supply! :lol:


- rudi123 - 2009-02-26 09:32

mr.b Wrote:intel atom330 board
http://www.newegg.com/Product/Product.aspx?Item=N82E16813121359

with 8400GS PCI card
http://www.newegg.com/Product/Product.aspx?Item=N82E16814187042


Does a PCI graphics card really have enough power for 1080P with an Intel Atom??
wow...


- slicemaster - 2009-02-26 10:34

BLKMGK Wrote:Have a look at -> http://www.newegg.com/Product/Product.aspx?Item=N82E16856167037

I have two of these running single core ATOMs for kidster computers and IMO they are darned nice. IF they have enough power as dual cores, and with this branch it's possible, they are very small, very low powered, and cheap! Heck the ones I have run off a wall wart power supply! :lol:

I am amazed at how cheap those atom based nettop PCs are...simply amazing. The cases of most of them look as though they would look reasonably good in a home theater setup. To bad most of those barebones rigs don’t come with a PCI slot for discrete graphics that that you could use an nVidia PCI card for HD decoding. I hope some of the big Taiwanese manufactures jump on the ION platform for nettops. If they could come out with a $150.00 ION based platform like that i am sure loads of Linux media center junkies would be on them like white on rice.

Slice