2014-03-12, 05:21
I've been using XBMC for a little over a year with litle issue running through a 720p DLP HDTV.
We just upgraded to a new LED flat panel TV, and I've been starting to replace my library with 1080p source material. Unfortunately, I've been getting dropped frames playing back said 1080p files (.mkv).
Plex plays them great, but I do not want to go back and deal with the many niggling issues it has with OSX.
I've tried going through settings>video>playback systematically selecting then de-selecting render method, display refresh, etc. most deviations from the defaults resulted in worse performance.
I finally chose the realtime performance overlay "O" key and saw that the CPU usage was over 100% on the scenes that had noticeable frame drops. With that went dips below 22fps, sometimes as low as 15-17fps. On scenes with lots of black/dark and shading, the CPU pegged near 115%.
Does this seem right? System specs aren't that lousy (see below).
The same file, same scenes run about 11-15% CPU usage on my windows desktop (although that has a 4GHz 8 core CPU, but still...).
Here is the log. It's my first time posting a log here, so take it easy on me if I haven't done so correctly.
Here is the MediaInfo for the source file.
Client Specs:
2010 Mac Mini (2.4 GHz Intel Core 2 Duo P8600, 8GB RAM, nVidia 320M)
OSX 10.8.5
Display: 50" Sharp LED panel
Gigabit Ethernet
XBMC v12.3 GIT:20131212-9ED3E58 (compiled: Dec 23 2013)
Server Specs:
AMD Phenom II x4 965 3.4GHz, 8GB RAM
WHS2011
Gigabit Ethernet
Hope I've included all the needed info.
Thanks in advance!
We just upgraded to a new LED flat panel TV, and I've been starting to replace my library with 1080p source material. Unfortunately, I've been getting dropped frames playing back said 1080p files (.mkv).
Plex plays them great, but I do not want to go back and deal with the many niggling issues it has with OSX.
I've tried going through settings>video>playback systematically selecting then de-selecting render method, display refresh, etc. most deviations from the defaults resulted in worse performance.
I finally chose the realtime performance overlay "O" key and saw that the CPU usage was over 100% on the scenes that had noticeable frame drops. With that went dips below 22fps, sometimes as low as 15-17fps. On scenes with lots of black/dark and shading, the CPU pegged near 115%.
Does this seem right? System specs aren't that lousy (see below).
The same file, same scenes run about 11-15% CPU usage on my windows desktop (although that has a 4GHz 8 core CPU, but still...).
Here is the log. It's my first time posting a log here, so take it easy on me if I haven't done so correctly.
Here is the MediaInfo for the source file.
Client Specs:
2010 Mac Mini (2.4 GHz Intel Core 2 Duo P8600, 8GB RAM, nVidia 320M)
OSX 10.8.5
Display: 50" Sharp LED panel
Gigabit Ethernet
XBMC v12.3 GIT:20131212-9ED3E58 (compiled: Dec 23 2013)
Server Specs:
AMD Phenom II x4 965 3.4GHz, 8GB RAM
WHS2011
Gigabit Ethernet
Hope I've included all the needed info.
Thanks in advance!