g0dders Wrote:I think the point is that sample is about the toughest bit of h264 you're ever going to have to decode. So if that works, you can be pretty sure you're not going to encounter problems later.
If you're happy with the possibility that some movies are going to stutter in busy scenes, then fine, don't bother. (Personally I would find it very hard to actually enjoy a movie if I knew it was dropping frames - I'd spend the whole time trying to work out how to fix it )
Bingo! Because you see until I bumped my CPU up I WAS dropping frames in some movies! Not just super high bitrate movies either but movies *I* encoded from HD-DVD and BD disks. Usually action movies, movies with lots of foliage seemed the worst and the Bird Scene, encoded by me from HD-DVD, was indeed one of the ones that had issues! That's a REALLY rough scene.
Apple Trailers? Cake. Can play them all day long and could before I bumped CPU clockspeed too. They are not terribly demanding near as I can tell. They do however look freakin' great so by all means try encoding an entire movie and see how it looks. Bottom line is it's up to you as to what's acceptable but please don't try to sell others on the "good enough" when more than one of us has found out otherwise - people reading this are often making
purchasing decisions based on what's posted here.
Thread title is
Best Hardware... My philosophy is to not try to cut corners and end up wishing I'd spent a little more but not spend stupid money. Do it right the first time, set it and forget it. If that means an extra $100 then I will do it, this device supports my entetainment so it's worth it.
Yes, XBMC code is getting better for decoding, yes "someday" we'll see video accel in drivers. Now go read the developer threads where they're working on that and judge how long it's likely to be!
I'm using my system now, not months or a year from now. ATI is having issues opening up that hardware module, NVIDIA isn't even trying, closed source drivers haven't done it. ATI may have to try to use less optimized pieces of hardware to accomplish it - we're already doing it on the CPU.
BTW - just played back my encode of Bird Scene. Is from Disk 1 of the BBC Earth series right near front. 24Mbits/s on my encode and both cores pushed up past 50%. No, I don't think that means I have CPU to spare either! Killa sample peaks at over 40Mbits/s and over 60% CPU by comparison.