Hannes The Hun Wrote:poofy, you know usually I'm 100% with you on HTPC stuff, but here I beg to differ. I don't see any value in these broadcom-like dedicated decoding cards as they are nothing more than a compromise for very weak systems and are becoming obsolete right as we speak.
It is way more than that. All systems that lack the general purpose power needed to decode x264 uses a dedicated decoding card: Blu Ray players, Popcorn Hour like boxes, cell phones, the new AppleTV, etc.
In fact the only things in common use that don't use dedicated decoding cards are (that are still sold): CPUs for HTPCs, GPUs for HTPC (using general shaders), and the PS3 (using those crazy Cell SPUs). That is why these are the three most upgradable and robust decoding solutions in the market.
Quote:1) the future for *cheap* consumer-ready set-top boxes lies in the fully integrated ARM-based chipsets as demonstrated by the new Apple TV or by nvidia's tegra2 platform.
I don't know of a ARM chip in the plans that has the raw power to decode 1080p x264 completely by itself. Most ARM platforms (such as the new AppleTV or cell phones) use dedicated x264 decoder chips.
In fact, I think you and I do believe in the exact same future.
Quote: while we'll see fully integrated, 1080p and 3D capable, highly integrated and cheap ARM smart phones and set-top boxes as soon as next year, the likes of syabas/sigma and realtek will start to become obsolete in the media player market and vanish (that's why I didn't like your popcorn hour example).
Popcorn Hours make a great example because what we have experienced with them (runs standard encodes easily, custom encodes hardly) is what we can expect for this ARM+Decoder chip future.
I mean seriously: The new AppleTV MIGHT be technically a different thing than the Popcorn Hour, but its dedicated decoder gives it the same disadvantages and tons of limits.
You see, to me, it is about two distinct groups:
1. Dedicated decoder chips stuck with a "good enough" CPU
Or
2. Powerful general purpose hardware bent to decode
The Popcorn Hour, old AppleTV+Broadcom, New AppleTV, probably the GoogleTV and the Boxee Box all are group 1.
Our ION boxes (using its general Purpose GPU), a Mac Mini, an AMD Fusion box, a home built HTPC, etc. are group 2.
The only practical difference is price and robustness of decoding- group 1 is better with the former category while group 2 is better with the latter.
Quote:2) for HTPC enthusiasts like us that want to have fully customized, bleeding edge hardware for diverse requirements, the FUSION stuff (I like how they hid the "ION" part in there...) or the integrated intel GPU stuff are a very good starting point also if the dedicated graphics cards will still be ahead technologically as we see this right now with 3D, HD audio and HDMI 1.4.
This is where we disagree. I think Intel's direction with its HTPC future (aka the chipset behind the Boxee Box) is about bundling decoder chips with its Intel CPUs instead of ARM CPUs. Intel can see the writing on the wall and knows ARM+decoder chip is cheaper so it is trying to stop that before it starts. Even if they succeed its GPU division gives us NO hope that they will ever deliver a decoding solution as robust as Nvidia's and therefore is pretty much a downgrade from our current position.
The best (and last) hope for real HTPCs with robust decoding (now that Nvidia is cut out at the chipset level) is AMD's Fusion......