Intel Sandy Bridge CPU integrated GPU?
#16
You know Poofy, in forums there is topics, every person choose to talk in the topic they like, if you want to give additional informations (bad or good), no problem. But if you don't want to read about that "garbage" and just answer that Intel suck and will continue to suck, I don't know why you're staying here.
Reply
#17
EuhicS Wrote:You know Poofy, in forums there is topics, every person choose to talk in the topic they like, if you want to give additional informations (bad or good), no problem. But if you don't want to read about that "garbage" and just answer that Intel suck and will continue to suck, I don't know why you're staying here.

I had no problem with it... and I am the OP! I wanted a true perspective on whether or not the Intel Core 2011 (Sandy Bridge) CPUs would benefit XBMC in any drastic way. I was specifically curious about the integrated graphics (as seen in the title of the post).

To be honest, I am glad poofy came in and was brutally honest. This allows me to put things in perspective, not get my hopes up, and realize I can settle with a current-generation Core i3 530 (which will probably drop in cost since the new 2011 CPUs will be coming out) and a nVidia GT430. This seems like the best combo there is.
Reply
#18
regardless of performance or the lack of it, sandy bridge STILL does not properly handle 24fps....

oh and personally I think powerVR do a pretty good job, if they had the resources of ati\nvidia their stuff would rock. What is the point of rendering stuff you aren't gonna display.
Reply
#19
Also note that most comments poofy makes are targeted to Linux usage while many endusers prefer windows.

In the end, hardware choice is more a matter of budget and preferences (specs & platform) then brand specifics between AMD, Intel and NVidia.

All post poofy provided give a good overview of possibilities. Without the ranting most of it really is valuable info.... Let the reader be the judge.
Reply
#20
I just think he focus too much on Intel and its past and not on the CPU itself.
But ok if you all think his comments are useful I won't complain. My bad.

I still think there is a lot of potential in Sandy Bridge, having a low consumption htpc, eventually a very small case (no gpu needed) with a not too weak cpu and having possibility of encoding movie at a very high speed. But of course if drivers and developpers support don't follow I'll go back to a core i5 530 + GPU.
Reply
#21
EuhicS Wrote:You know Poofy, in forums there is topics, every person choose to talk in the topic they like, if you want to give additional informations (bad or good), no problem. But if you don't want to read about that "garbage" and just answer that Intel suck and will continue to suck, I don't know why you're staying here.

Because I am passionate about hardware.

I am MORE than willing to accept Intel again if they do the right things to dig themselves out of the hole, but for now it is a messy situation.

I mean the article linked above basically says "Intel's hardware might be good, but with another round of bad display drivers we can't really tell." Nvidia/AMD had 24p nailed a LONG time ago, and here is Intel bringing up the rear.

I am not outright anti-Intel. Without their platforms my second favorite love to HTPCs- hackintoshing- could not exist.

But what Intel is doing is plain wrong. They know that their GPUs suck compared to AMD/Nvidia's offerings but instead of pumping more resources into that department to make it better (like AMD does) they are using their CPU line to force the GPUs onto us. Sure some of it is for power savings, and some of it is streamlining, but the way they purposefully EXCLUDE Nvidia from their platform means that it is something more insidious than that. This is EXACTLY like what Microsoft did back in the day with force bundling IE to kill Netscape, except that AMD has a large enough part of the CPU market (BARELY) that Intel can't be prosecuted.

I mean, will ANYONE debate that the great Intel- CPU champion of the world- couldn't make better GPUs than they do? That some in house Intel tech won't "mesh" better with their CPUs than outside tech? I mean if AMD's fusion was their CPU chips with a Broadcom Crystal HD bolted on they would be the laughing stock of the tech world. But Intel does it and its ok because they are Intel.

They made TONS of promises about open source support, then they sign onto a desperation contract with PowerVR that keeps all the specs locked tight. For crying out loud there EXISTS a Linux driver for the GoogleTV platform that WE CAN'T HAVE thanks to NDAs. In all my years of hardware love (going back to a Tandy 8088) I have NEVER seen such BS like that.

Who knows, maybe Intel has a bigger plan. Maybe they are trying to destroy Nvidia so that one day they can swoop in and buy them for pennies on the dollar. Maybe they figured that pleasing enthusiasts like us doesn't give them the same ROI as Boxee Boxes. Who knows?

All I know is what I see, and what I see is that Intel hasn't done anything to please hardcores like XBMC users in YEARS. I see them TERRIFIED by ARM, and I see a desperate company doing everything they can to lock down everything- consumers be damned. It is like they are going play-by-play out of Microsoft's playbook, and nobody gives MS a break.

Reply
#22
drewy Wrote:oh and personally I think powerVR do a pretty good job, if they had the resources of ati\nvidia their stuff would rock. What is the point of rendering stuff you aren't gonna display.

Agreed. Old PowerVR tech (no rendering what you don't see) was pretty cool. I was really sad when the 3DFXs, Matroxs and PowerVRs existed the mainstream consumer GPU market.

I have no beef with PowerVR. They didn't write these terrible Intel drivers. Intel did, per their contract. The blame is all on Intel.

And the saddest part is that the hardware this round doesn't seem as terrible as their current offerings. The benchmarks show this generation of Intels could really be great chips if they had decent drivers.

Reply
#23
Robotica Wrote:Also note that most comments poofy makes are targeted to Linux usage while many endusers prefer windows.

In the end, hardware choice is more a matter of budget and preferences (specs & platform) then brand specifics between AMD, Intel and NVidia.

All post poofy provided give a good overview of possibilities. Without the ranting most of it really is valuable info.... Let the reader be the judge.

If if was just about how terrible Intel is to Linux users you might have me there. But their Windows drivers and performance are pretty terrible too! I mean, no 24p? Current Core GPUs being weaker than entry 8xxx series Nvidia GPUs? With the amount of resources they have?

Intel's problem is beyond one generation of hardware, or any certain platform.

As far as my ranting goes, well I am a guest here like anyone else. If the XBMC powers that be decide my ranting is not helpful then I will be glad to go obsess about hardware on some other part of the net. But I am not gonna change my style.

I am passionate about hardware. I love hardware. And I am not gonna apologize for that...

Reply
#24
poofyhairguy Wrote:If if was just about how terrible Intel is to Linux users you might have me there. But their Windows drivers and performance are pretty terrible too! I mean, no 24p? Current Core GPUs being weaker than entry 6xxx series Nvidia GPUs? With the amount of resources they have?

Intel's problem is beyond one generation of hardware, or any certain platform.

As far as my ranting goes, well I am a guest here like anyone else. If the XBMC powers that be decide my ranting is not helpful then I will be glad to go obsess about hardware on some other part of the net. But I am not gonna change my style.

I am passionate about hardware. I love hardware. And I am not gonna apologize for that...

I know you love hardware and Nvidia in specific. I've learned a lot from you about all that, especially GPU specific (and Open Source) stuff.

But I believe that the HTPC king is a combination of power efficiency, cpu and GPU power and price. Since you are very focussed on the GPU-specific part (and hardware performance) of the discussion, we sometimes feel different about things. I am pretty simple: I just would like a linux based (free!) HTPC which does it all and I like AMD to gain some marketshare of Intel (future prices). So I am a big fan of the new AMD e350 for HTPC.... I even am prepared to pay 10% extra to AMD for -10% performance just the help AMD with their battle with Intel. Luckily, the new AMD products are perfect for HTPC usage....

But I don't doubt your knowledge, style (maybe I used the wrong word with ranting, sorry) or attitude. I think it is great how much knowledge you have given XBMC-users. So no need to apologize whatsoever.
Reply
#25
Robotica Wrote:I know you love hardware and Nvidia in specific. I've learned a lot from you about all that, especially GPU specific stuff.

You are right, I do love Nvidia. But that is because they deliver. If you would have asked me this time last year what I wanted to see from GPUs before the end of the year I would have said "Give me a GPU lineup with:"

1. HD Audio Bitstreaming
2. Enough power to de-interlace in the LOWEST end GPU of the line
3. Enough heat and power savings for these GPUs to be fanless
4. 3D support

Nvidia's GT4xx line did that, and they did it in a way that us Linux users can benefit (mostly, no 3D support). I will admit though AMD's 6/7xxx line seems tasty for the same reasons. Oh, I ESPECIALLY like how AMD redid how the use their shaders- I expect the next set of AMD GPUs to have the same robust decoding that Nvidia does today. Heck, it might be better.

Quote:But I believe that the HTPC king is a combination of power efficiency, cpu and GPU power and price. Since you are very focussed on the GPU-specific part (and hardware performance) of the discussion, we sometimes disagree.

I do disagree with that, but that is because I don't think the CPU matters as much. In the long run, if it is done correctly, EVERYTHING a HTPC does should be mostly done by the GPU. On the Windows platform we are basically there, which is why ION systems with the CPU power of Pentium 3's do the job so well.

Also I will admit I put performance over price. But that is because I am TERRIFIED about what a 100% focus on price in the HTPC market does to us- such a shift would forcibly move us ALL over to the dedicated decoding chip world. Basically, the GoogleTV platform (No). Us XBMC users want more for ourselves than that.

Quote:But I don't doubt your knowledge, style (maybe I used the wrong word with ranting, sorry) or attitude. I think it is great how much knowledge you have given XBMC-users. So no need to apologize whatsoever.

Thanks, I really appreciate that.

Reply
#26
poofyhairguy Wrote:I am passionate about hardware. I love hardware. And I am not gonna apologize for that...

Amen and I thank you for sharing that passion!
Reply
#27
Wink

Reply
#28
For those who don't believe me:

Look at this:

Image

There is Intel's x4500, it top chip in many product lines, getting handled by a MOBILE 8400 GS. Mobile Nvidia GPUs are always clocked down compared to their desktop counterparts, so you could easily call this the weakest Nvidia GPU in the 8xxx line (which debuted as a lineup in 2006). And it spanks Intel's best on many top product lines including too many netbooks.

Seriously until they break down and buy a decent GPU company (Nvidia) there is nothing is to get excited about with Intel GPUs. They are years behind today, and Nvidia and AMD keep moving the goalposts. I used to give Intel a pass because of their open source position, but now that is shot too.

Back in the early GMA 950 days (2005) Intel's GPUs were behind Nvidia GPUs from 1999. They had a five year lag from the get go, so without investing tons of capital Intel was never gonna catch up. Intel did eventually invest real money in the GPU department, but instead of shooting for discrete GPUs (or just flat decent mobile GPUs), Intel was gonna "create this whole new market" with Larrabee that was basically a crappier Nvidia's CUDA. And they failed miserably at it, Itanium-style:

http://www.theregister.co.uk/2009/12/08/...e_letdown/

Intel's arrogance got the best of them.

So instead of trying again, and faced with having GPUs too weak to do the next big thing- decode HD video- they turned to PowerVR and showed them their entire hand in a display of unconditional surrender. Power VR sticks what is really a mobile (as in cell phone) design in Intel's stuff and suddenly they Intel finds themselves in the 21st century. But at what price? Oh nothing, except their big stance of open source is shot to pieces because PowerVR's stuff is protected by patents and the fact that Intel blatantly lacks the inside know-how to produce the CPU/GPU chips needed by a new Decade of the Tablet ™.

We the consumer are stuck in the middle, our netbooks and nettops infected by this pot luck dinner Intel GPU approach. Of course the drivers suck even on Windows, because without unified and planned out (as in years) hardware it is IMPOSSIBLE to make decent GPU drivers.

And for the record, the point I pointed out in the last sentence is why I am a Nvidia fanboy (will admit it)- because they DO have a plan. And it rocks usually.

Reply
#29
poofy -- I am also a Hackintosh fan. I have only gone the easy route and installed SL on my Dell Mini 10v. (am actually a moderator on MyDellMini... but have unfortunately not been very active in the community recently).

Speaking of the Intel Sandy Bridge platform and Hackintoshing... the CPUs are not even in "real" Macs yet and they have already been Hackintoshed running a custom kernel. Rofl
Reply
#30
Asrock vision 3d or build a sandy bridge based htpc?

Will be used mostly for 3d bluray iso playback
Reply

Logout Mark Read Team Forum Stats Members Help
Intel Sandy Bridge CPU integrated GPU?0