Bulldozer.. the hype?

Alright guys and gals..


I'm interested to hear you views on some thoughts. There seems to be alot of hype for bulldozer and i'll be honest i have no idea why. Perhaps you can explain?


AMD has always centred around the price to performance ideology. Its great for people gaming and on a budget dont get me wrong or perhaps dont wont to pay the 'premium' for intel.


Previously (looking back to 775 and the AMD counter part) you use to be able to simply look at price to performance. The features and technologies on/in the different boards were so so similar. Now? I cant help but feel that AMD is being left completely behind.


Whilst some z68 and soon to x79 boards are incorporating PCI-E 3.0 (Which it will be debatable whether or not it will affect performance in the short run, it will certainly in the long - thats another 'small talk topic') The later will be using quad channel memory (im sure we'll see how effective this is also in the future and if its worth a premium). Furthermore USB 3 - intel has been supporting the technology for a while now and AMD are only really just supporting it - the same with Sata 3. I grant them they have implemented it, they did however take their time.


Is it me or are they playing catch up? I just get the impression that as soon as AMD mobos make these things standard, Intel will be pushing something else new that AMD don't do.


Don't get me wrong, AMD produce some good CPUs, just not for me. They really don't qualify, in my eyes, spending any big money on - maybe thats the point? I will also say the intel are bringing out some pretty good low/mid range cpu's as well.


I think that i would be right in saying that AMD are well aware of PCIe 3.0 as the last time i checked they still produce excellent GPUs, so why aren't they planning to use the technology?


Isn't it about time they came out with something that combines the use of their cpus, their gpus, their innovations, something perhaps only they can do. They surely have the means.


Whilst this has come out rather long, (thanks to those who read the whole thing) i'm interested to hear your views - from what i've been reading rtw seems pretty 'hyped' about bulldozer, would i be right in saying that?


Bevan.


Is it the age old question of...


[img]http://mytechknowledge.com/wp-content/uploads/2010/02/IntelVsAmd.jpg[/img]

you have to have a "pretty haul-ass card for it to be affected" even by 8x unless you have like crossfire but 16x 2.0 is more than enough for right now as they don't have any thing that needs that much bandwidth, and likely won't need it for a while, now for 3.0 the bandwidth is doubled again right?, so you would only need 8x lane for your 2.0 16x card so 8x/8x would be fine for crossfire, now on the topic of amd vs. intel on new cpu's I don't know much information but what it has been in the past is with intel you give an assload of money for something that is only a little bit more hualass than the other, when you give only a little bit of money for just the regular hualass amd cpu, sure the other is slightly faster but your wallet will be more than slightly fatter, this of course Is my own opinion and nothing more

My first custom built computer had and Dual core AMD, around that time it considered was better then Intel "dual core" cps in the same class(Intel used hyper-threading only at that time so it was not and real dual core). but when i started looking for part for my next build, it was almost like i piked Intel by default. Because i would like to be able run Nvidia SLI. And at that time it was very few AMD motherboards that had that feature and those that did had compromises other places. So what i like about Intel, is that there's no compromise. I can admit that the prices are sky high, but like any thing else that's specked with the newest technologies and features it costs. So totaly agre with Bevan, Intel is pull away form AMD. Personaly i think ill be sticking with the Intel, but people have to chose with they think is the best choice for them. (unless they want a mac)


PS: might be miss spells:P

um... amd chips have shared integer ALU capability, so they are truly hyper threaded.... instead of having a virtualized core like intel does....


al have always prefered amd / ati, as i can get more out of my system for the money, also... the bandwidth addition and lower overhead on pci 3 wont have any visual impact now... and by the time cards are fast enough that it will, amd would have implemented it.

Oh look. Another Intel fanboy.


TAKE COVER!

Well the only reason I went with Intel for upgrade is I got my core i5 2500k for $170, thus a no brainier.


Anyone else notice that bandwidth like PCI Express 2.0 isn't even utilized fully now at least X16. By the time 2.0 by the time people even start using 3.0 and 2.0 is maxed out PCI Express 4.0 will be out.


Technology moves faster on these boards then there is hardware to utilize it.

In terms of combing the cpu and gpu, amd is doing this with their apu or accelerated processing unit, and intel already has a foot sorta in on it with their integrated graphics on the sandybridge.

RuffeDK said:

Oh look. Another Intel fanboy.


TAKE COVER!

Lol, Im more of an EVGA fanboy, but they only make intel boards tho:P

I like AMD CPUs for their value compared to their relative performance. The most cpu intensive thing I do is play games. I have had both ATI(now AMD officially) and Nvidia Gfx cards. Heat and power consumption are not really determining factors for me. Rather frame rates in light of how much it cost are. I have had four AMD cards (HD3850 AGP, HD 4870, and two HD 4770) and one Nvidia card which I own now(EVGA GTX 580).


I have just came to the point where I don't mind putting off and saving for a better Gfx card since that is probably the greatest factor when it comes to playing any game. I prefer a single solution if possible and no more than two Gfx cards at best. Based on what my research, even if AMD was performing better or fairly close to its Nvidia counterpart, Nvidia seem to largely have the higher minimum frame rate.


I don't care what the average frame rate is so long as it's around 40 fps minimum. But I do care about what a Gfx card's minimum benchmark frame rate is just because I can't stand when a game slows down or stutters in though intense or detailed times due to sudden frame rate drop. So that is why I opted to drop a few extra dollars for Nvidia Gfx card

How many games even utilize DX11 because of console gaming?

I have a phenom 965 with two 5770's in crossfire and i can play every game on high setting with frame rates around 50 to 60 FPS.

So why would I want to Upgrade?

I'm waitng for DX12 or another Crysis to come along and make Nvidia and AMD play catch up again!!

But probably won't see that happen thanks to CONSOLE GAMING!!

dude by the time consoles adopt dx11 pcs will be on dx15

The consensus with PCIE is as i thought.. and what i think also.


Mainstream graphics cards are not even maxing out 2.0 so why bother. (There's something like a 1/2% drop in frames between x8/x16 and something like 8% between x4/x8 - i'd link you to the article but can't remember what site it was off).


Enterprise SSDs are pretty much maxing it out so in that aspect im all for it. Its good to know the technology is there, no point implementing it on boards which will be pretty much obsolete in 3 years though.


Interesting article on it: http://www.tomshardware.co.uk/pci-express-3.0-pci-sig,review-31962.html


RuffeDK said:

Oh look. Another Intel fanboy.


TAKE COVER!

Constructive comment, well done you. Would love to know how you came to that conclusion.


I'm still sticking to my initial thought.. They've got some catching up to do.

FYI, I'm an X58 user, and previous P45 user... And I am NEVER going to use "waste" my money on another "high-end" Intel platform. By "wasting" I mean that I don't need the kinda powerand features (4GHz, HyperThreads, ect) that ex. Core i7 delivers. For me it's just burning $$$. That's why I'm looking into AMD with my next build.


As for all the DELAY-dozer "hype", people think it's gonna be the "new AMD 64", back in the 2003 days. Which I hope it will be.

Hopkiller said:

dude by the time consoles adopt dx11 pcs will be on dx15

I agree totally !!

MASSKILLA said:

I agree totally !!

I think I read an article over on gamespot about a week or so ago which was an interview with someone working for xbox (or microsoft, whichever you prefer). He said something to the extent that he thought the xbox 360 was just now reaching about the middle of its lifespan with the release of Kinect.

I have learned from my years in building a rig that if I got Intel I never go for top top tier I usually go for somewhere around the mid-top. Why because you can get great deals on those processors, and they are either the same or around the same of AMD. When they are pretty much neck and neck I choose Intel.


Like i just got my core i5 2500k for $170. which is very similar to the price of AMD Pehenom X6 1090T, but the i5 out performs and benchmarks it. So thus I went with Intel.

Bevan said:

The consensus with PCIE is as i thought.. and what i think also.


Mainstream graphics cards are not even maxing out 2.0 so why bother. (There's something like a 1/2% drop in frames between x8/x16 and something like 8% between x4/x8 - i'd link you to the article but can't remember what site it was off).


Enterprise SSDs are pretty much maxing it out so in that aspect im all for it. Its good to know the technology is there, no point implementing it on boards which will be pretty much obsolete in 3 years though.


Interesting article on it: http://www.tomshardware.co.uk/pci-express-3.0-pci-sig,review-31962.html


Constructive comment, well done you. Would love to know how you came to that conclusion.


I'm still sticking to my initial thought.. They've got some catching up to do.

im going to use programming as an analogy here.... two different programmers start from the basics... one goes into enterprise application development, the other starts making games.... what your basically saying is "oh look, hes not making any games, he has some catching up tot do!"


no... amd and intel are going seperate ways, even though they are both making processors... one company is focusing on low wattage apu's for moble devices and laptops. on the other hand, intel is going with low power and low wattage cpus for their laptops, and integrated cpu/gpu technology for desktops.


same thing with amd / intel high end desktop chips....

amd is going with shared integer processing, where seperate cores can share ALU's to increase the speed of integer processing, allowing multithreading at a hardware level.


intel is going with a virtualized hyper threading, where the cpu has extra emulated cores on the opperating system. they also include an option to disable virtualization to increase voltage stability for higher clocks.


-you can compare performance, but you cant compare it without also analyzing the target audience and intended applications.

Good points and you know more details about it than me.. so i aint arguing.

Although..


ztrain said: what your basically saying is "oh look, hes not making any games, he has some catching up tot do!"

I am not that naive and is not what i was trying to suggest.


I was however initially talking about motherboard features.. and sure whilst those features are dependent on chipsets and cpus i was simply making a simple comparison between features and not the technologies with in.


As i said they were merely thoughts/observations.

Intel supports neither USB 3.0, nor SATA 3. Both of those features are added through third party chips. Intel refuses to support USB 3.0. You'll see them backing their own proprietary Thunderbolt standard.

Thunderbolt will never be adopted as other companies will not use as widespread as usb 3.0 will be.