AMD, GCN, and pulling Fury X review samples

im going to come out and say it, I am a Lenovo fan and am currently stuck on a Lenovo G570 with a shitty Core i3 sandybridge. i hate the performance my intel chip gives me. My previous notebook was an AMD athlon variant and it played Arma 2!! This POS cpu/igpu was supposed to be an 'upgrade'. My ass.
My next laptop will be a lenovo running the Carrizo APU. The only saving grace Intel has had is when i dual boot with Linux, the performance/batt life is a bit better. But Intel doesnt release drivers with every new distro release and AFAIK dont even plan to for a while once Kernel 4.0 releases. On Linux, to get the best performance out of the Intel chip, you have to install the intel microcode package.

Bulldozer was fine. Sure, it failed to live up to the hype but AMD went the CMT route and INTEL went the SMT route. INTEL bought out the market and devs programmed for CMT. Hence why Win 7 and 8 had to have special patches to work with AMD cores. CMT is amazing at computational loads like cracking passwords and bitcoin/cryptography. If it is a scientific problem, AMD has you covered. OpenCL for example........skull-f*%#ks CUDA in serious computational work loads. Or else all the Bitcoin miners would have bought Nvidia cards and Intel chips. When I mined, I could use my 8120 and 7970 in tandem workloads and almost double my output. 8 CMT cores could crunch and obscene amount of data efficiently.

Unfortunately, the market is a gaming market and 90% of games dont do that sort of thing. Arma does though and they optimized for Intel arch. When i had my 8120 I could easily host a Wasteland server and game on said server at the same time from the same machine. My intel chip ATM 4770k suffers hard doing the same task.

I think it is the OS and Kernel. Linux and MacOS dont build in optimization for one brand over another like Micro$haft Winblows does. Again, the OS patches just for FX chips to work right. Do you need those patches in Linux or Mac? NOPE. Microsoft and Intel (and probably Nvidia) are bed buddies and it shows.
Nividia Linux drivers were utter shite until Linus Torvalds publicly gave Nvidia the finger and told them 'FU nvidia' in a public forum over the drivers. it still took Nvidia how many years to release an opensource version of PhysX? The fact that AMD open-sources the majority of their stuff only helps them in the Linux/Mac sector.

@Kat The topic of bulldozer architecture is rife with danger, dashed hopes, and dogged determination against distortions.

I think Logan said it best when he said that Bulldozer was essentially workstation parts released for the average consumer, with some enthusiast features. Some aspects of Bulldozer were spot-on - overclocking is a simple and straightforward affair. Managing to fit more into the real estate available with the given lithography was also an accomplishment. Having more cores would be great for an environment that valued multi-threaded workloads, but, sadly, the gaming market was, and is, slow to evolve in that direction. So Bulldozer's performance for gaming was kind of poor. Since the efficiency didn't improve until the Vishera update was released, yeah, the Bulldozer launch was poorly received. Ideally, Vishera should have been the first released product, and AMD should have released a Richland update for the AM3+ socket. Oh well, though.

However, in games that are properly optimized, or in workloads that can utilize multiple threads, the aging Vishera products are still great performers. Even without the optimizations, though, the original bulldozer cores weren't terrible; I cackle like mad when I see an FX-4170 sneak into reviews of today, and still performing just fine.

EDIT - Whoops. Hit submit too early.

Now, as for the 300 series and pulling reviews... After sleeping on it, I have just come to the conclusion that I do not care. AMD could have just as easily pulled samples because of the comments that the site-goers were posting, or it could be that they felt reviewers missed the point. The 300-series changed its tone, and what it was marketed for from the previous generation. If you are only looking at the hardware, you may miss that point - for better or worse.

Intel's iGPU's have always been terrible. mind you there hasn't been any consideration from Intel that they want their Internal graphics to do anything but display your GUI on your monitor. so in a way i would argue that's your own fault for investing on an Intel-based Laptop with no dedicated graphics and expecting to game on it. most of them barely can as it is.

Intel didn't really buy out the market. if they truly did you wouldn't be able to get ANY AMD based product.. All Intel really did really was Pay OEMs to use their Intel CPUs cause AMD hasn't bothered to bring anything truly amazing to the table. PLUS AMD Processors weren't overall good performers they performed almost exactly the same as an i3 the only thing AMD had was computational performance as you mentioned. AMD has always been proud of thriving on OpenCL and OpenGL performance. nothing could touch them. Also even if Intel didn't pay, people were going to use Intel. because the Phenom II didn't do so well, and everything else didn't do so well either.

The market was moving to "Single-Threaded Performance" being king.. and AMD didn't bother to get onto the boat til Zen. so the "Intel Paying / Win-Tel master race thing, is really AMD's fault for allowing this to get so bad. clearly AMD didn't bother to make something that would annihilate intel, and Intel sitting on the throne for so long didn't bother to continue innovating.. for what? they were king regardless of what they did. Now AMD is getting on the ball? pssh..

As for the Intel / AMD driver thing on Linux. let's be real Intel doesn't care for Linux. they say they do but they don't. as for AMD and Linux, AMD cares for Linux but doesn't deliver. in some ways i would argue that AMD thrives on False hope of the tech community. cause they tell you "we got this thing coming.. it's amazing! and really it isn't. HSA has been around for almost a year and a half now. where's the progress? AMD kept mentioning that they have Kaveri based tablets.. where are they? AMD said Carrizo based products are coming? how many two? or three? it can't be a one-way-street in the tech community. you can't blame Microsoft for AMD's problems. AMD has been digging their grave for a while.. clearly CMT wasn't amazing and AMD admitted that themselves. now that AMD has broken themselves. it's time to rebuild. AMD needs to re-build their strategy. cause they allowed Intel to dominate for far too long.

As for Gaming on Linux, blame developers for that. there are more people gaming on Windows than Linux. why? cause Microsoft has the PC market by an unholy grasp and in my opinion isn't going to be broken. plus there were too much problems with drivers back then anyways. so not developing for Linux is less headache for Developers..

All the AMD APUs from 2011-ish and onward were based on the CMT-based architecture that was in Bulldozer/Vishera/Piledriver. all the APU had two modules with 2 cores in each module. the cores in the modules were weak as hell. cause each module shared cache. instead of how SMT works where each individual core has it's own cache lane. I would say after Piledriver. they should of threw that architecture out the window. cause CLEARLY it wasn't working, but AMD continued to beat the dead horse. waste more money on R&D on the 9370 and the 9590. and try to sell it to us for 500+ dollars. when our current Vishera chips were performing exactly the same. the focused too much on APUs and not on everything else. like i mentioned in my comment above AMD thrives on false hope. "Hey we have this 12 core coming it's going to be amazing!" it gets released "Oh we mean it has 12-compute cores. 4 processing cores and 8 graphics cores.." you mad people!?

like i mentioned to @LinuxMaster9

Linux is just 1 or 2% of the market. AMD would go out of business if they just focused strictly on Linux. their processors perform great every where else that isn't Linux. and it can't be a one way street, Oh it's microsoft fault that AMD sucks. no AMD just truly hasn't brought anything ground-breaking.. the only thing they did that AMD did that broke the "Tech Community" really was the 9590. if the 9590 were actually a Brand-new architecture like Zen and were actually clocked at 5Ghz and we didn't know how it were going to perform, the tech community would of been screaming their heads off in excitement the way we did when Fury X was finally shown to us.

should have and could have are two different things. AMD had paid their fabs to tool their factories for the CMT based architecture. AMD would have gone belly up if they were to cancel the order on parts and retool the fabs for a different architecture. the downtime because of retooling would have hurt the company more than just finishing out the contract.

you mean how AMD cancelled Skybridge? that didn't really hurt them and that was cancelled. AMD continued to beat the dead horse that was CMT based architecture and like i said many times, i don't think they should of continued investing into it. if they didn't release the 8370e, the 8370, the 8320e, the 9370 and the 9590 and just focused on Zen we probably would of had it by January. but NOPE gotta have more APUs, you know the one that has "12 Cores". and don't forget about and the same FX-processor re-released 5 times.

AMD canceled Skybridge BEFORE it was being produced. Once it is being produced, it is VERY costly to retool the factory.

Also, remember that Skybridge was the AMD+ARM chip. different Architecture.

Well, shared cache was one of several perceived "deficiencies" with the Bulldozer family. AMD helped offset this by giving them an obscene amount of cache. Long pipelines and a lower IPC are other oft-cited causes for inferior performance. Furthermore, I doubt AMD spent much R&D on the FX-9370 and 9590 (they are, after all, Piledriver cores) and had more resources put into obtaining Asetek-derived AIO cooling solutions and the binning process that qualified the chips. As silly of a move it was to release these "furnaces" - at least in the mind of the knowing and the enthusiast - from a marketing standpoint it was not a terrible move. I was not surprised when AMD did it, and the same with the "energy efficient" FX-line. Take a quick browse on just about an tech forum, and you're bound to run into people that have purchases either of these two processors. Not everyone likes to delve into the little details and nitty-gritty, and some people are just willing to throw down that extra money for something they think they want without the added effort. That is the brilliance of marketing and market segmentation - if you can sell the same, or similar, product in a different way, you will make that much more sales. It works.

Anyway, I am digressing. The point I was going to make was that AMD's current offerings do not fit our enthusiast-minded mold of having great performance in a certain way that is common for us. While games like Tomb Raider are greatly optimized and it doesn't matter what CPU you have, a lot of other games do depend on that single-threaded performance that Intel provides. If more games were coded like Tomb Raider, AMD and the Bulldozer family would not receive the "hate" it does. We can certainly blame AMD for its shortcomings, but we shouldn't forget to blame game developers for being lazy. Thank god AMD produced Mantle to show them the error of their ways!

As for APUs.... Well, as said before, the topic of bulldozer architecture is rife with danger, dashed hopes, and dogged determination against distortions! In other words, there is a lot to say, especially on the subject of expectations and reality. Perhaps, one day, parallel computations and HSA will be more mainstream. Until then, we can only hope.