Nvidia vs AMD business strategies?

So I have noticed a growing trend in the GPU market... nvidia has been winning the enthusiest market on windows, and is working on linux. Where as AMD has good support in the linux community, and has been winning all the major business deals with consoles and recently with MediaTek. Source

What do you guys think of this?

If I was to invest I would probably choose AMD because unfortunately I dont see consoles going away and this recent thing with MediaTek could make a huge difference for them.

NOTE: I'm not baiting red vs green arguements. I am also not talking about performance. Its literally all business here.

AMD is working with whoever they can to make money, push innovation in the marketplace and technology. Nvidia seems to be worried about pushing their own proprietary products and deal with backlash over some choices they made.

1 Like

So you figure they are doing it out of despair? I never thought about it that way. Could be.

both are making $.

AMD shakes hands with their customers. Nvidia instead shoves a fist in their ass, but they seem to enjoy that more than a handshake.


And Intel is the biggest GPU vendor in the world, and according to the Steam hardware survey stats, Intel iGPU is the single most popular GPU used for gaming... all first world problems, lolz, status items.

E.g.: the nVidia GeForce series: used to be usable for media production, but not any more, you need to buy a Quadro for that now, and guess what, then you'll also need to buy a GeForce (which is the same but locked down in a different direction) to play games, because you can't really do that on a Quadro. And guess what, both of them such for computation in a scientific environment, because they don't support the full OpenCL features set and can't be scaled because they don't support OpenMP at all.

AMD, nVidia, Intel, etc... they're all just selling, if you want to benefit as a buyer, you have to find ways to play them against each other, so that they have to offer more value. AMD almost went belly up because of the guerilla marketing techniques used by Intel and nVidia, so they came through and started offering more value. Enough to make Intel react and offer more value, but not enough to make nVidia react and offer more value.

On the other hand, you know that you're going to be screwed on the value part if you're buying a 600 USD GPU. The Intel iGPU's are not really good value because they are very weak and Intel can't seem to get Beignet working, but on a cheap low power Atom chip, they're very good value, because they make use of all of the gaming performance the CPU can deliver. The iGPU's on an AMD APU in a console are very good value for the same reason, they make use of all of the gaming graphics power the CPU can deliver. Discrete GP-GPU's only make sense if they can be deployed in a scalable and flexible manner, so that the user can determine the exact value depending on the application and the rest of the hardware. So AMD GP-GPU's are scalable, but the R9 285 is the first one that is realistically scalable for most users, because of it's lower thermal and power specs, because scaling older AMD GPU's requires special cooling and power arrangements that pretty much kill the value.

It's always a struggle to find value. In order to have the best shot at having access to value, it's important for users to be able to play several providers against each other. Because they will only provide value if the playing field is leveled, and if they are facing tough contenders.

Fanboyism is almost always a bad thing when it comes to leveling the playing field. That's why there are people that actually get paid to post fanboy stuff on fora, or to attack people that post good things about competitive products. Because guerilla marketing works, it brings more profit than selling actual value to customers.

This is a short video that relates to a similar situation in another technology branch where profits are mainly made with guerilla marketing instead of product value, a branch that is very similar to the computer branch, and more particularly to the discrete GPU branch. It's a video where a dude with a certain amount of valuable life experience tell the truth, a truth that is also very valid when it comes to computing hardware and software:

It's all about signal/noise... it really is lolz.

Example: what is worth the most:

A. Getting 52 instead of 46 fps in some game;

B. Being able to use an open source driver or a user space confined proprietary driver that doesn't require you to modprobe any binary blobs into your kernel;

Think of it...

Another example:

What the difference between 55 fps and 65 fps in some game? Is it the same as the difference between 60 and 90 fps? Or the same as the difference between 27 and 34 fps?

Think of it...

What portion is signal, what portion is noise? How much of the signal portion and how much of the noise portion has determined your last choice of a GPU?

I'm not casting any stones, I'm as guilty as anyone else in this matter. I just thought that some reflection about it can be useful sometimes.