Nvidia embraces competition in it's very own way

It didn’t take them 2 years to design it. Nothing anywhere near that.

2 years of legal back and forth perhaps to get into the GPU game, yes.

1 Like

Eh Dell is not a GPU vendor so … not sure how that’s relevant.

Because businesses are honest and would never break an agreement right. Also they were not allowed to talk to anyone about it, why would they to their competition? Doesn’t make sense. What does make sense though is taking the shitty deal and hoping the competition doesn’t. In hopes of getting better and/or more supplies.

Pretty sure they have decently payed lawyers who looked into that. Probably resultet in mo available legal action (because the contract wording may not have had grounds for it), or even if there was it would have taken too long. Time they wouldn’t have if the competitors already agreed and getting the nice chips they don’t.

Yup the hardware side isn’t the issue, and I never talked about 2 years anyway unless I don’t remember being drunk.

PS: I’m BTW not saying that I agree with the decision to go along with the GPP, I’m just saying it makes sense as a business decision. In a business you don’t always have to like what you (have to) do.

Memory suppliers are competitors aswell and are now facing yet another lawsuit for price fixing. ¯\(ツ)

Edit: Businesses are not like countries, they always talk about something because at some point, even the competition might be useful to partner with.

That’s a different kind of beast though. One is being the supplier (memory), one is being dependant on a supplier (GPU makers).

Apples and Oranges and stuff…

Is it? I thought we were talking about companies talking to each other despite beeing competitors. I see no difference if that is oil companies, memory suppliers or GPU vendors talking about stuff they are not supposed to talk about.

For fucks sake:

There is so much bullshit in this response. First you have the fact that one is green, one is red. Second, they created a complete mess with the 1060 3 and 6GB versions have different silicon. Third you have the 1030 different memory configuration. Then of course you have the MX150 laptop shit. But they want… clarity.

3 Likes

Meanwhile at Nvidia

Screenshot_20180512-012500_crop_539x400

… the gaming experience of a graphics card depends so much on the GPU that is chosen.

It shouldn’t. This is why we have Vulkan/OpenGL/Direct3D, so that you don’t need to care about what kind of GPU you are using! The card you use affects performance, but that’s not just GPU, that’s also dependant on memory and cooling, and whether the GPU you have is cut down or not.

The exception would be VESA Adaptive Sync support, but Nvidia could easily solve that problem by supporting it on their cards. Since it was derived from embedded DisplayPort, I wouldn’t be surprised if the circuitry is already in their mobile GPU chips.

What exactly does the API have to do with the amount of compute units in a card? Of course the GPU matters… and yes different cards perform differently with the same GPU type, but the GPU type has arguably more impact.

Nvidia can thank mining for that, think nvidia predicts a major loss next quarter, but that would just be back to even.

The GPU type distinction Nvidia wanted to push was AMD/Nvidia, not even Polaris/Pascal/Vega/Volta, let alone what chip is used.

If you aim to fixate consumers even more (assuming that’s possible) on the GPU manufacturer than the architecture; that suggests you might be planning on more proprietary APIs (G-Sync, CUDA) or APIs that favour one companies products (Gameworks/Hairworks).

My point is that the brand/type/manufacturer of GPU shouldn’t matter directly, only the resulting performance of the architecture in a particular card’s configuration; to suggest otherwise is to lay the groundwork for a return to Voodoo-era APIs specific to each vendor.

You can see something similarly worrying happening on Linux, where Nvidia was trying to push EGLstreams over GBM (the system everyone else already uses).

I’m at the point now where i don’t even care how strong Nvidia’s hardware is. They’re on my shit-list for the foreseeable future.

AMD hardware is “Strong enough” for my purposes, they’re supporting open source, their hardware is more general purpose and stronger at anything outside of gaming in general so i’m not too fussed about how quickly Nvidia run legacy DX11 software. Any card runs legacy software quick enough.

3 Likes

Having an open source driver really decides it for me, there’s nothing quite as future proof as having the source code for your hardware’s driver.

Although, I remember the Console Hacking talk about the PS4 that mentioned reverse-engineering the AMD GPU firmware itself; that would be even better!


An ideal world, of course, would be Nvidia officially supporting Nouveau.

With both drivers open, you could probably better compare actual hardware performance; or at least see what game-specific optimisations are being used.

In an ideal world nvidia wouldn’t be a bunch of shitheads… but here we are.

6 Likes

AMD and intel should just cross license a new PCIe bus spec and not give Nvidia a license.

They’re both enemies of NV now and could probably do it quite easily. Previously when intel tried it AMD/ATi weren’t so close to them…

edit:
I say that from a “how intel and AMD could kill Nvidia” perspective - not actually wishing for that to happen. mostly.

How would that work? PCI-SIG lets anyone with a membership have access to the specifications for free.

I imagine if that did somehow happen, you would see NVLink for GPU to CPU being pushed even more, with IBM POWER CPUs for high end, and Nvidia ARM CPUs for the low end.

It would torpedo their gaming marketshare, but the company itself would still have CUDA dependence in datacenters and on workstations.

They’d invent a new slot/bus type.

I’m not saying its good for the market. Merely that an intel/amd partnership could conceivably do this to nvidia. May even be likely if Nvidia aren’t permitted (or able) to put their GPUs on package with intel or AMD CPU cores due to not having a license to the EMIB technology or control of a fab that builds the things.

I’m saying that if intel and AMD wanted to, it’s merely an extension/re-try of what intel tried to do to Nvidia back in the NForce days, and VIA some time prior.

The gaming market would not shift to Power CPUs, when all the consoles are on AMD APUs anyway.

Prior history of these shenanigans:


Mech2 sounds a lot better than gaming

It is more like Armor mk2 than Gaming.