NVIDIA Q2 FY2021 Earnings; Datacenter overtakes Gaming

@SoulFallen That’s interesting. I thought Nvidia had tried to stop their gaming cards being used in data-centers? I suppose that is easier said than done…

…do they tell you to do one if you try to RMA a faulty gaming card that has only ever been run in a server?

Oh they have done a lot to try to stop it. They refuse to partner with us and a bunch of other hostile shit such as refusing to sell direct to us or give us early product samples (that hurts me specifcially) because we sell gaming cards in our servers. It’s also why we have such a huge share of that market because we’re one of the only integrators that does it.

Usually a non consideration… they tend to live for years and when we do have to get them replaced we don’t say it’s from a server cause again we don’t have direct relations.

1 Like

AI, supercomputers, calculations and more.

GPUs are superior to CPUs at many tasks.

2 Likes

Has the data center usage have something to do with their release of GeForce Now?

No, not in particular.

it makes sense. A lot of Datacenter gpus are used for ai/machine learning development. It is more cost effective for businesses to offload computation to another host rather than develop in house solutions. The biggest tech companies that specialize in hosting and building such solutions include Amazon, Google, and Oracle build in house solutions.

Nvidia’s datacenter focus seen when they introduced tensor cores, first in the volta architecture and specifcially Titan V, in gaming gpu through dlss and to a lesser extent ray tracing with RT cores.

If you think about nvidia from a production and business perspective, they can sell the super premium tensor core for data centers and reuse chips that didn’t make qc on quadro cards for turing/ampere. This saves money and prevents the chips from going to waste.

I think Geforce now is just a application of what I talked about above. The project probably started off as a marketing example for Microsoft and Google. If I had to speculate, Nvidia would probably reuse extra data center GPUs from the past generation for that service. If the demand for the service rises, they will produce more.

I vaguely remember @wendell, steve from GN, or Linus from LTT mention this same method of reusing datacenter parts for consumer goods.

Here’s a good read if you are interested in tensor cores