Nvidia - Titan X Discussion

Let's be real, What do you think of the Titan X?

We're going to have one educated discussion about this gaming card. YES we know it should be a Developers card, but is advertised as a gaming graphics card. [Also to kill all arguments on the box of the Titan X it clearly states "Inspired by gamers, Built by Nvidia"]

What do you think? Here's a few questions to start this discussion.

  • Do you think it will outperform the GTX 980?
  • Do you think Nvidia will release a GTX 980ti?
  • Do you think this card can game on three 4K monitors at reasonable frame-rates?
  • How do you think this card will do in Linux?
  • Do you think the GTX Titan X Makes sense now?
  • Troll Question: Do you think it will have 11.5GBs of Memory? instead of 12GB VRAM lol

I kinda think it's going to start the internet rage when reviewers get ahold of it, the same way the very first TITAN card was announced. "ITS AMAZING" but not worth $1000+ dollars."

  1. Yes
  2. Yes,but not soon
  3. NO
  4. Yes
  5. For Gamers i think it dosent makes any sense(sad that its advertised as a gaming card);for developers or productivity it makes totaly sense for me its just like a cheap quadro (with dp ;but no quadro drivers)


It is basically the Titan 2. It can game well, but it is also good at productivity stuff. I bet that it is a good chunk faster than the 980 in games and likely much faster in productivity stuff. 12gb of ram is like when the Titan 1 was released with 6gb of ram. It is overkill for gaming right now, but it is part of what makes it for devs. People who have a ton of money to drop on a gaming rig tend to be nVidia fanboys in my experience. They want the best of the best, regardless of the price, and that is what you will get with the Titan X. As far as I am concerned, those are the only people who will be getting the Titan X for gaming alone; people with more money than sense, that is. It is kind of a Jack of all trades.

As for how it will compare to other cards, I imagine that it will fit into the line up the same way that the Titan originally did. Faster in game than the 980 and likely a bit slower than the 980ti (iirc about the first Titan). And yes, I think that there will definitely be a 980ti. The x80ti is for the top of the line gamers. The Titans are for top end gamers who also use their gpus for other things. The 980ti will likely be faster in game than the Titan X, but with less ram, and while slower in non-gaming applications.

Honestly, I don't think that there will be any single card that will be able to play AAA titles at 3x4k resolution any time soon. Maybe 2-4 Titan X will be able to handle it, but one? Definitely not. It might get a lot closer to 60fps at 4k in modern titles though (here's hoping). My bet is that they released it now because of AMD, though this is all conjecture. It seems to me that they like being able to say that they have the faster card "in the world". With the 390x looming overhead, they know that the 980 won't stand up, so they drop the Titan X. It should be about on par with the 390x, if not beat it in many cases (importantly, gaming). Then they release the 980ti which will stomp the 390x and everyone loses their minds again about "fastest in the world".

Gaining and keeping the title of "fastest card in the world" is a good way to get brand loyalty and make people want to keep buying new cards.
"Pshhh. You have the 780ti? Yeah, sure that WAS the fastest card around, but I have the 980. Pleb." That kind of sentimentality leads to more sales, which is good from a business perspective.

  1. Outperform 980? Without a doubt.
  2. Release a 980ti? WIth that name, No, the "ti" suffix is mostly reserved for fully enabled versions of a core. EG the 780 and 780ti used the same core but the 780ti was fully anabled (minus DP) and since 980 is already fully enabled (so we're told)... But then again knowing how these gpu companies like to ditch good naming schemes (looking at you amd) But yes, they most likely will as they still need a GM210 card to reign supreme without having the "titan" monicker that implies ultra price-gouging and pretentiousness
  3. Nothing games on 3x 4K monitors but then again it all depends on what game you're running.
  4. Knowing Nvidia...
  5. Depends on the lunch price. Regardless the highest end stuff is the most expensive and those who have the coin will use it to game. Is it for anyone else besides the .001%? Most likey no (see #4)
  6. without a doubt

12GB VRAM on an itty tiny bus. All I have to say is:
Everyday I'm bufferin'

But in all seriousness this "let's use a smaller bus because compression" is really getting on my nerves. Compression or no the GPU will still only be able to clock out 384 bits per cycle and trying to fill 9.6*10^10 bits to fill possibility with many kbit in size frames, well, math happens. Quadros got away with this because they didn't need to move frames at quite as rapid pace as gaming cards.

Honestly, all I care about are the results. If they think that shrinking the memory bus will help them get higher frame rates, then let them. So long as the numbers keep going up with no appreciable side effects, I couldn't care less.

That said, I have to question how what they are doing effects the picture and whatnot. Does it really have no appreciable side effects or does it degrade the picture? Not too long ago, frame times and latency were never mentioned in gpu reviews. Maybe something else will come up soon to measure whatever side effects their compression has. Maybe it has none though. Who knows? For now, I want to see how the frame rates do.

After doing some maths I have figured out that the Titan X is only ever so faster bandwidth wise than the 290X. Titan X=336GB/s 290X=320GB/s.
Only a 16GB/s difference in bandwidth.

But will that affect frame rates? Will it be limited because of that? Considering the money that they poured into this thing, I highly doubt that it will affect all that much to be honest. We will see. Again, all I care about are results.

  1. Yes.
  2. Probably no.
  3. LMAO no. Especially not with that tiny little bus.
  4. No clue. IDK much about Linux and gaming at least when it comes to GPU vendors.
  5. No. Not at all. Will elaborate below.
  6. Of course. I'd be disappointed if it actually had 12.

To me the Titan X and the Titan family in general do not make sense at all. They are workstation GPUs being marketed as gaming graphics cards. To me it is like putting a cow into a tracksuit. Or maybe a better metaphor would be using a Bentley Continental to pull a trailer. Yeah it can do it and it has the power but there are better, cheaper, tools for the job.

People who need that much VRAM and the petter precision point performance of the Titan series will be buying Quadros or Teslas. Not the Titan. Whereas gamers will be buying a 980, or 390X. They will be cheaper and doing a decent or better job for much less money. Considering, the Titan is going to be $1300+ I don't think it makes much sense from the perspective of a gamer. Yes, one could use it for gaming and productivity but if you're serious about either you should have a dedicated tool for the job IE a GeForce/Radeon or a Quadro/FirePro.

I think this will be a card for the rich fanboys. Just to say they have a "Titan." All in all this seems to me to be very much like the Titan Z. The last wheezes of a company who has taken its consumers for granted and has run out of ideas. In my view nVidia has lost the will or ability to innovate. They have rabid fanboys who will buy whatever they make and defend it to the grave. They have no motivation to do anything better. I think though after the issues with the 900 series and GSync that there are cracks appearing in nVidia's armor and they are struggling to deal with it and a changing market.

I would think it could be a limiting factor.

Compare the 290X and the 980. The 980 is a more powerful card and benchmarks generally back that up at 1080p and even (albeit to a lesser extent) at 1440p with the 980 generally being around 15-25 FPS faster.
However, once you make the jump to 4K and memory bandwidth becomes an issue, that boost disappears. The 980's advantages crumble to the point where it is only 1-3 FPS faster and even the same or slower in some titles. That tells me that the VRAM and memory bus is becoming a problem.
The Titan X probably won't suffer as much as the 980 but It is still concerning. Especially when the 390X is supposed to have crazy fast stacked VRAM. Seems to me that AMD is looking forward while nVidia is clinging to the past.

  1. Yes
  2. Yeah , though the k1ngp1n is about to drop or has dropped I think, and I think Luke from LTT said it had a $800 price point or something
  3. I can dream
  4. Meh
  5. Meh
  6. 11.9999 GB

Looked into the 390x: 4kbit bus @ 4GHz = 1.8TB/s so the 390X could in theory not need the VRAM because it can spit out whole frames at a time. The Titan X will have steep competition come june

I know that it was a bit of hyperbole, but it would still need vram even if it could "spit out a whole frame at a time" as textures need to be stored for easy access from one frame to the next. It will always need memory on board, but I am sure that you understand that fully.

But that is making the assumption that it is the memory bus that is making the difference here. It could be that different architectures do different things better. AMD has been looking to 4k for what seems like forever now. They tend to have that sort of foresight about the future of hardware that few other companies have. They just have good vision and good R&D. So is it the bus? I just don't know. Maybe AMD could get away with a smaller bus too. I don't actually know how much of a limiting factor it is, and I am sure that different approaches to the whole gpu thing result in differing dependencies on the bus width, so yeah. Until I can see some real world numbers where the bus width is manually manipulated so that I can definitively see the effect that it has on performance, I am just going to be unsure about it. Although, it is a good idea that the bus is a considerable bottleneck. It makes sense to me, and you would have one hell of a time rigging up a bus width manipulation benchmark somehow, so I doubt that I will see that any time soon.

Yes I understand that. What I'm talking about is that memory shouldn't be used as a buffer because it can move so much data it can push the frame to the output in one go rather than store it in a buffer to be built before pushing it to the output

I think the bus is an issue but I also agree though that it isn't the only deciding issue. I think the lower number of CUDA cores also is a factor too. Just doesn't have the ability to keep up at 4K. But I think that problem is made worse by the small bus. If it wasn't an issue either nVidia wouldn't have bothered developing texture compression technology which seems to me to be a bit of an engineering bandaid

TItan Z came out at an earth shattering $3000USD. Over 3 times the price of the 780 ti and more the twice the price of the Titan Black.

Maxwell however is so fp64 crippled that NVIDIA isn't even going to bother creating a Tesla with this gpu.

The Titan X will likely let you play 4k games but if it comes in over 3k it doesn't have the draw of being a Tesla K80 you can play games on. Maxwell saved power and used it's silicon for fp32. Might be a good board for machine learning for guys with large convoluted networks if they can keep it in 12GB rather than spanning to a second card but guys with codes that big are usually using 1000's of compute nodes and not just trying to decide between 1 or 2.

If it drives down the price of the Titan Black it has my vote but I'm just not sure who is going to want it.

  • Do you think it will outperform the GTX 980? absolutely
  • Do you think Nvidia will release a GTX 980ti? the cut down version of the chip likely will be 980ti or something
  • Do you think this card can game on three 4K monitors at reasonable frame-rates? doubtful. i'll believe it when i see it
  • How do you think this card will do in Linux? dependent on too many factors to make an accurate prediction. could be awesome to subpar to windows.
  • Do you think the GTX Titan X Makes sense now? yes in the sense of the market this card is intended for.
  • Troll Question: Do you think it will have 11.5GBs of Memory? instead of 12GB VRAM lol no it will be a full 12gb. i doubt nvidia would pull that shit on their top desktop part.

Why do people say "productivity stuff" instead of saying what it's good for?