So, nVidia, 780ti

so, the big announcement from nVidia today was... hang onto your hats... THE 780TI!

 

I somehow expected... more... from nvidia. the 290x will probably still beat the 780ti, considering that it beat the 780 whilst in quiet mode...

it will probably be priced at $700-800, so it's not going to be too common anyhow.

overall, unlike AMD's feature releases, pretty much nothing new here, just another overpriced card for fanboys to drool over and rage/brag/shove-in-everyone's-face.

maybe some workstation applications in a price-point between the 780 and the titan, but there's only ~$300 difference there, not a big gap at that end of the price scale.

I predict that fanboys will be rather excited for the next month or so, but nothing really new, aside from a slight increase in house fires.

now, there is also g-sync, but it's nothing as awesome as Mantle, just a select set of expensive monitors that add little to your overall gaming experience.

I think nVidia needs to start innovating, rather than just pulling the same stunts year-after-year...

 

featured image courtesy Linus.

And we thought the Titan would fall in price. I mean, that would have been the smart thing, right?

I don't really understand where the 780ti can come in or be priced. A decently overclocked 780 can beat a titan so will the 780ti beat the titan. I just don't understand where the card can come..

G-sync seems to be quite a good idea and a fix to the problem but to be honest I generally don't notice it anyway unless like Linus said, the picture is running at low frame rates in which case you should just turn down your settings. It's going to be VERY expensive and surely there's nothing to stop it running off of AMD GPU's?

 

No this wasn't the big announcement from Nvidia today. They barely spend time talking about it. The big deal was G-Sync, which would revolutionize PC gaming. I would say that it's probably going to be a bigger thing than Mantle. On one hand Mantle gives you better performance, but on the other hand G-Sync reduces the lag and totally removes stutter and tearing which gives you smoother experience. My money is on G-Sync. 

Lol i smell desperation

Its more priority shit and it only works with nivida recommended monitors so.....

This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work.

How is the 780ti going to be any different than either a fully unlocked GK110 with the 3GB VRAM, or just another Titan?

What the fuck is nVidia doing:

- At their event, they've showcased 10 year old linux capabilities on the ancient 3.8 kernel as if they've invented linux and it's the newest thing;

- The best new hardware they could come up with, are the pathetic 760ti and 780ti, both of which are two year old technology;

- They have actually said that, by the time the SteamMachines will be officially released (which is still months away), they'll have a good driver for SteamOS, which is based on the old kernels 3.4 or 3.8, and we know by their promise to release architecture info about their cards (which never happened really) how well they keep their promises;

- The last time they produced a half decent linux proprietary driver (not even speaking about open source drivers, let's not because that's nVidia's biggest clusterfuck) was more than a year ago;

- When everyone is moving towards more open platforms, they introduce an even more closed platform, limiting the use of their cards to certain monitors.... really nVidia? So they gave V-sync a new name and a smaller proprietary application field and are trying to sell it as the newest and greatest thing ever? Are they on drugs?

- They still don't have a working GK180 card;

- Their Shield piece of overpriced crap has been out for a few months maybe (it's still not available in all retail stores) and they're already dumping it by offering 30% discounts;

- ...

What the fuck is happening? They seem to be on a mission to piss off all of their customers. Are they closing shop? Are they in the process of being taken over by another company.

La historia me absolverá, but the way nVidia has been behaving in the last year or so, indicates to me that they are in a negotiation/due diligence for a complete company sell-out, possibly with some major asset stripping. Sell sell sell before it's too late!

I agree. Except for the shield. Depending who you are and your gaming style, the shield is very nice. May not be for you and me, but i have friends that really want one. The shield has its place.

They have gone through worse situations, at one point most of their cards were DOA, just before the 8800 was released. Most specifically it was EVGA cards, but it gave nVidia massive bad rep at the time.

Probably will require more horrible leadership decisions before we see a downfall of nVidia, as they still have a powerful fan base to leach off. Corporations do it all the time, once they get a mass fan base to fight for them, they can slack off for a while, rake in mass profits, then get back into the game just before fiscal destruction, some survive, some don't. Examples: Chevy & blackberry.

As soon as I saw G-Sync or what ever it is, I stopped reading the liveblog. I do not see the point in getting a G-Sync. One thing I don't understand is if G-Sync is a chip you install on the monitor or if it is an actual monitor. Anyway, G-Sync, to me, sounded like an excuse for any issues with their GPUs that they didnt want to fix. 

The point is, neither sides would have a perfect competition.

It seems like a desperation move to me.... Will be interesting to see how it is priced and how it competes w with r290x once its launched!

The 780Ti is not even going to be competition for the 290x. It is going to be priced a few hundred dollars more and may offer similar performance at best.

G-sync sounds like a huge gimmick. Remember adaptive V-sync? It now has a new name and costs more money.

I care so much. I love seeing new $800 cards released. Yay. :L

 

All I want is some $200 card that is just amazing kind of like the AMD Radeon 7850 or the Nvidia 660...

Nvidia your killing yourself of slowly and painfully. 

I'm not so sure about this.

Before we get into the whole "fanboyism" stuff, let's remember: the GTX Titan is a 4-Way SLI card, and the GTX 780 is a 3-Way SLI card. From what was rumored, the GTX 780 Ti is supposed to be 3-Way SLI as well. Note that gaming does suffer when in 4-Way SLI anyways, and only certain synthetic of compute applications would benefit from 4 Titans in SLI.

Next, let's also establish that the GK110 GPU is not fully-enabled in any GTX card from nVidia, only in the Tesla and Quadro card series.

Also worth noting is that cooling performance has increased, especially given that nVidia has OEM partners who can design coolers to keep their cards cooler.

Lastly, let's also remember that the GTX Titan has a third of it's Double Precision Floating Point processing power fully-enabled. This means it works for rendering and compute, as well as gaming. Think of the GTX Titan as something more akin to the Z87-WS by ASUS, which blurs the line between Enthusiast and Workstation motherboards. The GTX Titan is somewhere in between a "gaming" GPU and a productivity/workstation GPU. It can be thought of as a "diet Quadro" for people who work and do gaming on the same machine.

So, here's where the GTX 780 Ti comes in. It probably won't feature very much Double Precision Floating Point processing power, or if so not much more than the GTX 780 has. Next, it probably will have more cores (as many as the GTX Titan or a fully-enabled GK110 with 2880 CUDA cores).

I doubt it'll have 6GB of GDDR5. nVidia wants a large profit margin, and putting 6GB of GDDR5 would be bad right now. With the price of DRAM going up since the Hynix fab issues, putting more GDDR5 on a chip would be a bad idea for profits. Also worth noting is that Hynix produced some of the best GDDR5 chips out there, which may also be bad news for the GTX 770's 7Ghz (effective) GDDR5 memory modules, so we might have a shortage or a price hike.

Next, let's consider this: nVidia didn't want partners changing the cooler on the GTX Titan (only EVGA could, and that's to put liquid coolers on it). But that seems to be because it wanted it's product to be recognized (remember that GTX Titan logo in greed LED's everyone was talking about?), but also it seems it wanted it's product to look "professional-grade". Again, part of the "workstation+gamer blurred line" strategy.

But with the GTX 780 Ti, it seems that blurry line is going away. I think if nVidia is smart, it'll give a full GK110 for gamers to play with, but only allow 3-Way SLI, whilst also allowing OEM partners to offer non-reference PCB's, non-reference coolers, and default-overclocks that can push it way beyond the GTX Titan. The GTX Titan would be more of a "compute, productivity, rendering, benchmarking + gaming hybrid" GPU, whilst the GTX 780 Ti would be more for gaming and benchmarking, that's it.

Keeping it at 3-Way SLI means gamers get less lag, especially from those PLX chips. With alternate frame rendering, one frame is rendered on each GPU. So if your framerate is 60fps, and you're running 3-Way SLI, that means anything you do now will have to wait through 50ms to get through and be processed. That's because your framerate divided your the number multiple GPUs you have, all divided by 1000 is the number of milliseconds it takes for any input (mouse or keyboard) to be registered. So that's [ (60fps / 3) / 1000 ] = 50ms. If you thought 8ms (GTG) was bad for a monitor, imagine what SLI or CrossFire can do to your lag! (That's not including internet, monitor, mouse, keyboard, etc.)

That's lag no gamer could stand to use in a competitive setting. So why add more lag by adding a 4th card in SLI?

2-Way SLI or CrossFire is the practical limit for most competitive gamers, but generally a single more-powerful GPU is better than two less-powerful GPUs (due to multi-GPU issues, driver issues, game compatibility, lag and other reasons).

The GTX 780 Ti could be really good. What nVidia needs to do to make it competitive:

- Full GK110 (2880 CUDA Cores, or at the very least 2688) enabled by default. That's 15 or 14 SMX clusters enabled by default.

- Allow OEMs to ship non-reference PCB, non-reference coolers, and ship overclocked waaayyyy beyond standard clocks.

- Ship with reference core clock of 893Mhz (Boost of 935Mhz), with 3GB of GDDR5 clocked at 6Ghz over the 384-bit memory bandwidth.

- Price it at 699.99$ (USD).

- Include the nVidia game bundle with the card.

- 3-Way SLI maximum.

- Allow only 384 CUDA Cores to be used for Double Precision Floating Point calculations, so it can't be used as much for compute or rendering. It should be meant for gaming as it's main purpose, instead of trying to be all things at once. By disabling Double Precision, it can't be used as a workstation product, and thus won't compete with their Quadro or Kepler cards, and it won't be used for productivity (at least not much more than the GTX 780 is). Another option is that only 32 CUDA cores in every SMX cluster could run Double Precision calculations, meaning either 480 CUDA cores running Double Precision (about half that of GTX TITAN) if we're talking about 15 SMX clusters (2880 CUDA Cores) or 452 CUDA cores running Double Precision calculations if we're talking about 14 SMX clusters (2688 CUDA Cores).

- Keep the TMUs and ROPs  of the GTX TITAN. Don't castrate the GTX 780 Ti like the GTX 780 was.

As for pricing of their other cards, here's what I'd do:

GTX TITAN (lower performance than GTX 780 Ti, except in Compute and 4-Way SLI) drops from 999$ to 879$ (becomes workstation/productivity with a side of gaming, and for benchmarkers)

GTX 780 Ti enters at 699$ (USD) (compete with R9 290-X)

GTX 780 drops from 649$ to 499$ (USD) (compete with R9 290 Non-X)

GTX 770 (4GB) drops from 449$ to 359$ (USD) (compete with R9 280X, but has Battlefield 4's recommended 3GB minimum of VRAM)

GTX 770 (2GB) drops from 399$ to 319$ (USD) (compete with R9 280X's price of 299$)

GTX 760 (4GB) drops from 299$ to 249$ (USD) (compete with R9 270X, and can run Battlefield 4's minimum of 3GB of VRAM - also compete with R9 280X for the price!)

GTX 760 (2GB) drops from 249$ to 219$ (USD) (compete with R9 270X, and beat the price/performance ratio of the R9 280X)

That would be a much better option. It beats AMD, and offers more value to their customers.

Of course, nVidia might not choose to this right away. It's more likely they'll wait to see what AMD does in regard to pricing first, and they'll wait for the launch of the R9 290X and R9 290 Non-X. Once that's done, they'll settle on a new price point. But given all this, it's unlikely nVidia is going to stand still with their prices.

I think nVidia will have to pull out some big guns. I don't think they'll engineer a whole new GPU, at least not yet. The GK 180 sounds like too much work for something that won't even be here in another 9 months (Q3 2014 and 22nm/20nm from TSMC, I'm looking at you).

Right now, I think this will be near the end of new GPU architectures and lineups of new cards by AMD and nVidia. They can't keep pouring money into engineering, testing and whatnot like this indefinitely. They'll probably let these cards settle in, than battle it out with pricing battles and game bundles. Afterwards, once 22nm/20nm is out, they'll launch a new generation with a new architecture, and it'll be the same thing all over again.

EVGA GTX 780 4-Way SLI Benchmarks

http://www.youtube.com/watch?v=hKQf931A7v4


You know, that video explains exactly what I was saying. And also, I did say nVidia did artificially disable the 4-Way SLI support for GTX 780 because they wanted people to buy 4x GTX TITAN for 4-Way SLI instead.

Also worth noting: 3-Way SLI is more than what most people would use or pay for anyways (also, few games support 4-Way SLI, so why bother?). 3-Way SLI is better for gaming anyways, like the video says, and most gamers would opt for 2-Way or just a single, better GPU. nVidia still gets to charge premium for GTX TITAN like they want, but for most gamers the GTX 780 Ti will be enough.

So yeah, I'm glad you agree with me. =)

Yes that would of been my first gess disable 780 4way sli to sell TITAN 4way sli not that anyone is going to need 4way SLI for gameing and if you need that kind of power for professional work there are better cards geared to that.