Several Nvidia board partners about to launch 4GB GTX 960

So, several Nvidia board partners have recently announced 4GB versions of the GTX 960, which should be coming out later this month. Do you think that this solves at least one of the drawbacks of the 960 for people who need a larger frame buffer at 1080p but don't need the 970's horsepower (and price tag)?

With the recent large VRAM caching trends in game development (since "next-gen" consoles have weaker GPUs but can field up to 4.5 GB of VRAM), won't a 4GB 960 be slightly more "futureproof" than a 2GB one, even with the same 128 bit bus? Does Nvidia's compression tech make any real difference there?

No one has ever needed 4gbs of frame buffer at 1080p tbh but 4gb vershons will be better for those who whant 1440p SLI

It is truly a stupid card if they put 4gb's of vram on it. 128 bit bus + weak card + 4gb's of vram = pointless card. The 960 is already weaker then AMD's cards that cost less, and now your adding cost to it. The 4gb version is going to cost pennies less then the 970 which is a far superior card in every way possible. Using a single 4gb 960 at 1080p will be dumb since you wouldn't have the horse power in a single card to need 4gb's of vram. And having sli 960 4gb versions is dumb since you can get sli 970's for pennies more, and 970's provide a serious power boost. Its a dumb card without a need. They'll probably sell them like hot cakes to idiots that have no idea what they're buying anyways...

lol, 960's are so weak, what they gona do with 4gbs of Vram? just chug along just as slow. that's what.

You're right it will only use 3.5gb
<img

But yea seriously 4 gbs is overkill for what is essentially a 1080p card.

And like @thecaveman said 128bit bus? Really Nvidia....smh

I'd rather have a 3Gb 192bit bus version honestly. That should have been the default 960. Because as it is right now, this card has no place on the market. 4Gb won't change that. I can still get a 280x for the same money and it's a better card.

I really hope AMD does not screw up this launch, they need to slap around Nvidia from making crap like the 960.

If the 960 had been released sooner, they probably wouldn't be getting such a bad rap.

They're great 1080p cards, but now than AMD cards have hit an equilibrium with lower pricing, it is kind of hard to justify purchasing one unless you are just that much of an Nvidia patron.

They probably should have included a 4gb version at launch, or at least a 3gb version. With the 2gb mark finally passed, and being passed more everyday, a lot of people are looking for more RAM with a mind for longevity. Even some casual people think, "Hm, perhaps later I can Crossfire/SLI this card with another one..." when they invest in a card, and it doesn't take a lot of research to figure out that having more RAM on your card is a nice thing for such an application.

1 Like

well according to the info from nvidia after the 970 thing this is just impossible. Maxwell unless you have the top banana can't handle that amount of memory. Should be a fun few coming months.

Prediction: come out, is praised, one week goes by and it goes down in flames with actual people and not paid off reviewers. AMD chuckles in the corner and releases the 390X.

The 970 can't handle a "solid" 4GB because its a crippled GM204 with a 1/8 VRAM prosthetic. The GM206 on the GTX 960 is a fully-enabled chip, but with half the performance of a non-crippled GM204. EVGA, Asus, Gigabyte, MSI and co are just going to offer a version with Vram modules that are twice as big for the same bus.

It's pretty clear that 4GB of VRAM on a single 960 may only prove to be useful when running games that do not saturate the 960's memory bus at 1080p with higher details/AA; i.e, modded open-world games or recent "next-gen" cross-platform games that abuse VRAM caching. On the other hand, the 960 has The Witcher 3 bundled with it, and it supposedly falls within its "recommended" specs. :S

So far, Nvidia always brings up their "lossless delta color compression" to explain why their cards have a ridiculously small bus, and to explain why they apparently perform better than they should with that small bus. What, then would be the "standard" bus width equivalent of a Maxwell card with compression taken into account? If its effective bandwidth is twice as much, then the extra 2GB might not all be useless for things like a modded Skyrim or some "next gen" memory-hog console ports.

As for the AMD comparison, a 4GB GTX 960 would probably land in price-dropped R9 290 territory, at a minimum of 250$ USD. In that context, the GPU price-to-performance ratio really isn't a valid point of comparison anymore if you don't take other factors in consideration. In the US, AMD's high-end GPUs are so cheap that it's a no brainer to recommend them over Nvidia for the same price bracket. That's also why there's a bunch of people complaining on the internet about the R9 280Xs/R9 290s that they ran on 500W power supplies because they didn't know better and nobody asked their PC's specs before telling them what was the best card to buy. XD

I know I personally couldn't go for an R9 280x over a GTX 960, even at US prices, because the higher TDP might require the additional purchase of a slightly higher-wattage power supply (and maybe even better fans to cool my mITX case), which makes the card's individual price only one of many decicive criteria when you compare products that weren't meant to compete in the same category.

Ah right I was under the impression that these were all further cut down version of the 970 and 980 above that. Just die cut to nerf the chip.

I don't put as much stock in the compression as some people do. But hey not a concern, the 960 just is not worth it for anything when the 280x is cheaper and more powerful.

Chip manufacturers actually don't cut the chip to "nerf" it and sell their high-end products at lower prices. Lower-end chips in a product line are usually defective production rejects. Semiconductor production unfortunately isn't flawless, and the imperfect silicon chips/wafers have their defective sectors disabled to be sold as lower-end products instead of just throwing them away. The launch of higher-end and lower-end cards based on the same core architecture is thus dependant on the accumulation of a sufficient stock of chips for each specs bracket.

So, yeah, a GTX 970 is, at its core, a defective 980, while the 960 is a whole other chip.

Hence the myriad of jokes on various forums about how the 960 now gets more frame buffer than a 970. XD

Oh, and about the 280x being cheaper, that really depends on where you can get it from. GTX 960s are actually cheaper than R9 280 non-X where I live... and by a large amount. 250$ CAD for a baseline Zotac 960, 300$ CAD for a Sapphire R9 280...

And shopping online isn't much of a solution with the limited delivery options in a student campus residence.

Useless on that small bus at anything higher than 1080 and high AA

On paper, yeah, that's how it's supposed to be. The old saying is, you need 128bit for each gigabyte of Vram, otherwise it bottlenecks the fps.

But apparently, the way games are coded since 2013 seems to contradict that. We already hit the point where benchmarks repeatedly show that 2GB of VRAM is lacking at 1080p, and it seems to be the vram that caps before the bus width becomes a bottleneck.

I mean, back in 2010, we used to say that a card with a 256 bit bus could barely access 2GB of VRAM... Over the years, it seems as the reference design buses tended to shrink while vram grew.

It makes no sense to me, but that's apparently where cross-platform development is headed to, which is optimizing for the APU equivalent of a 7870 GPU that has access to about 4 GB of GDDR5 over a 256 bit bus...