Nvidia lied to the consumer

Maybe wait see what the next card AMD comes out with first

 

imagination has the best ray tracers

especially in cases where a 290 or 290x can't be handle due to power/heat/space limitations (pretty sure there aren't any ITX 290s or 290x's).

Okay I am NOT going to get drawn into this discussion too deeply because, well I think the reasoning is obvious. However I would like to take a moment to pull this off topic and address the above quote.

I will capitulate the size issue with our 290/290X cards being some of the largest cards out right now and it is true there no ITX card above the 285 in our lineup. However I feel the heat and power issues might be blown out of proportion.

I have, sitting on the desk next to me, a system built using an APU and a 290X Tri-X that is running perfectly fine on an Silverstone SFX 600W PSU. The 600 watt PSU is a pretty standard size amongst gaming rigs to the power draw of the card, while higher than others is hardly so high as to be a real issue.

Concerning heat, our Tri-X and Vapor-X designs do not heat up more than pretty much any comparable competitive card. While our cooler design gets a lot of flak for not being a blower style the heat put back into the case has a MINIMAL impact on overall system temperatures in any build big enough to use the card.

Just wanted to address these two points, as you were.... :-)

 

Having read this and the LTT post as well, this is false advertising from team green. It shows why technical specifications need to be touched only by the engineers and the marketing team basing their materials around that, instead of fudging the details.

However, the actual impact of this is essentially that the GPU has 3.5 gb of ram instead of advertised 4. While this isn't a good thing that nVidia did to their customers, it's also not some massive end of the world difference between their advertised specs and what was delivered. As explained above the missing ROPs aren't actually bottlenecking the card or adversely impacting performance and having games need more than 3.5gb of ram is still small use case scenarios. All cards crater in performance once you pass their onboard memory threshold since they then have to go to DRAM which is much slower and requires passing through the CPU.

As for the AMD driver thing, well drivers impact a lot more than games that require over 3.5gb so it was a bigger deal and had a much wider impact. That being said, both are still VASTLY better than intel bribing benchmark companies to make AMD cpus seem like crap. Because even though it's not exactly as nVidia advertised, the 3rd party benchmark results are still valid since they were done with the actual cards you buy and not based off of nvidia marketing bs.

nVidia needs to make good with their customers somehow, but at the same time stop acting like they shot your dog.

Wait, I thought Voodoo were the best cards on the market!

When did this happen?!?!

Hey if you call 20-30 fps playable then go for it.

So the "just the tip" philosophy works here, if they screw people just a little its not as bad as them shooting your dog.

Yes, actually. I expect you to react according to the actual damage done. If someone slaps you in the face and you go running around saying they attempted murder then you're just an idiot.

As I said, nVidia definitely erred in its marketing and needs to make amends. But this isn't some, "nVidia is worse than Comcast, Time Warner, Hitler, and AIDS combined and all their stuff is shit -they raped my mother- how dare they! Those cockroaches should all put napalm up their asses and light the match because they're irredeemable human beings who deserve to suffer!" reaction that some of you are having.

Also, as i said before, their marketing material did not affect 3rd party benchmarks in any way. They could have said the things had 140gb of GDDR27 with a jetpack and the benchmarks would still show the same result.

Do you honestly base your GPU purchasing decisions on how many ROP units they say the card has? Or what it's bandwith is?

No, you look at benchmarks and price points so you can tell WHAT THOSE NUMBERS ACTUALLY FUCKING MEAN TO YOUR GAMING PERFORMANCE and then you make a decision.

 

Stupidity on nvidias part, but this isn't the world falling in. 

I tend to be pretty impartial when it comes to pc products only because I have been around long enough to know all tech companies are bending us over. The hilarious part about it, is benchmarks are nothing more than a junk measuring contest and its amazing how many individuals think just because something scored well, its all good. 

  Lets take this into context. I'm not saying nvidia dosn't release good products, but it still boils down to the fact they sent a product into the world saying it was one thing, and it turned out to be something different. Who honestly is fine with being lied too, thats like your GF saying she has been faithful, and then telling you "oh this one night though I was drunk, and i didn't remember till now" Its crude, but a perfect example

 

Well, when I talk about benchmarks I refer to in-game engine benchmarks. Like if you play a lot of Arma 3 and you want to know what card to get to improve your performance, you can get some charts giving you a great idea of how they perform inside of that game. this is incredibly valuable to consumers since it gives a very tangible way of showing them real world performance rather than talking about ROPs and GFLOPS, buffer sizes, etc. so that they can make a more informed decision.

And your metaphor is actually not very accurate. This would be akin to you asking how many ex-lovers your partner have and they say 4, but later say "5 actually, forgot one". It's not like they said they had 64 ROPs but had 0.

My crummy metaphors aside, nvidia lied, got caught, and please don't tell me you think its OK because it was just a "minor mistake"

I just got an MSI Gaming 780 Ti for $330. 
Score! I hope the price plunges on the 970's. I still think it's a neat card. It's just really effed that they lied. But they all do it, just a matter if they get caught. I'd be really pissed if I were trying to push 4k games but this kind of thing happens when you buy in early to new tech, I'm referring to the new 4k craze and having to have the gpu/gpus to push it as you need a min of 4GB for a frame buffer.  

lmao, I fixed it! Hooray!

I never said it was OK, I just said temper your anger because it's out of line with the actual offense.

Nvidia are either grossly incompetent at best of fucking liars at worse.

Their specs don't match their launch specs.

 

 

Actually 290x CFX is faster than 970 SLI for 4k.

http://www.hardocp.com/article/2014/11/19/nvidia_geforce_gtx_970_sli_4k_nv_surround_review/5#.VMizQCusVSk

Dude i saw a guy in a tech forum that was like.

The card has all the specs promised expect :

"The VRAM arquitecure is different and the ROPS are disabled"

So it doesnt have same specs , or what was expected.

I dont get it what nvidia fanboys get off on defending nVIDIA.

What a bunch of morons.

Im a little shocked at the way threads like this tend to go. Jumping up and down like a bunch of whining 12yr olds.

When the 970 was released it was praised, benchmarks across the internet showed very good performance. Everyone was as happy as a pig in shit with it. 

Now with this story coming to light, and it should have. I am by no means defending nVidia's speculative lack of communication between the teams working on the product. Yes it should have been labelled correctly, but it still uses 4gb of memory just in a different way.

The 970 still offers great performance and 99% of mature people who purchased would probably still have bought one even with different stated specs. They buy based on performance in reviews and value for money not whether there is 1.75 or 2mb of L2 cache.

Meh,

I am an amd person through and through and i might troll nvidia for giggles every now and then but they dont deserve this level of silliness... they get a pass from me on this one.

I have used two 970's on friends machines and they are excellent cards, i would have no problems recommending them to people.

..Although I try to steer people to amd (purely because i dislike cuda and physx and gsync and.... mother f*cking gameworks) but even I see that different jobs require different tools.

*tin foil hat time* and COMPLETELY seperate from this nonsense...

980, 970, 960

all of these are part of the same product stack where 980 sits at the top with the silicon firing on all cylinders and operating as it should...

and where units are defective they are made into 970's where possible...

and when they arent good enough to be 970's they then switch more bits off and get em to be 960's..

Was the 960 released PURELY because nvidia have had a high enough failure rate of cards not good enough to be 970's?

It's common practise so I am thinking this is so, my question is...

and without knowing full figures of shipping 960's, does this seem like its too early to be seeing this card?

e.g do you think nvidia have had a higher than normal failure rate on maxwell?

 

it's just a thought.