Anyone knows when we can expect Pascal to be released?

Hi guys,

i was wondering if there is any information about when we can expect the first Pascal GPU's to be released.

I was thinking of buying a GPU lately (still got a GTX 670), but a friend told me to safe the money and put it into a Pascal GPU later.

Well what's your upgrade budget?

Otherwise, not much info, AMD's new stuff looks like it's going to launch before Nvidia's, Nvidia apparently showed a bunch of mock ups at CES, no clue if that's to mean their production is delayed

I guess budget is relative. I just dont want to buy a 300+ € GPU now and in 6 Month buy a 500€ GPU again

I think if you invested in a high end GPU now it would still be comfortable with future games for a long while so there would be no need to upgrade again in a couple months, thats the kind of thing enthusiasts who have to have the latest hardware follow. You will always be waiting for the next thing if you don't spend and you will always be out of date if you do, it's the nature of the computer markets.

1 Like

Yeah... I guess thats a good point tho.

Am i right in thinking that AMD have switched to a 14 and 16nm GPU meaning that should thoretically out perform nVidia unless they have something similar in the pipes. Also my understanding is the DX12 should really even up the playing field even more.

I dunno, but is AMD actually back in the game? A while ago nobody would have recommended to buy a AMD card. I'm actually thinking about a R9 390X.

Well it's not like a 390 now is going to be a push over anytime soon, AMD pretty much owns the mid range atm, a 390 can even do some light 4k gaming

Yeah... Just the energy consumption is what i would call insane.

Thats why there new 14nm tech is so anticipated because it should dramatically reduce the energy consumption

1 Like

Is there any mid range model what would be worth buying now and still is able to run lets say Witcher 3 on max @1080p? I dont want to buy a high end card now, because I'm quite sure i want one of the new generation soon as they come.

Not really under a typical gaming load, aside from that if you were looking more towards the high end the nano just got a price drop, in addition AMD cards currently look to support DX12/Vulkan better than Maxwell, and free-sync can save you anywhere from 100-300 on your adaptive sync display

390 is going to end up somewhere in the middle
http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-9.html

Nvidia's cards tend to do better at 1080p, especially in a game like The Witcher which is nvidia optimized


Some extra benchmarks, 390 wins out in the end, more so for higher resolutions

1 Like

I too am waiting for Pascal, but I will probably get a GTX980 once the price drops after Pascal. My SLI setup works equal to a 970 most of the time, but some games don't support it at all and a single older card is unacceptable. I'm gonna wait and see if I want Pascal, but honestly a 980 will be fine for me.

As for AMD, my friend just got a Fury Nano and she loves it. The Fury's are really good. Somehow they are equal to a 980 at 1080p, but better than a 980Ti at 4K. I am very tempted to switch teams because FreeSync monitors are so much cheaper than G-Sync screens. I can't stand screen tearing and always use V-sync. The only thing she doesn't like is the coil whine, but I have that now and it's not noticeable except when running a silent benchmark.

ceofreak all I can say is I have the same video card as you and I am waiting til a single card 4K 60 FPS max. detail most games affordable card comes out and so I guess I will be waiting for some time. If the card you have does what you need it to do right now there is no sense in upgrading. Good luck in deciding what to do.

Thanks, yeah I'm not exactly sure yet. Its basically about games like Fallout 4, GTA 5 and Witcher, i just want to play them at full details to have the full experience. If it's worth to upgrade now instead of waiting is the question...

I basically have enough games that still needs playing, which are running great on my setup now.

I would personally wait on buying a new card.
Save some cash while waiting and splurge on the card later.
Also, GTC 2016 is only 10 weeks away. We'll know more by then.

I don't usually recommend waiting, since, you know, there's always that next thing just behind the corner!
But this is a slightly different situation considering the manufacturing node we've been stuck with for a while.

If you can't wait then, well, buy the fastest card you can afford right now and don't look back. Either new or used.

Or go SLI and get 1.5-2x more performance (around 970/980 level) than you now have with minimal investment if you don't mind dealing with double the cards, heat and noise and potential SLI problems..
Here in Finland (the land of thousand lakes and ridiculously expensive used tech) the 670's go for 100-150€ depending on the model. That's 110-160 USD for you americans, cheapest 670 I found on Ebay.com was an EVGA FTW model for 65 USD. I'd buy that in a heart beat.


But commence rambling:
The smaller manufacturing node (14/16nm FinFET) is going to be amazing.

  • Same performance + lower power consumption
  • Higher performance + same power consumption

We've been stuck on the damn 28nm manufacturing node for four years now, think about what the node shrink is gonna give us!
(Power consumption numbers and relative performance figures taken from TechPowerUp)

Remember the GTX 580 > Titan jump.

  • Performance on 1920x1200 resolution
    GTX 580 = 58% off of Titan (100%)
    Performance on 2560x1600
    GTX 580 = 54% off of Titan (100%)

  • Average/peak power consumption under gaming load:
    GTX 580 : 214W/229W
    GTX Titan : 208W/238W

That's insane.

And yes, I'm not comparing GTX 580 to a GTX 680 since GTX 680 is a main stream chip (GK104) where as GTX 580 was a high end chip (GF110).

GF110 to GK110 is the logical comparison, big core vs big core.
Although this is a comparison of full core vs partial core.
Perfect comparison would be the GTX 480 vs Titan and GTX 580 vs Titan Black.
Unfortunately TechPowerUp didn't include the 480 to the Titan reviews relative performance graphs AND TechPowerUp hasn't tested the Titan Black so...580 vs Titan will do..


Or we could compare GF114 (560Ti) to GK104 (680)
In that case

  • Performance on 1920x1200 resolution
    GTX 560Ti = 59% off of 680 (100%)
    Performance on 2560x1600
    GTX 560Ti = 53% off of 680 (100%)

  • Average/peak power consumption under gaming load:
    GTX 560Ti : 148W/159W
    GTX 680 : 166W/186W


Since then without any node shrinks we've gotten these results with

  • tweaking the architecture
  • mastering the manufacturing process (improving yields and so on)

Which allowed for

  • better utilization of the shaders
  • throwing more (+35%) area (read shaders) at the problem (which is 28nm)

While keeping the power consumption in check, at least with the main stream chip.. GK110 to GM200 is a different story..

GTX 680 > GTX 980 (GK104 > GM104)

  • Performance on 1920x1080 resolution
    GTX 680 = 64% off of 980 (100%)
    Performance on 2560x1600
    GTX 980 = 63% off of 980 (100%)

  • Average/peak power consumption under gaming load:
    GTX 680 : 166W/175W
    GTX 980 : 156W/184W

Where as GTX Titan > Titan X (GK110 > GM200) is a different story like I said.
This is just about the limit of what you can do on 28nm, and it shows. ~550-560mm^2 on 28nm was nuts, 600mm^2 is outright ridiculous.
The power consumption didn't "quite" stay in check. Look at the average power consumption.

  • Performance on 1920x1080 resolution
    GTX Titan = 64% off of Titan X(100%)
    Performance on 2560x1440
    GTX Titan = 62% off of Titan X (100%)

  • Average/peak power consumption under gaming load:
    GTX Titan : 186W/252W
    GTX Titan X : 223W/243W

If you look at Titan X vs 980 Ti, you'll see.

  • Performance on 1920x1080 resolution
    GTX Titan X = 103% off of 980 Ti (100%)
    Performance on 2560x1440
    GTX Titan X = 104% off of 980 Ti (100%)

  • Average/peak power consumption under gaming load:
    GTX 980 Ti : 211W/238W
    GTX Titan X : 223W/243W

tl;dr?
With Pascal + 16nm FinFET:
I'm going to expect similar jumps in performance we've seen in the past with 580>Titan, 560 Ti>680, 680>980, Titan>Titan X

2 Likes

Old meme is old. Yes the power consumption is higher but it equates to about 8 dollars a year in extra power usage. It is no where near what people like to make up and spout as truth.

And regardless of the feeling AMD have never been that far behind in GPUs and over the life time of them get better so by the end they are either tied with the equivalent nVidia card or out pacing it.

You could buy nVidia now but they don't support parts of the DX12 standard well if at all and are pulling tricks to offload graphic computing to the CPU. Pascal should fix this but then you are looking at new nVidia prices which are hugely inflated at launch.

Inwoild saybwait for the new AMD Polaris or nVidia Pascal stuff either way unless you are in a huge hurry to upgrade. It will be a few months, probably 4, if you are okay with that then wait it out. If not go for the current higher end AMD as it ties and or beats nVidia at all matching price points in games a that are not heavily broken by nVidia gameworks.

GTA 5 works great on my 290. Solid 60 at high to max across the settings. I don't have fallout 4 but the word is if you disable some of the game works junk, which can actually look worse, it runs great.