So, nVidia, 780ti

Yeah. Also, remember that GTX 780 and GTX TITAN are a few percentage points away. For GTX 780 Ti to be successful, it needs to be at least lower double-digit percentage point improvements, which means super-GTX-TITAN performance.

GTX TITAN will still be their compute/productivity/professional/workstation/gaming hybrid card, which will also be good for those crazy-rich people doing benchmarks and whatnot. But for most people, it'll be useless - it's there mostly for marketing's sake.

I think GTX 780 Ti would probably be 12~15% faster than GTX 780, and it might be sold between 649$ (I doubt it) to 799$. Although, unless it beats R9 290X, I think it'll be priced around 699$ (USD).

For that to happen, nVidia is going to have to drop the price of the GTX 780 enough to make it worthwhile. Otherwise, NewEgg and other companies will have a huge stock of these cards, and nVidia won't have anything to compete against R9 290 Non-X.

So I think GTX 780 is going to sell for between 499$ and 549$ (USD), because that'll put AMD in a lot of pressure. Gamers responded tepidly to nVidia's release of G-Sync and GTX 780 Ti, so unless nVidia can re-heat their enthusiasm, it'll be a tough sale.

Also, remember that the GTX 780 is selling for 649$ right now. A 100$~150$ price cut will be a welcome sight for sore eyes, especially considering this is a 3GB card we're talking about.

** EDIT ** : Hexus agrees with me. http://hexus.net/tech/news/graphics/61405-nvidia-unveils-geforce-gtx-780-ti-radeon-r9-290x-killer/

It will just be a bumped up 780 with some extra cuda cores and a better base clock i cant see them makeing it better than a TITAN because that will piss off too meany titan owners that droped 1k on the 'fastest single card'

Thay never sead it was going to be there fastest card yet ether,here is there live announcement http://www.youtube.com/watch?v=M1bEJDzft-A

So i am kind of puzzled as to where this gpu is going to sit.

http://hexus.net/tech/news/graphics/61405-nvidia-unveils-geforce-gtx-780-ti-radeon-r9-290x-killer/

Nvidia boss Jen-Hsun Huang unveiled the GeForce GTX 780 Ti, described as the "best GPU that's ever been built."

I'm not puzzled at all. nVidia loves to recycle old GPU models, so a recycled GK110 sounds reasonable. However, if it's not faster than the GTX TITAN, what's the point? And if the TITAN can't beat the R9 290X, than how is a card SLOWER than the GTX TITAN going to beat it? If it's meant to compete, it should be competitive - right?

So whilst it may anger many people, nVidia looks to make money first and foremost. They are a corporation with shareholders after all.

As for how it would be faster, not by much. Think of 12~15% faster than the GTX TITAN in gaming, but not compute or rendering. Thus, it's a "pure gaming" card, not a workstation card, nor is it a 4-Way SLI card.

The GTX TITAN still holds a place - it's just not going to be "the best gaming GPU" anymore. It'll be "the best workstation + gaming GPU hybrid" if you don't want to pony up to a Quadro or Tesla. That being said, perhaps nVidia knows that some people at the higher end want a GPU that can do both for a lower price. So maybe nVidia will have four segments, instead of three:

- Pure Gaming - GeForce GTX series of cards.

- Medium Workstation/Productivity/Compute/Rendering and Top-Tier Gaming on the Side - GTX TITAN-series

- Pure Top-Tier Workstation/Productivity/Rendering - Quadro series of cards

- Pure Top-Tier Compute - Tesla series

So we'd see the TITAN-series as a new segment, meant for people like Logan, Wendell, and other people who game but also do video editing and whatnot. Not for *most* gamers. It makes sense to add this segment, so people don't have to buy one Quadro card for rendering and professional stuff, and another GeForce GTX card for gaming.

Just you fuckers wait until Volta..

So not a titan.....but not a gtx 780......................................THIS IS IRRELEVANT!!!!!!!!!!!!!!!!!!!!!! I WILL DRAW MY ATTENTION BACK TO MY INTERNET MEMES AS THEY SEEM MORE IMPORTANT NOW!

Im thinking the ti will be a titan with just 3gb's of Vram. More powerful but they really want to keep the titan up there. The whole beating the titan on silent mode is stupid. My GTX 780 smashes the Titan and I cant here it. The only fans I can here are from my crap Vaio Desktop Replacement. Bloody thing wont shut up

 

+1

I was watching the WAN show where TimmyTechTV said that maybe nVidia took what they wanted to be 880 or 870 and renamed it to 780ti and release it. I thinks thats 100% plausible.

Unless there is a picture of the back of the PCB, No one has a clue. All we can see is a 780 with ti on the end. Nothing apears to be different. If we could find a picture of the back, We would know atleast how much Vram it has.

800 series will not be kepler, it is to be maxwell (20nm i think).

He said it is a kepler GPU.

crossing my fingers for 4gb 512-bit bus

I heard that too. But it's way too doubtful. The 22nm/20nm fabrication process isn't ready. And considering that they just launched the GTX 700-series of GPUs, it's doubtful they'd stab their customers in the back by launching a new series less than six months after the 700-series made it's debut.

So, I really, *really* doubt it. So it can't be a Maxwell GPU, or if it is, it isn't made on the new die shrink. If they are adopting the method of putting a new architecture on an old die shrink, that's Intel's older-than-dirt "Tick-Tock" model that they use with their CPUs. It's doubtful they'd use that on GPUs.

And also, think of the amount of engineering power it would require to make a new chip design (like the rumored GK180). If engineering the GK104 and GK110 cost a metric butt-tonne of money, than why would they spend even more money for an even larger die size, and even lower yields, and than having to spend even more money trying to optimize their drivers for the new design? It makes very little economic sense.

Personally, I think it's much more likely they'd recycle the GK110 and improve upon it. It makes much more sense from an economic perspective, even though the hardware enthusiast side of me sighs in disappointment. I'd like to be optimistic about what nVidia plans to do, but I have more experience watching the progress of hardware and observing market tendencies, as well as observing the costs of engineering, research and development, software development, quality assurance (beta testing), marketing and whatnot... and for nVidia to make a new chip, it sounds economically wasteful. And without 20nm fabrication process ready from TSMC or GlobalFoundries for mass-production, they wouldn't be able to deliver that card any time soon. So, no. It's going to be a "GTX TITAN" that's overclocked and has less Double Precision, that's all.

Should have just brought out the 790.

 

Your article gave me cancer. Please, don't you ever post an article related to your own personal opninions.

You're giving TekSyndicate a bad name.

Since when is expressing your opinion about something bad? Teksyndicate is all about saying whatever you like about a product without bias, fanboyism or affiliation. 

Nah, dude.

I'm an amateur tech journalist (non-professional/hobbyist), and I know writing is an art that is honed over time. Nobody is born the next Tolkien, or becomes the next H.G. Wells overnight. It takes time to hone one's skill.

And yes, improving one's tech knowledge and writing skills is important, but I don't believe censorship or negative criticism is the way to go about doing that. Instead of saying his writing s*cks, give him constructive criticism on how to improve his writing.

For example, here is my own article/blog about the subject:

https://teksyndicate.com/users/rsilverblood/blog/2013/10/21/gpu-wars-episode-v-nvidia-strikes-back

It may not be perfect, and it may be filled with some quite non-professional comments at times (that's because I'm not writing professionally here; it's a blog that I'm writing as a hobby, not an article I'm submitting to a formal website nor am I a paid tech journalist like Linus or Elric). But it does make a lot of good points, it uses a lot of good references (I didn't bother to add the links, since I usually write way late at night after I arrive from the night shift at work), and it takes a lot of things into context.

So yeah, don't be so upset that somewhere is learning. It wouldn't be OK for a tech veteran on a forum to criticize someone for not knowing the difference between SATA 3.1 and SATA 3.2, for example, so why should someone who writes for a hobby be expected to write professionally or with a great degree of skill?

Maybe it's just me, but I remember being bullied as a kid, and I didn't like it. Now that I'm more grown and have more knowledge, I can bully others with my knowledge, but my experience has taught me better. Because if you wish to grow a great community, you shouldn't try to discourage others from learning or growing. We all make mistakes, but we improve. Make a list of non-offensive, constructive suggestions and/or guidelines on how to improve his work. Don't just dish out sh*t because you didn't like it; explain how it needs improving in a non-emotional tone.

Furthermore, if you wish to see TekSyndicate have some way of featuring articles and whatnot... we do have that already. Maybe we should make a suggestion to the Tek team that we add a way to nominate blog posts written by community members so that they might appear in a sort of "community spotlight" place, where we feature "the best blog posts written by our forum members". It's a great way of getting amateur/hobby tech journalists to get their foot in the door of this type of career.

And that, for example, is how you give constructive criticism to someone. I didn't need to give constructive criticism to the person who wrote this mini blog article; I gave it to you. Be constructive, rather than negative. You encourage a better community that grows together, learns together and respects each other more because of it. It's one of the lessons I've learned when I was a moderator of forums way back in the early 2000s. :)

Cheers!

OPs writing is flawed by ignorance.

I somehow expected... more... from nvidia. the 290x will probably still beat the 780ti, considering that it beat the 780 whilst in quiet mode...

- Nvidia is countering the R9-290X with a better GTX780. This is how competition works.

it will probably be priced at $700-800, so it's not going to be too common anyhow.

MSRP is USD $649 just like the GTX780. When the GTX780 Ti is out, prices on GTX 780'ies will fall to approximately $550.

overall, unlike AMD's feature releases, pretty much nothing new here, just another overpriced card for fanboys to drool over and rage/brag/shove-in-everyone's-face.

The ignorance is making my butt hurt more than a white boy in prison. Nvidia has announced a shit-ton of new innovative technologies (G-Sync as an example). Overpriced? Look at the sales. GK110 graphics cards are selling like cupcakes. The GTX TITAN was a bigger succes than Nvidia had hoped. Just because the majority cannot afford the highest-end video cards doesn't mean they are overpriced.

I predict that fanboys will be rather excited for the next month or so, but nothing really new, aside from a slight increase in house fires.

What is this BS?

now, there is also g-sync, but it's nothing as awesome as Mantle, just a select set of expensive monitors that add little to your overall gaming experience.

Enjoy your screen-tearing gaming experience.

I think nVidia needs to start innovating, rather than just pulling the same stunts year-after-year...

Right, because the the first release wave of R7 and R9 Radeon cards are not rebranded HD7000-series graphics cards... The R9-280 and 280X are based on a brand new arcitechture and I'm looking foreward to see what they have to offer.

Competition is only good for the consumer simply because we get better products for better prices.

Before stigmatizing me as an Nvidia-boy: I like AMD. I've owned both ATI/AMD and Nvidia cards through my time. Most of my client builds are based on AMD products, simply because they are performing better for the money.

My problem here is OPs writing. He is spreading the cancer through ignorance and is giving TS a bad reputation because people thinks it's OK to post BS like this.

oookkkkk....

the 780 and 780ti are at least the price of the 290x, and the 290x beat the 780, whist in quiet mode. FACT.

 

and I didn't say much about g-sync, but I might as well. g-sync is a fix for an uncommon issue. it actually only appears to help with your FPS under 60.... not all that common with top-end hard ware, and even with mainstream hardware, simply because you're going to turn you setting as high as possible, while still getting 60 fps. additionally, screen-tearing isn't that common. i, for one, have NEVER seen it. not with my gtx 260, nor with my 7970...\

the part about house fires, that was a joke...

the thing with AMD's rebrand is that they actually added a bit into the hardware, such as higher clocks, and more output lanes. they have an entire displayport additional, vs the 7970.

 

thank you. I really don't write reviews/ articles much, even as a hobby. I do appreciate constructive criticism, but just saying "this post is ignorant and gave me cancer" is, at the best, just as stupid as the original post. it tells them nothing about what you disagree about, nor how to improve upon that.

in fact, I may start to issue warnings about such posts, especially in blogs/news articles. it's bullying, nothing more, unless the OP is obviously trolling.

nvidia... blah

g-sync....yarnnnn

my wallet...what performs best for the money? AMD? ill take it!!

the 780 and 780ti are at least the price of the 290x, and the 290x beat the 780, whist in quiet mode. FACT.

 No, it's not fact because the official R9-290X benchmarks are not released yet. Only a few specific benchmarks has been showed in AMD optimized games where the R9-290X has proven to be faster than the GTX 780. With time and driver optimization, the R9-290X may even be a better gaming GPU than the GTX TITAN. 

AMD drivers have gotten better ever since the HD4000-series which is a good thing.

and I didn't say much about g-sync, but I might as well. g-sync is a fix for an uncommon issue. it actually only appears to help with your FPS under 60.[...]

G-Sync makes your monitor's refreshrate dynamic - like the GPU - instead of being continuously fixed. As the name indicates, G-Sync makes your monitor's refreshrate sync with your GPU's fps output (to a certain limit). A 60Hz panel might not be able to hit 75Hz. I have tried to overclock my cheap ASUS VW222 monitor and it went from 60 to 67 Hz.

In short, G-Sync is a new way to eliminate screen tearing without having to sacrifice performance (V-sync). There's also Nvidia's adaptive V-sync, but I'm still seeing some screen tearing in certain games...

the part about house fires, that was a joke...

Flew right over my head. 

the thing with AMD's rebrand is that they actually added a bit into the hardware, such as higher clocks, and more output lanes. they have an entire displayport additional, vs the 7970.

From what I know, rebranding a productline has been done by Nvidia since the 8000-series -> 9000-series (due to a smaller die = better performance and thermal with lower watts). The Geforce 9800-series product line was a succes, given the fact that it was only a tweaked 8800-series chip.

Same goes for GF100: 400-serie -> 500-serie was also a great succes - simply by decreasing the amount of transistors - the last core cluster in the chip could be accessed without thermal issues.

And now there's GK100 rebranding, with the GK110 as the best seller.

AMD HD6000-series were also rebranded & tweaked HD5000 GPUs. Both productlines did really well against Nvidia Fermi. I loved my HD6870.

Oh well, I'm talking too much again.