G-Sync

So, tom's hardware just did a test on G-Sync and so far things look promising.

http://www.tomshardware.com/reviews/g-sync-v-sync-monitor,3699.html

So far, things seem to work really great for 1440p+ monitors.  That's really good because a lot of times you'll be below 60fps whichc means you'll be prone to tearing and ghosting.  What do you guys think after reading this?

I didn't read the article, but if this works for 1440p I shall be satisfied. I heard rumours that it didn't work particularly well with IPS panels. So this is good news. I will at least consider G sync. The drawback is that fitting the module to my 1440p ASUS, I would have to continue using Nvidia in the future. I couldn't switch back to AMD and not use it.

I don't have too much ghosting or lag with my ASUS, but it would be nice to push a higher refresh rate.

I have the same problem with it I've had since the beginning: vendor lock-in.

I'm pretty positive that if NVidia adopts Mantle, then they'll open G-Sync for everyone.  It's really stupid to close down hardware to 1 specific manufacturer.  If Nvidia does this, i'll be very disappointed in their decision.  I currently have an Nvidia card, but later on i may want AMD for my needs.  GCN is generally the architecture for gaming so i'm pretty sure down the line within lets say a year, game dev applications will be more optimized for AMD because of next gen consoles, generally AMD dominating the gaming industry.

Maya and 3dsMax 2014 have already done their optimizations for AMD so CUDA doesn't have an advantage anymore like they use to.  On the other hand, UDK and Cryengine for now are still optimized for CUDA so that'll be possibly another generation till Stream Processing will be optimized properly.  Unity for now prefers CUDA only because Tessellation for the DX11 package works better on CUDA, everything else seems fine for AMD.

Back to G-Sync, i gotta say that G-sync won't benefit people using 1080p resolutions, and 1440p won't be a standard for another 5-10 years.  Maybe 1440p will become a standard but most gamers at the moment game @ 1080.  1440 is still way too expensive to be a valuable monitor upgrade, maybe when 1440 drops to $200, we'll see it become dominant, but that means G-Sync will be great for a small percentage of people.  By the time the majority come to 1440, i have a feeling that G-sync will be pointless because there will be much more powerful hardware that will compensate for it.  Maybe that's just me, but i think G-Sync will really work well for 4k resolution, i think that's the real focus. Who will want 4k resolution for gaming, atm it's a stupid idea mainly because your high-end hardware will be rendered useless in gaming, who wants to game with bearly 30fps if you're lucky...i rather go 1440 and get extra 20fps.

Disclaimer: I'm not a fanboy.

Even though this technology is limited to Nvidia, it's not like the monitor can't be used with AMD, you just won't see the benefit. Also the AMD cards haven't put a lot of pressure on Nvidia. Nvidia has said that Maxwell will support G-sync. I think that Maxwell will be as big as the 8800GT. This could make Nvidia/Gsync a no-brainer.

I have a decent PC but I've been holding back on buying nice monitors. I have two pretty basic 1080p displays. I've had a feeling that something big was about to happen with monitors and that I should wait investing in something HD+. I'm glad I did. I'll be waiting patiently for Maxwell and a nice 1440p possibly 4k Gsync display.

Nvidia spent a lot of money developing this technology and it's exactly what the industry needed in my opinion, they deserve to make some of my money on the first round of this tech. Who knows, by the time I'm ready to upgrade next time, maybe AMD will have something competitive.

I wouldn't expect Nvidia to adopt Mantle. Mantle isn't the first of its kind, and Nvidia will probably replicate it. Then they will have the best of both worlds. But it is still heavily dependent on developers adopting low-level APIs for their games. And for some games, low-level APIs do not make sense at all. I am pretty excited for Mantle, and I do hope that it is widely used. They did a demonstration of a 20-50% performance increase, almost completely removing the CPU bottleneck.

I agree that most of the closed source tools are steadily being overcome. The industry is going to be re-shaped over the next couple of years, but I couldn't say what the outcome would be.

I think that Maxwell will be as big as the 8800GT

What makes you think that?

Nvidia spent a lot of money developing this technology

Same question.

Nvidia was able to pull the 780ti out of thin air, so i wouldn't be surprised with their resources that Maxwell is intended to demolish AMD's R9 lineup.

The 780ti is a fully unlocked GK110, not a new part. Those have been manufactured ever since the first Titans rolled off the line last September. I would assume they just took the ones that fully worked and saved them for a rainy day.

It's too late for Maxwell to be any kind of direct response to Hawaii, if they want it out by the middle of 2014. They can play with frequencies and such, but the GPU industry moves too slowly for any sort of design to be changed by now. Decisions are generally made 18-24 months before a product is launched.

honestly, g sync sounds like an idea that i dont think would be hard for AMD to implement as well, or any company that makes or develops displays. nvida just happen to come up with the idea first. everyone was always locked down to the idea of the fixed refresh rate for so long, just a new way of thinking came along and i think everyone will follow suit.

the hardware industry move quite quickly.  The only reason why things don't seem that great is because the programmer side is lacking.  Not their fault but if they gave more emphasis in drivers over hardware, they can pull more performance out of a card, would be better for the industry.

Yes, you're right but it was as easy as a snap of the fingers for them to get it done, within the week we already knew.  Maxwell is said to be a big step up, they're using all gk110 components.  So that means they're going for even bigger numbers, like doubling the cuda core count.  Nvidia is known to pay off people to get information from their competitors.  You guys forget the 400 series? They already knew about what AMD was doing and just beat them to the punch.  Intel and Nvidia have deep pockets, they're able to find out about anything their competitors are up to, you think they're worried about Mantle or HSA? i doubt it, even thought it would be nice to hear them say it.  They may not innovate as much as AMD does which is sad to say, the little guy innovating...but they always know how to win.  I loved my AMD 6970's they were beasts at the time, but still behind Nvidias 570's. I hate how Nvidia plays dirty but they can because they're nvidia.

well yes, but they made sure that the graphics card dictates proper refresh rates, thats why it's so well implemented. I get what you're saying, but ASUS, BenQ, AOC, Dell, and everyone else in the monitor industry don't make GPU's so it took someone like Nvidia to dish it out.  Thought atm i wouldn't even want it since i got dual 1080 screens.  No need since my fps is past 60 every game i play.

Though if AMD implemented it, it wouldn't be so successful mainly because NVidia has deeper pockets and deep connections and they blackmail.  Origin or Maingear, i forget which were told not to build any AMD based towers for set amount of time and that they would sue them if they did, or something along those lines.  I bet Nvidia did the same with these monitor manufacturers...like quid pro quo.

To answer your first question, just rumors really, and I hope it does. A big architecture change which Maxwell is rumored to be, usually introduce some serious performance gains. Cause you know, why else redesign the architecture.

 

To answer the second question, Nvidia stated themselves, I watched the PCper and Linus streams of the event. They could be pulling my leg, but I believe it.

1. Architecture changes mean changes, not massive performance gains. ~30% per generation (or less) is pretty typical, regardless of architecture changes. The biggest gains will likely be in power usage.

2. Nvidia stating something themselves is about the least reliable source of information.